In the Eyes of IDEMIA’s Vigilance Plan 2023: New Perspectives on Data Protection Impact Assessment Obligations for Big Tech

In the Eyes of IDEMIA’s Vigilance Plan 2023: New Perspectives on Data Protection Impact Assessment Obligations for Big Tech

Introduction

IDEMIA is a France-based multinational technology company which provides biometric solutions for States and private entities in sectors such as telecommunication, finance, banking, public security, travel and digital assets. The firm provided Kenya with biometric voter kits for the 2017 general elections. It was also poised to be contracted to supply the technology platform for the Referendum that was anticipated to follow the BBI initiative, which ultimately did not materialize. In 2019, the government engaged the firm to deliver biometric solutions for the national digital identity project dubbed “Huduma Namba”.

In 2020, some Kenyan Non-Governmental Organizations (NGOs) obtained court orders which suspended implementation of the national digital identity project dubbed “Huduma Namba” in Kenya for failure to conduct a data protection impact assessment (DPIA). Despite the momentous win, the Nubian Rights Forum, Kenya Human Rights Commission and Data Rights still took a bold step of filing a case against IDEMIA, which had supplied the technology hardware for the project.

The case against IDEMIA was filed in a Paris court in 2021 to further protect the rights of marginalized individuals in Kenya affected by digital identity technology. This lawsuit, which followed a partial victory in the Kenyan High Court, offered a broader perspective on how the NGOs recognized the full scope of the issue and the legal avenues available to address its root causes. This article explores how this bold move has introduced new insights into the DPIA obligations for foreign tech giants internally and in their supply chains, which include contracting companies and governments. It also highlights the implications of these perspectives for ongoing advocacy efforts and the push for rights-respecting DPIA in Kenya.

Data protection impact assessment obligation

DPIA is a new obligation in Article 35 of the General Data Protection Regulations (GDPR), which the European Union adopted in 2016. The obligation is both a variant of the previous privacy impact assessment practice and an improvement of the prior checking regime under the former European Data Protection Directive 1995, which the GDPR has now replaced.

In Kenya, DPIA obligation has been adopted into the National Data Protection Policy and the Data Protection Act 2019. The obligation applies to data controllers and data processors alike. While the DPIA is recommended as a good practice in data processing and as part of data protection by design and default, the obligation is mandatory when data processors and controllers engage in high-risk data processing. The categories of processing activities deemed by law to give rise to high-risk processing are listed in the Data Protection (General) Regulations 2021 and the Office of the Data Protection Commissioner’s Guidance Note on Data Protection Impact Assessment 2022. For the purposes of the present case, it is important to note that the processing of biometric data, data of vulnerable individuals, and other sensitive personal data, common in digital identity projects, are categories of processing operations that must undergo a DPIA.

Shifting focus to Business and DPIA obligations

A previous blog article analyzed the import of the decision of the High Court of Kenya in Ex Parte Katiba Institute & Another in which the Court annulled the government’s decision to implement the digital identity project. The central issue in the discussion was the Court’s finding that the Kenyan government failed to conduct a DPIA for the digital identity project, thus neglecting to implement sufficient safeguards to protect privacy and related rights. The article highlighted how the Court established a precedent that the DPIA requirement is part of the State’s duty to uphold the right to privacy. It concluded that this ruling set a unique legal precedent by retroactively applying the DPIA obligation to the actions of the State.

The situation is somewhat different for companies, particularly foreign ones, that supply technologies to Kenya. In cases involving human rights abuses related to digital projects, foreign companies responsible for developing the technologies are rarely included as parties in the legal proceedings. An example is the Bernard Murage case. In this case, a petition challenging thin-SIM technology was filed without including Taisys Technologies, despite the Taiwan-based company’s involvement in the manufacturing and supply of the technology and thin-SIMs. The situation has been the same for other recent cases challenging the digital identity project dubbed “Maisha Namba” . The only exception is in relation to the investigations against the activities of Worldcoin in Kenya.

This blog turns attention to IDEMIA, a company whose controversy differs in key ways from that of Worldcoin. The analysis which follows is significant for two main reasons. First, it expands the understanding of DPIA obligations for companies providing digital technologies in Kenya. A broader perspective is crucial, especially given that these companies are seldom held accountable for failing to conduct DPIAs. Second, it offers deeper insights into the DPIA obligations for foreign companies within the broader context of business and human rights. This context is relevant owing to the growing state-business partnership in the development, testing, and deployment of technologies.

Allegations levelled against IDEMIA

In 2022, three NGOs took the bold move to change the direction of regulating data injustices in Kenya. First, was the Kenya Human Rights Commission, which is a flagship organization that enhances right-centred governance at all levels. Second Nubian Rights Forum, which described itself as the voice for the voiceless and the oppressed. Third, is Data Rights, a European NGO that defends, enforces, and advances data rights in Europe.

The three NGOs formed a unique North-South alliance to challenge the activities of IDEMIA regarding implementation of Huduma Namba. In 2020, after notices to the company were not adequately responded to, the NGOs filed a court case against the company under the French supply chain law called the Duty of Vigilance Act 2017. The NGO alleged that IDEMIA failed to identify the human rights risks associated with its digital identity products when it supplied hardware for the Kenyan National Integrated Identification Management System (NIIMS) between 2019 and 2021. The NGO further alleged that IDEMIA did not adequately assess or mitigate the human rights impact of NIIMS. Additionally, they raised concerns that IDEMIA’s shortcomings could result in significant data processing harms.

The grievances levelled by the NGOs concerned discharge of the obligations contained in the human rights instruments as follows:

  1. Pillar II of the United Nations General Principles on Business and Human Rights 2011 (UNGPs). The mantra of the pillar requires business enterprises such as IDEMIA to respect human rights and address adverse human rights impacts with which they are involved through their own activities or as a result of their business relationships. In line with this requirement, Principle 17 of the UN Guiding Principles on Business and Human Rights (UNGPs) calls for companies such as IDEMIA to conduct human rights due diligence (HRDD). This process involves assessing both actual and potential human rights impacts, taking necessary actions to mitigate or prevent these impacts, and transparently communicating the steps taken to address them.

  1. The United Nations High Commissioner for Human Rights Report on the Right to Privacy in the Digital Age 2014 states that that businesses like IDEMIA risk being complicit in violating human rights by States that use the technology they supply for surveillance. The report adopted by the United Nations General Assembly mandates that businesses implement adequate safeguards to mitigate such risks. Paragraph 44 specifically highlights that one of these safeguards should be the adoption of a clear policy statement that demonstrates the company’s commitment to respecting human rights across all its operations. Additional safeguards include establishing mechanisms for meaningful transparency regarding potential risks and ensuring provisions for remediation if the assessment identifies harm.

  1. The United Nations High Commissioner for Human Rights Report on the Right to Privacy in the Digital Age 2018 adopts the mantra in pillar II of the UNGPs and the scope of its application to business activities and relationships. Paragraph 42 of the report clearly establishes that the role of businesses is distinct from the responsibilities of States. Therefore, businesses cannot justify their actions by citing the human rights record of the State receiving their technology, even if that State has a history of not respecting rights, including the right to privacy. Additionally, the 2014 report further refines the concept of “risk of complicity” by introducing a “strict liability regime.” Under this regime, businesses that manufacture and sell technologies enabling unlawful or arbitrary intrusions are considered complicit in contributing to negative human rights impacts.

  1. The United Nations High Commissioner for Human Rights Report on the Right to Privacy in the Digital Age 2021 also adopts the need for HRDD obligations stated in pillar II of the UNGPs. It makes a distinct contribution to the discourse by emphasizing, in paragraph 49, that due diligence should specifically consider the interests, circumstances, and potential impacts on vulnerable individuals, minorities, and marginalized groups.

Parisian court proceedings and determination

At the time of filing the case, the Nubian Rights Forum and Katiba Institute, both NGOs in Kenya, had secured orders from the Nairobi High Court stopping the implementation of NIIMS and Huduma Namba project by the government of Kenya. This was after the High Court declared the project unconstitutional and illegal for failure to conduct DPIA. The NGOs understood that domestic court rulings alone would not ensure IDEMIA’s commitment to respecting the privacy and related rights of citizens in Kenya. They recognized that true justice could only be achieved if IDEMIA also restructured its internal procedures and systems to effectively assess and mitigate the potential negative impacts of its technologies and business practices. With this in mind, the NGOs requested that the Paris court compel IDEMIA to take appropriate measures to evaluate and reduce the risks associated with its products. Additionally, they sought an order for IDEMIA to revise its Vigilance Plan to ensure it met the relevant legal standards, some of which were outlined above.

Ultimately, the proceedings were referred to mediation. At the conclusion of the process, a significant outcome was achieved, which the Kenya Human Rights Commission hailed as a victory. IDEMIA agreed to revise its Vigilance Plan, a key framework guiding its due diligence processes, to enhance safeguards in digital identity management.

Further look at IDEMIA’s Vigilance Plan

The views on the adequacy of the Vigilance Plan are varied. However, examination of the IDEMIA Vigilance Plan 2023, the revised version developed after the conclusion of the mediation process, reveals the substantial influence of the court proceedings in the Parisian court. The updated Vigilance Plan clearly demonstrates that:

  1. IDEMIA has established a Human Rights Committee. The Committee is tasked with reviewing the business activities and their impact on human rights. This is in keeping with the mantra under the corporate responsibility to respect human rights under Pillar II of the UNGPs and reports adopted by the United Nations High Commissioner for Human Rights.
  2. In section 6.2 of the Plan, IDEMIA reaffirms its commitment to upholding and promoting human rights within its business operations. Specifically, IDEMIA emphasizes its goal of adhering to established human rights standards, including the International Bill of Rights and the UNGPs.
  3. It is also remarkable and eye-catching that IDEMIA has committed to comply with the standards of protection of human and peoples’ rights under the African Charter on Human and Peoples’ Rights 1981. This is a massive win as African instruments have unique and distinctive contributions to protecting human rights. These contributions include recognition of collective rights, new rights and the right to sustainable development, all of which are relevant for effectively protecting privacy and related rights in the African context. Furthermore, section 6.2.1 of the revised Vigilance Plan emphasizes company commitment to ensure its projects comply with data privacy by default in its activities. This is key in emboldening its DPIA obligation from the design stage and throughout the technology life cycle since impact assessment is part of the data privacy by default.
  4. Regarding digital identity products, the revised Vigilance Plan 2023 now recommends that systems involving the collection and use of biometric data proceed with a DPIA. It envisages that such a DPIA should coexist with the human rights impact assessment. This hybrid approach aligns with what the Danish Institute for Human Rights recommends. It also aligns with what the Kenyan coalition of civil society organisations and community-based organizations have been calling for in their demand for rights-respecting DPIA during implementation of digital identity projects.

Despite these key learnings, NGOs which filed the case had mixed reactions as to the inadequacy of the IDEMIA’s revised Vigilance Plan. This may be justified. For example, section 6.2.1 of the revised Vigilance Plan 2023 does not provide how IDEMIA will ensure that the safeguard measure is implemented by businesses with which it has relationships in the process. Failure to make that description may be a lost opportunity for the company to describe how it intends to use leverage in business relationships to reduce risk to privacy and related human rights. However, that should not cloud some remarkable successes in influencing perspectives on implementing DPIA obligations in cross-border contexts. Notable highlights of these successes are provided below as key learnings.

Key Learnings

  1. Effective implementation of DPIA obligations requires leveraging the significant influence that big tech companies have within the supply chain. This underscores the importance of aligning human rights impact assessment requirements with DPIA obligations, ensuring that the former can bolster and reinforce the latter.

  1. Building on the previous point, business and human rights frameworks such as the UN Guiding Principles (UNGPs), national action plans, and supply chain regulations serve as valuable tools in advocating for the effective implementation of DPIAs by foreign companies that design, manufacture, or supply digital technologies.

  1. Business and human rights frameworks can complement national legislation in enforcing effective DPIA obligations, as demonstrated in the lawsuit against IDEMIA in the Paris court. Three specific conditions in Kenya further support the application of this complementarity in the Kenyan context. First, Kenya has adopted the UNGPs through its 2019 National Action Plan on BHR and was, in fact, the first to do so on the African continent. Secondly, the standards in UNGPs can be relied upon in case of disputes in courts or other dispute resolution fora. Lastly, Kenyan courts now believe that foreign companies can be successfully sued in Kenya.

  1. NGOs play a critical role in advocating for the effective implementation of big tech’s DPIA obligations. As demonstrated in the IDEMIA case, coordinated efforts and cross-border collaborations (such as North-South alliances) are essential for NGOs in fulfilling this role. In the IDEMIA case, such advocacy led to a clear commitment to conduct DPIAs in accordance with both international and African regional human rights standards, as outlined in the African Charter on Human and Peoples’ Rights, which is a significant achievement.

Conclusion

The NGOs that filed an additional case against IDEMIA in the Paris court recognized that it was insufficient for the Kenyan courts to hold the State accountable for failing to conduct a DPIA. They understood that the lack of an effective DPIA was a broader issue rooted in the supply chain, particularly affecting multinational technology companies. To address this, they focused on holding IDEMIA accountable for its HRDD obligations, as the responsibility to respect human rights ultimately rested with the company. The revision of IDEMIA’s Vigilance Plan following the conclusion of the proceedings highlighted new opportunities for progress. Ultimately, the case has provided a fresh perspective on the understanding of DPIA obligations within the context of existing and evolving business and human rights frameworks.

Image used is from yahoo images

Leave a Comment

Your email address will not be published. Required fields are marked