The Impact of the AI Divide on Vulnerable Groups

The Impact of the AI Divide on Vulnerable Groups

Image created by Markus Spiske –

In the 21st Century, models backed by big data trained to make decision making processes more efficient are creeping into daily life and common usage1. In industry and in Government, Artificial Intelligence which can be defined as a ‘cross-disciplinary approach to understanding, modeling, and replicating intelligence and cognitive processes by invoking various computational, mathematical, logical, mechanical, and even biological principles and devices’2 is being deployed to reach better conclusions, faster. The promise that AI holds is, however, marred by its equal ability to cause new inequalities and perpetuate existing ones3. This potential danger is an offshoot of the broader existing inequalities that are present in the Information and Communications Technology (ICT) sphere, more commonly known as the digital divide4.

The digital divide refers to the gap between those who can access ICT’s and those who cannot5. The divide6 essentially underscores different barriers to participation in ICTs that are faced by different groups primarily due to a lack of access. Access forms a prerequisite for effective use of technology of which AI forms a major part, the two are inextricably linked in that, lacking access to ICT in general limits access to AI, specifically.

Those who are affected by the new and perpetuated inequalities as a result of the digital, and consequently, AI, divide, are classified as vulnerable groups7. ‘Vulnerability’ denotes disproportionate harm to certain persons or groups depending on a manifold set of potential and actual risks8. These include marginalization, social exclusion, limited opportunities, abuse, prejudice, and discrimination9. Understanding ‘vulnerability’ as a concept requires a two-strand approach. The first is by pre-defined categories developed as a result of observed at-risk groups in various situations. These are classified by international and national policy documents and legal instruments. For instance Article 20 of the Constitution of Kenya states that the State shall give priority to vulnerable groups in allocating resources and identifies, among others, women, the elderly, persons with disabilities, children and the youth as vulnerable groups10. The second approach is by interrogating the interaction of social, political, economic, and psychological systems and harms and the hierarchy of beneficiaries11. For instance, in a 2017 Report on the Vulnerable and Marginalised Groups by the Ministry of Education, an expanded definition includes ex-combatants, internally displaced people, HIV/AIDS-affected individuals and qualifies the categorisation of women as vulnerable groups by adding that their vulnerability applies only in some communities or societies12. In her book, Vulnerable Groups in Health and Social Care, Mary Larkin widens the definition to include those with mental illnesses, single parents and the homeless13.

In defining ‘vulnerable groups’ as it applies to Artificial intelligence the first point of reference is the digital divide. AI specific vulnerable groups risk being locked out of the ever growing, ever fundamental digital world. Barriers to digital access connote lack of access to the world at large whether they are barriers to entry or barriers to participation which differ based on factors including poor infrastructure, lack of technical know-how, social norms and structural power relations14. UNICEF has, for instance, identified children as a vulnerable group due to their exposure to AI which does not have adequate safety measures in place. The elderly are also particularly vulnerable where human services are replaced by care machines. This may have increasing negative impacts on their dignity, autonomy and privacy, it may also lead to their increased social exclusion.

This growing AI disequilibrium can be tackled across three main domains across which vulnerable groups are placed with each domain feeding into the next:

  1. The Use Divide

  2. The Literacy Gap
  3. The Bias Problem

The Use Divide

This refers to inequalities that are a result of lack of infrastructure that promotes connectivity, leading to a disparity in those who can use AI and those who cannot. In Africa, only 28.2% of people have access to the internet and data costs vary per country making it hard to reach for those who may be socio-economically disadvantaged and those who have low connectivity based off of location. This hampers progress in AI in several ways. Firstly, it widens existing tech literacies as those who are in places that have the underlying infrastructure needed to maintain connectivity have the online resources to access opportunities and networks that encourage AI development15. Secondly it disadvantages developers and software engineers who do not have consistent connectivity forcing them to incur higher costs in developing models or seek alternative employment16. Thirdly it puts AI developers in Africa at a competitive disadvantage as it limits the workforce that may be technically trained therefore allowing bigger tech companies abroad to draw their data and resources from Africa and develop AI specific to African needs and sell it back to Africa17.

The shortage of technological support for AI development has a significant financial impact on Africa as it leaves the region at the mercy of developers largely in the Global North. The consequences of this gap were evident in a report produced by PwC which estimated that the global economy will expand by 16 trillion dollars largely due to AI with China and North America receiving 70% of the economic benefits of AI. Meanwhile Africa, Latin America and Asia are projected to receive less than 6% of Gains.

To counter this, financial support from international donors as well as Governments in Africa are backing AI projects that are intended to assist vulnerable groups. For instance NIOTEK, a hardware and Internet of Things start-up was awarded a (US $6000) fund to develop and secure a database for volunteers who assist the elderly and vulnerable people living alone in order to help lower their risk of infection from the novel coronavirus. Additionally, initiatives such as project loon are developing expanded connectivity solutions which reduce the risks of having traditional infrastructure through high altitude internet balloons18.

The Literacy Gap

The literacy gap refers to language and digital literacy barriers. Due to the default use of English as the language of the internet as well as dominance of AI development stemming from English speaking countries, those who do not speak English are quite literally left out of the conversation affecting their capacities to meaningfully participate in the development and deployment of AI and rendering them vulnerable.

A lack of digital literacy raises risks in AI such as a lack of informed consent as persons who need to access essential services have no choice but to surrender their information for data collection.19 This also means that decisions about using their data to effectively meet their needs will be left out of their hands. For example, vulnerable groups in Yemen were unable to receive food deliveries from The World Food Programme (WFP) as Yemen’s Houthi rebels opposed the implementation of a biometric registration system, intended to better monitor the supply chain process and reduce food diversion from those populations. Ultimately this worsened the crisis for the vulnerable groups.

Improving physical access to digital technologies is largely viewed as the main way of reducing the use divide 20 Physical access is usually computed as ‘The number of PCs per capita as well as telephone lines per thousands.” 21 A clear example of this is the laptop initiative rolled out by the current government, which sought to provide every child in primary school with a laptop. This was meant to ensure each child in the country had an equal opportunity to gain digital literacy. Digital infrastructure can also patch up traditionally underserved groups for instance having accessible digital learning materials for children with disabilities. Powered by funding from the UNICEF Innovation Fund, Kenya became the first country to pilot the accessible digital textbook, a digitized textbook in the EPUB 3 Format, designed following “Universal Design for learning principles” and provided with multi-media overlays to support children with and without disabilities in their learning process.
Natural Language Processing is also assisting to solve the issue of English hegemony in online services and activities. Masakhane, is an open source AI project using neural machine translation to translate African languages.
The literacy gap also renders the digitally illiterate vulnerable as they are at risk to losing their jobs as a result of the automation brought on by Artificial Intelligence. These people are primarily affected if they are not educated in the skillset needed for them to transition or adapt to highly automated workspaces rendering those who lack opportunities to retrain and re-skill as vulnerable22.=

The Bias Problem

AI has been shown to reflect bias inherent in training data. This widens existing inequalities placing vulnerable groups doubly at risk. AI models boast statistical rigor, yet the data used to train them is often stand in data or proxies which oftentimes only produce incorrect, yet self-affirming results. The models create a vicious feedback loop that manifests in unequal outcomes for different groups of people23. In one instance this can mean being charged higher fees for loans because the model’s data (which includes zip code and the behaviour of people in one’s age group) deems the consumer a high risk, which in turn may result to the default of that loan. The lack of adequate feedback mechanisms to check the flaws in the models deepen the problem as they are presumed adequate despite their negative and incorrect assumptions.24

For instance a machine learning technology that would be able to diagnose Alzheimer’s from auditory tests was found to contain bias as the technology only worked for English speakers of a particular Canadian dialect causing concern about AI diagnostic tools having a long term fatal effect on non-white persons.


The advent of ICTs has increased the potential for the inclusion as well as exclusion of vulnerable groups. Artificial Intelligence can be leveraged for the benefit of vulnerable groups but under three conditions. First, that primary importance be given to bridging the digital divide to ensure that ICT for all is a baseline to involving vulnerable groups in the ICT space. Secondly that vulnerability is addressed as a curbing of mechanisms that are designed to oppress certain groups or exclude them from full social, economic and political participation. Lastly, in the use, design and deployment of Artificial Intelligence, vulnerable groups need to be placed at the forefront, gaps in data addressed and adequate problem solving to ensure that AI does not further existing inequalities.

1 Verity A, Wright J, ‘Artificial Intelligence principles for vulnerable populations in humanitarian contexts’, accessed on 26 September 2020.

2 Frankish K Ramsey W, ‘The Cambridge Handbook of Artificial Intelligence’, Cambridge University Press, United Kingdom, 2014, 12.

3 Wilson, III. E.J. (2004). The Information Revolution and Developing Countries. Cambridge, MA: The MIT Press.

4 Chang BL, Bakken S, Brown SS, et al. Bridging the digital divide: reaching vulnerable populations. J Am Med Inform Assoc. 2004, 11(6), 448.

5 Chang BL, Bakken S, Brown SS, et al. Bridging the digital divide: reaching vulnerable populations. J Am Med Inform Assoc. 2004, 11(6), 450.

6 It’s important to note that the digital divide is also accepted as a multidimensional phenomenon which includes, the global digital divide, the social divide, and the democratic divide. See Norris, P., 2001. Digital Divide. Civic Engagement, Information Poverty, and the Internet Worldwide. Cambridge University Press, New York.

7 Ippolito F and Sanchez S, Protecting Vulnerable Groups: The European Human Rights Framework, Hart Publishing, Oxford and Portland, Oregon, 2014, 37.

8 Ippolito F and Sanchez S, Protecting Vulnerable Groups: The European Human Rights Framework, Hart Publishing, Oxford and Portland, Oregon, 2014, 37.

9 Warschauer M, Demystifying the Digital Divide, Scientific American, 289(2), 2003, 44.

10 Article 20, Constitution of Kenya (2010).

11 Fineman M, ‘The vulnerable subject: anchoring equality in the human condition’, Yale Journal of Law and Feminism, 20(1), 2008,2.

12 For instance the Brasilia Regulations Regarding Access to Justice for Vulnerable People defines ‘vulnerable people’ as ‘those who, due to reasons of age, gender, physical or mental state, or due to social, economic, ethnic and/or cultural circumstances, find it especially difficult to fully exercise their rights before the justice system as recognized to them by law’.

13 Larkin M, Sage Publications, Vulnerable Groups in Health and Social Care, 3.

14 Verity A, Wright J, ‘Artificial Intelligence principles for vulnerable populations in humanitarian contexts’, accessed on 26 September 2020.

15 Sarkar, Avijit; Pick, James B.; Johnson, Jeremy (2015): Africa’s digital divide: Geography, policy, and implications, 2015 Regional Conference of the International Telecommunications Society (ITS): “The Intelligent World: Realizing Hopes, Overcoming Challenges”, Los Angeles, USA, 25th-28th October 2015, International Telecommunications Society (ITS), Calgary

16 Fuchs C and Korak E, ‘Africa and the Digital Divide, ICT&S Center for Advanced Studies and Research in Information and Communication Technologies and Society, 2006, 100-103.

17 Pollitzer E, creating a better future: four scenarios for how digital technologies could change the world, Journal of International Affairs, Vol. 72, No. 1, 77.

18 Bhandari, V. (2020). Improving internet connectivity during Covid-19. Digital Pathways at Oxford Paper Series; no. 4. Oxford, United Kingdom

20 Nemer, David. (2015). From Digital Divide to Digital Inclusion and Beyond. The Journal of Community Informatics. 11.

21 Wilson, III. E.J. (2004). The Information Revolution and Developing Countries. Cambridge, MA: The MIT Press.

22 Kenny C, Automation and AI Implications for African Development Prospects?, Center for Global Development, 2019, 2.

23 O’Neil C, Weapons of Math Destruction, Crown Publishing, New York, 2016, 9-20.

24 O’Neil C, Weapons of Math Destruction, Crown Publishing, New York, 2016, 9-20.

Leave a Comment

Your email address will not be published. Required fields are marked