WHY ARE A MAJORITY OF CHATBOTS FEMALE, AND WHAT CAN WE DO TO MITIGATE THE RISKS THEREIN?

WHY ARE A MAJORITY OF CHATBOTS FEMALE, AND WHAT CAN WE DO TO MITIGATE THE RISKS THEREIN?

Introduction

The female gendering of AI assistants is perpetuated to encourage greater acceptability, as companies believe that female voices are more approachable than their male counterparts.1 Unfortunately, this female gendering of AI assistants plays on human perceptions and stereotypes of assistive roles and subservience associated with women.2 They also expose cultural expectations on women, where women are expected to be concerned, pleasant, effective communicators, and constantly available to serve people’s needs. In an earlier blog, CIPIT explored the potential negative impact on women, of the default gendering of AI assistant technologies as female, and the regulatory responses to this challenge. In the current blog, we explore some more examples of gendered AI Assistants, the various factors leading to this gendered development of AI Assistants and recommendations to reduce the prevalence of gendered AI Assistants.

Gendered AI Assistants in Africa

In Nigeria, 10 of the country’s 22 commercial banks have adopted chatbots for customer service, where 70% of these chatbots are gendered as female.3 Ghana’s Abena AI is another stereotypically female AI assistant in which the creator is equipped with a female voice and female mannerisms.4 Kenya’s Sophie Bot is described as having a female voice that answers questions on sexual and reproductive health issues for young Kenyan women.5 The creators have also described the bot as “her”, indicating that it is designed to be a female companion offering answers to questions that women may not easily find online.6 Sophie Bot is trained on credible information on sexual and reproductive health issues sourced from the Ministry of Health through the National Council for Population and Development.7 Amazon’s Alexa Voice Service also extended its availability to South Africa and several other locations in 2022, and has maintained its default female setting, even though users are allowed to choose other options.8 Alexa is basically an administrative assistant who completes tasks, plays music and places takeout orders on users’ behalf.9

The Nigerian chatbot Kiki of Sterling Bank is described as never ignoring messages, and always ready to respond to queries. This description is part of the underlying problem, since training and recruitment policies constrain women to soft and feminine roles and exclude them from serious and difficult roles, which are left to men, such as programming or engineering.10 Illustratively, in the Nigerian banking industry, 80% of the Executive Directors at the top five banks and the Central Bank of Nigeria are men.11 Comparatively, Nigerian women are better represented in the service roles of the banks, such as customer relations, sales and marketing, and underrepresented in the management roles of the banking sector. As such, with customer relations jobs being relegated to chatbots by the banking industry, women are among the first demographic to be replaced.

Cultural and Socioeconomic Causes of Gendered AI

Many African societies are characterised by patriarchal systems, where men hold primary decision-making power and authority. Gender roles are often defined along these traditional lines, assigning men roles related to leadership, economic provision, and decision-making, while women are expected to fulfill domestic duties and caregiving roles. Understanding these power dynamics is crucial for designing AI assistants that challenge gender inequalities and empower women.12 Further, contextualising gender roles in Africa is crucial when designing AI assistants to ensure they are sensitive to the diverse cultural, social, and economic dynamics that shape gender norms and expectations on the continent.13

Gender disparities such as limited access to education, employment opportunities, and financial resources persist in many parts of Africa, leading to fewer female designers of AI assistants.14 As a result, the number of women employed in ICT is also in commensurate deficit, which is reflected in the design of assistants as female, by male designers.15 The result of this underrepresentation of women is systems developed from a single point of view, portraying women in the perspective of the designers. Temi, a chatbot of the Nigeria First City Monument Bank is described as always available and offering services such as planning clients’ vacations, and health plans while promising to carry out all tasks competitively and not respond to questions “with a k”.16 Such claims imply that women are less effective communicators than AI assistants, and it is expected that women would play softer, more feminine roles throughout the hiring processes.17 Finally, the stereotype that women are always accessible and better communicators may disproportionately expose them to verbal abuse from unsatisfied clients.18

When combined with the social stereotypes about women’s roles, the homogeneously male technology talent and the overrepresentation of women in the service sector, it is easy to see why there is widespread use of female-gendered chatbots in service roles.19

Representation and Diversity

Gender representation and diversity within the design and development teams behind AI assistants play a vital role in shaping the technology’s functionalities and capabilities.20 The lack of inclusivity in these design teams fosters biases, stereotypes, and gendered assumptions about AI assistants.21 UNESCO indicates that the rise of female-gendered AI assistant technologies is primarily manifested during the algorithm’s creation process, during the training of data sets and creation of systems.22 At the same time, research published by Google in 2020 indicates that of the 700,000 coders on the African continent, at least 21% are women.23 Other researchers have found the number of women to be considerably lower; with some indicating that 7%-8% of African developers are women, with only a few in roles of significant decision-making power.24 Only about 22% of AI and data science professionals worldwide are women.25 The 2019 Global AI Talent Report indicates that of all researchers presenting at AI conferences, only 18% are female.26 A perpetuation of gender stereotypes and marginalisation of female voices is the result of this gender imbalance in the perspectives included in the design and development of AI assistants. Engaging female stakeholders throughout the AI assistant development process would ensure the consideration of their needs, preferences, and concerns through user research and participatory design, promoting inclusive products and minimising products that perpetuate gendered stereotypes of women.27

Generally, building diverse design teams that include individuals from different genders, backgrounds, cultures, and perspectives is essential to enable developers to challenge their own biases, uncover blind spots, and design AI assistants that are more inclusive and reflective of diverse user preferences.28 Design choices should be based on evidence and user research, rather than perpetuating stereotypes that may limit or misrepresent individuals based on their gender.29 Allowing users to customise and personalise their AI assistants based on their preferences, including gender-related settings also enhances inclusivity.30 This could include choosing the voice, personality, or appearance of the AI assistant, providing users with a sense of agency and control over their interactions with the technology.

Data Deserts

There is a generalised lack of all forms of data in Africa, which would be useful in developing inclusive technologies and addressing gender discrimination and stereotyping.31 Further, a majority of the voice data used to train algorithms are held by a few tech companies such as Amazon, Microsoft, Apple, Samsung and Google, which makes it difficult for tech companies in Africa to develop high-quality and inclusive AI assistants.32 Only 6% of African countries have gender-disaggregated data on computer programming skills and number of engineering and technology researchers, on the continent.33 Even though African governments routinely collect large amounts of citizen data, it often lacks gender and sex-disaggregated data, while also lacking the granularity that would allow it to be usable for meaningful application.34 Even when data is available, it can contain encoded biases that translate into the AI systems built with that data.35

As a result of data scarcity, chatbots are often trained on imbalanced data with skewed representations of gender interactions; hence the bots not only perpetuate gender stereotypes but also condone poor user behaviour. For example, more than a million users asked Alexa to marry them in 2017, with the bot responding with witty comments.36 Google Assistant, in response to researchers’ questions about its sexuality, has been programmed to offer suggestive yet dismissive responses.37 Google Assistant remains female by default in many countries and has maintained a playful, helpful and humble disposition as an assistant.38 Microsoft’s Cortana assistant also embodies the female-oriented features of service and communication. ‘Her’ origin as a video game character, portrayed as a highly sexualized woman does not help the situation, as the assistant is represented as a subordinate creature available for user pleasure.39 Apple’s Siri also perpetuates the same problem. ‘Siri’ is a name of Nordic origin, loosely translating to “a lovely woman who leads one to victory”.40 The Apple product’s flirtatious responses even to offensive questions, such as those asking what the bot is wearing, are a problem as it perpetuates poor consumer behaviour. Even though the default voice of Siri is still a female caricature, more recent updates allow users to select a different voice.41 When Samsung’s Bixby was first released, its female voice choice was marketed as “chipper,” “clear,” and “cheerful”, while the male option was referred to as “assertive,” “confident,” and “clear.”42 Although the company has eliminated these adjective descriptions, its default is a female voice rather than allowing customers to select a male or female voice when configuring the application.

The development of ‘female’ AI assistants with these features shows a built-in master-servant relationship with their users, which reinforces gender stereotypes of women being servants for their male companions.43 These assistants are designed to be helpers and perform menial tasks for users, reinforcing the existing biased stereotypes about women.44

What We Recommend to Overcome These Barriers

User-Centred Design – Considering the needs, preferences, and cultural sensitivities of diverse user groups, including women, is essential to avoid alienating or excluding specific demographics. User research, feedback, and iterative design processes are vital for developing inclusive and effective AI assistants.45 Thus, designers must research to gain insights into the target users, their demographics, behaviours, and needs, through interviews and observation to understand user goals and concerns and incorporate these in AI assistant products.46

Stakeholder Collaboration – Collaboration among various stakeholders, including designers, developers, policymakers, civil society organisations, and gender equality advocates, is crucial to address ethical concerns. Engaging diverse perspectives and involving the intended users and communities throughout the design process can lead to more ethical and contextually relevant AI assistants.47 By considering these ethical considerations, designers can develop gendered AI assistants in Africa that uphold user rights, promote fairness, and inclusivity, and respect cultural diversity, fostering positive societal impact while minimising potential harm or bias.

Bias Mitigation – AI technologies can inadvertently perpetuate gender biases and stereotypes. Designers must be vigilant in avoiding biased algorithms, biased training data, and reinforcing discriminatory patterns.48 Incorporating diversity and inclusion principles into the development process, along with regular audits and testing for bias, can help mitigate these risks. This requires a collaborative and inclusive approach that involves local communities, addresses gender disparities, respects cultural diversity, and promotes gender equality and empowerment.49

Image is from verloop.io

Conclusion

From the foregoing, the design of AI assistants in Africa requires a thoughtful and inclusive approach that considers the cultural, social, and economic factors shaping gender roles and expectations in different African countries. By addressing representation and diversity, ethical considerations, user-centred design principles, and the specific challenges faced by marginalised groups, AI assistants can contribute to positive social change and empower users in Africa. The representation of diverse gender identities and experiences in the design of AI assistants is crucial for fostering inclusivity and challenging existing gender biases. Thus, it is through a collaborative, ethical, and user-centred approach that AI assistants can truly make a positive impact in addressing societal challenges and advancing gender equality in Africa.

1 Nóra Ni Loideain and Rachel Adams, ‘From Alexa to Siri and the GDPR: The Gendering of Virtual Personal Assistants and the Role of Data Protection Impact Assessments’ (2019) 36 Computer Law & Security Review 105366 <https://www.sciencedirect.comscience/article/pii/S0267364919303772> accessed 5 June 2023.

2 Nóra Ni Loideain and Rachel Adams, see n 1

3 Damian Okaibedi Eke, Kutoma Wakunuma and Simisola Akintoye (eds), Responsible AI in Africa (Palgrave Macmillan 2023)

4 Joel Nwankwo, ‘Born in Africa and for Africans. Abena Is the Real African Voice Assistant AI’ (byOuut17 June 2022) <https://theouut.com/born-in-africa-and-for-africans-abena-is-the-real-african-voice-assistant-ai/> accessed 12 September 2023.

5 ‘Sophie Bot: The “Siri” for Sexual and Reproductive Health Information for Adolescents’ (African Institute for Development Policy – AFIDEP2016) <https://www.afidep.org/multimedia/sophie-bot-the-siri-for-sexual-and-reproductive-health-information-for-adolescents/> accessed 7 June 2023.

6‘Sophie Bot • WHO | Regional Office for Africa’ (innov.afro.who.int) <https://innov.afro.who.int/emerging-technological-innovations/sophie-bot-3860> accessed 7 June 2023.

7 ‘Sophie Bot • WHO | Regional Office for Africa’, see n 7

8 ‘Alexa Voice Service Expands to Ecuador, Hong Kong, South Africa, Taiwan, and Thailand’ (alexa-blog16 May 2022) <https://developer.amazon.com/en-US/blogs/alexa/device-makers/2022/05/alexa-voice-service-international-expansion-may-2022> accessed 7 June 2023.

9 Leopoldina Fortunati and others, ‘Is Alexa Female, Male, or Neutral? A Cross-National and Cross-Gender Comparison of Perceptions of Alexa’s Gender and Status as a Communicator’ (2022) 137 Computers in Human Behavior 107426 <https://www.sciencedirect.com/science/article/pii/S0747563222002485> accessed 14th June 2023.

10 Damian Okaibedi Eke, Kutoma Wakunuma and Simisola Akintoye (eds), see n 4

11 Damian Okaibedi Eke, Kutoma Wakunuma and Simisola Akintoye (eds), see n 4

12 Sstramel (2009). Artificial Intelligence: Advantages and Disadvantages. Available at < http://sstramel.blogspot.com/2009/09/artificial-intelligence-advantages-and.html> Accessed on 3rd June 2023.

13 Oarabile Mudongo, ‘From Colonial Legacy to AI Progress: Re-Defining Gender and Data Protection in Africa – Centre for Intellectual Property and Information Technology Law’ (Centre for Intellectual Property and Information Technology law – Centre for Intellectual Property and Information Technology law29 May 2023) <https://cipit.strathmore.edu/from-colonial-legacy-to-ai-progress-re-defining-gender-and-data-protection-in-africa/> accessed 08 September 2023.

14 Gumisai Mutume, ‘African Women Battle for Equality | Africa Renewal’ (Un.org2015) <https://www.un.org/africarenewal/magazine/july-2005/african-women-battle-equality>.

15 Mark West, Rebecca Kraut and Han Ei Chew, ‘I’d Blush If I Could’ (UNESCO 2019) <https://en.unesco.org/Id-blush-if-I-could>Accessed on 14th June 2023.

16 Miranda Jeanne Marie lossifidis, ‘ASMR and the “reassuring female voice” in the sound art practice of Clare Tolan’ Feminist Media Studies 17:1, 112–115.

17 Miranda Jeanne Marie lossifidis, see n 17

18 Miranda Jeanne Marie lossifidis, see n 17

19 Mark West, Rebecca Kraut and Han Ei Chew, see n 14

20 Joanna Stern, ‘Alexa, Siri, Cortana: The Problem with All-Female Digital Assistants’: <https://www.wsj.com/articles/alexa-siri-cortana-the-problem-with-all-female-digital-assistants-1487709068?mod=rss_Technology> Accessed on 12th June 2023.

21 Heather Suzanne Woods (2018) ‘Asking more of Siri and Alexa: Feminine Persona in Service of Surveillance Capitalism’, Critical Studies in Media Communication Vol 35 (4), 334–349, 345.

22 Mark West, Rebecca Kraut and Han Ei Chew, see n 14

23 Nitin Gajria, ‘E-Conomy Africa 2020: Africa’s $180 Billion Internet Economy Future’ (Google & International Finance Corporation (IFC) 2020) <https://www.ifc.org/content/dam/ifc/doc/mgrt/e-conomy-africa-2020.pdf>.

24 Kees Kranendonk, ‘Female Developers in Africa Are Starting to Catch Up’ (Tunga22 November 2021) <https://tunga.io/female-developers-in-africa/#:~:text=Research%20by%20Tunga%20revealed%20that> accessed 12 September 2023.

25 Favour Borokini, Sandra Nabulega and Garnett Achieng’, ‘A Gender and Ethics Perspective on Artificial Intelligence in Africa Engendering AI’ (2021) <https://archive.pollicy.org/wp-content/uploads/2021/09/Engendering-AI.pdf> accessed 8 January 2024.

26 Araba Sey and Shamira Ahmed, ‘An African Perspective on Gender and Artificial Intelligence Needs African Data and Research’ (RIA 2020) <https://researchictafrica.net/wp/wp-content/uploads/2020/10/Gender-AI-Policy-Brief.pdf> accessed 2024.

27 Andrés Piñeiro-Martín and others, ‘Ethical Challenges in the Development of Virtual Assistants Powered by Large Language Models’ (2023) 12 Electronics 3170 <https://www.mdpi.com/2079-9292/12/14/3170> accessed 24 August 2023.

28 Hilary Bergen (2016) ‘I’d Blush if I Could’: Digital Assistants, Disembodied Cyborgs and the Problem of Gender’, Word and Text: A Journal of Literary Studies and Linguistics Vol VI, 95–113, 105–6.

29 Hilary Bergen, see n 29

30 Tom Froese and Tom Ziemke, ‘Enactive Artificial Intelligence: Investigating the Systemic Organization of Life and Mind’ (2009) 173 Artificial Intelligence 466.

31 Araba Sey and Shamira Ahmed, see n 26

32 Araba Sey and Shamira Ahmed, see n 26

33 Araba Sey and Shamira Ahmed, see n 26

34 Favour Borokini, Sandra Nabulega and Garnett Achieng’, ‘A Gender and Ethics Perspective on Artificial Intelligence in Africa Engendering AI’ (2021) <https://archive.pollicy.org/wp-content/uploads/2021/09/Engendering-AI.pdf> accessed 8 January 2024.

35 Favour Borokini, Sandra Nabulega and Garnett Achieng’, see n 33

36 Favour Borokini, Sandra Nabulega and Garnett Achieng’, see n 33

37 Nóra Ni Loideain and Rachel Adams, see n 1

38 Nóra Ni Loideain and Rachel Adams, see n 1

39 Leopoldina Fortunati and others, see n 10

40 Taylor Walker, see n 36

41 ‘IOS 14.5 Offers Unlock iPhone with Apple Watch, Diverse Siri Voices, and More’ (Apple Newsroom (Kenya) 26 April 2021) <https://www.apple.com/ke/newsroom/2021/04/ios-14-5-offers-unlock-iphone-with-apple-watch-diverse-siri-voices-and-more/> accessed 5 June 2023.

42 ‘Mobile Vendor Market Share Africa | StatCounter Global Stats’ (StatCounter Global Stats2019) <https://gs.statcounter.com/vendor-market-share/mobile/africa>.

43 Taylor Walker, ‘“Alexa, Are You a Feminist?”: Virtual Assistants Doing Gender and What That Means for the World’ (2020) 6 The iJournal: Graduate Student Journal of the Faculty of Information 1 <https://theijournal.ca/index.php/ijournal/article/view/35264/26986>.

44 Leopoldina Fortunati and others, see n 10

45 Responsible AI in Africa. https://library.oapen.org/handle/20.500.12657/60787 > Accessed on 3rd June 2023.

46 Responsible AI in Africa, see n 44

47 Responsible AI in Africa, see n 44

48 Ben Hutchinson and others, ‘Towards Accountability for Machine Learning Datasets’ [2021] Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency <https://arxiv.org/pdf/2010.13561.pdf>.

49 Mark Samuels, ‘Siri, Cortana, Alexa, and Google Assistant are just the beginning: Voice is the future’: <https://www.zdnet.com/article/siri-cortana-alexa-and-google-assistant-are-just-the-beginning-voice-is-the-future/> Accessed on 14th June 2023.

Leave a Comment

Your email address will not be published. Required fields are marked