Abstract
Artificial Intelligence (AI) is supporting decisions in ways that increasingly affect humans in many aspects of their lives. Both autonomous and decision-support systems applying AI algorithms and data-driven models are used for decisions about justice, education, physical and psychological health, and to provide or deny access to credit, healthcare, and other essential resources, in all aspects of daily life, in increasingly ubiquitous and sometimes ambiguous ways. Too often systems are built without considering human factors associated with their use, such as gender bias. The need for clarity about the correct way to employ such systems is an an increasingly critical aspect of design, implementation, and presentation. Models and systems provide results that are difficult to interpret and are blamed for being good or bad, whereas only the design of such tools is good or bad, and the necessary training for them to be integrated into human values. This chapter aims at discussing the most evident issues about gender bias in AI and exploring possible solutions for the impact on humans of AI and decision support algorithms, with a focus on how to integrate gender balance principles into data sets, AI agents, and in general in scientific research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
At the moment this page is written, the world is facing a critical step back in gender equality in Arabic countries, with Afghanistan facing the renewed arrival of Talibans’ command. Clear data about this part of the world are still not available. ISIS is bombing Kabul’s airport to force people to stay under the Sharia Islamic rules, where women are limited in which in other countries are considered human rights. Students and researchers accepted in foreign universities cannot exit the country. Women are abandoning their children in the arms of the US army and European ambassadors, hoping for them to be transported outside Afghanistan.
- 2.
Note of the author: It is such a difficult situation to live in that I had difficulty getting to sleep while writing these paragraphs because it is a worrying situation with a dark past and future, and it can be emotionally challenging for any woman in the field, even to speak about it. Many studies and surverys are circulating to track such issues, all of them are anonymous to the aim to protect the freedom of speech.
References
Lennerlöf, L.: Learned helplessness at work. Int. J. Health Serv. 18(2), 207-222 (1988). https://doi.org/10.2190/CPFB-Y04Y-5DCM-YX7F
https://www.diag.uniroma1.it/en/users/luigia_carlucci-aiello
OECD Gender Data Portal: Where are tomorrow’s female scientists? https://www.oecd.org/gender/data/wherearetomorrowsfemalescientists.htm
CNI Data Portal: In calo gli immatricolati ai corsi di ingegneria .https://www.cni.it/media-ing/news/213-2019/2620-in-calo-gli-immatricolati-ai-corsi-di-laurea-in-ingegneria
I’d blush if I could, UNESCO-EQUALS (2019). https://en.unesco.org/EQUALS/ICT-GE-paradox
Questioni di genere inIntelligenza artificiale, report from S. Badaloni on Gender bias in Artifiial Intelligence (2020). https://vimeo.com/486394250
ISTAT portal: Divario retributivo di genere, report from the Italian National Institute for Statistics (2019). https://www.istat.it/donne-uomini/bloc-2d.html
Craglia et al.: Artificial Intelligence: A European Perspective. Joint Research Centre (2018). https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technical-research-reports/artificial-intelligence-european-perspective
McMillan, G.: It’s Not You, It’s It: Voice Recognition Doesn’t Recognize Women. Times article (2011). https://techland.time.com/2011/06/01/its-not-you-its-it-voice-recognition-doesnt-recognize-women/
Dastin, J.: Amazon scraps secret AI recruiting tool that showed bias against women (2018) https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Chin, C.: AI Is the Future-But Where Are the Women? Wired article (2018). https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/
UNESCO Data Portal: The rise of gendered AI and its troubling ripercussions (2018). https://en.unesco.org/EQUALS/voice-assistants
UNESCO Data Portal: Priority Gender Equality. https://en.unesco.org/genderequality
European Commission projects https://eur-lex.europa.eu/
Stanford Gendered Innovation Platform. Chronic Pain: Analyzing How Sex and Gender Interact (2018). https://genderedinnovations.stanford.edu/case-studies/pain.html
Stanford Gendered Innovation Platform: Facial Recognition: Analyzing Gender and Intersectionality in Machine Learning (2019). https://genderedinnovations.stanford.edu/case-studies/facial.html
Stanford Gendered Innovation Platform. Extended Virtual Reality: Analyzing Gender (2019). https://genderedinnovations.stanford.edu/case-studies/extendedVR.html
Stanford Gendered Innovation Platform. Gendering Social Robots: Analyzing Gender and Intersectionality (2018). https://genderedinnovations.stanford.edu/case-studies/extendedVR.html
Stanford Gendered Innovation Platform: Inclusive Crash Test Dummies: Rethinking Standards and Reference Models (2019). https://genderedinnovations.stanford.edu/case-studies/crash.html
Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. Proc. Mach. Learn. Res. 81, 1–15 (2018)
Tolan, S.: Fair and Unbiased Algorithmic Decision Making: Current State and Future Challenges. JRC Technical Report (2018)
Tolan, S., Miron, M., Castillo, C., Gómez, E.: Performance, fairness and bias of expert assessment and machine learning algorithms: the case of juvenile criminal recidivism in Catalonia. Algorithms and Society Workshop (2018)
Tatman, R.: Gender and dialect bias in YouTube’s automatic captions. Ethics in Natural Language Processing (2017)
Pinkola Estés, C.: Women Who Run with the Wolves. Pickwick BIG (2016)
Roger, J.: A field study of the impact of gender and user’s technical experience on the performance of voice-activated medical tracking application. Int. J. Human-Comput. Stud. 60(5–6), 529–544 (2004)
Falkner, K.: Gender gap in academia: perceptions of female computer science academics. In: Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, pp. 111–116 (2015)
Franzoni, V., Baia, A.E., Biondi, G.,Milani, A.: Producing artificial male voices with maternal features for relaxation. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, Dec 14-17, 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). https://doi.org/10.1145/1122445.1122456
Porges, S., Lewis, G.: The polyvagal hypothesis: common mechanisms mediating autonomic regulation, vocalizations and listening. In: Handbook of Mammalian Vocalization. Handbook of Behavioral Neuroscience, vol. 19, pp. 255–264. Elsevier (2010). https://doi.org/10.1016/B978-0-12-374593-4.00025-5
Badaloni, S., Lisi, F.A.: Towards a Gendered Innovation in AI (short paper). DP@AI*IA 12–18 (2020)
Tannenbaum, C., Ellis, R.P., Eyssel, F., et al.: Sex and gender analysis improves science and engineering. Nature 575, 137–146 (2019)
Franzoni, V., Milani, A., Di Marco, N., Biondi, G.: How virtual reality influenced emotional well-being worldwide during the Covid-19 pandemics. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 14–17 Dec 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). https://doi.org/10.1145/1122445.11224561
West, S.M., Whittaker, M., Crawford, K.: Discriminating Systems: Gender, Race and Power in AI. AI Now Institute (2019). Retrieved from https://ainowinstitute.org/discriminatingsystems.html
Mehrabi, N. et al.: A survey on bias and fairness in machine learning. ACM Comput. Surv. 54(6), 115:1–115:35 (2021)
Raji, I.D., Buolamwini, J., et al.: Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing, pp. 145–151. AIES ’20 (2020)
Raji, I.D., Buolamwini, J.: Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products, pp. 429–435. AIES ’19 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Franzoni, V. (2023). Gender Differences and Bias in Artificial Intelligence. In: Vallverdú, J. (eds) Gender in AI and Robotics. Intelligent Systems Reference Library, vol 235. Springer, Cham. https://doi.org/10.1007/978-3-031-21606-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-21606-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-21605-3
Online ISBN: 978-3-031-21606-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)