Skip to main content

Diversity and Inclusion in Artificial Intelligence

  • Chapter
  • First Online:
Law and Artificial Intelligence

Part of the book series: Information Technology and Law Series ((ITLS,volume 35))

Abstract

Discrimination and bias are inherent problems of many AI applications, as seen in, for instance, face recognition systems not recognizing dark-skinned women and content moderator tools silencing drag queens online. These outcomes may derive from limited datasets that do not fully represent society as a whole or from the AI scientific community's western-male configuration bias. Although being a pressing issue, understanding how AI systems can replicate and amplify inequalities and injustice among underrepresented communities is still in its infancy in social science and technical communities. This chapter contributes to filling this gap by exploring the research question: what do diversity and inclusion mean in the context of AI? This chapter reviews the literature on diversity and inclusion in AI to unearth the underpinnings of the topic and identify key concepts, research gaps, and evidence sources to inform practice and policymaking in this area. Here, attention is directed to three different levels of the AI development process: the technical, the community, and the target user level. The latter is expanded upon, providing concrete examples of usually overlooked communities in the development of AI, such as women, the LGBTQ+ community, senior citizens, and disabled persons. Sex and gender diversity considerations emerge as the most at risk in AI applications and practices and thus are the focus here. To help mitigate the risks that missing sex and gender considerations in AI could pose for society, this chapter closes with proposing gendering algorithms, more diverse design teams, and more inclusive and explicit guiding policies. Overall, this chapter argues that by integrating diversity and inclusion considerations, AI systems can be created to be more attuned to all-inclusive societal needs, respect fundamental rights, and represent contemporary values in modern societies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Schönberger 2019; Wisskirchen et al. 2017; Righetti et al. 2019.

  2. 2.

    Noble 2018; Raji and Buolamwin 2019; Fosch-Villaronga et al. 2021.

  3. 3.

    Raji and Buolamwini 2019; Gomes et al. 2019.

  4. 4.

    Zhao et al. 2017.

  5. 5.

    Roopaei et al. 2021.

  6. 6.

    Willson 2017; Noble 2018; Ito 2019.

  7. 7.

    Liu 2021; Giger et al. 2019.

  8. 8.

    Danielescu 2020.

  9. 9.

    West et al. 2019; Rahman and Billionniere 2021.

  10. 10.

    Cirillo et al. 2020.

  11. 11.

    Park and Woo 2019.

  12. 12.

    Richardson 2016.

  13. 13.

    Schiebinger 2014; Tannenbaum et al. 2019.

  14. 14.

    Ekmekcioglu 2021.

  15. 15.

    Groom 2021.

  16. 16.

    Freireç et al. 2020.

  17. 17.

    See Article 5 of the Convention on the Elimination of All Forms of Discrimination against Women and Article 8(1)(b) of the Convention on the Rights of Persons with Disabilities.

  18. 18.

    Cirillo et al. 2020.

  19. 19.

    See Lexico’s definition at https://www.lexico.com/definition/diversity.

  20. 20.

    Mitchell et al. 2020.

  21. 21.

    O’Neil 2016; Caliskan et al. 2017.

  22. 22.

    Fosch-Villaronga et al. 2021.

  23. 23.

    See https://www.legislation.govt.nz/act/public/1995/0016/latest/DLM359369.html.

  24. 24.

    See https://www.acon.org.au/wp-content/uploads/2019/07/TGD_Language-Guide.pdf.

  25. 25.

    See https://www.glaad.org/reference/transgender.

  26. 26.

    Ibidem.

  27. 27.

    Fosch-Villaronga et al. 2021.

  28. 28.

    Custers 2013.

  29. 29.

    O’Neil 2016; Hamidi et al. 2018.

  30. 30.

    See ​​https://www.ohchr.org/en/issues/women/wrgs/pages/genderstereotypes.aspx.

  31. 31.

    Sink et al. 2018.

  32. 32.

    Burdge 2007; Howansky et al. 2019.

  33. 33.

    Freire et al. 2020.

  34. 34.

    See https://www.journals.elsevier.com/artificial-intelligence/editorial-board.

  35. 35.

    Cech and Waidzunas 2021; Tao 2018.

  36. 36.

    Whisnant 2012.

  37. 37.

    Gibney 2019.

  38. 38.

    Rathenau Institute 2021.

  39. 39.

    Schiebinger 2014.

  40. 40.

    Poulsen et al. 2020.

  41. 41.

    Nature Editorial 2018.

  42. 42.

    Haraway 2006; Bray 2007; Wajcman 2007.

  43. 43.

    Faulkner 2001.

  44. 44.

    O'Riordan and Phillips 2007.

  45. 45.

    Oudshoorn and Pitch 2003.

  46. 46.

    Page 2009.

  47. 47.

    Vida 2020.

  48. 48.

    Oudshoorn et al. 2004.

  49. 49.

    MoMa 2021.

  50. 50.

    Moscoso-Porras 2019.

  51. 51.

    Whittaker et al. 2019.

  52. 52.

    Bragg et al. 2019.

  53. 53.

    Goggin and Newell 2003.

  54. 54.

    United Nations 1993.

  55. 55.

    Temmerman et al. 2014.

  56. 56.

    Maxwell et al. 2006; Roussel 2013.

  57. 57.

    Fosch-Villaronga and Poulsen 2021.

  58. 58.

    Cech and Waidzunas 2021.

  59. 59.

    Gomes et al. 2019.

  60. 60.

    Strengers and Kennedy 2020.

  61. 61.

    See ​​https://www.gatebox.ai/en/hikari.

  62. 62.

    Liu 2021.

  63. 63.

    See ​​https://www.gatebox.ai/en/hikari.

  64. 64.

    Liu 2021.

  65. 65.

    Giger et al. 2019.

  66. 66.

    See https://clubfirst.org/product/sona-2-5-covid-19-robot/.

  67. 67.

    See http://www.xiaoice.com/.

  68. 68.

    See https://www.apple.com/au/siri/.

  69. 69.

    See https://assistant.google.com/.

  70. 70.

    Liu 2021.

  71. 71.

    Strengers and Kennedy 2020.

  72. 72.

    See India Times 2020 Covid-19: Jaipur Hospital Turns To Robots To Take Care Of Coronavirus Patients https://navbharattimes.indiatimes.com/video/news/covid-19-jaipur-hospital-turns-to-robots-to-take-care-of-coronavirus-patients/videoshow/74818092.cms.

  73. 73.

    Fosch-Villaronga et al. 2021.

  74. 74.

    Park and Woo 2019.

  75. 75.

    Park and Woo 2019.

  76. 76.

    Nosek et al. 2002a; Caliskan et al. 2017; Nosek et al. 2002b.

  77. 77.

    Zhao et al. 2017.

  78. 78.

    Zhao et al. 2017.

  79. 79.

    Sun et al. 2019; Zhou et al. 2019.

  80. 80.

    Campa et al. 2019.

  81. 81.

    Dupré et al. 2020; Fosch-Villaronga et al. 2021.

  82. 82.

    Buolamwini and Gebru 2018; Font and Costa-jussà 2019; McDuff et al. 2019; Torralba and Efros 2011.

  83. 83.

    Hamidi et al. 2018.

  84. 84.

    Keyes 2018; Fosch-Villaronga et al. 2021.

  85. 85.

    Fosch-Villaronga et al. 2021.

  86. 86.

    Fosch-Villaronga 2019a, b.

  87. 87.

    Hamidi et al. 2018; Büchi et al. 2020; Nišević et al. 2021.

  88. 88.

    Yu et al. 2018; Ahuja 2019.

  89. 89.

    Cirillo et al. 2020.

  90. 90.

    Topol 2019.

  91. 91.

    Esteva et al. 2017.

  92. 92.

    Wapner 2018.

  93. 93.

    Topol 2019.

  94. 94.

    Fosch-Villaronga et al. 2021.

  95. 95.

    Kamiran et al. 2013; Cirillo et al. 2020.

  96. 96.

    Fosch-Villaronga and Poulsen 2021.

  97. 97.

    Fosch-Villaronga and Poulsen 2021.

  98. 98.

    Jecker 2020.

  99. 99.

    Richardson 2016.

  100. 100.

    Scheutz and Arnold 2016.

  101. 101.

    Scheutz and Arnold 2016.

  102. 102.

    Fosch-Villaronga and Poulsen 2021.

  103. 103.

    Fosch-Villaronga and Poulsen 2021.

  104. 104.

    Behrendt 2018.

  105. 105.

    Zara et al. 2021.

  106. 106.

    Cirillo et al. 2020.

  107. 107.

    McGregor 2016.

  108. 108.

    Cirillo et al. 2020.

  109. 109.

    Jenkins et al. 2016.

  110. 110.

    Keyes 2018.

  111. 111.

    Kamiran et al. 2013.

  112. 112.

    Geyik et al. 2019.

  113. 113.

    Sommers 2006; Rock and Grant 2016.

  114. 114.

    Díaz-García et al. 2013.

  115. 115.

    Phillips et al. 2009.

  116. 116.

    Friedman and Hendry 2019; Friedman et al. 2006.

  117. 117.

    Poulsen et al. 2020.

  118. 118.

    CSIRO 2019.

  119. 119.

    Queer in AI 2019.

  120. 120.

    Jenkins 2016.

  121. 121.

    Keyes 2018.

  122. 122.

    Friedman and Hendry 2019; Friedman et al. 2006.

  123. 123.

    Fosch-Villaronga and Özcan 2019.

  124. 124.

    Carr 2011.

  125. 125.

    Jobin et al. 2019.

  126. 126.

    Martinetti et al. 2021. See Regulation (EU) of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regards to the processing of personal data and on the free movement of such data; Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices; and the Directive 2006/42/EC of the European Parliament and of the Council of 17 May 2006 on machinery.

  127. 127.

    AI Act 2021.

  128. 128.

    Stahl and Coeckelbergh 2016.

  129. 129.

    European Commission 2012.

  130. 130.

    Raji and Buolamwini 2019; Gomes et al. 2019.

  131. 131.

    Zhao et al. 2017.

  132. 132.

    Roopaei et al. 2021.

  133. 133.

    Mitchell et al. 2020.

  134. 134.

    Raji and Buolamwini 2019.

  135. 135.

    Willson 2017; Noble 2018; Ito 2019.

  136. 136.

    Some initiatives have started to explore these topics in the Netherlands. Check for instance the ‘Gendering Algorithms’ initiative started at Leiden University (see https://www.genderingalgorithms.org/) or the ‘Diversity and Inclusion for Embodied AI’ initiative started by the 4TU Federation and Leiden University (see https://www.dei4eai.com/).

  137. 137.

    Mitchell et al. 2020.

References

  • Addlakha R et al (2017) Disability and sexuality: Claiming sexual and reproductive rights. Reproductive Health Matters https://doi.org/10.1080/09688080.2017.1336375

  • Ahuja A S (2019) The impact of artificial intelligence in medicine on the future role of the physician. Peer J, 7, e7702

    Google Scholar 

  • Behrendt M (2018) Reflections on moral challenges posed by a therapeutic childlike sexbot. In: Cheok A, Levy D (eds) LSR 2017: Love and Sex with Robots. Springer, Cham, pp 96–113

    Google Scholar 

  • Bragg D et al (2019) Sign language recognition, generation, and translation: An interdisciplinary perspective. In: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, pp 16–31

    Google Scholar 

  • Bray F (2007) Gender and technology. Annu. Rev. Anthropol. https://doi.org/10.1146/annurev.anthro.36.081406.094328

  • Büchi M, Fosch-Villaronga E, Lutz C, Tamò-Larrieux A, Velidi S, Viljoen S (2020) The chilling effects of algorithmic profiling: Mapping the issues. Computer law & security review 36, 105367

    Google Scholar 

  • Burdge B J (2007) Bending gender, ending gender: Theoretical foundations for social work practice with the transgender community. Social work 52:243–250

    Google Scholar 

  • Buolamwini J, Gebru T (2018) Gender shades: Intersectional accuracy disparities in commercial gender classification. In: Proceedings of the First Conference on Fairness, Accountability and Transparency. PMLR, pp 77–91

    Google Scholar 

  • Caliskan A et al (2017) Semantics derived automatically from language corpora contain humanlike biases. Science https://doi.org/10.1126/science.aal4230

  • Campa S et al (2019) Deep & machine learning approaches to analyzing gender representations in journalism. https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/reports/custom/15787612.pdf

  • Carr N (2011) The Shallows: What the Internet is doing to our brains

    Google Scholar 

  • Cech E A, Waidzunas T J (2021) Systemic inequalities for LGBTQ professionals in STEM. Science Advanceshttps://doi.org/10.1126/sciadv.abe0933

  • Cirillo D et al (2020) Sex and gender differences and biases in artificial intelligence for biomedicine and healthcare. NPJ Digital Medicine https://doi.org/10.1038/s41746-020-0288-5

  • Commonwealth Scientific and Industrial Research Organisation (CSIRO) (2019) Diversity & inclusion at the robotics and autonomous systems group. https://research.csiro.au/robotics/diversity-inclusion-at-the-robotics-and-autonomous-systems-group/

  • Custers B (2013) Data dilemmas in the information society: Introduction and overview. In: Custers B et al (eds) Discrimination and Privacy in the Information Society. Springer, Berlin, pp 3–26

    Google Scholar 

  • Danielescu A (2020) Eschewing gender stereotypes in voice assistants to promote inclusion. In: Torres M I et al (eds) Proceedings of the 2nd Conference on Conversational User Interfaces. ACM, New York, pp 1–3

    Google Scholar 

  • Di Nucci E (2017) Sex robots and the rights of the disabled. In: Danaher J, McArthur N (eds) Robot Sex: Social and Ethical Implications. MIT Press, Cambridge, pp 73–88

    Google Scholar 

  • Díaz-García C, González-Moreno A, Saez-Martinez FJ (2013) Gender diversity within R&D teams: Its impact on radicalness of innovation. Innovation, 15(2), pp. 149–160

    Google Scholar 

  • Döring N et al (2020) Design, use, and effects of sex dolls and sex robots: Scoping review. Journal of Medical Internet Research https://doi.org/10.2196/18551

  • Dupré D, Krumhuber EG, Küster D, McKeown GJ (2020) A performance comparison of eight commercially available automatic classifiers for facial affect recognition. PloS one 15(4):e0231968

    Google Scholar 

  • Ekmekçioğlu O et al (2021) Women in nuclear medicine. Eur. J. Nucl. Med. Mol. Imaging https://doi.org/10.1007/s00259-021-05418-9

  • European Commission (2012) Options for strengthening responsible research & innovation. Retrieved from https://ec.europa.eu/research/science-society/document_library/pdf_06/options-for-strengthening_en.pdf

  • Esteva A et al (2017) Dermatologist-level classification of skin cancer with deep neural networks. Naturehttps://doi.org/10.1038/nature21056

  • Faulkner W (2001) The technology question in feminism: A view from feminist technology studies. Women's Studies International Forum https://doi.org/10.1016/S0277-5395(00)00166-7

  • Font J E, Costa-jussà M R (2019) Equalizing gender bias in neural machine translation with word embeddings techniques. In: Costa-jussà M R et al (eds) Proceedings of the 1st Workshop on Gender Bias in Natural Language Processing. Association for Computational Linguistics, Stroudsburg, pp 147–154

    Google Scholar 

  • Fosch-Villaronga E (2019a) Robots, healthcare, and the law: Regulating automation in personal care. Routledge, Abingdon

    Google Scholar 

  • Fosch-Villaronga E (2019b) “I love you,” said the robot: Boundaries of the use of emotions in human-robot interactions. In: Ayanoğlu H, Duarte E (eds) Emotional design in human-robot interaction. Springer, Cham, pp 93–110

    Google Scholar 

  • Fosch-Villaronga E, Özcan B (2020) The progressive intertwinement between design, human needs and the regulation of care technology: the case of lower-limb exoskeletons. International Journal of Social Robotics, 12(4), 959–972

    Google Scholar 

  • Fosch-Villaronga E, Poulsen A (2020) Sex care robots. Paladyn, Journal of Behavioral Robotics https://doi.org/10.1515/pjbr-2020-0001

  • Fosch-Villaronga E, Poulsen A (2021) Sex robots in care: Setting the stage for a discussion on the potential use of sexual robot technologies for persons with disabilities. In: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. ACM, New York, pp 1–9

    Google Scholar 

  • Fosch-Villaronga E et al (2021) A little bird told me your gender: Gender inferences in social media. Information Processing & Management https://doi.org/10.1016/j.ipm.2021.102541

  • Freire A et al (2020) Measuring diversity of artificial intelligence conferences. arXiv preprint. https://arxiv.org/abs/2001.07038

  • Friedman B, Hendry D G (2019) Value sensitive design: Shaping technology with moral imagination. MIT Press, Cambridge

    Google Scholar 

  • Friedman B et al (2006) Value sensitive design and information systems. In: Zhang P, Galletta D (eds) Human-computer interaction and management information systems: Foundations. M. E. Sharpe, New York, pp 348–372

    Google Scholar 

  • Gartrell A et al (2017) “We do not dare to love”: Women with disabilities’ sexual and reproductive health and rights in rural Cambodia. Reproductive Health Matters https://doi.org/10.1080/09688080.2017.1332447

  • Geyik S C et al (2019) Fairness-aware ranking in search & recommendation systems with application to LinkedIn talent search. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, New York, pp 2221–2231

    Google Scholar 

  • Gibney E (2019) Discrimination drives LGBT+ scientists to think about quitting. Nature. https://www.nature.com/articles/d41586-019-02013-9

  • Giger J-C et al (2019) Humanization of robots: Is it really such a good idea? Hum. Behav. & Emerg. Tech. https://doi.org/10.1002/hbe2.147

  • Goggin G, Newell C (2003) Digital disability: The social construction of disability in new media. Rowman & Littlefield, Lanham

    Google Scholar 

  • Groom J R (2021) Diversity in science requires mentoring for all, by all. Nat. Immunol. https://doi.org/10.1038/s41590-021-00999-x

  • Gomes A et al (2019) Drag queens and artificial intelligence: Should computers decide what is ‘toxic’ on the internet? Internet Lab. http://www.internetlab.org.br/en/freedom-of-expression/drag-queens-and-artificial-intelligence-should-computers-decide-what-is-toxic-on-the-internet/

  • Hamidi F et al (2018) Gender recognition or gender reductionism? The social implications of embedded gender recognition systems. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, New York, pp 1–3

    Google Scholar 

  • Hao K (2019) Facebook's ad-serving algorithm discriminates by gender and race. MIT Technology Review. https://www.technologyreview.com/2019/04/05/1175/facebook-algorithm-discriminates-ai-bias/

  • Haraway D (2006) A cyborg manifesto: Science, technology, and socialist-feminism in the late 20th century. In: Weiss J et al (eds) The International Handbook of Virtual Learning Environments. Springer, Dordrecht, pp 118–158

    Google Scholar 

  • Higgins A et al (2006) Sexual health education for people with mental health problems: What can we learn from the literature? Journal of Psychiatric and Mental Health Nursing https://doi.org/10.1111/j.1365-2850.2006.01016.x

  • Holder C et al (2016) Robotics and law: Key legal and regulatory implications of the robotics age (part II of II). Computer Law & Security Review https://doi.org/10.1016/j.clsr.2016.05.011

  • Howansky K et al (2021) (Trans)gender stereotypes and the self: Content and consequences of gender identity stereotypes. Self and Identity https://doi.org/10.1080/15298868.2019.1617191

  • International Federation of Robotics (2018) Executive summary world robotics 2018 service robots. https://ifr.org/downloads/press2018/Executive_Summary_WR_Service_Robots_2018.pdf

  • Ito J (2019) Supposedly ‘fair’ algorithms can perpetuate discrimination. MIT Media Lab. https://www.media.mit.edu/articles/supposedly-fair-algorithms-can-perpetuate-discrimination/

  • Jecker N S (2020) Nothing to be ashamed of: Sex robots for older adults with disabilities. Journal of Medical Ethics https://doi.org/10.1136/medethics-2020-106645

  • Jenkins H et al (2016) Participatory culture in a networked era: A conversation on youth, learning, commerce, and politics. Polity Press, Cambridge

    Google Scholar 

  • Jobin A, Ienca M, Vayena E (2019) The global landscape of AI ethics guidelines. Nat Mach Intell 1(9):389–399

    Google Scholar 

  • Kamiran F et al (2013) Techniques for discrimination-free predictive models. In: Custers B H M et al (eds) Discrimination and Privacy in the Information Society. Springer, Heidelberg, pp 223–239

    Google Scholar 

  • Keyes O (2018) The misgendering machines: Trans/HCI implications of automatic gender recognition. Proceedings of the ACM on Human-Computer Interaction https://doi.org/10.1145/3274357

  • Liu J (2021) Social robots as the bride? Understanding the construction of gender in a Japanese social robot product. Human-Machine Communication https://doi.org/10.30658/hmc.2.5

  • Martinetti A, Chemweno PK, Nizamis K, Fosch-Villaronga E (2021) Redefining safety in light of human-robot interaction: A critical review of current standards and regulations. Front Chem Eng 32

    Google Scholar 

  • Maxwell J et al (2006) A health handbook for women with disabilities. Hesperian, Berkeley

    Google Scholar 

  • McCann E (2003) Exploring sexual and relationship possibilities for people with psychosis – A review of the literature. Journal of Psychiatric and Mental Health Nursing https://doi.org/10.1046/j.1365-2850.2003.00635.x

  • McDuff D et al (2019) Characterizing bias in classifiers using generative models. In: Wallach H et al (eds) Proceedings of the 33rd Conference on Neural Information Processing Systems. Curran Associates, New York, pp 1–12

    Google Scholar 

  • McGregor A J et al (2016) How to study the impact of sex and gender in medical research: A review of resources. Biol. Sex Differ. https://doi.org/10.1186/s13293-016-0099-1

  • Mitchell M et al (2020) Diversity and inclusion metrics in subset selection. In: Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society. ACM, New York, pp 117–123

    Google Scholar 

  • MoMa (2021) Design innovations for women. Design store. https://store.moma.org/design-innovations-for-women.html

  • Moscoso-Porras M et al (2019) Access barriers to medical facilities for people with physical disabilities: The case of Peru. Cadernos de Saúde Pública https://doi.org/10.1590/0102-311x00050417

  • Nature Editorial (2018) Science benefits from diversity. Nature, 558, 5–6, https://www.nature.com/articles/d41586-018-05326-3

  • Nišević M et al (2021) Understanding the legal bases for automated decision-making under the GDPR. In: Kostas E, Leenes R (eds) Research Handbook on EU Data Protection. Hart Publishing, Oxford [forthcoming]

    Google Scholar 

  • Noble S U (2018) Algorithms of oppression: How search engines reinforce racism. NYU Press, New York

    Google Scholar 

  • Nosek B A et al (2002a) Harvesting implicit group attitudes and beliefs from a demonstration web site. Group Dynamics: Theory, Research, and Practice https://doi.org/10.1037/1089-2699.6.1.101

  • Nosek B A et al (2002b) Math = male, me = female, therefore math ≠ me. Journal of Personality and Social Psychology https://doi.org/10.1037/0022-3514.83.1.44

  • Ntoutsi E et al (2020) Bias in data‐driven artificial intelligence systems—An introductory survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery https://doi.org/10.1002/widm.1356

  • O'Neil C (2016) Weapons of math destruction: How big data increases inequality and threatens democracy. Crown, New York

    Google Scholar 

  • O'Riordan K, Phillips D J (2007) Queer online: Media technology & sexuality. Peter Lang Publishing, Bern

    Google Scholar 

  • Oudshoorn N, Pinch T (2003) How users matter: The co-construction of users and technology. MIT Press, Cambridge

    Google Scholar 

  • Oudshoorn N et al (2004) Configuring the user as everybody: Gender and design cultures in information and communication technologies. Science, Technology, & Human Values https://doi.org/10.1177/0162243903259190

  • Page M et al (2009) The blue blazer club: masculine hegemony in science, technology, engineering, and math fields. Forum on Public Policy Online v2009:1–23

    Google Scholar 

  • Park S, Woo J (2019) Gender classification using sentiment analysis and deep learning in a health web forum. Applied Sciences https://doi.org/10.3390/app9061249

  • Perry B L, Wright E R (2006) The sexual partnerships of people with serious mental illness. Journal of Sex Research https://doi.org/10.1080/00224490609552312

  • Phillips KW, Liljenquist KA, Neale MA (2009) Is the pain worth the gain? The advantages and liabilities of agreeing with socially distinct newcomers. Personality and Social Psychology Bulletin, 35(3), 336–350

    Google Scholar 

  • Poulsen A et al (2020) Queering machines. Nature Machine Intelligence https://doi.org/10.1038/s42256-020-0157-6

  • Prince A E, Schwarcz D (2020) Proxy discrimination in the age of artificial intelligence and big data. Iowa Law Review 105:1257–1318

    Google Scholar 

  • Quinn C, Browne G (2009) Sexuality of people living with a mental illness: A collaborative challenge for mental health nurses. International Journal of Mental Health Nursing https://doi.org/10.1111/j.1447-0349.2009.00598.x

  • Queer in AI (2019) Queer in AI. https://sites.google.com/view/queer-in-ai/

  • Rahman F, Billionniere E (2021) Re-entering computing through emerging technology: Current state and special issue introduction. ACM Trans. Comput. Educ. https://doi.org/10.1145/3446840

  • Raji I D, Buolamwini J (2019) Actionable auditing: Investigating the impact of publicly naming biased performance results of commercial AI products. In: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society. ACM, New York, pp 429–435

    Google Scholar 

  • Rathenau Institute (2021) Women in Academia. https://www.rathenau.nl/en/science-figures/personnel/women-science/women-academia

  • Richardson K (2016) The asymmetrical 'relationship' parallels between prostitution and the development of sex robots. ACM SIGCAS Computers and Society https://doi.org/10.1145/2874239.2874281

  • Righetti L et al (2019) Unintended consequences of biased robotic and artificial intelligence systems [ethical, legal, and societal issues]. IEEE Robotics & Automation Magazine https://doi.org/10.1109/MRA.2019.2926996

  • Rock D, Grant H (2016) Why diverse teams are smarter. Harvard Business Review, 4(4), 2–5

    Google Scholar 

  • Roopaei M et al (2021) Women in AI: barriers and solutions. In: Proceedings of the 2021 IEEE World AI IoT Congress (AIIoT). IEEE, New York, pp 0497-0503

    Google Scholar 

  • Roussel S (2013) Seeking Sexual Surrogates. The New York Times. https://www.nytimes.com/video/world/europe/100000002304193/seeking-sexual-surrogates.html [video]

  • Schwalbe N, Wahl B (2020) Artificial intelligence and the future of global health. The Lancet https://doi.org/10.1016/S0140-6736(20)30226-9

  • Scheutz M, Arnold T (2016) Are we ready for sex robots? In: Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction. IEEE, New York, 351–358

    Google Scholar 

  • Schiebinger L (2014) Scientific research must take gender into account. Nature 507, 9.https://doi.org/10.1038/507009a

  • Schönberger D (2019) Artificial intelligence in healthcare: A critical analysis of the legal and ethical implications. International Journal of Law and Information Technology https://doi.org/10.1093/ijlit/eaz004

  • Servais L (2006) Sexual health care in persons with intellectual disabilities. Mental Retardation and Developmental Disabilities Research Reviews https://doi.org/10.1002/mrdd.20093

  • Sink A, Mastro D, Dragojevic M (2018) Competent or warm? A stereotype content model approach to understanding perceptions of masculine and effeminate gay television characters. Journalism & Mass Communication Quarterly, 95(3), 588–606

    Google Scholar 

  • Sommers SR (2006) On racial diversity and group decision making: identifying multiple effects of racial composition on jury deliberations. Journal of personality and social psychology, 90(4), 597

    Google Scholar 

  • Søraa R A (2017) Mechanical genders: How do humans gender robots? Gender, Technology and Development https://doi.org/10.1080/09718524.2017.1385320

  • Sparrow R (2021) Sex robot fantasies. Journal of Medical Ethics https://doi.org/10.1136/medethics-2020-106932

  • Stahl BC, Coeckelbergh M (2016) Ethics of healthcare robotics: Towards responsible research and innovation. Robotics and Autonomous Systems, 86, 152–161

    Google Scholar 

  • STOA (2018) Assistive technologies for people with disabilities. https://www.europarl.europa.eu/RegData/etudes/IDAN/2018/603218/EPRS_IDA(2018)603218_EN.pdf

  • Strengers Y, Kennedy J (2020) The smart wife: Why Siri, Alexa, and other smart home devices need a feminist reboot. MIT Press

    Google Scholar 

  • Sun T et al (2019) Mitigating gender bias in natural language processing: Literature review. In: Korhonen A et al (eds) Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Stroudsburg, pp 1630–1640

    Google Scholar 

  • Tannenbaum C, Ellis RP, Eyssel F, Zou J, Schiebinger L (2019) Sex and gender analysis improves science and engineering. Nature 575(7781):137–146

    Google Scholar 

  • Tao Y (2018) Earnings of academic scientists and engineers: Intersectionality of gender and race/ethnicity effects. American Behavioral Scientist https://doi.org/10.1177/0002764218768870

  • Temmerman M et al (2014) Sexual and reproductive health and rights: A global development, health, and human rights priority. The Lancet https://doi.org/10.1016/S0140-6736(14)61190-9

  • Topol E J (2019) High-performance medicine: The convergence of human and artificial intelligence. Nature Medicine https://doi.org/10.1038/s41591-018-0300-7

  • Torralba A, Efros A A (2011) Unbiased look at dataset bias. In: Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, New York, pp 1521–1528

    Google Scholar 

  • United Nations (1993) Standard rules on the equalization of opportunities for persons with disabilities. https://www.un.org/disabilities/documents/gadocs/standardrules.pdf

  • United Nations (2007) Convention on the Rights of Persons with Disabilities and Optional Protocol. https://www.un.org/development/desa/disabilities/convention-on-the-rights-of-persons-with-disabilities.html

  • Urry K, Chur-Hansen A (2020) Who decides when people can have sex? Australian mental health clinicians’ perceptions of sexuality and autonomy. Journal of Health Psychology https://doi.org/10.1177/1359105318790026

  • Vaughan C et al (2015) W-DARE: A three-year program of participatory action research to improve the sexual and reproductive health of women with disabilities in the Philippines. BMC Public Health https://doi.org/10.1186/2Fs12889-015-2308-y

  • Vida B (2021) Policy framing and resistance: Gender mainstreaming in Horizon 2020. European Journal of Women’s Studies https://doi.org/10.1177/1350506820935495

  • Wajcman J (2007) From women and technology to gendered technoscience. Information, Community and Society https://doi.org/10.1080/13691180701409770

  • Wapner J (2018) Cancer scientists have ignored African DNA in the search for cures. Newsweek. https://www.newsweek.com/2018/07/27/cancer-cure-genome-cancer-treatment-africa-genetic-charles-rotimi-dna-human-1024630.html

  • Weber J (2005) Helpless machines and true loving care givers: A feminist critique of recent trends in human‐robot interaction. Journal of Information, Communication and Ethics in Society https://doi.org/10.1108/14779960580000274

  • West M et al (2019) I'd blush if I could: Closing gender divides in digital skills through education. UNESCO. https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1

  • Willson M (2017) Algorithms (and the) everyday. Information, Communication & Society https://doi.org/10.1080/1369118X.2016.1200645

  • Wisskirchen G et al (2017) Artificial intelligence and robotics and their impact on the workplace. IBA Global Employment Institute

    Google Scholar 

  • Wheeler A P, Steenbeek W (2021) Mapping the risk terrain for crime using machine learning. Journal of Quantitative Criminology https://doi.org/10.1007/s10940-020-09457-7

  • Whisnant C J (2012) Male homosexuality in West Germany. Palgrave Macmillan, London

    Google Scholar 

  • Whittaker M et al (2019) Disability, bias, and AI. AI Now Institute. https://wecount.inclusivedesign.ca/uploads/Disability-bias-AI.pdf

  • World Health Organization (2015) Sexual health, human rights and the law report. https://apps.who.int/iris/bitstream/handle/10665/175556/9789241564984_eng.pdf

  • Yu KH, Beam AL, Kohane IS (2018) Artificial intelligence in healthcare. Nature biomedical engineering, 2(10), 719–731

    Google Scholar 

  • Zara G et al (2021) Sexbots as synthetic companions: Comparing attitudes of official sex offenders and non-offenders. International Journal of Social Robotics https://doi.org/10.1007/s12369-021-00797-3

  • Zhao J et al (2017) Men also like shopping: Reducing gender bias amplification using corpus-level constraints. In: Palmer M et al (eds) Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Stroudsburg, pp 2979–2989

    Google Scholar 

  • Zhou P et al (2019) Examining gender bias in languages with grammatical gender. In: Padó S, Huang R (eds) Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Association for Computational Linguistics, Stroudsburg, pp 5279–5287

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eduard Fosch-Villaronga .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 T.M.C. Asser Press and the authors

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Fosch-Villaronga, E., Poulsen, A. (2022). Diversity and Inclusion in Artificial Intelligence. In: Custers, B., Fosch-Villaronga, E. (eds) Law and Artificial Intelligence. Information Technology and Law Series, vol 35. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-94-6265-523-2_6

Download citation

  • DOI: https://doi.org/10.1007/978-94-6265-523-2_6

  • Published:

  • Publisher Name: T.M.C. Asser Press, The Hague

  • Print ISBN: 978-94-6265-522-5

  • Online ISBN: 978-94-6265-523-2

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics