Eva Gengler, Ilse Hagerer, and Alina Gales published their chapter “Diversity bias in artificial intelligence” in The Digital and AI Coaches' Handbook: The Complete Guide to the Use of Online, AI, and Technology in Coaching, edited by Jonathan Passmore, Sandra J. Diller, Sam Isaacson, and Maximilian Brantl at Routledge, Taylor & Francis.
Abstract: The ever-increasing digital transformation causes profound changes in many areas of life, especially through emerging disruptive technologies such as artificial intelligence (AI). AI tools like Chat GPT generate text, and prompt discussions on copyright, ethics, and human uniqueness,–and can be a major source for many applications in business and private context, including coachbots. As algorithms within AI tools can practice discrimination, emerging coachbots powered by AI can be discriminatory. In the following chapter, diversity as a concept as well as its interconnectedness with AI is explained. Then, many examples of biased AI in different business sectors are presented. Next to providing solutions to AI bias, the consequences of discriminatory AI for people in everyday life are laid out. The goal of this chapter is to illuminate the problems that have emerged due to a lack of diversity in AI and to show what solutions exist to address them. When coachingbots replace human coaches–as examples from other sectors demonstrate–the effects can be tremendous. The questions raised, the answers given, and the objectives pursued today will impact the decisions of upcoming decades.