A Matter of Perspective: Discrimination, Bias, and Inequality in AI

A Matter of Perspective: Discrimination, Bias, and Inequality in AI

Katie Miller
DOI: 10.4018/978-1-7998-3130-3.ch010
OnDemand:
(Individual Chapters)
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

The challenge presented is an age when some decisions are made by humans, some are made by AI, and some are made by a combination of AI and humans. For the person refused housing, a phone service, or employment, the experience is the same, but the ability to understand what has happened and obtain a remedy may be very different if the discrimination is attributable to or contributed by an AI system. If we are to preserve the policy intentions of our discrimination, equal opportunity, and human rights laws, we need to understand how discrimination arises in AI systems; how design in AI systems can mitigate such discrimination; and whether our existing laws are adequate to address discrimination in AI. This chapter endeavours to provide this understanding. In doing so, it focuses on narrow but advanced forms of artificial intelligence, such as natural language processing, facial recognition, and cognitive neural networks.
Chapter Preview
Top

Introduction

An Aboriginal woman is refused housing.

A man with a disability is refused a contract for a phone service.

A woman is denied employment as a pilot.

Each of these examples involve discrimination. But discrimination by whom? Can you distinguish the case where the discrimination is caused by a human and the case where the discrimination is caused by artificial intelligence (AI), meaning computers doing tasks that, when humans do them, require thinking (Walsh, 2019)?

This is the challenge presented in an age when some decisions are made by humans, some are made by AI and some are made by a combination of AI and humans. For the person refused housing, a phone service or employment, the experience is the same – but the ability to understand what has happened and obtain a remedy may be very different if the discrimination is attributable to, or contributed by, an AI system.

The questions posed above are, of course, trick questions. Each of the examples given have resulted from discrimination by humans in the past (see for example Australian Human Rights Commission, 2002; Australian Human Rights Commission, 2009; Solonec, 2000). Each of the examples given could, or already have, resulted from the use of AI systems. Each of the examples given can result from direct discrimination on the basis of race, gender or parental responsibilities, or indirectly through discrimination on the basis of criminal record, employment status or mental health status. One of the challenges presented by AI systems is that we increasingly do not know why decisions are made or how traditionally protected attributes factor into AI decisions, recommendations and advice.

If we are to preserve the policy intentions of our discrimination, equal opportunity and human rights laws, we need to understand how discrimination arises in AI systems; how design in AI systems can mitigate such discrimination; and whether our existing laws are adequate to address discrimination in AI. This chapter endeavours to provide this understanding. In doing so, it focuses on narrow, but advanced, forms of artificial intelligence, such as natural language processing, facial recognition and cognitive neural networks.

Top

Are We Speaking The Same Language?

The challenge of discrimination, bias and equality in AI involves the intersection of multiple domains of law, sociology and technology, each with their own experts and language. In order to have a shared understanding of the issues and possible solutions, we must first ensure that we are speaking the same language. In particular, we need to know what we mean by ‘discrimination’ and ‘bias’. While the same words may be used across domains, they can have different meanings and connotations within different domains.

In everyday speech, to ‘discriminate’ is to “note or observe a difference; distinguish” (Dorner, Blair & Bernard, 1998, Discriminate entry) and ‘discrimination’ is “the process of differentiating between persons or things possessing different properties” (Street v Queensland Bar Association, 1989, p. 570). Understood in this sense, an AI system is a discriminating machine. The ability to discriminate, quickly and over large data sets, is one of AI’s greatest strengths and a large part of the reason for its adoption and incorporation into so much of our daily lives. For example, AI assistants such as Cortana, Siri and Alexa rely on natural language processing, speech recognition and deep learning algorithms that can differentiate between words (or the sounds we use to represent words) and the contexts in which they are used (Krywko, 2017).

In a legal sense, ‘discrimination’ involves treating, or proposing to treat, someone unfavourably because of a personal characteristic protected by law (Rees, Rice & Allen, 2018). For example, refusing a person a job because of their gender, racial background, disability or sexual orientation constitutes an unlawful form of discrimination.

Key Terms in this Chapter

Discrimination: Involves treating, or proposing to treat, someone unfavourably because of a personal characteristic protected by law ( Rees, Rice, & Allen, 2018 ).

Artifical Intelligence: Involves the use of computers or software programs designed to perform tasks generally performed by humans.

Bias: Refers to a predisposition, prejudgment or distortion. In law, this often refers to a prejudice, inclination or prejudgment of a question ( Aronson & Groves, 2013 ).

Direct Discrimination: Occurs when a person is treated less favourably because of an attribute that is protected by law, such as race, gender, religious belief or (dis)ability.

Indirect Discrimination: Requires consideration of how an ostensibly neutral action affects people with one or more protected attributes. For example, preferring to employ people who can attend work at 8am may indirectly discriminate against parents who have child care responsibilities for young or school-aged children.

Complete Chapter List

Search this Book:
Reset