Algorithms and the Propagation of Gendered Cultural Norms

Forthcoming for publication in French in “IA, Culture et Médias” (2022). Edited by: Véronique Guèvremont and Colette Brin. Presses de l’université de Laval.

13 Pages Posted: 15 Dec 2021

See all articles by Eleonore Fournier-Tombs

Eleonore Fournier-Tombs

University of Ottawa Faculty of Law Civil Law Section

Céline Castets-Renard

Civil Law Faculty; University of Toulouse 1; ANITI (Artificial and Natural Intelligence Toulouse Institute); Institut Universitaire de France; University of Ottawa

Date Written: December 7, 2021

Abstract

Artificial intelligence is increasingly being used to create technological interfaces - whether chatbots, personal assistants or robots whose function is to interact with humans. They offer services, answer questions, and even undertake domestic tasks, such as buying groceries or controlling the temperature in the home.

In a study of personal assistants with female voices, such as Amazon's Alexa and Apple's Siri, the United Nations Educational, Scientific and Cultural Organization (UNESCO) argued that these technologies could have significant negative effects on gender equality . In addition to the fact that these artificial intelligence (AI) systems are trained on gender-specific models, these female-voiced assistants all feature stereotypical female attributes. This problem is compounded by the fact that these systems were probably created primarily by male developers . These gender-specific assistants can pose a threat through the biased representation of women they generate, especially as they become increasingly ubiquitous in our daily lives. It is predicted that by the end 2021, there will be more voice assistants on the planet than human beings .

Given the increasing use of voice assistants trained with biased language models, the potential impact on gender norms is of concern. While isolation has increased significantly during COVID-19, there is a risk that some people's main 'female' interaction is with these voice assistants. If we are not careful, sexist representations of women, totally out of step with real women, will intrude into the privacy of the home or our smartphones, anywhere, anytime. Moreover, the models are essentially the same, leading to the reproduction of a single 'standard' and a cultural smoothing in human-machine interaction, denying the diversity of users of these products around the world.

While some have argued that learning algorithms may be less biased than humans, who are often influenced by discriminatory cultural norms of which they may not be aware , this is without regard to the fact that artificial intelligence (AI) is necessarily created by human beings whose way of thinking it incorporates. Indeed, it is easy to underestimate the importance of cultural norms in human decision-making. Artificial intelligence mimics the social biases of the data it has been given unless it is explicitly designed with different principles. It is therefore not surprising that artificial intelligence developed without built-in values only reflects already biased social norms.

This chapter explores the ambiguous impact of learning algorithms that threaten to propagate biased gender norms, while promising to eliminate them. Two main questions are posed here: How can we better understand and document gender biases that may be associated with other forms of domination? How can the technique be better used to mitigate its negative effects?

The objectives of the research are twofold. On the one hand, it aims to analyse the cultural biases embedded in machine learning models, such as sexist word associations and stereotypical female representations, from different cultural and geographical perspectives (1). On the other, it considers the ways in which machine learning could reverse gender-biased norms by providing predictions and decisions free of systemic discrimination or by minimising negative discriminants (2).

Suggested Citation

Fournier-Tombs, Eleonore and Castets-Renard, Céline, Algorithms and the Propagation of Gendered Cultural Norms (December 7, 2021). Forthcoming for publication in French in “IA, Culture et Médias” (2022). Edited by: Véronique Guèvremont and Colette Brin. Presses de l’université de Laval., Available at SSRN: https://ssrn.com/abstract=3980113 or http://dx.doi.org/10.2139/ssrn.3980113

Eleonore Fournier-Tombs (Contact Author)

University of Ottawa Faculty of Law Civil Law Section ( email )

57 Louis Pasteur Dr
Ottawa
Canada

Céline Castets-Renard

Civil Law Faculty ( email )

57 Louis Pasteur Street
Ottawa, Ontario K1N 6N5
Canada

HOME PAGE: http://https://droitcivil.uottawa.ca/fr

University of Toulouse 1 ( email )

2 rue du doyen Gabriel Marty
Toulouse, 31000
France

ANITI (Artificial and Natural Intelligence Toulouse Institute) ( email )

41 Allées Jules Guesde - CS 61321
TOULOUSE
France

Institut Universitaire de France ( email )

103, bld Saint-Michel
75005 Paris
United States

University of Ottawa ( email )

2292 Edwin Crescent
Ottawa, Ontario K2C 1H7
Canada

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
93
Abstract Views
843
Rank
538,863
PlumX Metrics