Addressing Indirect Discrimination and Gender Stereotypes in AI Virtual Personal Assistants: The Role of International Human Rights Law

Annual Cambridge International Law Conference 2019, New Technologies: New Challenges for Democracy and International Law

21 Pages Posted: 19 Jun 2019

See all articles by Rachel Adams

Rachel Adams

Information Law and Policy Centre, Institute for Advanced Legal Studies; Research Use and Impact Assessment, Human Sciences Research Council

Nora Ni Loideain

University of London - Institute of Advanced Legal Studies; Faculty of Law, University of Cambridge; Faculty of Humanities; King's College London – The Dickson Poon School of Law

Date Written: May 22, 2019

Abstract

Virtual Personal Assistants are increasingly becoming a common aspect of everyday living. However, with female names, voices, and characters, these devices appear to reproduce harmful gender stereotypes about the role of women in society and the type of work women perform. Designed to “assist”, virtual personal assistants – such as Apple’s Siri and Amazon’s Alexa – reproduce and reify the idea that women are subordinate to men, and exist to be “used” by men. Despite their ubiquity, these aspects of their design have seen little critical attention in scholarship, and the potential legal responses to this issue have yet to be fully canvassed. Accordingly, this article sets out to critique the reproduction of negative gender stereotypes in virtual personal assistants and explores the provisions and findings within international women’s rights law to assess both how this constitutes indirect discrimination and possible remedies for redress. In this regard, this article explores the obligation to protect women from discrimination at the hands of private actors under the Convention on the Elimination of All Forms of Discrimination Against Women, the work of the Committee on Discrimination Against Women on gender stereotyping, the role of the United Nations Guiding Principles on Business and Human Rights, as well as domestic enforcement mechanisms for international human rights norms and standards.

Keywords: gender stereotypes; indirect discrimination; AI; Virtual Personal Assistants; women’s rights; CEDAW; UN Guiding Principles; GDPR; data protection impact assessments

Suggested Citation

Adams, Rachel and Ni Loideain, Nora, Addressing Indirect Discrimination and Gender Stereotypes in AI Virtual Personal Assistants: The Role of International Human Rights Law (May 22, 2019). Annual Cambridge International Law Conference 2019, New Technologies: New Challenges for Democracy and International Law . Available at SSRN: https://ssrn.com/abstract=3392243 or http://dx.doi.org/10.2139/ssrn.3392243

Rachel Adams (Contact Author)

Information Law and Policy Centre, Institute for Advanced Legal Studies ( email )

Charles Clore House
17 Russell Square
London, WC1B 5DR
United Kingdom

Research Use and Impact Assessment, Human Sciences Research Council ( email )

Private Bag X41
134 Pretorius Street
Pretoria, 0001
South Africa

Nora Ni Loideain

University of London - Institute of Advanced Legal Studies ( email )

Charles Clore House
17 Russell Square
London, WC1B 5DR
United Kingdom

HOME PAGE: http://ials.sas.ac.uk/about/about-us/people/nóra-ni-loideain

Faculty of Law, University of Cambridge ( email )

Trinity Ln
Cambridge, CB2 1TN
United Kingdom

HOME PAGE: http://www.crassh.cam.ac.uk/people/profile/nora-ni-loideain

Faculty of Humanities ( email )

PO Box 524
Auckland Park
Johannesburg, Gauteng 2006
South Africa

King's College London – The Dickson Poon School of Law

Somerset House East Wing
Strand
London, WC2R 2LS
United Kingdom

Register to save articles to
your library

Register

Paper statistics

Downloads
120
Abstract Views
994
rank
238,298
PlumX Metrics