Addressing Indirect Discrimination and Gender Stereotypes in AI Virtual Personal Assistants: The Role of International Human Rights Law
Annual Cambridge International Law Conference 2019, New Technologies: New Challenges for Democracy and International Law
21 Pages Posted: 19 Jun 2019
Date Written: May 22, 2019
Abstract
Virtual Personal Assistants are increasingly becoming a common aspect of everyday living. However, with female names, voices, and characters, these devices appear to reproduce harmful gender stereotypes about the role of women in society and the type of work women perform. Designed to “assist”, virtual personal assistants – such as Apple’s Siri and Amazon’s Alexa – reproduce and reify the idea that women are subordinate to men, and exist to be “used” by men. Despite their ubiquity, these aspects of their design have seen little critical attention in scholarship, and the potential legal responses to this issue have yet to be fully canvassed. Accordingly, this article sets out to critique the reproduction of negative gender stereotypes in virtual personal assistants and explores the provisions and findings within international women’s rights law to assess both how this constitutes indirect discrimination and possible remedies for redress. In this regard, this article explores the obligation to protect women from discrimination at the hands of private actors under the Convention on the Elimination of All Forms of Discrimination Against Women, the work of the Committee on Discrimination Against Women on gender stereotyping, the role of the United Nations Guiding Principles on Business and Human Rights, as well as domestic enforcement mechanisms for international human rights norms and standards.
Keywords: gender stereotypes; indirect discrimination; AI; Virtual Personal Assistants; women’s rights; CEDAW; UN Guiding Principles; GDPR; data protection impact assessments
Suggested Citation: Suggested Citation