Female Servitude by Default and Social Harm: AI Virtual Personal Assistants, the FTC, and Unfair Commercial Practices

17 Pages Posted: 19 Jun 2019

See all articles by Nora Ni Loideain

Nora Ni Loideain

Information Law & Policy Centre; Faculty of Law, University of Cambridge; Faculty of Humanities

Rachel Adams

Information Law and Policy Centre, Institute for Advanced Legal Studies; Research Use and Impact Assessment, Human Sciences Research Council

Date Written: June 11, 2019

Abstract

The prevalence of artificial intelligence (AI) -driven virtual personal assistants (VPAs), both in the home and in businesses, is increasing. Yet, the key VPAs on the market today – Siri (Apple), Alexa (Amazon) and Cortana (Microsoft) – appear to be gendered female. This gendering takes place not just through their designation with female names which coincide with mythical and stereotyped views on gender (Siri is a Nordic name meaning “the beautiful woman that leads you to victory”), but also through female voices that users find more comfortable to instruct and give orders to than a male voice, and through witty and flirtatious characters revealed through their programmed responses to even the most perverse questions. It is, therefore, a gendering which is problematic: depicting the category “female” as an assistant – or secondary – to their male counterparts.

Noting the post-phenomenological arguments set out by Mireille Hildebrandt that the technologies we use not only reflect and embed our presumptions and social biases, but also reproduce them in new ways that have material effects on us, we explore how the gendering of AI-driven VPAs poses a critical social harm by, as Julie Cohen describes in relation to Hildebrandt’s contentions, ‘continually, imminently mediating and pre-empting our beliefs and choices’ about the role of women in society.

More critically, and in response to what we argue to be a social harm caused by the gendering of AI-driven VPAs produced by US-based companies Apple, Microsoft and Amazon, we explore the role and mandate of the Federal Trade Commission (FTC) as the broadly-mandated regulatory body for consumer protection. In particular, we analyse two distinct functions of the FTC. The first, its role with regard to the protection of data privacy and the extent to which data privacy impact assessments that necessitate an investigation of the social impact of new technologies beyond the privacy paradigm, can be drawn on here as a potential solution. And the second, the role of the FTC to investigate, protect and prevent ‘unfair or deceptive acts of practices in commerce’, and the extent to which the gendering of AI-driven VPAs can be gainsaid to constitute an unfair commercial practice.

Keywords: AI, Virtual Personal Assistants, Social Harm, Female Servitude, Unfair Commercial Practices, Federal Trade Commission

Suggested Citation

Ni Loideain, Nora and Adams, Rachel, Female Servitude by Default and Social Harm: AI Virtual Personal Assistants, the FTC, and Unfair Commercial Practices (June 11, 2019). Available at SSRN: https://ssrn.com/abstract=3402369 or http://dx.doi.org/10.2139/ssrn.3402369

Nora Ni Loideain

Information Law & Policy Centre ( email )

Charles Clore House
17 Russell Square
London, WC1B 5DR
United Kingdom

HOME PAGE: http://ials.sas.ac.uk/about/about-us/people/nóra-ni-loideain

Faculty of Law, University of Cambridge ( email )

Trinity Ln
Cambridge, CB2 1TN
United Kingdom

HOME PAGE: http://www.crassh.cam.ac.uk/people/profile/nora-ni-loideain

Faculty of Humanities ( email )

PO Box 524
Auckland Park
Johannesburg, Gauteng 2006
South Africa

Rachel Adams (Contact Author)

Information Law and Policy Centre, Institute for Advanced Legal Studies ( email )

Charles Clore House
17 Russell Square
London, WC1B 5DR
United Kingdom

Research Use and Impact Assessment, Human Sciences Research Council ( email )

Private Bag X41
134 Pretorius Street
Pretoria, 0001
South Africa

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
335
Abstract Views
1,892
Rank
163,651
PlumX Metrics