Down by Algorithms? Siphoning Rents, Exploiting Biases and Shaping Preferences – The Dark Side of Personalized Transactions

30 Pages Posted: 19 Apr 2018

See all articles by Gerhard Wagner

Gerhard Wagner

Humboldt University School of Law; University of Chicago Law School

Horst Eidenmueller

University of Oxford - Faculty of Law; European Corporate Governance Institute (ECGI)

Date Written: March 30, 2018

Abstract

In this article we seek to systematically explore and understand crucial aspects of a potential dark side of personalized transactions. Big data and artificial intelligence may enable businesses with access to the data and the required technology to effectively personalize their interactions with consumers in order to exploit informational asymmetries and/or consumer biases in novel ways and on an unprecedented scale. We identify three aspects of the dark side of personalized B2C transactions as particular areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit well-known behavioral biases of consumers such as, for example, their inability to correctly assess the long-term effects of complex transactions or their insufficient will-power, in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern, effectively locking them into a lifestyle determined by their past choices and those of likeminded fellows.

At first sight, siphoning rents, exploiting biases and shaping preferences appear to be relatively distinct phenomena. However, on closer inspection, these phenomena share a common underlying theme: the potential exploitation of consumers or at least an impoverishment of their lives by firms who apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms”, losing transaction surplus, engaging in welfare-reducing transactions and increasingly being trapped in a narrower life.

It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioral biases may or may not lead consumers to conclude inefficient transactions. But it appears that they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Indeed, in many jurisdictions they already have such a right today. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting and leading a multifaceted life. We should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish our lives.

Keywords: Big data, artificial intelligence, algorithms, personalized transactions, first degree price discrimination, cognitive biases, preference shaping, withdrawal rights, right to anonymity

JEL Classification: D00, K00

Suggested Citation

Wagner, Gerhard and Eidenmueller, Horst G. M., Down by Algorithms? Siphoning Rents, Exploiting Biases and Shaping Preferences – The Dark Side of Personalized Transactions (March 30, 2018). University of Chicago Law Review, Forthcoming, Oxford Legal Studies Research Paper No. 20/2018, Available at SSRN: https://ssrn.com/abstract=3160276 or http://dx.doi.org/10.2139/ssrn.3160276

Gerhard Wagner

Humboldt University School of Law ( email )

Unter den Linden 9
Berlin, 10099
Germany

University of Chicago Law School ( email )

1111 East 60th Street
Chicago, IL 60637
United States

Horst G. M. Eidenmueller (Contact Author)

University of Oxford - Faculty of Law ( email )

St Cross Building
St Cross Road
Oxford, OX1 3UL
United Kingdom

European Corporate Governance Institute (ECGI) ( email )

c/o the Royal Academies of Belgium
Rue Ducale 1 Hertogsstraat
1000 Brussels
Belgium

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
921
Abstract Views
3,570
Rank
53,375
PlumX Metrics