Are Dark Patterns Anticompetitive?

45 Pages Posted: 22 Oct 2019 Last revised: 16 Jan 2021

See all articles by Gregory Day

Gregory Day

University of Georgia - C. Herman and Mary Virginia Terry College of Business

Abbey Stemler

Indiana University - Kelley School of Business - Department of Business Law; Harvard University - Berkman Klein Center for Internet & Society

Date Written: October 11, 2019

Abstract

Platform-based businesses (“platforms”) have sought to design websites, apps, and interfaces to addict and manipulate users. They intentionally stimulate the release of dopamine in users’ brains, which creates addiction akin to gambling. As examples, reports indicate that Instagram withholds notifying users of “likes” until later so as to increase their dopamine intake—known as a “variable reward schedule.” Or Twitter’s app which opens with a blue screen and pulsating bird: the interface, while intended to appear like its loading, builds a positive feedback loop based on anticipation. Perhaps the most addictive design is Snapchat’s “streak” which has increased attention spent on the platform by 40%. In capturing and maintaining attention, companies increase the amount of time spent and data created (the chief commodity of the digital economy) on their platforms.

Once attention is gained, platforms can then exploit their users’ cognitive vulnerabilities in the form of “dark patterns,” which are described as subtle design choices meant to guide users towards adopting behaviors sought by the platform, and other forms of online manipulation. The brilliance of online manipulation is that it makes interactions on the platform—as well as the sharing of one’s photos, messages, geolocation, and contacts—appear like exercises of free will. This threatens an aspect of privacy called “decisional privacy,” referring one’s ability to make choices free of coercion.

This Article argues that consumer welfare diminishes when technology is designed to extract wealth from consumers in a manner eroding decisional privacy. Given the lack of regulations on this point, we show that market power and exclusionary strategies enable platforms to adopt dark patterns and other manipulative techniques. The problem is that antitrust has typically viewed efforts to coax consumers as a form of competition or even procompetitive behavior. To us, online manipulation erodes the ability of consumers to act rationally, which empowers platforms to extract wealth and build market power without doing so on the merits. If digital markets were more competitive, market forces would create competition over privacy lines as well as disseminate information about dark patterns, enhancing consumer welfare. We thus insist that courts must not only settle the debate about whether privacy accords with antitrust’s framework—it does—but also recognize the importance of decisional privacy.

Keywords: antitrust, platforms, privacy, big data, technology, competition, addiction, manipulation

Suggested Citation

Day, Gregory and Stemler, Abbey, Are Dark Patterns Anticompetitive? (October 11, 2019). Alabama Law Review, Forthcoming, Available at SSRN: https://ssrn.com/abstract=3468321 or http://dx.doi.org/10.2139/ssrn.3468321

Gregory Day (Contact Author)

University of Georgia - C. Herman and Mary Virginia Terry College of Business ( email )

Brooks Hall
Athens, GA 30602-6254
United States

Abbey Stemler

Indiana University - Kelley School of Business - Department of Business Law ( email )

Bloomington, IN 47405
United States

Harvard University - Berkman Klein Center for Internet & Society ( email )

Harvard Law School
23 Everett, 2nd Floor
Cambridge, MA 02138
United States

HOME PAGE: http://https://cyber.harvard.edu/people/abbey-stemler

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
216
Abstract Views
1,096
rank
170,296
PlumX Metrics