Discriminatory Designs on User Data

Knight First Amendment Institute at Columbia University

22 Pages Posted: 26 Apr 2018 Last revised: 13 Feb 2023

See all articles by Olivier Sylvain

Olivier Sylvain

Fordham University School of Law

Date Written: April 6, 2018

Abstract

Section 230 of the Communications Decency Act protects intermediaries from liability for distributing third-party user content. Courts have read Section 230 broadly, creating an immunity for intermediaries who do all but “materially contribute” to the user content they distribute. That is, courts have read the statute’s protections to cover services that augment user content, but not services that demonstrably help to develop the alleged illegal expressive conduct. Many believe that the internet would not be as dynamic and beguiling today were it not for the protection that Section 230 has been construed to provide for online intermediaries. Members of Congress worried the intermediaries would be “chilled” by the fear that they could be held legally responsible for content posted by users.

Today, however, Section 230 doctrine has also had a perverse effect. By providing intermediaries with such broad legal protection, the courts’ construction of Section 230 effectively underwrites content that foreseeably targets children, women, racial minorities, and other predictable targets of harassment and discriminatory expressive conduct. More than this: intermediaries today elicit and then algorithmically sort and repurpose the user data that they collect. The most powerful services also leverage their market position to trade this information in ancillary or secondary markets. Moreover, they design their platforms in ways that shape the form and substance of their users’ content. Intermediaries and their defenders characterize these designs as substantively neutral technical necessities, but as I explain below, recent developments involving two of the most prominent beneficiaries of Section 230 immunity, Airbnb and Facebook, suggest otherwise. Airbnb and Facebook have enabled a range of harmful expressive acts, including violations of housing and employment laws, through the ways in which they structure their users’ interactions.

At a minimum, companies should not get a free pass for enabling unlawful discriminatory conduct, regardless of the social value their services may otherwise provide. I argue here that Section 230 doctrine requires a substantial reworking if the internet is to be the great engine of democratic engagement and creativity that it might be. Section 230 is no longer serving all the purposes it was meant to serve. The statute was intended at least in part to ensure the vitality and diversity, as well as the volume, of speech on new communications platforms. By allowing intermediaries to design their platforms without internalizing the costs of the illegal speech and conduct they facilitate, however, the statute is having the opposite effect.

Suggested Citation

Sylvain, Olivier, Discriminatory Designs on User Data (April 6, 2018). Knight First Amendment Institute at Columbia University, Available at SSRN: https://ssrn.com/abstract=3157975

Olivier Sylvain (Contact Author)

Fordham University School of Law ( email )

150 West 62nd Street
New York, NY 10023
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
352
Abstract Views
2,705
Rank
168,477
PlumX Metrics