Law In, law Out: Legalistic Filter Bubbles and the Algorithmic Prevention of Nonconsensual Pornography

62 Pages Posted: 2 Sep 2021

Date Written: August 31, 2021

Abstract

In 2019, Facebook announced that it has begun using machine learning algorithms to preemptively screen uploads for nonconsensual pornography. Although the use of screening algorithms has become commonplace, this seemingly minor move from reactive to preemptive, legal analysis–based prevention—this Article argues—is part of a groundbreaking shift in the meaning and effect of algorithmic screening, with potentially far-reaching implications for legal discourse and development.

To flush out the meaning of this shift, the Article draws on the filter bubble theory. Thus far, the phenomenon of filter bubbles has been synonymous with personalized filtering and the social polarization and radicalization it is prone to producing. Generalizing on this idea, the Article suggests that the control filtering algorithms have over the information brought before users can shape users’ worldviews in accordance with the algorithm’s measure of relevance. Algorithmic filtering produces this effect by enhancing users’ trust in the applicability of the measure of relevance and “invisibly hiding” any information that conflicts with it. The Article argues that in the case of filtering algorithms that use a legal classification as their measure of relevance, the result is a legalistic filter bubble that can essentialize dominant legal paradigms and suppress information that challenges their usefulness and decency. These effects, the Article suggests, can significantly impede legal evolution as they drive a wedge between adjudication and the greater normative universe it inhabits.

In the case of filtering algorithms that use the legal category of nonconsent as their measure of relevance, the emergence of a filter bubble will effectively cement nonconsent as the gravamen of violative distribution and insulate decision-makers from exposure to consensual harms. Although the Article does not suggest that we ban consensual but harmful distribution of sexual materials, it argues that the emergence of a filter bubble can hinder the development of a vibrant normative debate on the meaning of sexual autonomy.

Keywords: Artificial Intelligence, Machine Learning, Content Moderation, Algorithms, Filter Bubble, Consent, Pornography, Sexual Autonomy

Suggested Citation

Maggen, Daniel, Law In, law Out: Legalistic Filter Bubbles and the Algorithmic Prevention of Nonconsensual Pornography (August 31, 2021). Cardozo Law Review, Vol. 43, 2022, Available at SSRN: https://ssrn.com/abstract=3915143

Daniel Maggen (Contact Author)

Yale University, Law School ( email )

127 Wall Street
New Haven, CT 06511
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
127
Abstract Views
685
Rank
397,596
PlumX Metrics