Accountability in Algorithmic Copyright Enforcement

61 Pages Posted: 20 May 2015 Last revised: 17 May 2017

See all articles by Maayan Perel (Filmar)

Maayan Perel (Filmar)

Netanya Academic College

Niva Elkin-Koren

Tel-Aviv University - Faculty of Law

Date Written: February 21, 2016

Abstract

Recent years demonstrate a growing use of algorithmic law enforcement by online intermediaries. Facilitating the distribution of online content, online intermediaries offer a natural point of control for monitoring access to illegitimate content, which makes them ideal partners for performing civil and criminal enforcement. Copyright law has been at the forefront of algorithmic law enforcement since the early 1990s when it conferred safe harbor protection to online intermediaries who remove allegedly infringing content upon notice under the Digital Millennium Copyright Act (DMCA). Over the past two decades, the Notice and Takedown (N&TD) regime has become ubiquitous and embedded in the system design of all major intermediaries: major copyright owners increasingly exploit robots to send immense volumes of takedown requests and major online intermediaries, in response, use algorithms to filter, block, and disable access to allegedly infringing content automatically, with little or no human intervention. Algorithmic enforcement by online intermediaries reflects a fundamental shift in our traditional system of governance. It effectively converges law enforcement and adjudication powers in the hands of a small number of mega platforms, which are profit maximizing, and possibly biased, private entities. Yet notwithstanding their critical role in shaping access to online content and facilitating public discourse, intermediaries are hardly held accountable for algorithmic enforcement. We simply do not know which allegedly infringing material triggers the algorithms, how decisions regarding content restrictions are made, who is making such decisions, and how target users might affect these decisions. Lessons drawn from algorithmic copyright enforcement by online intermediaries could offer a valuable case study for addressing these concerns. As we demonstrate, algorithmic copyright enforcement by online intermediaries lacks sufficient measures to assure accountability, namely, the extent to which decision makers are expected to justify their choices, are answerable for their actions, and are held responsible for their failures and wrongdoings. This Article proposes a novel framework for analyzing accountability in algorithmic enforcement that is based on three factors: transparency, due process and public oversight. It identifies the accountability deficiencies in algorithmic copyright enforcement and further maps the barriers for enhancing accountability, including technical barriers of non-transparency and machine learning, legal barriers that disrupt the development of algorithmic literacy, and practical barriers. Finally, the Article explores current and possible strategies for enhancing accountability by increasing public scrutiny and promoting transparency in algorithmic copyright enforcement.

Keywords: copyright, DMCA, enforcement, algorithm, machine learning, accountability, due process, transparency, UGC, online intermediaries, public sphere, free speech, filtering, Content ID, YouTube, Google, Facebook

Suggested Citation

Perel (Filmar), Maayan and Elkin-Koren, Niva, Accountability in Algorithmic Copyright Enforcement (February 21, 2016). 19 Stan. Tech. L. Rev. 473 (2016), Available at SSRN: https://ssrn.com/abstract=2607910 or http://dx.doi.org/10.2139/ssrn.2607910

Maayan Perel (Filmar) (Contact Author)

Netanya Academic College ( email )

1 Unisversity Street
Netanya
Netanya, 31905
Israel

Niva Elkin-Koren

Tel-Aviv University - Faculty of Law ( email )

Ramat Aviv
Tel Aviv, 6997801
Israel

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
811
Abstract Views
4,136
Rank
53,215
PlumX Metrics