Algorithmic Destruction
29 Pages Posted: 27 Mar 2022 Last revised: 10 May 2023
Date Written: 2022
Abstract
Contemporary privacy law does not go far enough to protect our privacy interests, particularly where artificial intelligence and machine learning are concerned. While many have written on problems of algorithmic bias or deletion, this article introduces the novel concept of the “algorithmic shadow," the persistent imprint of data in a trained machine learning model, and uses the algorithmic shadow as a lens through which to view the failures of data deletion in dealing with the realities of machine learning. This article is also the first to substantively critique the novel privacy remedy of algorithmic disgorgement, also known as algorithmic destruction.
What is the algorithmic shadow? Simply put, when you feed a set of specific data to train a machine learning model, that data produces an impact on the model that results from such training. Even if you later delete data from the training data set, the already-trained model still contains a persistent “shadow” of the deleted data. The algorithmic shadow describes the persistent imprint of the data that has been fed into a machine learning model and used to refine that machine learning system.
The failure of data deletion to resolve the privacy losses caused by algorithmic shadows highlights the ineffectiveness of data deletion as a right and a remedy. Algorithmic destruction (deletion of models or algorithms trained on misbegotten data) has emerged as an alternative, or perhaps supplement, to data deletion. While algorithmic destruction or disgorgement may resolve some of the failures of data deletion, this remedy and potential right is also not without its own drawbacks.
This article has three goals: First, the article introduces and defines the concept of the algorithmic shadow, a novel concept that has so far evaded significant legal scholarly discussion, despite its importance in future discussions of artificial intelligence and privacy law. Second, the article explains why the algorithmic shadow exposes and exacerbates existing problems with data deletion as a privacy right and remedy. Finally, the article examines algorithmic destruction as a potential algorithmic right and algorithmic remedy, comparing it with data deletion, particularly in light with algorithmic shadow harms.
Suggested Citation: Suggested Citation