Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations

International Data Privacy Law, 2020, forthcoming.

U of Colorado Law Legal Studies Research Paper No. 19-28

29 Pages Posted: 6 Oct 2019 Last revised: 12 Oct 2020

See all articles by Margot E. Kaminski

Margot E. Kaminski

University of Colorado Law School; Yale University - Yale Information Society Project; University of Colorado at Boulder - Silicon Flatirons Center for Law, Technology, and Entrepreneurship

Gianclaudio Malgieri

Universiteit Leiden, eLaw; Vrije Universiteit Brussel (VUB) - Faculty of Law

Date Written: September 18, 2019

Abstract

Policy-makers, scholars, and commentators are increasingly concerned with the risks of using profiling algorithms and automated decision-making. The EU’s General Data Protection Regulation (GDPR) has tried to address these concerns through an array of regulatory tools. As one of us has argued, the GDPR combines individual rights with systemic governance, towards algorithmic accountability. The individual tools are largely geared towards individual “legibility”: making the decision-making system understandable to an individual invoking her rights. The systemic governance tools, instead, focus on bringing expertise and oversight into the system as a whole, and rely on the tactics of “collaborative governance,” that is, use public-private partnerships towards these goals. How these two approaches to transparency and accountability interact remains a largely unexplored question, with much of the legal literature focusing instead on whether there is an individual right to explanation.

The GDPR contains an array of systemic accountability tools. Of these tools, impact assessments (Art. 35) have recently received particular attention on both sides of the Atlantic, as a means of implementing algorithmic accountability at early stages of design, development, and training. The aim of this paper is to address how a Data Protection Impact Assessment (DPIA) links the two faces of the GDPR’s approach to algorithmic accountability: individual rights and systemic collaborative governance. We address the relationship between DPIAs and individual transparency rights. We propose, too, that impact assessments link the GDPR’s two methods of governing algorithmic decision-making by both providing systemic governance and serving as an important “suitable safeguard” (Art. 22) of individual rights.

After noting the potential shortcomings of DPIAs, this paper closes with a call — and some suggestions — for a Model Algorithmic Impact Assessment in the context of the GDPR. Our examination of DPIAs suggests that the current focus on the right to explanation is too narrow. We call, instead, for data controllers to consciously use the required DPIA process to produce what we call “multi-layered explanations” of algorithmic systems. This concept of multi-layered explanations not only more accurately describes what the GDPR is attempting to do, but also normatively better fills potential gaps between the GDPR’s two approaches to algorithmic accountability.

Keywords: DPIA, Algorithmic Impact Assessment, GDPR, Automated Decision Making, ADM, AI, Right to Explanation

Suggested Citation

Kaminski, Margot E. and Malgieri, Gianclaudio, Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations (September 18, 2019). International Data Privacy Law, 2020, forthcoming., U of Colorado Law Legal Studies Research Paper No. 19-28, Available at SSRN: https://ssrn.com/abstract=3456224. or http://dx.doi.org/10.2139/ssrn.3456224

Margot E. Kaminski (Contact Author)

University of Colorado Law School ( email )

401 UCB
Boulder, CO 80309
United States

Yale University - Yale Information Society Project ( email )

127 Wall Street
New Haven, CT 06511
United States

University of Colorado at Boulder - Silicon Flatirons Center for Law, Technology, and Entrepreneurship ( email )

Wolf Law Building
2450 Kittredge Loop Road
Boulder, CO
United States

Gianclaudio Malgieri

Universiteit Leiden, eLaw ( email )

Steenschuur 25
Leiden, 2311
Netherlands

Vrije Universiteit Brussel (VUB) - Faculty of Law ( email )

Brussels
Belgium

HOME PAGE: http://www.vub.ac.be/LSTS/members/malgieri/

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
2,555
Abstract Views
14,437
Rank
11,460
PlumX Metrics