Slave to the Algorithm? Why a 'Right to an Explanation' Is Probably Not the Remedy You Are Looking For

67 Pages Posted: 24 May 2017 Last revised: 6 Dec 2017

See all articles by Lilian Edwards

Lilian Edwards

University of Newcastle - Law School

Michael Veale

University College London, Faculty of Laws

Date Written: May 23, 2017

Abstract

Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive.

However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric" explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers' worries of intellectual property or trade secrets disclosure.

Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ("right to be forgotten") and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered.

Keywords: algorithms, machine learning, GDPR, data protection, data mining, KDD, explanations, right to an explanation, computer science, IoT, transparency

Suggested Citation

Edwards, Lilian and Veale, Michael, Slave to the Algorithm? Why a 'Right to an Explanation' Is Probably Not the Remedy You Are Looking For (May 23, 2017). 16 Duke Law & Technology Review 18 (2017), Available at SSRN: https://ssrn.com/abstract=2972855 or http://dx.doi.org/10.2139/ssrn.2972855

Lilian Edwards (Contact Author)

University of Newcastle - Law School ( email )

Newcastle upon Tyne, NE1 7RU
United Kingdom

Michael Veale

University College London, Faculty of Laws ( email )

Bentham House
4-8 Endsleigh Gardens
London, WC1E OEG
United Kingdom

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
8,978
Abstract Views
45,122
Rank
1,232
PlumX Metrics