Right to an Explanation Considered Harmful

10 Pages Posted: 31 May 2019

See all articles by Andy Crabtree

Andy Crabtree

University of Nottingham

Lachlan Urquhart

University of Edinburgh - School of Law; Horizon Digital Economy Research Institute

Jiahong Chen

University of Nottingham

Date Written: April 8, 2019

Abstract

Lay and professional reasoning has it that newly introduced data protection regulation in Europe – GDPR – mandates a ‘right to an explanation’. This has been read as requiring that the machine learning (ML) community build ‘explainable machines’ to enable legal compliance. In reviewing relevant accountability requirements of GDPR and measures developed within the ML community to enable human interpretation of ML models, we argue that this reading should be considered harmful as it creates unrealistic expectations for the ML community and society at large. GDPR does not require that machines provide explanations, but that data controllers – i.e., human beings – do. We consider the implications of this requirement for the ‘explainable machines’ agenda.

Suggested Citation

Crabtree, Andy and Urquhart, Lachlan and Chen, Jiahong, Right to an Explanation Considered Harmful (April 8, 2019). Edinburgh School of Law Research Paper Forthcoming. Available at SSRN: https://ssrn.com/abstract=3384790 or http://dx.doi.org/10.2139/ssrn.3384790

Andy Crabtree (Contact Author)

University of Nottingham ( email )

Jubilee Campus
Wollaton Road
Nottingham, NG8 1BB
United Kingdom

Lachlan Urquhart

University of Edinburgh - School of Law

Old College
South Bridge
Edinburgh, EH8 9YL
United Kingdom

Horizon Digital Economy Research Institute ( email )

University of Nottingham Innovation Park
Triumph Road
Nottingham, NG7 2TU
United Kingdom

Jiahong Chen

University of Nottingham ( email )

United Kingdom

Register to save articles to
your library

Register

Paper statistics

Downloads
35
Abstract Views
233
PlumX Metrics