Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 Use Case Scenario

Proceedings of ACM FaaCT. ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/1234567890

11 Pages Posted: 1 Feb 2021

See all articles by Ronan Hamon

Ronan Hamon

European Commission Joint Research Center

Henrik Junklewitz

European Commission Joint Research Center

Gianclaudio Malgieri

Universiteit Leiden, eLaw; Vrije Universiteit Brussel (VUB) - Faculty of Law

Paul De Hert

Free University of Brussels (VUB)- LSTS; Tilburg University - Tilburg Institute for Law, Technology, and Society (TILT)

Laurent Beslay

European Commission Joint Research Center

Ignacio Sanchez

European Commission Joint Research Center

Date Written: January 21, 2021

Abstract

Can we achieve an adequate level of explanation for complex machine learning models in high-risk AI applications when applying the EU data protection framework? In this article, we address this question, analysing from a multidisciplinary point of view the connection between existing legal requirements for the explainability of AI systems and the current state of the art in the field of explainable AI.

We present a case study of a real-life scenario designed to illustrate the application of an AI-based automated decision making process for the medical diagnosis of COVID-19 patients. The scenario exemplifies the trend in the usage of increasingly complex machine-learning algorithms with growing dimensionality of data and model parameters. Based on this setting, we analyse the challenges of providing human legible explanations in practice and we discuss their legal implications following the General Data Protection Regulation (GDPR).

Although it might appear that there is just one single form of explanation in the GDPR, we conclude that the context in which the decision-making system operates requires that several forms of explanation are considered. Thus, we propose to design explanations in multiple forms, depending on: the moment of the disclosure of the explanation (either ex ante or ex post); the audience of the explanation (explanation for an expert or a data controller and explanation for the final data subject); the layer of granularity (such as general, group-based or individual explanations); the level of the risks of the automated decision regarding fundamental rights and freedoms. Consequently, explanations should embrace this multifaceted environment.

Furthermore, we highlight how the current inability of complex, deep learning based machine learning models to make clear causal links between input data and final decisions represents a limitation for providing exact, human-legible reasons behind specific decisions. This makes the provision of satisfactorily, fair and transparent explanations a serious challenge. Therefore, there are cases where the quality of possible explanations might not be assessed as an adequate safeguard for automated decision-making processes under Article 22(3) GDPR. Accordingly, we suggest that further research should focus on alternative tools in the GDPR (such as algorithmic impact assessments from Article 35 GDPR or algorithmic lawfulness justifications) that might be considered to complement the explanations of automated decision-making.

Suggested Citation

Hamon, Ronan and Junklewitz, Henrik and Malgieri, Gianclaudio and De Hert, Paul and Beslay, Laurent and Sanchez, Ignacio, Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 Use Case Scenario (January 21, 2021). Proceedings of ACM FaaCT. ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/1234567890, Available at SSRN: https://ssrn.com/abstract=3774114

Ronan Hamon

European Commission Joint Research Center ( email )

Edificio Expo C
Inca Garcilaso, s/n
Seville, Seville E-41092
Spain

Henrik Junklewitz

European Commission Joint Research Center ( email )

Edificio Expo C
Inca Garcilaso, s/n
Seville, Seville E-41092
Spain

Gianclaudio Malgieri (Contact Author)

Universiteit Leiden, eLaw ( email )

Steenschuur 25
Leiden, 2311
Netherlands

Vrije Universiteit Brussel (VUB) - Faculty of Law ( email )

Brussels
Belgium

HOME PAGE: http://www.vub.ac.be/LSTS/members/malgieri/

Paul De Hert

Free University of Brussels (VUB)- LSTS ( email )

Pleinlaan 2
Brussels, Brabant 1050
Belgium

Tilburg University - Tilburg Institute for Law, Technology, and Society (TILT) ( email )

P.O.Box 90153
Prof. Cobbenhagenlaan 221
Tilburg, 5037
Netherlands

Laurent Beslay

European Commission Joint Research Center ( email )

Edificio Expo C
Inca Garcilaso, s/n
Seville, Seville E-41092
Spain

Ignacio Sanchez

European Commission Joint Research Center ( email )

Edificio Expo C
Inca Garcilaso, s/n
Seville, Seville E-41092
Spain

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
265
Abstract Views
793
rank
169,096
PlumX Metrics