Transparency of Automated Decisions in the GDPR: An Attempt for Systemisation

50 Pages Posted: 16 Jan 2018 Last revised: 29 Nov 2018

See all articles by Emre Bayamlıoğlu

Emre Bayamlıoğlu

Tilburg University - Tilburg Institute for Law, Technology, and Society (TILT)

Date Written: January 7, 2018

Abstract

The study provides a conceptual framework of the transparency requirements arising due to the opaque and biased nature of the automated decisions; and further explores the compatibility of this framework with the affordances of the transparency-related provisions of the GDPR. In line with this, the section following this introduction will start with the question: what type of automated decisions are within the scope of the GDPR? Accordingly, the coming section 2 will explore what amounts to an automated decision under the Regulation, and how the requirement of “solely automated processing” should be understood. In search of an answer, the section introduces a “regulatory perspective” to serve as the common denominator to systemize legal and other similarly significant effects of the automated decisions as expressed in Article 22/1. In sum: in order to address the issue at the necessary level of generality, automated data-driven systems are approached as decisional processes with certain regulatory impact.

As the paper ultimately intends to analyse the adequacy of the transparency-related provisions of the GDPR, this logically entails, as an initial step, the definition of a measure, or a certain benchmark of transparency to test and compare with the affordances provided by the Regulation. To this end, section 3 briefly provides the essential components of a technology-neutral and model-agnostic framework which aims to conceptualise and systemise the transparency requirements engendered by the automated decisions — namely, the transparency desiderata. Intended as a generic template, the transparency desiderata may be seen as a legal reading of the data-driven technologies together with their capacities and affordances. Taking into consideration the legal, economic and technical/computational impediments; section 3 completes with a theoretical outline of the possible modes of implementation for the transparency desiderata. Over all, the section seeks answer to the question: what must be made transparent to render automated decisions reviewable, verifiable and justifiable as regulatory processes — either directly by human reason or through human-machine symbiotic mechanisms?

Next, with a view to find out to what extent that the GDPR accommodates the transparency desiderata, sections 4 and 5 analyse the transparency affordances of the provisions specific to automated decisions in a two-pronged methodology. Accordingly, the scope and the implications of these provisions are studied in a dialectical entanglement — though as two different set of obligations. First, section 4, will provide a normative analysis of the relevant provisions formulated in the form of notification and disclosure duties under the “access rights” (Art. 13,14,15). The second set of transparency-related legal remedies to be analysed under the GDPR is the right not to be subject to automated decisions and its derivatives as formulated in Article 22. Section 5 is spared for the transparency implications of Article 22; and as a novel approach, the right to human intervention and contestation(Art.22/3) are treated as a different type of obligation which is complimentary to the “access rights”, but distinct in nature. Based on the systemic view taken, section 5.elaborates on the possible content and the procedural aspects of the right to contest as provided in Article 22/3 — namely the contestation scheme: an initial abstraction to be further developed and refined to accommodate different decision-making domains and methodologies.

Keywords: Algorithmic Transparency, GDPR

Suggested Citation

Bayamlıoğlu, Emre, Transparency of Automated Decisions in the GDPR: An Attempt for Systemisation (January 7, 2018). Available at SSRN: https://ssrn.com/abstract=3097653 or http://dx.doi.org/10.2139/ssrn.3097653

Emre Bayamlıoğlu (Contact Author)

Tilburg University - Tilburg Institute for Law, Technology, and Society (TILT) ( email )

P.O.Box 90153
Prof. Cobbenhagenlaan 221
Tilburg, 5037
Netherlands

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
767
Abstract Views
2,640
Rank
60,537
PlumX Metrics