Automated Individual Decisions to Disclose Personal Data: Why GDPR Article 22 Should Not Apply

15 Pages Posted: 10 Jul 2020

See all articles by Mike Hintze

Mike Hintze

Hintze Law PLLC; University of Washington School of Law; Future of Privacy Forum

Date Written: June 18, 2020

Abstract

"The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her." This opening paragraph of Article 22 of the EU General Data Protection Regulation (GDPR) describes the types of automated decision-making that are subject to special restrictions and requirements under the GDPR. But despite incorporating principles that already existed in European data protection law, and despite guidance having been published by European data protection authorities, Article 22 continues to be a source of great uncertainty.

Organizations of all types are increasingly adopting the tools of machine learning and artificial intelligence in a variety of applications. Such organizations must determine when and how the Article 22 restrictions on automated decision-making apply. Depending on whether Article 22 applies broadly or narrowly will have dramatic impacts on a wide range of organizations. Overly broad interpretations will likely result in less efficient and more costly operations and could lead to less accurate and reliable decision-making. A narrower interpretation, on the other hand, will apply the unique protections afforded by Article 22 to the type of decision-making for which those protections will be meaningful, while other types of decision-making continue to be subject to the full range of protections that other provisions of the GDPR afford.

This paper will provide an overview of Article 22 and will examine several considerations that are important for determining its scope. It will argue that the scope of automated decision-making regulated by Article 22 is quite narrow, limited to those solely automated decisions where a legal or similarly significant effect is an inherent and direct result of the decision and where human intervention could be helpful and meaningful in protecting individual rights.

It will then use those considerations and conclusions to discuss one type of automated decision-making to determine whether it should be regulated by Article 22. Specifically, it will ask whether an automated decision to disclose personal data to a third party is subject to Article 22. This paper will demonstrate that in most cases, if not virtually every case, automated decisions to disclose personal data are not of the type contemplated by, or appropriate for, the application of Article 22. While acknowledging that any decision to disclose personal data raises significant privacy and data protection concerns, this paper will argue that other protections afforded by the GDPR are better to suited address those concerns and safeguard individual rights. Thus, this paper will conclude that, as a general matter, Article 22 should be interpreted to exclude such automated decisions.

Keywords: privacy, data protection, GDPR, automated decision-making, profiling, machine learning, artificial intelligence, AI

Suggested Citation

Hintze, Michael, Automated Individual Decisions to Disclose Personal Data: Why GDPR Article 22 Should Not Apply (June 18, 2020). Available at SSRN: https://ssrn.com/abstract=3630026 or http://dx.doi.org/10.2139/ssrn.3630026

Michael Hintze (Contact Author)

Hintze Law PLLC ( email )

505 Broadway E #151
Seattle, WA 98102
United States

University of Washington School of Law ( email )

William H. Gates Hall
Box 353020
Seattle, WA 98105-3020
United States

Future of Privacy Forum

United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
252
Abstract Views
906
rank
155,213
PlumX Metrics