The Data Protection Impact Assessment as a Tool to Enforce Non-Discriminatory AI

Forthcoming in Springer Proceedings of the Annual Privacy Forum (Lisbon, 2020)

20 Pages Posted: 21 May 2020

See all articles by Yordanka Ivanova

Yordanka Ivanova

Vrije Universiteit Brussel; Sofia University 'St.Kliment Ohridski'

Date Written: February 1, 2020

Abstract

This paper argues that the novel tools under the General Data Protection Regulation (GDPR) may provide an effective legally binding mechanism for enforcing non-discriminatory AI systems. Building on relevant guidelines, the generic literature on impact assessments and algorithmic fairness, this paper aims to propose a specialized methodological framework for carrying out a Data Protection Impact Assessment (DPIA) to enable controllers to assess and prevent ex ante the risk to the right to non-discrimination as one of the key fundamental rights that GDPR aims to safeguard.

Keywords: EU fundamental rights, Non-discrimination, Data protection, GDPR, DPIA, Algorithmic impact assessment, Algorithmic bias, AI fairness

Suggested Citation

Ivanova, Yordanka, The Data Protection Impact Assessment as a Tool to Enforce Non-Discriminatory AI (February 1, 2020). Forthcoming in Springer Proceedings of the Annual Privacy Forum (Lisbon, 2020), Available at SSRN: https://ssrn.com/abstract=3584219 or http://dx.doi.org/10.2139/ssrn.3584219

Yordanka Ivanova (Contact Author)

Vrije Universiteit Brussel ( email )

Brussels
Belgium

Sofia University 'St.Kliment Ohridski' ( email )

Bulgaria

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
263
Abstract Views
878
Rank
213,582
PlumX Metrics