Affinity Profiling and Discrimination by Association in Online Behavioural Advertising

38 Pages Posted: 13 Jun 2019

See all articles by Sandra Wachter

Sandra Wachter

University of Oxford - Oxford Internet Institute

Date Written: May 15, 2019

Abstract

Affinity profiling - grouping people according to their assumed interests rather than solely their personal traits - has become commonplace in the online advertising industry. Online platform providers use behavioural advertisement (OBA) and can infer very sensitive information (e.g. ethnicity, gender, sexual orientation, religious beliefs) about individuals to target or exclude certain groups from products and services, or to offer different prices.

OBA and affinity profiling raise at least three distinct legal issues: privacy, non-discrimination, and group level protection. Current regulatory frameworks may be ill-equipped to sufficiently protect against all three harms. I first examine several shortfalls of the General Data Protection Regulation (GDPR) concerning governance of sensitive inferences and profiling. I then show the gaps of EU non-discrimination law in relation to affinity profiling in terms of its areas of application (i.e. employment, welfare, goods and services) and the types of attributes and people it protects.

To close some of these potential gaps, I propose the concept of “discrimination by association”. This concept challenges the idea of differentiating between assumed interests and personal traits that potentially could render regulation inapplicable. Discrimination by association protects individuals that experience adverse treatment (e.g. not being shown an advertisement based on assumed gender) without the need to be a member of the protected group (e.g. ‘women’). Rather, protection is solely granted based on an individual’s association (e.g. assumed affinity) with people who share protected attributes. Both wrongly and accurately classified people who suffered adverse treatment because of their assumed affinity and interests could bring a claim without the need to prove group membership and to “out” themselves, if they prefer (e.g. sexual orientation, religion).

Even if these gaps are closed, challenges remain. Inferential analytics and AI expand the circle of potential victims of undesirable treatment in this context by grouping people according to inferred or correlated similarities and characteristics. These new groups are not accounted for in data protection and non-discrimination law.

I close with policy recommendations to address each of these legal challenges for OBA and affinity profiling.


Keywords: advertising, algorithmic bias, artificial intelligence, data protection, discrimination, discrimination by association, equality, European Union, fairness, General Data Protection Regulation, group privacy, machine learning, non-discrimination law, privacy, profiling

Suggested Citation

Wachter, Sandra, Affinity Profiling and Discrimination by Association in Online Behavioural Advertising (May 15, 2019). Available at SSRN: https://ssrn.com/abstract=3388639 or http://dx.doi.org/10.2139/ssrn.3388639

Sandra Wachter (Contact Author)

University of Oxford - Oxford Internet Institute ( email )

1 St. Giles
University of Oxford
Oxford OX1 3PG Oxfordshire, Oxfordshire OX1 3JS
United Kingdom

Register to save articles to
your library

Register

Paper statistics

Downloads
312
Abstract Views
1,724
rank
95,408
PlumX Metrics