'It's Reducing a Human Being to a Percentage': Perceptions of Justice in Algorithmic Decisions

Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18), DOI: 10.1145/3173574.3173951

14 Pages Posted: 16 Feb 2018 Last revised: 18 May 2018

See all articles by Reuben Binns

Reuben Binns

University of Oxford

Max Van Kleek

University of Oxford - Computing Laboratory

Michael Veale

Alan Turing Institute - Alan Turing Institute; University of Birmingham - Birmingham Law School

Ulrik Lyngs

University of Oxford - Computing Laboratory

Jun Zhao

University of Oxford - Computing Laboratory

Nigel Shadbolt

University of Oxford - Computing Laboratory

Date Written: January 31, 2018

Abstract

Data-driven decision-making consequential to individuals raises important questions of accountability and justice. Indeed, European law provides individuals limited rights to 'meaningful information about the logic' behind significant, autonomous decisions such as loan approvals, insurance quotes, and CV filtering. We undertake three experimental studies examining people's perceptions of justice in algorithmic decision-making under different scenarios and explanation styles. Dimensions of justice previously observed in response to human decision-making appear similarly engaged in response to algorithmic decisions. Qualitative analysis identified several concerns and heuristics involved in justice perceptions including arbitrariness, generalisation, and (in)dignity. Quantitative analysis indicates that explanation styles primarily matter to justice perceptions only when subjects are exposed to multiple different styles --- under repeated exposure of one style, scenario effects obscure any explanation effects. Our results suggests there may be no 'best' approach to explaining algorithmic decisions, and that reflection on their automated nature both implicates and mitigates justice dimensions.

Keywords: GDPR, article 22, automated decision-making, algorithmic accountability, right to an explanation, right to explanation, explanation facilities, procedural justice

Suggested Citation

Binns, Reuben and Van Kleek, Max and Veale, Michael and Lyngs, Ulrik and Zhao, Jun and Shadbolt, Nigel, 'It's Reducing a Human Being to a Percentage': Perceptions of Justice in Algorithmic Decisions (January 31, 2018). Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18), DOI: 10.1145/3173574.3173951. Available at SSRN: https://ssrn.com/abstract=3114133

Reuben Binns

University of Oxford ( email )

Mansfield Road
Oxford, Oxfordshire OX1 4AU
United Kingdom

Max Van Kleek

University of Oxford - Computing Laboratory ( email )

Wolfson Building
Parks Road
Oxford, OX1 3QD
United Kingdom

Michael Veale (Contact Author)

Alan Turing Institute - Alan Turing Institute ( email )

96 Euston Road
London, NW1 2DB
United Kingdom

University of Birmingham - Birmingham Law School ( email )

Edgbaston
Birmingham, B15 2TT
United Kingdom

Ulrik Lyngs

University of Oxford - Computing Laboratory ( email )

Wolfson Building
Parks Road
Oxford, OX1 3QD
United Kingdom

Jun Zhao

University of Oxford - Computing Laboratory ( email )

Wolfson Building
Parks Road
Oxford, OX1 3QD
United Kingdom

Nigel Shadbolt

University of Oxford - Computing Laboratory ( email )

Wolfson Building
Parks Road
Oxford, OX1 3QD
United Kingdom

Register to save articles to
your library

Register

Paper statistics

Downloads
117
Abstract Views
458
rank
234,485
PlumX Metrics