The Biased Algorithm: Evidence of Disparate Impact on Hispanics
25 Pages Posted: 29 Oct 2018 Last revised: 15 Jul 2019
Date Written: October 5, 2018
Abstract
Algorithmic risk assessment holds the promise of reducing mass incarceration while remaining conscious of public safety. Yet presumptions of transparent and fair algorithms may be unwarranted. Critics warn that algorithmic risk assessment may exacerbate inequalities in the criminal justice system’s treatment of minorities. Further, calls for third party auditing contend that studies may reveal disparities in how risk assessment tools classify minorities. A recent audit found a popular risk tool overpredicted for Blacks.
An equally important minority group deserving of study is Hispanics. The study reported herein examines the risk outcomes of a widely used algorithmic risk tool using a large dataset with a two-year followup period. Results reveal cumulative evidence of (a) differential validity and prediction between Hispanics and non-Hispanics and (b) algorithmic unfairness and disparate impact in overestimating the general and violent recidivism of Hispanics.
Keywords: Risk Assessment, Big Data, Algorithmic Bias, Pretrial Release, Minorities, Evidence-Based Practices, Algorithmic Fairness, COMPAS, Criminal Justice, Recidivism
JEL Classification: K14, K42, C10, C20, Z18
Suggested Citation: Suggested Citation