The Biased Algorithm: Evidence of Disparate Impact on Hispanics

25 Pages Posted: 29 Oct 2018 Last revised: 15 Jul 2019

See all articles by Melissa Hamilton

Melissa Hamilton

University of Surrey School of Law

Date Written: October 5, 2018


Algorithmic risk assessment holds the promise of reducing mass incarceration while remaining conscious of public safety. Yet presumptions of transparent and fair algorithms may be unwarranted. Critics warn that algorithmic risk assessment may exacerbate inequalities in the criminal justice system’s treatment of minorities. Further, calls for third party auditing contend that studies may reveal disparities in how risk assessment tools classify minorities. A recent audit found a popular risk tool overpredicted for Blacks.

An equally important minority group deserving of study is Hispanics. The study reported herein examines the risk outcomes of a widely used algorithmic risk tool using a large dataset with a two-year followup period. Results reveal cumulative evidence of (a) differential validity and prediction between Hispanics and non-Hispanics and (b) algorithmic unfairness and disparate impact in overestimating the general and violent recidivism of Hispanics.

Keywords: Risk Assessment, Big Data, Algorithmic Bias, Pretrial Release, Minorities, Evidence-Based Practices, Algorithmic Fairness, COMPAS, Criminal Justice, Recidivism

JEL Classification: K14, K42, C10, C20, Z18

Suggested Citation

Hamilton, Melissa, The Biased Algorithm: Evidence of Disparate Impact on Hispanics (October 5, 2018). 56 AM. CRIM L. REV. 1553 (2019), Available at SSRN:

Melissa Hamilton (Contact Author)

University of Surrey School of Law ( email )

United Kingdom

HOME PAGE: http://

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Abstract Views
PlumX Metrics