The Sexist Algorithm

38 Behavioral Sciences & the Law 145 (2019)

Posted: 30 May 2019

See all articles by Melissa Hamilton

Melissa Hamilton

University of Surrey School of Law

Date Written: April 19, 2019


Algorithmic risk assessment tools are informed by scientific research concerning which factors are predictive of recidivism and thus support the evidence‐based practice movement in criminal justice. Automated assessments of individualized risk (low, medium, high) permit officials to make more effective management decisions. Computer generated algorithms appear to be objective and neutral. But are these algorithms actually fair? The focus herein is on gender equity. Studies confirm that women typically have far lower recidivism rates than men. This differential raises the question of how well algorithmic outcomes fare in terms of predictive parity by gender.

This essay reports original research using a large dataset of offenders who were scored on the popular risk assessment tool COMPAS. Findings indicate that COMPAS performs reasonably well at discriminating between recidivists and non‐recidivists for men and women. Nonetheless, COMPAS algorithmic outcomes systemically overclassify women in higher risk groupings. Multiple measures of algorithmic equity and predictive accuracy are provided to support the conclusion that this algorithm is sexist.

Keywords: risk assessment, algorithmic justice, algorithmic fairness, criminal justice, gender, pretrial

JEL Classification: K14, K42, I18, C52

Suggested Citation

Hamilton, Melissa, The Sexist Algorithm (April 19, 2019). 38 Behavioral Sciences & the Law 145 (2019). Available at SSRN:

Melissa Hamilton (Contact Author)

University of Surrey School of Law ( email )

United Kingdom

Register to save articles to
your library


Paper statistics

Abstract Views
PlumX Metrics