Justice by Algorithm: Are Artificial Intelligence Risk Assessment Tools Biased Against Minorities?
Michael Conklin & Jun Wu, Justice by Algorithm: Are Artificial Intelligence Risk Assessment Tools Biased Against Minorities?, ___ S. J. OF POL’Y & JUST. ___ (2021 Forthcoming)
10 Pages Posted: 8 Jul 2021 Last revised: 27 Dec 2021
Date Written: June 30, 2021
Abstract
This is a review of Katherine B. Forrest’s new book When Machines Can Be Judge, Jury, and Executioner. The book does an excellent job discussing issues of fairness and racial disparities from the use of artificial intelligence risk assessment tools (hereinafter “AI”) for decisions such as pretrial release and likelihood of recidivism. This is a timely topic as the technology is currently a tipping point. While Europe has begun to implement protections for defendants regarding AI, the U.S. is increasing its reliance on AI without such safeguards. This review includes a discussion on the topics of how AI compares to human judge predictions and decisions, fairness and racial outcomes, how recidivism is frequently misunderstood and its relevance, how human decisions are inextricably intertwined with AI, and the proper understanding of an AI’s “error rate.”
Keywords: artificial intelligence, AI, sentencing, bail, pre-trial release, criminal law, criminal justice, criminal procedure
Suggested Citation: Suggested Citation