The Solution to the Pervasive Bias and Discrimination in the Criminal Justice: Transparent Artificial Intelligence

57 Pages Posted: 29 Mar 2021

See all articles by Mirko Bagaric

Mirko Bagaric

Director of the Evidence-Based Sentencing and Criminal Justice Project, Swinburne University Law School

Jennifer Svilar

affiliation not provided to SSRN

Melissa Bull

Queensland University of Technology

Dan Hunter

Queensland University of Technology

Nigel Stobbs

Queensland University of Technology - Faculty of Law

Date Written: March 2, 2021

Abstract

Algorithms are increasingly used in the criminal justice system for a range of important matters, including determining the sentence that should be imposed on offenders; whether offenders should be released early from prison; and the locations where police should patrol. The use of algorithms in this domain has been severely criticized on a number of grounds, including that they are inaccurate and discriminate against minority groups. Algorithms are used widely in relation to many other social endeavors, including flying planes and assessing eligibility for loans and insurance. In fact, most people regularly use algorithms in their day-to-day lives. Google Maps is an algorithm, as are Siri, weather forecasts, and automatic pilots. The criminal justice system is one of the few human activities which has not embraced the use of algorithms. This Article explains why the criticisms that have been leveled against the use of algorithms in the criminal justice domain are flawed. The manner in which algorithms operate is generally misunderstood. Algorithms are not autonomous machine applications or processes. Instead, they are always designed by humans and hence their capability and efficacy are, like all human processes, contingent upon the quality and accuracy of the design process and manner in which they are implemented. Algorithms can replicate all of the high-level human processing but have the advantage that they process vast sums of information far more quickly than humans. Thus, well-designed algorithms overcome all of the criticisms levelled against them. Moreover, because algorithms do not have feelings, the accuracy of their decision-making is far more objective, transparent, and predictable than that of humans. They are the best means to overcome the pervasive bias and discrimination that exists in all parts of the deeply flawed criminal justice system.

Keywords: Sentencing, artificial intelligence, eliminating discrimination

Suggested Citation

Bagaric, Mirko and Svilar, Jennifer and Bull, Melissa and Hunter, Dan and Stobbs, Nigel, The Solution to the Pervasive Bias and Discrimination in the Criminal Justice: Transparent Artificial Intelligence (March 2, 2021). American Criminal Law Review, Vol. 59, No. 1, Forthcoming , Available at SSRN: https://ssrn.com/abstract=3795911

Mirko Bagaric (Contact Author)

Director of the Evidence-Based Sentencing and Criminal Justice Project, Swinburne University Law School ( email )

Hawthorn
Hawthorn
Burwood, Victoria 3000
Australia

Jennifer Svilar

affiliation not provided to SSRN

Melissa Bull

Queensland University of Technology

2 George Street
Brisbane, Queensland 4000
Australia

Dan Hunter

Queensland University of Technology

2 George Street
Brisbane, Queensland 4000
Australia

Nigel Stobbs

Queensland University of Technology - Faculty of Law ( email )

Level 4, C Block Gardens Point
2 George St
Brisbane, QLD 4000
Australia

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
91
Abstract Views
289
rank
339,013
PlumX Metrics