Download This Paper Open PDF in Browser

Disparate Impact in Big Data Policing

87 Pages Posted: 1 Oct 2016 Last revised: 20 Feb 2018

Andrew D. Selbst

Data & Society Research Institute; Yale Information Society Project

Date Written: February 25, 2017

Abstract

Data-driven decision systems are taking over. No institution in society seems immune from the enthusiasm that automated decision-making generates, including—and perhaps especially—the police. Police departments are increasingly deploying data mining techniques to predict, prevent, and investigate crime. But all data mining systems have the potential for adverse impacts on vulnerable communities, and predictive policing is no different. Determining individuals’ threat levels by reference to commercial and social data can improperly link dark skin to higher threat levels or to greater suspicion of having committed a particular crime. Crime mapping based on historical data can lead to more arrests for nuisance crimes in neighborhoods primarily populated by people of color. These effects are an artifact of the technology itself, and will likely occur even assuming good faith on the part of the police departments using it. Meanwhile, predictive policing is sold in part as a “neutral” method to counteract unconscious biases when it is not simply sold to cash-strapped departments as a more cost- efficient way to do policing.

The degree to which predictive policing systems have these discriminatory results is unclear to the public and to the police themselves, largely because there is no incentive in place for a department focused solely on “crime control” to spend resources asking the question. This is a problem for which existing law does not provide a solution. Finding that neither the typical constitutional modes of police regulation nor a hypothetical anti-discrimination law would provide a solution, this Article turns toward a new regulatory proposal centered on “algorithmic impact statements.”

Modeled on the environmental impact statements of the National Environmental Policy Act, algorithmic impact statements would require police departments to evaluate the efficacy and potential discriminatory effects of all available choices for predictive policing technologies. The regulation would also allow the public to weigh in through a notice-and-comment process. Such a regulation would fill the knowledge gap that makes future policy discussions about the costs and benefits of predictive policing all but impossible. Being primarily procedural, it would not necessarily curtail a department determined to discriminate, but by forcing departments to consider the question and allowing society to understand the scope of the problem, it is a first step towards solving the problem and determining whether further intervention is required.

Keywords: Civil Rights, Disparate Impact, Discrimination, Big Data, Data Mining, Algorithms, Machine Learning, Policing, Predictive Policing, Fourth Amendment, Criminal Procedure, Administrative Law

Suggested Citation

Selbst, Andrew D., Disparate Impact in Big Data Policing (February 25, 2017). 52 Georgia Law Review 109 (2018). Available at SSRN: https://ssrn.com/abstract=2819182 or http://dx.doi.org/10.2139/ssrn.2819182

Andrew D. Selbst (Contact Author)

Data & Society Research Institute ( email )

36 West 20th Street
11th Floor
New York,, NY 10011
United States

Yale Information Society Project ( email )

127 Wall Street
New Haven, CT 06511
United States

Paper statistics

Downloads
530
rank
47,073
Abstract Views
2,144
PlumX