The UK Algorithmic Transparency Standard: A Qualitative Analysis of Police Perspectives

30 Pages Posted: 27 Jul 2022

See all articles by Marion Oswald

Marion Oswald

University of Northumbria at Newcastle; The Alan Turing Institute

Luke Chambers

affiliation not provided to SSRN

Ellen P. Goodman

Rutgers Law

Pam Ugwudike

University of Southampton

Miri Zilka

University of Cambridge

Date Written: July 7, 2022

Abstract

1. The UK Government’s draft ‘Algorithmic Transparency Standard’ is intended to provide a standardised way for public bodies and government departments to provide information about how algorithmic tools are being used to support decisions. The research discussed in this report was conducted in parallel to the piloting of the Standard by the Cabinet Office and the Centre for Data Ethics and Innovation.
2. We conducted semi-structured interviews with respondents from across UK policing and commercial bodies involved in policing technologies. Our aim was to explore the implications for police forces of participation in the Standard, to identify rewards, risks, challenges for the police, and areas where the Standard could be improved, and therefore to contribute to the exploration of policy options for expansion of participation in the Standard.
3. Algorithmic transparency is both achievable for policing and could bring significant rewards. A key reward of police participation in the Standard is that it provides the opportunity to demonstrate proficient implementation of technology-driven policing, thus enhancing earned trust. Research participants highlighted the public good that could result from the considered use of algorithms.
4. Participants noted, however, a risk of misperception of the dangers of policing technology, especially if use of algorithmic tools was not appropriately compared to the status quo and current methods.
5. Participation in the Standard provides an opportunity to develop increased sharing among police forces of best practices (and things to avoid), and increased thoughtfulness among police force personnel in building and implementing new tools. Research participants were keen for compliance with the Standard to become an integral part of a holistic system to drive reflective practice across policing around the development and deployment of algorithmic technology. This could enable police to learn from each other, facilitate good policy choices and decrease wasted costs. Otherwise, the Standard may come to be regarded as an administrative burden rather than a benefit for policing.
6. Several key areas for amendment and improvement from the perspective of policing were identified in the research. These could improve the Standard for the benefit of all participants. These include a need for clarification of the scope of the Standard, and the stage of project development at which the Standard should apply. It is recommended that consideration be given to a ‘Standard-Lite’ for projects at the pilot or early stages of the development process in order to gain public understanding of new tools and applications. Furthermore, the Standard would benefit from a more substantial glossary (to include relevant policing terms) and additional guidance on the level of detail required in each section and how accuracy rates should be described, justified and explained in order to ensure consistency.
7. The research does not suggest any overriding reason why the Standard should not be applied in policing. Suitable exemptions for sensitive contexts and tradecraft would be required, however, and consideration given to ensuring that forces have the resources to comply with the Standard and to respond to the increased public interest that could ensue. Limiting the scope initially to tools on a defined list (to include the most high-risk tools, such as those that produce individualised risk/predictive scores) could assist in mitigating concerns over sensitive policing capabilities and resourcing. A non-public version of the Standard for sensitive applications and tools could also be considered, which would be available to bodies with an independent oversight function.
8. To support police compliance with the Standard, supplier responsibilities – including appropriate disclosure of algorithmic functionality, data inputs and performance – should be covered in procurement contracts and addressed up front as a mandatory requirement of doing business with the police.
9. As well as contributing to the piloting of the Standard, it is recommended that the findings of this report are considered at NPCC level, by the College of Policing and by the office of the Chief Scientific Advisor for Policing, as new sector-led guidance, best practice and policy are developed.

Keywords: Algorithms, Machine Learning, Police, Transparency, Standards, Law, Regulation, Ethics

JEL Classification: K10

Suggested Citation

Oswald, Marion and Chambers, Luke and Goodman, Ellen P. and Ugwudike, Pam and Zilka, Miri, The UK Algorithmic Transparency Standard: A Qualitative Analysis of Police Perspectives (July 7, 2022). Available at SSRN: https://ssrn.com/abstract=4155549 or http://dx.doi.org/10.2139/ssrn.4155549

Marion Oswald (Contact Author)

University of Northumbria at Newcastle ( email )

Pandon Building
208, City Campus East-1
Newcastle-Upon-Tyne, Newcastle NE1 8ST
United Kingdom

The Alan Turing Institute ( email )

British Library
96 Euston Road
London, NW1 2DB
United Kingdom

Luke Chambers

affiliation not provided to SSRN

Ellen P. Goodman

Rutgers Law ( email )

217 N. 5th Street
Camden, NJ 08102
United States
856-225-6393 (Phone)
856-225-6516 (Fax)

Pam Ugwudike

University of Southampton ( email )

University Rd.
Southampton SO17 1BJ, Hampshire SO17 1LP
United Kingdom

Miri Zilka

University of Cambridge

Trinity Ln
Cambridge, CB2 1TN
United Kingdom

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
520
Abstract Views
3,178
Rank
108,524
PlumX Metrics