When Speed Kills: Autonomous Weapon Systems, Deterrence, and Stability

27 Pages Posted: 23 May 2019

See all articles by Michael C. Horowitz

Michael C. Horowitz

University of Pennsylvania - Department of Political Science

Date Written: May 2, 2019


While the applications of artificial intelligence (AI) for militaries are broad and go beyond the battlefield, autonomy on the battlefield, in the forms of lethal autonomous weapon systems (LAWS), represents one possible usage of narrow AI by militaries. Research and development on LAWS by major powers, middle powers, and non-state actors makes exploring the consequences for the security environment a crucial task. This paper draws on classic research in security studies and examples from military history to assess how LAWS could influence two outcome areas: the development and deployment of systems, including arms races, and the stability of deterrence, including strategic stability, the risk of crisis instability, and wartime escalation. It focuses on these questions through the lens of two characteristics of LAWS: the potential for increased operational speed and the potential for decreased human control over battlefield choices. It also examines how these issues interact with the large uncertainty parameter associated with potential AI-based military capabilities at present, both in terms of the range of the possible and the opacity of their programming.

Keywords: robotics, arms races, lethal autonomous weapon systems, deterrence, crisis stability, artificial intelligence

Suggested Citation

Horowitz, Michael C., When Speed Kills: Autonomous Weapon Systems, Deterrence, and Stability (May 2, 2019). Available at SSRN: https://ssrn.com/abstract=3348356 or http://dx.doi.org/10.2139/ssrn.3348356

Michael C. Horowitz (Contact Author)

University of Pennsylvania - Department of Political Science ( email )

Stiteler Hall
Philadelphia, PA 19104
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Abstract Views
PlumX Metrics