The Other Side of Autonomous Weapons: Using Artificial Intelligence to Enhance IHL Compliance

Lieber Inst. for Law and Land Warfare, U.S. Military Academy at West Point, The Impact of Emerging Technologies on the Law of Armed Conflict (Oxford Univ. Press, Eric Talbot Jensen ed., 2018, Forthcoming)

Roger Williams Univ. Legal Studies Paper No. 182

52 Pages Posted: 12 Jun 2018 Last revised: 18 Jun 2018

Peter Margulies

Roger Williams University School of Law

Date Written: June 12, 2018

Abstract

The role of autonomy and artificial intelligence (AI) in armed conflict has sparked heated debate. The resulting controversy has obscured the benefits of autonomy and AI for compliance with international humanitarian law (IHL). Compliance with IHL often hinges on situational awareness: information about a possible target’s behavior, nearby protected persons and objects, and conditions that might compromise the planner’s own perception or judgment. This paper argues that AI can assist in developing situational awareness technology (SAT) that will make target selection and collateral damage estimation more accurate, thereby reducing harm to civilians.

SAT complements familiar precautionary measures such as taking additional time and consulting with more senior officers. These familiar precautions are subject to three limiting factors: contingency, imperfect information, and confirmation bias. Contingency entails an unpredictable turn of events, such as the last-minute entrance of civilians into a targeting frame. Imperfect information involves relevant data that is inaccessible to the planner of an attack. For example, an attack in an urban area may damage civilian objects that are necessary for health and safety, such as sewer lines. Finally, confirmation bias entails the hardening of preconceived theories and narratives.

SAT’s ability to rapidly assess shifting variables and discern patterns in complex data can address perennial problems with targeting such as the contingent appearance of civilians at a target site or the risk of undue damage to civilian infrastructure. Moreover, SAT can help diagnose flaws in human targeting processes caused by confirmation bias. This Article breaks down SAT into three roles. Gatekeeper SAT ensures that operators have the information they need. Cancellation SAT can respond to contingent events, such as the unexpected presence of civilians. The most advanced system, behavioral SAT, can identify flaws in the targeting process and remedy confirmation bias. In each of these contexts, SAT can help fulfill IHL’s mandate of “constant care” in the avoidance of harm to civilian persons and objects.

Suggested Citation

Margulies, Peter, The Other Side of Autonomous Weapons: Using Artificial Intelligence to Enhance IHL Compliance (June 12, 2018). Lieber Inst. for Law and Land Warfare, U.S. Military Academy at West Point, The Impact of Emerging Technologies on the Law of Armed Conflict (Oxford Univ. Press, Eric Talbot Jensen ed., 2018, Forthcoming); Roger Williams Univ. Legal Studies Paper No. 182. Available at SSRN: https://ssrn.com/abstract=3194713

Peter Margulies (Contact Author)

Roger Williams University School of Law ( email )

10 Metacom Avenue
Bristol, RI 02809
United States

Register to save articles to
your library

Register

Paper statistics

Downloads
59
rank
339,700
Abstract Views
210
PlumX