Autonomous Weapon Systems and the Limits of Analogy
The Ethics of Autonomous Weapon Systems (Claire Finkelstein, Duncan MacIntosh & Jens David Ohlin), 2017, Forthcoming
27 Pages Posted: 13 Aug 2016 Last revised: 4 Oct 2016
Date Written: August 2, 2016
Most imagine autonomous weapon systems as either more independent versions of weapons already in use or as some kind of robotic soldier. In many ways, these analogies are useful. Analogies and allusions to popular culture make this new kind of weaponry accessible, identify potential dangers, and augment desired narratives. Most importantly from a legal perspective, analogical reasoning helps stretch existing law to cover developing technologies and thereby avoid law-free zones.
But neither these nor other analogies based on unconventional entities that participate in armed conflict are apt for autonomous weapon systems. Every analogy is false in some way: it will either fail to capture a characteristic or it will imply the existence of a trait that isn’t actually there. All potential analogies — weapon, combatant, child soldier, animal — misrepresent crucial traits of autonomous weapon systems and limit our understanding of the technology, thereby impeding our ability to properly regulate it.
The majority of embodied autonomous weapon systems in use today are appropriately analogized to other weapons and can be regulated accordingly. But as other kinds of autonomous weapon systems are developed and deployed, there is no appropriate analogy and therefore no appropriate legal regime. Instead, as is often the case when law by analogy fails, what is needed is new law — at the very least, new regulations for autonomous weapon systems, but perhaps also a new legal regime for all unconventional warfighters.
Keywords: autonomous weapon systems, international humanitarian law, law of armed conflict, analogy, weapons, combatants, child soldiers, animals, killer robot
Suggested Citation: Suggested Citation