The (Erroneous) Requirement for Human Judgment (and Error) in the Law of Armed Conflict

96 Int'l L. Stud. 26 (2020)

BYU Law Research Paper No. 20-09

34 Pages Posted: 6 Mar 2020

See all articles by Eric Talbot Jensen

Eric Talbot Jensen

Brigham Young University School of Law

Date Written: March 3, 2020

Abstract

One of the most intriguing and important discussions in international law today is the potential impact of emerging technologies on the law of armed conflict (LOAC), including weapons that incorporate machine learning and/or artificial intelligence. Because one of the likely characteristics of these advanced weapons would be the ability to make decisions implicating life and death on the battlefield, these discussions have highlighted a fundamental question concerning the LOAC: does the law regulating armed conflict require human input in selecting and engaging targets or can that decision be made without human input? This article analyzes views expressed by scholars and NGOs, but focuses on views expressed by States, many of which have been publicized as part of the discussions of States Parties to the Certain Conventional Weapons Convention. As a result of this analysis, It is clear that States have not yet come to a consensus on the issue of the legal role of human decision making in LOAC compliance. Given that lack of consensus, one can only conclude that the law does not currently require a human decision for selecting and engaging targets to be lawful. Though the international community may come to such a decision, it has not yet done so. Therefore, States should continue to research and develop weapons that incorporate machine learning and artificial intelligence because such weapons offer the promise of not only greater compliance with existing norms and processes, but also increased opportunities to provide protections in new and creative ways in the future.

Keywords: international law, law of armed conflict, law of war, international humanitarian law, lethal autonomous weapons systems, machine learning, artificial intelligence, certain conventional weapons convention, armed conflict

JEL Classification: K33

Suggested Citation

Jensen, Eric Talbot, The (Erroneous) Requirement for Human Judgment (and Error) in the Law of Armed Conflict (March 3, 2020). 96 Int'l L. Stud. 26 (2020), BYU Law Research Paper No. 20-09, Available at SSRN: https://ssrn.com/abstract=3548314 or http://dx.doi.org/10.2139/ssrn.3548314

Eric Talbot Jensen (Contact Author)

Brigham Young University School of Law ( email )

504 JRCB
Provo, UT 84602
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
260
Abstract Views
1,309
Rank
245,680
PlumX Metrics