Autonomous Weapons Systems: Taking the Human Out of International Humanitarian Law

41 Pages Posted: 10 Jul 2013 Last revised: 21 Apr 2014

James G. Foy

Dalhousie University - Schulich School of Law

Date Written: April 20, 2013

Abstract

Once confined to science fiction, killer robots will soon be a reality. Both the USA and the UK are currently developing weapons systems that could be capable of autonomously targeting and killing enemy combatants. These capabilities are only 25 years away. According to Additional Protocol I to the Geneva Convention and customary international law, weapons systems must be capable of operating within then principles of International Humanitarian Law (IHL). This paper will demonstrate that without significant restrictions on the use of Autonomous Weapons Systems (AWS) or the creation of a new legal framework, the use AWS is problematic. First, there are legitimate concerns that AWS are, by their nature, incapable of adhering to IHL principles. Second, there is a more fundamental problem: the principles of IHL are actually insufficient to address the unique concerns regarding AWS. Finally, the solutions proposed by proponents of AWS do not sufficiently address these concerns. A legal solution beyond the general principles of IHL must be developed.

Keywords: International Humanitarian Law, Autonomous Weapons Systems, drones, robot soldiers, robot weapon, robotics, lethal autonomous robots

Suggested Citation

Foy, James G., Autonomous Weapons Systems: Taking the Human Out of International Humanitarian Law (April 20, 2013). Available at SSRN: https://ssrn.com/abstract=2290995 or http://dx.doi.org/10.2139/ssrn.2290995

James G. Foy (Contact Author)

Dalhousie University - Schulich School of Law ( email )

6061 University Avenue
Halifax, Nova Scotia B3H 4H9
Canada

Paper statistics

Downloads
177
Rank
135,919
Abstract Views
573