Protecting Sentient Artificial Intelligence: A Survey of Lay Intuitions on Standing, Personhood, and General Legal Protection
Frontiers in Robotics and AI: Ethics in Robotics and Artificial Intelligence, 8, 367. (2021).
15 Pages Posted: 6 Oct 2021 Last revised: 30 Nov 2021
Date Written: October 2, 2021
To what extent, if any, should the law protect sentient artificial intelligence (that is, AI that can feel pleasure or pain)? Here we surveyed United States adults (n=1061) on their views regarding granting (a) general legal protection, (b) legal personhood, and (c) standing to bring forth a lawsuit, with respect to sentient AI and eight other groups: humans in the jurisdiction, humans outside the jurisdiction, corporations, unions, non-human animals, the environment, humans living in the near future, and humans living in the far future. Roughly one-third of participants endorsed granting personhood and standing to sentient AI (assuming its existence) in at least some cases, the lowest of any group surveyed on, and rated the desired level of protection for sentient AI as lower than all groups other than corporations. We further investigated and observed political differences in responses; liberals were more likely to endorse legal protection and personhood for sentient AI than conservatives. Taken together, these results suggest that laypeople are not by-and-large in favor of granting legal protection to AI, and that the ordinary conception of legal status, similar to codified legal doctrine, is not based on a mere capacity to feel pleasure and pain. At the same time, the observed political differences suggest that previous literature regarding political differences in empathy and moral circle expansion apply to artificially intelligent systems and extend partially, though not entirely, to legal consideration, as well.
Keywords: Legal personhood, Legal standing, Moral standing, Robot rights, Artificial intelligence, Artificial intelligence & law, Moral circle
JEL Classification: K10, K33
Suggested Citation: Suggested Citation