An Education Theory of Fault For Autonomous Systems

2 Notre Dame Journal on Emerging Technologies 33 (2021)

24 Pages Posted: 28 May 2021 Last revised: 12 Jul 2021

See all articles by William D Smart

William D Smart

Oregon State University

Cindy Grimm

Oregon State University

Woodrow Hartzog

Northeastern University School of Law and Khoury College of Computer Sciences; Center for Law, Information and Creativity (CLIC); Stanford Law School Center for Internet and Society

Date Written: May 27, 2021

Abstract

Automated systems like self-driving cars and “smart” thermostats are a challenge for fault-based legal regimes like negligence because they have the potential to behave in unpredictable ways. How can people who build and deploy complex automated systems be said to be at fault when they could not have reasonably anticipated the behavior (and thus risk) of their tools?

Part of the problem is that the legal system has yet to settle on the language for identifying culpable behavior in the design and deployment for automated systems. In this article we offer an education theory of fault for autonomous systems—a new way to think about fault for all the relevant stakeholders who create and deploy “smart” technologies. We argue that the most important failures that lead autonomous systems to cause unpredictable harm are due to the lack of communication, clarity, and education between the procurer, developer, and users of these technologies.

In other words, while it is hard to exert meaningful control over automated systems to get them to act predictably, developers and procurers have great control over how much they test these tools and articulate their limits to all the other relevant parties. This makes testing and education one of the most legally relevant point of failures when automated systems harm people. By recognizing a responsibility to test and educate each other, foreseeable errors can be reduced, more accurate expectations can be set, and autonomous systems can be made more predictable and safer.

Keywords: artificial intelligence, torts, fault, duty, ai, automation, automated systems, driverless cars, robotics

Suggested Citation

Smart, William D and Grimm, Cindy and Hartzog, Woodrow, An Education Theory of Fault For Autonomous Systems (May 27, 2021). 2 Notre Dame Journal on Emerging Technologies 33 (2021), Available at SSRN: https://ssrn.com/abstract=3854927

William D Smart

Oregon State University ( email )

204 Rogers Hall
Corvallis, OR 97331
United States

HOME PAGE: http://oregonstate.edu/~smartw

Cindy Grimm

Oregon State University ( email )

Bexell Hall 200
Corvallis, OR 97331
United States

Woodrow Hartzog (Contact Author)

Northeastern University School of Law and Khoury College of Computer Sciences ( email )

416 Huntington Avenue
Boston, MA 02115
United States

HOME PAGE: http://https://www.northeastern.edu/law/faculty/directory/hartzog.html

Center for Law, Information and Creativity (CLIC) ( email )

416 Huntington Avenue
Boston, MA 02115
United States

Stanford Law School Center for Internet and Society ( email )

Palo Alto, CA
United States

HOME PAGE: http://cyberlaw.stanford.edu/profile/woodrow-hartzog

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
116
Abstract Views
582
rank
297,109
PlumX Metrics