A Model of Pathways to Artificial Superintelligence Catastrophe for Risk and Decision Analysis

Journal of Experimental & Theoretical Artificial Intelligence, vol. 29, no. 2 (2017), pages 397-414

21 Pages Posted: 11 Jul 2016 Last revised: 11 Oct 2017

See all articles by Anthony Barrett

Anthony Barrett

Global Catastrophic Risk Institute

Seth Baum

Global Catastrophic Risk Institute

Date Written: February 20, 2017

Abstract

An artificial superintelligence (ASI) is artificial intelligence that is significantly more intelligent than humans in all respects. While ASI does not currently exist, some scholars propose that it could be created sometime in the future, and furthermore that its creation could cause a severe global catastrophe, possibly even resulting in human extinction. Given the high stakes, it is important to analyze ASI risk and factor the risk into decisions related to ASI research and development. This paper presents a graphical model of major pathways to ASI catastrophe, focusing on ASI created via recursive self-improvement. The model uses the established risk and decision analysis modeling paradigms of fault trees and influence diagrams in order to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. The events and conditions include select aspects of the ASI itself as well as the human process of ASI research, development, and management. Model structure is derived from published literature on ASI risk. The model offers a foundation for rigorous quantitative evaluation and decision making on the long-term risk of ASI catastrophe.

Keywords: risk analysis, decision analysis, artificial intelligence

Suggested Citation

Barrett, Anthony and Baum, Seth, A Model of Pathways to Artificial Superintelligence Catastrophe for Risk and Decision Analysis (February 20, 2017). Journal of Experimental & Theoretical Artificial Intelligence, vol. 29, no. 2 (2017), pages 397-414, Available at SSRN: https://ssrn.com/abstract=2807361

Anthony Barrett

Global Catastrophic Risk Institute ( email )

Seth Baum (Contact Author)

Global Catastrophic Risk Institute ( email )

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
102
Abstract Views
807
Rank
472,796
PlumX Metrics