How Soon is Now? Predicting the Expected Arrival Date of AGI- Artificial General Intelligence
26 Pages Posted: 5 Jul 2023 Last revised: 30 Jan 2024
Date Written: June 30, 2023
Abstract
This paper uses some economic modelling techniques to predict the expected arrival date of AGI- Artificial General Intelligence. The average predicted date from this analysis is 2041, with a likely range of 2032 to 2048, and an estimated earliest possible arrival date of 2028, i.e. just 5 years away.
Whereas the average predicted date to get “reasonably close” to AGI is 2034, with a likely range of 2027 to 2041 and estimated earliest date of 2026, i.e. just 3 years away.
(N.B. Significant global-scale conflict in the interim could delay this process, but probably not by more than 10 years.)
Together this implies that the next quarter century to mid 21stC is likely to see the arrival of AGI and all the challenges and risks this can bring to humanity including of existential and catastrophic change, which current and not just future generations of people will have to face and manage.
This study is therefore supportive of recent 2023 qualitative existential risk warnings from key godfathers of AI technology including: Hinton, Hassibis and Bengio. Conversely, it is not supportive of those philosophers who dismiss AI existential risk claims as overblown or diversionary, and find insuperable sticking points in abstract questions e.g. of AI consciousness and AI qualia.
Because the estimated arrival time of AGI, according to this analysis, is close by- in the next quarter century to mid 21stC, this will almost certainly occur before any planetary global warming emergency takes effect. And because AGI could include existential risk to humans, this leads to the conclusion and policy implication that AGI is probably the premier and most imminent potential existential risk to humanity, and therefore needs to be addressed most urgently and globally now.
Keywords: AI, AGI, Artificial General Intelligence, prediction models, LLMs, existential risk, precautionary principle, regulation, HAL 9000, 2001, Terminator
JEL Classification: O21, O31, O32, O33, O34, O35, O38, O44
Suggested Citation: Suggested Citation