Algorithmic Black Swans

68 Pages Posted: 1 Mar 2023 Last revised: 17 Oct 2023

Date Written: October 14, 2023


From biased lending algorithms to chatbots that spew violent hate speech, AI systems already pose many risks to society. While policymakers have a responsibility to tackle pressing issues of algorithmic fairness, privacy, and accountability, they also have a responsibility to consider broader, longer-term risks from AI technologies. In public health, climate science, and financial markets, anticipating and addressing societal-scale risks is crucial. As the COVID-19 pandemic demonstrates, overlooking catastrophic tail events — or “black swans” — is costly. The prospect of automated systems manipulating our information environment, distorting societal values, and destabilizing political institutions is increasingly palpable. At present, it appears unlikely that market forces will address this class of risks. Organizations building AI systems do not bear the costs of diffuse societal harms and have limited incentive to install adequate safeguards. Meanwhile, regulatory proposals such as the White House AI Bill of Rights and the European Union AI Act primarily target the immediate risks from AI, rather than broader, longer-term risks. To fill this governance gap, this Article offers a roadmap for “algorithmic preparedness” — a set of five forward-looking principles to guide the development of regulations that confront the prospect of algorithmic black swans and mitigate the harms they pose to society.

Keywords: Artificial Intelligence, Governance, Regulation, General Purpose AI Systems, ChatGPT, Misuse, Systemic Risk, Black Swans, EU AI Act, NIST AI Risk Management Framework, White House AI Bill of Rights

JEL Classification: D18, K24, L86, O31, O33, O38

Suggested Citation

Kolt, Noam, Algorithmic Black Swans (October 14, 2023). Washington University Law Review, Vol. 101, Forthcoming, Available at SSRN:

Noam Kolt (Contact Author)

University of Toronto

105 St George Street
Toronto, Ontario M5S 3G8


Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics