Existential Risks & Global Governance Issues Around AI & Robotics
31 Pages Posted: 15 Aug 2022 Last revised: 13 Jun 2023
Date Written: June 15, 2023
There are growing concerns about how lethal autonomous weapons systems, artificial general intelligence (or “superintelligence”) or “killer robots” might give rise to new global existential risks. Continuous communication and coordination—among countries, developers, professional bodies and other stakeholders—is the most important strategy for addressing such risks.
Although global agreements and accords can help address some malicious uses of artificial intelligence (AI) or robotics, proposals calling for control through a global regulatory authority are both unwise and unlikely to work. Calls for bans or “pauses” on AI developments are also futile because many nations would never agree to forego developing algorithmic capabilities when adversaries are advancing their own. Therefore, the U.S. government should continue to work with other nations to address threatening uses of algorithmic or robotic technologies while simultaneously taking steps to ensure that it possesses the same technological capabilities as adversaries or rogue nonstate actors.
Many different nongovernmental international bodies and multinational actors can play an important role as coordinators of national policies and conveners of ongoing deliberation about various AI risks and concerns. Soft law (i.e., informal rules, norms and agreements) will also play an important role in addressing AI risks. Professional institutions and nongovernmental bodies have developed important ethical norms and expectations about acceptable uses of algorithmic technologies, and these groups also play an essential role in highlighting algorithmic risks and helping with ongoing efforts to communicate and coordinate global steps to address them.
Keywords: artificial, intelligence, AI, ML, machine, learning, robot, robotics, data, innovation, regulation, policy, governance
JEL Classification: O1, O38, L86, L88, L5, K13, K00, K39, O3, O31, O33, M38
Suggested Citation: Suggested Citation