The Indiana Journal of Global Legal Studies, Forthcoming
31 Pages Posted: 14 Mar 2023
Date Written: March 10, 2023
The increasing encroachment of artificial intelligence (AI) on social life raises various risks to society, most prominently in the info-spheres created and controlled by Google, Facebook, Apple, and Amazon. We examine these risks through an in-depth discussion of the Facebook content moderation regime, which is already partially controlled by algorithms. We argue that the idea of ethical engineering, which was developed by the literature as a solution to the challenge of governing AI, is inadequate for various reasons. In this article, we develop a different approach to coping with the risks of AI governance, which we have termed “algorithmic constitutionalism.” Our approach rests on three pillars: (a) layered architecture that consists of two levels of code: (i) operative or object level and (ii) meta level, whose purpose is to shield the core principles of the system from algorithmic-initiated changes; (b) algorithmic meta-reasoning, which allows the system to operate simultaneously at the two levels, so that it can (self) monitor, verify, and potentially correct in real time operations at the object level, if they depart from the principles protected by the meta-code level; and (c) correction by deliberation. We elaborate the idea of algorithmic constitutionalism and demonstrate how it can be applied to the Facebook content moderation regime. As part of this elaboration, we also consider the tension between societal and algorithmic constitutionalism. Paradoxically, the attempt to subject the AI algorithm to external deliberative control also opens the door for the AI agent to intervene in that process, potentially undermining its very purpose. We conclude by exploring the implications of our argument for the new European Digital Services Act, which came into force in October 2022.
Keywords: AI, Algorithmic Governance, Dogotal Services Act, Societal Constitutionalism
JEL Classification: K10
Suggested Citation: Suggested Citation