Moderating Monopolies
54 Pages Posted: 14 Sep 2023 Last revised: 26 Oct 2023
Date Written: September 8, 2023
Abstract
Industrial organization predetermines content moderation online. At the core of today’s dysfunctions in the digital public sphere is a market power problem. Meta, Google, Apple, and a few other digital platforms control the infrastructure of the digital public sphere. A tiny group of corporations governs online speech, causing systemic problems to public discourse and individual harm to stakeholders. Current approaches to content moderation build on a deeply flawed market structure, addressing symptoms of systemic failures at best and cementing ailments at worst.
Market concentration creates monocultures for communication susceptible to systemic failures and raises the stakes for individual content moderation decisions, like takedowns of posts or bans of individuals. As these decisions are inherently prone to errors, those errors are magnified by the platforms’ scale and market power. Platform monopolies also harm individual stakeholders: persisting monopolies lead to higher prices, lower quality, or less innovation. As platforms’ services include content moderation, degraded services may increase the error rate of takedown decisions and over-expose users to toxic content, misinformation, or harassment. Platform monopolies can also get away with discriminatory and exclusionary conduct more easily because users lack voice and exit opportunities.
Stricter antitrust enforcement is imperative, but contemporary antitrust doctrine alone cannot hope to provide sufficient relief to the digital public sphere. First, a narrowly understood consumer welfare standard overemphasizes easily quantifiable, short-term price effects. Second, the levels of concentration necessary to trigger antitrust scrutiny far exceed those of a market conducive to pluralistic discourse. Third, requiring specific anticompetitive conduct, the focal point of current antitrust doctrine, ignores structural dysfunction mighty bottlenecks create in public discourse, irrespective of the origins or even benevolent exercise of their power.
In this Article, I suggest three types of remedies to address the market power problem behind the dysfunctions in the digital public sphere. First, mandating active interoperability between platforms would drastically reduce lock-in effects. Second, scaling back quasi-property exclusivity online would spur follow-on innovation. Third, no-fault liability and broader objectives in antitrust doctrine would establish more effective counterweights to concentrating effects in the digital public sphere. While these pro-competitive measures cannot provide a panacea to all online woes, they would lower the stakes of inevitable content moderation decisions, incentivize investments in better decision-making processes, and contribute to healthier pluralistic discourse.
Keywords: antitrust, content moderation, platforms
Suggested Citation: Suggested Citation