Section 230 and the Fediverse: The 'Instances' of Mastodon's Immunity and Liability
20 Pages Posted: 2 May 2023 Last revised: 29 Jan 2024
Date Written: April 17, 2023
Abstract
“Code is law,” scholar Lawrence Lessig famously propounded. Technological architecture defines how speech is expressed. Different social media platforms shape discourse differently by virtue of their designs. Be it by allowing users to send ‘Friend requests’ on Facebook, by fixing the character count of Tweets on Twitter, or by prescribing the maximum duration of a TikTok video - such apps shape speech, its limits, its transgressions, and its parlance. Since this space is ever-evolving, platforms with newer technological architecture continue to disrupt the expression of speech, and the newest kid on the block is Mastodon - a decentralized platform on the Fediverse - that provides its users the ability to manage and moderate their own servers (called ‘instances’) on which third-party content can be hosted. Mastodon by virtually allowing anyone to operate their own Facebook, Twitter, or YouTube equivalent empowers its users in new ways. Yet, with great power comes great responsibility. Such an innovation raises a whole set of questions about Mastodon’s liability and immunity at various levels, especially in relation to laws such as FOSTA/SESTA. In this paper, I discuss the potential answers to these questions.
Keywords: Section 230, Mastodon, Fediverse, law, policy, immunity, liability, content moderation, platform governance, Communications Decency Act, FOSTA/SESTA, decentralized, social media
Suggested Citation: Suggested Citation