The Implications of Section 230 for Black Communities
66 Wm. & Mary L. Rev. __ (forthcoming Oct. 2024)
61 Pages Posted: 6 Jun 2024
Date Written: June 03, 2024
Abstract
Section 230 of the Communications Decency Act generally immunizes online platforms such as Facebook, YouTube, Amazon, and Uber from liability for third-party user content (e.g., posts, comments, videos) and for moderation of that content. This article addresses an important issue overlooked by both defenders and critics of Section 230: the implications of the law and proposed reforms for Black communities. By relieving tech platforms of most legal liability for third-party content, Section 230 helps facilitate Black social activism, entrepreneurship, and artistic creativity. Further, Section 230 also relieves platforms of most legal liability for content moderation, which boosts platforms’ freedom to remove or downrank unlawful activity, as well as an array of “lawful but awful” content that government is constitutionally unable to restrict—such as hate speech, white supremacy organizing, medical disinformation, and political disinformation. However, unfortunately, platforms’ overly broad interpretations of Section 230 also provide incentives for platforms to allow unlawful activity directed at Black communities, such as harassment, white supremacist violence, voter intimidation, and housing and employment discrimination, and to prevent legal recourse when platforms erroneously downrank Black content. These insights provide factors that can help policymakers assess whether proposed Section 230 reforms—such as notice-and-takedown, content moderation neutrality, and carve-outs to immunity for algorithmic recommendations and advertisements—will benefit or harm Black communities.
Keywords: Section 230, Communications Decency Act, social media, internet regulation, content moderation, platform liability, Section 230 immunity, free speech, First Amendment, misinformation, disinformation
Suggested Citation: Suggested Citation