A Consumer Protection Approach to Platform Content Moderation

B.Petkova and T.Ojanen (eds), Fundamental Rights Protection Online: the Future Regulation of Intermediaries, Edward Elgar, 2019 Forthcoming

22 Pages Posted: 26 Jun 2019 Last revised: 2 Jul 2019

Date Written: June 22, 2019


Congress should consider legislation to regulate the content moderation practices of platforms. Failure to act will leave platform users unprotected and will allow other countries, notably the European Union and China, to seize global leadership in yet another area of tech policy. But a law requiring content rules against the most salient kinds of harmful platform content including hate speech, terrorist material and disinformation campaigns would not pass constitutional muster under the First Amendment. In contrast a consumer protection approach to content moderation might be effective and pass First Amendment scrutiny. The Federal Trade Commission, on its own or with authorization from Congress, could treat the failure to establish and maintain a procedurally adequate content moderation program as an unfair practice. This would effectively require platforms to have a content moderation program in place that contains content rules, enforcement procedures and due process protections including disclosure, mechanisms to ask for reinstatement and an internal appeals process, but it would not mandate the substance of the platform’s content rules. It would respond to strict First Amendment scrutiny as a narrowly crafted requirement that burdens speech no more than necessary to achieve the compelling government purpose of preventing an unfair trade practice. In addition, or alternatively, the FTC might be authorized to use its deception authority to require platforms to say what they do and do what they say in connection with content moderation programs. The FTC would treat failure to disclose key elements of a content moderation program as a material omission, and the failure to act in accordance with its program as a deceptive or misleading practice. Its First Amendment defense would rest on the compelling government interest in preventing consumer deception. The unfairness version would be more effective but less likely to survive a constitutional challenge. The pure disclosure version would be less effective, but more likely to be found consistent with current First Amendment jurisprudence. One additional advantage of this consumer protection approach is that it does not require controversial modification of Section 230 immunities for platforms.

Suggested Citation

MacCarthy, Mark, A Consumer Protection Approach to Platform Content Moderation (June 22, 2019). B.Petkova and T.Ojanen (eds), Fundamental Rights Protection Online: the Future Regulation of Intermediaries, Edward Elgar, 2019 Forthcoming, Available at SSRN: https://ssrn.com/abstract=3408459 or http://dx.doi.org/10.2139/ssrn.3408459

Mark MacCarthy (Contact Author)

Georgetown University ( email )

3520 Prospect St NW
Suite 311
Washington, DC 20057
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics