Deplatforming and the Control of Misinformation: Evidence from Parler

30 Pages Posted: 10 Oct 2022 Last revised: 2 Mar 2023

See all articles by Saharsh Agarwal

Saharsh Agarwal

Indian School of Business

Uttara M Ananthakrishnan

University of Washington, Michael G. Foster School of Business

Catherine E. Tucker

Massachusetts Institute of Technology (MIT) - Management Science (MS)

Date Written: September 29, 2022

Abstract

Moderation of problematic individuals and communities—--particularly those rife with misinformation—--has become a significant policy challenge in recent times. Social media platforms wield significant editorial control over their users' content, but do not always have the incentives or technological capabilities to moderate harmful content. Yet these platforms rely on third-party technology infrastructure providers, who are increasingly intervening in content moderation decisions by withdrawing support to platforms that fail to self-regulate. Despite the rise in such 'deplatforming' initiatives, little is known about their spillover effects outside the focal platform. In January 2021, Parler, a social media platform, was deplatformed by its web and app hosting services for its purported role in fomenting the Capitol riots. Using detailed user-level mobile usage data, we demonstrate that following the deplatforming, a section of Parler users migrated to Telegram, an alternative 'free-speech’ platform. Many other fringe platforms saw momentary surges that could not be sustained over time, and a large fraction of Parler users did not migrate to alternative platforms. The migration was driven predominantly by highly engaged Parler users with high partisan media consumption prior to January 2021. We also demonstrate the unintended consequences of the Parler deplatforming—when Parler users moved to Telegram, their exposure to misinformation, partisan content, and unregulated financial activities increased. Our results also serve as the first general equilibrium evidence of cross-category platform substitution, which can be critical in guiding legislation related to market analysis of social media platforms.

Keywords: Deplatforming, Content Moderation, Echo Chamber, Polarization, Misinformation, Fake News, Censorship, Radicalization, Free Speech, Parler

JEL Classification: L86, M37, M38

Suggested Citation

Agarwal, Saharsh and Ananthakrishnan, Uttara M and Tucker, Catherine E., Deplatforming and the Control of Misinformation: Evidence from Parler (September 29, 2022). Available at SSRN: https://ssrn.com/abstract=4232871 or http://dx.doi.org/10.2139/ssrn.4232871

Saharsh Agarwal (Contact Author)

Indian School of Business ( email )

Hyderabad, Gachibowli 500 019
India

HOME PAGE: http://https://sites.google.com/view/saharshagarwal/

Uttara M Ananthakrishnan

University of Washington, Michael G. Foster School of Business ( email )

Box 353200
Seattle, WA 98195-3200
United States

Catherine E. Tucker

Massachusetts Institute of Technology (MIT) - Management Science (MS) ( email )

100 Main St
E62-536
Cambridge, MA 02142
United States

HOME PAGE: http://cetucker.scripts.mit.edu

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
321
Abstract Views
1,174
Rank
151,632
PlumX Metrics