Deplatforming and the Control of Misinformation: Evidence from Parler
32 Pages Posted: 10 Oct 2022
Date Written: September 29, 2022
Regulation of controversial individuals and communities - particularly those rife with misinformation and conspiracy theories - has become a significant challenge for technology platforms in recent times. Little is known about the spillover effects of content moderation policies outside the focal platform, despite the ease with which censored individuals and communities can regroup in alternative spaces online. In January 2021, Parler, a right-wing social media platform, was deplatformed by its web and app hosting services for its purported role in fomenting the Capitol riots. Using detailed user-level mobile usage data, we demonstrate that following the deplatforming, Parler users migrated to Telegram, an alternative ‘free-speech’ platform that is more resistant to surveillance. We also find that a large fraction of users did not migrate, suggesting that the intervention was partially effective. The migration was driven predominantly by highly engaged Parler users with high right-wing media consumption prior to January 2021. We also demonstrate the unintended consequences of the Parler deplatforming - when Parler users moved to Telegram, their exposure to misinformation, partisan content, and unregulated financial activities increased. Finally, we find that following the influx of Parler users, some of the existing right-leaning users of Telegram who had no prior links to Parler also saw an increase in exposure to misinformation.
Keywords: Deplatforming, Content Moderation, Echo Chamber, Polarization, Misinformation, Fake News, Censorship, Radicalization, Free Speech, Parler
JEL Classification: L86, M37, M38
Suggested Citation: Suggested Citation