The Effects of Machine-powered Platform Governance: An Empirical Study of Content Moderation

39 Pages Posted: 10 Mar 2021 Last revised: 9 Feb 2024

See all articles by Qinglai He

Qinglai He

University of Wisconsin - Madison - Department of Operations and Information Management

Yili Hong

University of Miami Herbert Business School

T. S. Raghu

Arizona State University - W. P. Carey School of Business

Date Written: January 17, 2021

Abstract

With increasing volumes of participation in social media and online communities, content moderation has become an integral component of platform governance. Volunteer (human) moderators have thus far been the essential workforce for content moderation. Because volunteer-based content moderation faces challenges in achieving scalable, desirable, and sustainable moderation, many online platforms have recently started to adopt algorithm-based content moderation tools (bots). When bots are introduced into platform governance, it is unclear how volunteer moderators react in terms of their community-policing and -nurturing efforts. To understand the impacts of these increasingly popular bot moderators, we conduct an empirical study with data collected from 156 communities (subreddits) on Reddit. Based on a series of econometric analyses, we find that bots augment volunteer moderators by stimulating them to moderate a larger quantity of posts, and such effects are pronounced in larger communities. Specifically, volunteer moderators perform 20.9% more community policing, particularly over subjective rules. Moreover, in communities with larger sizes, volunteers also exert increased efforts in offering more explanations and suggestions after their community adopted bots. Notably, increases in activities are primarily driven by the increased need for nurturing efforts to accompany growth in subjective policing. Moreover, introducing bots to content moderation also improves the retention of volunteer moderators. Overall, we show that introducing algorithm-based content moderation into platform governance is beneficial for sustaining digital communities.

Keywords: Platform governance, human-machine interaction, content moderation, human moderators, bot moderators

Suggested Citation

He, Qinglai and Hong, Yili and Raghu, T. S., The Effects of Machine-powered Platform Governance: An Empirical Study of Content Moderation (January 17, 2021). Available at SSRN: https://ssrn.com/abstract=3767680 or http://dx.doi.org/10.2139/ssrn.3767680

Qinglai He

University of Wisconsin - Madison - Department of Operations and Information Management ( email )

Madison, WI
United States

Yili Hong (Contact Author)

University of Miami Herbert Business School ( email )

P.O. Box 248126
Florida
Coral Gables, FL 33124
United States

T. S. Raghu

Arizona State University - W. P. Carey School of Business ( email )

Farmer Building 440G PO Box 872011
Tempe, AZ 85287
United States

HOME PAGE: http://https://wpcarey.asu.edu/people/profile/192381

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
907
Abstract Views
3,001
Rank
47,281
PlumX Metrics