The Effects of Machine-powered Platform Governance: An Empirical Study of Content Moderation

48 Pages Posted: 10 Mar 2021

See all articles by Qinglai He

Qinglai He

University of Wisconsin - Madison - Department of Operations and Information Management

Yili Hong

University of Houston - C.T. Bauer College of Business

T. S. Raghu

Arizona State University - W. P. Carey School of Business

Date Written: January 17, 2021

Abstract

With increasing participation in social media and online communities, content moderation has become an important part of the online experience. Since of beginning of these digital platforms, volunteer moderators have been the essential workforce for platform governance. However, human moderation suffers from a limited capacity in moderating massive and undesirable content. Recently, many online platforms and communities have started to adopt algorithm-based moderation tools (bot moderators) to enable machine-powered platform governance to cope with the increasing need for content moderation. As platforms move toward the technical and automated mode of governance, there is a growing concern over de-humanization and whether machines would lead volunteer moderators to reduce their contributions. To understand the role of these increasingly popular bot moderators, we conduct an empirical study to examine the impact of machine-powered regulations on volunteer moderators’ behaviors. With data collected from 156 subreddits on Reddit, a large global online community, we found that delegating moderation to machines augments volunteer moderators’ role as community managers. Human moderators engage in more moderation-related activities, including 20.2% more corrective and 14.9% supportive activities with their community members. Importantly, the effect manifests primarily among communities with large user bases and detailed guidelines, suggesting that community needs for moderation are the key factors driving more voluntary contributions in the presence of bot moderators.

Keywords: Platform governance, human-machine interaction, content moderation, human moderators, bot moderators

Suggested Citation

He, Qinglai and Hong, Yili and Raghu, T. S., The Effects of Machine-powered Platform Governance: An Empirical Study of Content Moderation (January 17, 2021). Available at SSRN: https://ssrn.com/abstract=3767680 or http://dx.doi.org/10.2139/ssrn.3767680

Qinglai He

University of Wisconsin - Madison - Department of Operations and Information Management ( email )

Madison, WI
United States

Yili Hong (Contact Author)

University of Houston - C.T. Bauer College of Business ( email )

Houston, TX 77204-6021
United States

T. S. Raghu

Arizona State University - W. P. Carey School of Business ( email )

Farmer Building 440G PO Box 872011
Tempe, AZ 85287
United States

HOME PAGE: http://https://wpcarey.asu.edu/people/profile/192381

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
236
Abstract Views
801
rank
168,541
PlumX Metrics