The Effects of Machine-powered Platform Governance: An Empirical Study of Content Moderation
48 Pages Posted: 10 Mar 2021
Date Written: January 17, 2021
With increasing participation in social media and online communities, content moderation has become an important part of the online experience. Since of beginning of these digital platforms, volunteer moderators have been the essential workforce for platform governance. However, human moderation suffers from a limited capacity in moderating massive and undesirable content. Recently, many online platforms and communities have started to adopt algorithm-based moderation tools (bot moderators) to enable machine-powered platform governance to cope with the increasing need for content moderation. As platforms move toward the technical and automated mode of governance, there is a growing concern over de-humanization and whether machines would lead volunteer moderators to reduce their contributions. To understand the role of these increasingly popular bot moderators, we conduct an empirical study to examine the impact of machine-powered regulations on volunteer moderators’ behaviors. With data collected from 156 subreddits on Reddit, a large global online community, we found that delegating moderation to machines augments volunteer moderators’ role as community managers. Human moderators engage in more moderation-related activities, including 20.2% more corrective and 14.9% supportive activities with their community members. Importantly, the effect manifests primarily among communities with large user bases and detailed guidelines, suggesting that community needs for moderation are the key factors driving more voluntary contributions in the presence of bot moderators.
Keywords: Platform governance, human-machine interaction, content moderation, human moderators, bot moderators
Suggested Citation: Suggested Citation