Report and Repeat: Investigating Facebook's Hate Speech Removal Process

Posted: 27 Aug 2018

See all articles by Caitlin Carlson

Caitlin Carlson

Seattle University - Department of Communication

Hayley Rousselle

Seattle University

Date Written: April 1, 2018

Abstract

Despite the enormous power Facebook has to regulate the expression of their 2.2 billion users, little is known about their efforts to remove content that violates the company’s own Community Standards. This study sought to better understand this process by reporting posts, images, and comments containing hate speech and recording Facebook’s response (n=144). Approximately 45 percent of vitriolic content reported was removed. The results of the study indicated inconsistencies on the part of Facebook when it came to removing misogynistic hate speech and accounting for context in removal decisions. The paper concludes with specific recommendations for policy changes on the part of Facebook in order to minimize hateful content on their platform.

Suggested Citation

Carlson, Caitlin and Rousselle, Hayley, Report and Repeat: Investigating Facebook's Hate Speech Removal Process (April 1, 2018). Available at SSRN: https://ssrn.com/abstract=3232325 or http://dx.doi.org/10.2139/ssrn.3232325

Caitlin Carlson (Contact Author)

Seattle University - Department of Communication ( email )

900 Broadway
Seattle, WA 98122
United States
206-220-8531 (Phone)

Hayley Rousselle

Seattle University

900 Broadway
Seattle, WA 98122
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
710
PlumX Metrics