Report and Repeat: Investigating Facebook's Hate Speech Removal Process
Posted: 27 Aug 2018
Date Written: April 1, 2018
Despite the enormous power Facebook has to regulate the expression of their 2.2 billion users, little is known about their efforts to remove content that violates the company’s own Community Standards. This study sought to better understand this process by reporting posts, images, and comments containing hate speech and recording Facebook’s response (n=144). Approximately 45 percent of vitriolic content reported was removed. The results of the study indicated inconsistencies on the part of Facebook when it came to removing misogynistic hate speech and accounting for context in removal decisions. The paper concludes with specific recommendations for policy changes on the part of Facebook in order to minimize hateful content on their platform.
Suggested Citation: Suggested Citation