Facebook: Managing User-Generated Content

1 Pages Posted: 13 Sep 2020

See all articles by Tami Kim

Tami Kim

Harvard University - Business School (HBS)

Gerry Yemen

University of Virginia - Darden School of Business

Abstract

This public-sourced case describes the evolution of Facebook monitoring users' activity around what potentially offensive or disturbing material could stay posted and what should be removed. Following intense media and regulatory (e.g., Congressional) scrutiny, the company revised policies and reorganized content-review teams. The case offers an opportunity to debate a key question the company faced: What should platform governance look like? As Facebook faced an exponential growth in the number of user-generated posts daily, posts that contained harmful, inappropriate content also increased, raising the need for Facebook to devise a set of guidelines that were not only applicable across many different types of posts and situations, but were also perceived as acceptable by Facebook users. The tradeoffs a platform business makes between user characteristics, intent, outcome, and norms can be explored as students put themselves in CEO Mark Zuckerberg's shoes to redesign Facebook's platform-governance policies. Students will also have a chance to examine how certain platform-governance policies can subsequently impact Facebook's future strategy: for instance, the more Facebook got involved in curating content for its users—thus serving as arbiter—the more potential to shift Facebook from a tech business to a news publisher increased, thereby subjecting the company to a different set of regulations.

Excerpt

UVA-M-0976

Rev. Aug. 17, 2020

Facebook: Managing User-Generated Content

…Social media companies exploit the social environment.

—George Soros, financier

The posts never ended. It was often difficult to determine the differences. Yet that was Tom Kellogg's job, as a process executive for a firm that contracted for Facebook's safety and security unit. He was supposed to review materials that members of the Facebook community flagged as material that violated Facebook's community standards. If Kellogg agreed, offending content was removed and the person who posted it was issued a Facebook warning. Several citations resulted in suspension of their Facebook account.

. . .

Keywords: Facebook, social media, safety and security, internet, Mark Zuckerberg, content moderation, free speech, censorship, media, media distribution, positioning, risk, value proposition, consumer-firm dynamics, digital, platform governance, community standards, trust, bad actors, marketing strategy

Suggested Citation

Kim, Tami and Yemen, Gerry, Facebook: Managing User-Generated Content. Darden Case No. UVA-M-0976, Available at SSRN: https://ssrn.com/abstract=3682607 or http://dx.doi.org/10.2139/ssrn.3682607

Tami Kim (Contact Author)

Harvard University - Business School (HBS) ( email )

Soldiers Field Road
Morgan 270C
Boston, MA 02163
United States

Gerry Yemen

University of Virginia - Darden School of Business ( email )

P.O. Box 6550
Charlottesville, VA 22906-6550
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
3
Abstract Views
469
PlumX Metrics