Differential Privacy as a Response to the Reidentification Threat: The Facebook Advertiser Case Study

40 Pages Posted: 18 May 2012

See all articles by Andrew Chin

Andrew Chin

University of North Carolina School of Law

Anne Klinefelter

University of North Carolina School of Law

Date Written: May 8, 2012

Abstract

Recent computer science research on the reidentification of individuals from anonymized data has given some observers in the legal community the impression that the utilization of data is incompatible with strong privacy guarantees, leaving few options for balancing privacy and utility in various data-intensive settings. This bleak assessment is incomplete and somewhat misleading, however, because it fails to recognize the promise of technologies that support anonymity under a standard that computer scientists call differential privacy. This standard is met by a database system that behaves similarly whether or not any particular individual is represented in the database, effectively producing anonymity. Although a number of computer scientists agree that these technologies can offer privacy-protecting advantages over traditional approaches such as redaction of personally identifiable information from shared data, the legal community’s critique has focused on the burden that these technologies place on the utility of the data. Empirical evidence, however, suggests that at least one highly successful business, Facebook, has implemented such privacy-preserving technologies in support of anonymity promises while also meeting commercial demands for utility of certain shared data.

This Article uses a reverse-engineering approach to infer that Facebook appears to be using differential privacy-supporting technologies in its interactive query system to report audience reach data to prospective users of its targeted advertising system, without apparent loss of utility. This case study provides an opportunity to consider criteria for identifying contexts where privacy laws might draw benefits from the adoption of a differential privacy standard similar to that apparently met by Facebook’s advertising audience reach database. United States privacy law is a collection of many different sectoral statutes and regulations, torts, and constitutional law, and some areas are more amenable to incorporation of the differential privacy standard than others. This Article highlights some opportunities for recognition of the differential privacy standard as a best practice or a presumption of compliance for privacy, while acknowledging certain limitations on the transferability of the Facebook example.

Keywords: Facebook, privacy, databases, differential privacy, social networks, computer science

JEL Classification: K13, K23

Suggested Citation

Chin, Andrew and Klinefelter, Anne, Differential Privacy as a Response to the Reidentification Threat: The Facebook Advertiser Case Study (May 8, 2012). North Carolina Law Review, Vol. 90, No. 5, 2012, UNC Legal Studies Research Paper No. 2062447, Available at SSRN: https://ssrn.com/abstract=2062447

Andrew Chin (Contact Author)

University of North Carolina School of Law ( email )

Van Hecke-Wettach Hall
100 Ridge Road
Chapel Hill, NC 27599-3380
United States
919-962-4116 (Phone)

Anne Klinefelter

University of North Carolina School of Law ( email )

Van Hecke-Wettach Hall, 160 Ridge Road
CB #3380
Chapel Hill, NC 27599-3380
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
537
Abstract Views
6,194
Rank
95,323
PlumX Metrics