Big Data in Small Hands

Posted: 5 Sep 2013 Last revised: 12 Aug 2017

See all articles by Woodrow Hartzog

Woodrow Hartzog

Boston University School of Law; Stanford Law School Center for Internet and Society

Evan Selinger

Rochester Institute of Technology - Department of Philosophy

Date Written: September 3, 2013


"Big data" can be defined as a problem-solving philosophy that leverages massive data-sets and algorithmic analysis to extract "hidden information and surprising correlations." Not only does big data pose a threat to traditional notions of privacy, but it also compromises socially shared information. This point remains under appreciated because our so-called public disclosures are not nearly as public as courts and policymakers have argued — at least, not yet. That is subject to change once big data becomes user friendly.

Most social disclosures and details of our everyday lives are meant to be known only to a select group of people. Until now, technological constraints have favored that norm, limiting the circle of communication by imposing transaction costs — which can range from effort to money — onto prying eyes. Unfortunately, big data threatens to erode these structural protections, and the common law, which is the traditional legal regime for helping individuals seek redress for privacy harms, has some catching up to do.

To make our case that the legal community is under-theorizing the effect big data will have on an individual’s socialization and day-to-day activities, we will proceed in four steps. First, we explain why big data presents a bigger threat to social relationships than privacy advocates acknowledge, and construct a vivid hypothetical case that illustrates how democratized big data can turn seemingly harmless disclosures into potent privacy problems. Second, we argue that the harm democratized big data can inflict is exacerbated by decreasing privacy protections of a special kind — ever-diminishing "obscurity." Third, we show how central common law concepts might be threatened by eroding obscurity and the resulting difficulty individuals have gauging whether social disclosures in a big data context will sow the seeds of forthcoming injury. Finally, we suggest that one way to stop big data from causing big, un-redressed privacy problems is to update the common law with obscurity-sensitive considerations.

Keywords: privacy, big data, obscurity

Suggested Citation

Hartzog, Woodrow and Selinger, Evan, Big Data in Small Hands (September 3, 2013). 66 Stanford Law Review Online 81 (2013), Available at SSRN:

Woodrow Hartzog (Contact Author)

Boston University School of Law ( email )

765 Commonwealth Avenue
Boston, MA 02215
United States

HOME PAGE: http://

Stanford Law School Center for Internet and Society ( email )

Palo Alto, CA
United States


Evan Selinger

Rochester Institute of Technology - Department of Philosophy ( email )

92 Lomb Memorial Drive
Rochester, NY 14623-5670
United States
(585) 475-2531 (Phone)

Do you have negative results from your research you’d like to share?

Paper statistics

Abstract Views
PlumX Metrics