Not Available for Download

Big Data in Small Hands

Posted: 5 Sep 2013 Last revised: 12 Aug 2017

Woodrow Hartzog

Northeastern University School of Law and College of Computer and Information Science; Stanford Law School Center for Internet and Society

Evan Selinger

Rochester Institute of Technology - Department of Philosophy

Date Written: September 3, 2013

Abstract

"Big data" can be defined as a problem-solving philosophy that leverages massive data-sets and algorithmic analysis to extract "hidden information and surprising correlations." Not only does big data pose a threat to traditional notions of privacy, but it also compromises socially shared information. This point remains under appreciated because our so-called public disclosures are not nearly as public as courts and policymakers have argued — at least, not yet. That is subject to change once big data becomes user friendly.

Most social disclosures and details of our everyday lives are meant to be known only to a select group of people. Until now, technological constraints have favored that norm, limiting the circle of communication by imposing transaction costs — which can range from effort to money — onto prying eyes. Unfortunately, big data threatens to erode these structural protections, and the common law, which is the traditional legal regime for helping individuals seek redress for privacy harms, has some catching up to do.

To make our case that the legal community is under-theorizing the effect big data will have on an individual’s socialization and day-to-day activities, we will proceed in four steps. First, we explain why big data presents a bigger threat to social relationships than privacy advocates acknowledge, and construct a vivid hypothetical case that illustrates how democratized big data can turn seemingly harmless disclosures into potent privacy problems. Second, we argue that the harm democratized big data can inflict is exacerbated by decreasing privacy protections of a special kind — ever-diminishing "obscurity." Third, we show how central common law concepts might be threatened by eroding obscurity and the resulting difficulty individuals have gauging whether social disclosures in a big data context will sow the seeds of forthcoming injury. Finally, we suggest that one way to stop big data from causing big, un-redressed privacy problems is to update the common law with obscurity-sensitive considerations.

Keywords: privacy, big data, obscurity

Suggested Citation

Hartzog, Woodrow and Selinger, Evan, Big Data in Small Hands (September 3, 2013). 66 Stanford Law Review Online 81 (2013). Available at SSRN: https://ssrn.com/abstract=2320739

Woodrow Hartzog (Contact Author)

Northeastern University School of Law and College of Computer and Information Science ( email )

416 Huntington Avenue
Boston, MA 02115
United States

HOME PAGE: http://https://www.northeastern.edu/law/faculty/directory/hartzog.html

Stanford Law School Center for Internet and Society ( email )

Palo Alto, CA
United States

HOME PAGE: http://cyberlaw.stanford.edu/profile/woodrow-hartzog

Evan Selinger

Rochester Institute of Technology - Department of Philosophy ( email )

92 Lomb Memorial Drive
Rochester, NY 14623-5670
United States
(585) 475-2531 (Phone)

Paper statistics

Abstract Views
509