Big Data in Small Hands
Evan Selinger
2013-09-04 00:00:00
URL

Most social disclosures and details of our everyday lives are meant to be known only to a select group of people.U. Chi. L. Rev. 919 (2005).">[2] Until now, technological constraints have favored that norm, limiting the circle of communication by imposing transaction costs—which can range from effort to money—onto prying eyes. Unfortunately, big data threatens to erode these structural protections, and the common law, which is the traditional legal regime for helping individuals seek redress for privacy harms, has some catching up to do.Harv. J.L. & Tech. 1, 19-20 (2007); Danielle Keats Citron,Mainstreaming Privacy Torts, 98 Calif. L. Rev. 1805, 1827 (2010); Andrew Jay McClurg, Bringing Privacy Law Out of the Closet: A Tort Theory of Liability for Intrusions in Public Places, 73 N.C. L. Rev. 989, 1057 (1995); Neil M.
Richards, The Limits of Tort Privacy, 9 J. Telecomm. & High Tech. L. 357, 383 (2011); Neil M. Richards & Daniel J. Solove, Prossers Privacy Law: A Mixed Legacy, 98 Calif. L. Rev. 1887, 1889 (2010); Harry Surden, Structural Rights in Privacy, 60 SMU L. Rev. 1605
(2007).">[3]

To make our case that the legal community is under-theorizing the effect big data will have on an individual’s socialization and day-to-day activities, we will proceed in four steps.http://law.duke.edu/sites/default/files/images/centers/judicialstudies/Reworking_Info_Privacy_Law.pdf. href="http://law.duke.edu/sites/default/files/images/centers/judicialstudies/Reworking_Info_Privacy_Law.pdf">http://law.duke.edu/sites/default/files/images/centers/judicialstudies/Reworking_Info_Privacy_Law.pdf. They write:




People also expect “privacy by obscurity,” that is, the ability to blend into a crowd or find other ways to be anonymous by default. This condition is
rapidly disappearing, however, with new technologies that can capture images and audio nearly everywhere. As an example, facial recognition technology is
constantly improving. Already, Facebook and Apple use technologies that permit the automatic tagging of photographs. One day devices, such as Google
Glasses, may permit the identification of passing pedestrians on the street. In short, if the privacy torts are to be rethought, more guidance must be
provided as to the underlying concept of privacy.




Id.
at 11 (citations omitted).">[4] First, we explain why big data presents a bigger threat to social relationships than privacy advocates acknowledge, and construct a vivid hypothetical case that illustrates how democratized big data can turn seemingly harmless disclosures into potent privacy problems. Second, we argue that the harm democratized big data can inflict is exacerbated by decreasing privacy protections of a special kind—ever-diminishing “obscurity.” Third, we show how central common law concepts might be threatened by eroding obscurity and the resulting difficulty individuals have gauging whether social disclosures in a big data context will sow the seeds of forthcoming injury. Finally, we suggest that one way to stop big data from causing big, unredressed privacy problems is to update the common law with obscurity-sensitive considerations.



I. Big, Social Data

For good reason, the threat big data poses to social interaction has not been given its due. Privacy debates have primarily focused on the scale of big data and concentrations of power—what big corporations and big governments can do with large amounts of finely analyzed information. There are legitimate and pressing concerns here, which is why scholars and policymakers focus on Fair Information Practice Principles (FIPPs), deidentification techniques, sectoral legislation protecting particular datasets, and regulatory efforts to improve data security and safe international data transfers.UCLA L. Rev. 1701, 1776 (2010)
(“Easy reidentification represents a sea change not only in technology but in our understanding of privacy.”); Rubinstein, supra note 1, at 74; Omer
Tene & Jules Polonetsky, Big Data for All: Privacy and User Control in the Age of Analytics, 11 Nw. J. Tech. & Intell. Prop. 239, 256-57
(2013); Omer Tene & Jules Polonetsky, Privacy in the Age of Big Data, 64 Stan. L. Rev. Online 63 (2012); Felix T. Wu, Defining Privacy and Utility in Data Sets, 84 U. Colo. L. Rev. 1117 (2013); danah boyd, Address at the WWW2010 Conference: “Privacy and Publicity in
the Context of Big Data” (Apr. 29, 2010),
http://www.danah.org/papers/talks/2010/WWW2010.html. href="http://www.danah.org/papers/talks/2010/WWW2010.html">http://www.danah.org/papers/talks/2010/WWW2010.html. But see Jane Yakowitz, Tragedy of the Data Commons, 25 Harv. J.L. & Tech. 1 (2011).">[5]

This trajectory fails to address the full scope of big data as a disruptive force in nearly every sector of the patchwork approach to privacy protection in the United States. Individuals eventually will be able to harness big datasets, tools, and techniques to expand dramatically the number and magnitude of privacy harms to themselves and others, perhaps without even realizing it. Emory L.J. 909 (2013).">[6] This is problematic in an age when so many aspects of our social relationships with others are turned into data.

This article was co-authored with Woodrow Hartzog

Click Here to read more...