Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.
Facebook's announcement of its new Graph search tool on Tuesday set off yet another round of rapid-fire analysis about whether Facebook is properly handling its users' privacy. Unfortunately, most of the rapid-fire analysts haven't framed the story properly. Yes, Zuckerberg appears to be respecting our current privacy settings. And, yes, there just might be more stalking ahead. Neither framing device, however, is adequate. If we rely too much on them, we'll miss the core problem: the more accessible our Facebook information becomes, the less obscurity protects our interests.
While many debates over technology and privacy concern obscurity, the term rarely gets used. This is unfortunate, as "privacy" is an over-extended concept. It grabs our attention easily, but is hard to pin down. Sometimes, people talk about privacy when they are worried about confidentiality. Other times they evoke privacy to discuss issues associated with corporate access to personal information. Fortunately, obscurity has a narrower purview.
Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn't mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.
Evan Selinger is Associate Professor of Philosophy and MAGIC Center Head of Research Communications, Community & Ethics, both at Rochester Institute of Technology. Evan publishes extensively in the areas of philosophy of technology, privacy, and ethics/policy of science and technology. To enhance public debate about ethics, Evan regularly supplements his peer-reviewed scholarship with outreach articles in places like The Atlantic, Wired, Slate, Forbes,The Wall Street Journal, and The Nation.
(0) Comments •
(2742) Hits •