Google’s Anti-Facial Recognition Policy for Glass is Deadly
B. J. Murphy
2013-12-04 00:00:00

According to the Amber Alert webpage, "A child goes missing in the United States every 40 seconds," and that "More than 700,000 children go missing annually." That is an absolutely frightening statistic! Much more frightening than the prospect that some Glass user may know my name.

How far are we willing to go to ensure absolute privacy isn't diminished whatsoever? When does the right of privacy begin interfering with the right of safety? Can the two come together in harmony, or are they destined to be in conflict until society finally reaches a decision over one or the other?

I understand the desire for privacy, but as I've argued in the past, as we as a society become more public and technologically open-source, the idea of privacy slowly fades away. That isn't to say that some forms of privacy can't be maintained. Surely we should have the right to say 'yes' or 'no' over whether or not our private data is to be shared publicly. That level of freedom and choice could easily maintain a sense of privacy to each individual.

But then, when it comes to missing children, or even missing adults, should we not then be willing to sacrifice a portion of our privacy to ensure the safety of those who's gone missing? It doesn't even have to be that large of a peek into each's private lives - simply a facial recog. map, a name, and whether or not they're reported missing, or even possibly wanted.

Picture this with me: It's 2014 and only a few months have passed since the commercial launch of Google Glass. Hundreds of thousands of people already acquire their own device, scattered across the United States. A mandatory app was included with Glass, which was connected with Amber Alert systems. The app has Glass quietly scanning each face you cross paths with, but doesn't reveal their names, nor does it alert you that it's currently scanning. For all you know, it's a normal day like any other.

Now, as you're walking down a street, you walk past an adult male with a pre-teen female. You don't even pay much attention to them. Just another group of people walking by, as far as you're concerned. But then Glass, on the other hand, knows something you don't - the little girl has been reported missing.

As a result, without alerting you, the app then - albeit quietly - takes a snapshot of the girl and unknown male captor, contacts a 911 operator program, and delivers GPS coordinates of where the photo was taken and in which direction the girl was walking. The police show up, arrest the male captor, and contacts the parents of the missing child informing them that she'd been found and safe.

This was able to occur because each parent - or family member, guardian, etc. - had allowed the missing child's name and facial recog. map to be archived in a Amber Alert system program, which connects via app on Glass. Was said child's "privacy" diminished? Yes. But then she's also alive because of it and a kidnapper is taken off the streets, not able to harm anyone else again.

Isn't this very real prospect of technologically-enhanced safety worth sacrificing a bit of our own privacy? While I'm not a parent, if anyone of my family were to go missing, their privacy would be the last thing I'd be concerned about. And if I'd gone missing, I'd want everyone to do all they could to find me, even if it meant sacrificing my own privacy.

Google Glass is coming just next year. And with Google's determination to ban facial recognition using Glass, we must ask ourselves: At what price?