Normally, if you asked me to free associate what comes to mind when I hear words like “productivity app” and “life hack,” you’d be treated an all out vent session—a combination of skepticism and cynicism directed at overly hyped products, overesteem for efficiency, and overblown attempts to delegate responsibility and willpower. But then I read a gushing review of Full, an app for tracking and measuring “what’s important to you.” I actually think it’s a good product and an excellent prompt for thinking about why goal track apps are so existentially provocative.
When the Partially Examined Lifediscussion of human enhancement (Episode 91) turned to the topic of digital technology, the philosophical oxygen was sucked out of the room. Sure, folks conceded that philosopher of mind Andy Clark (not mentioned by name, but implicitly referenced) has interesting things to say about how technology upgrades our cognitive abilities and extends the boundaries of where our minds are located. But everything else more or less was dismissed as concerning not terribly deep uses of “appliances”.
While I am far from a Luddite who fetishizes a life without tech, we need to consider the consequences of this latest batch of apps and tools that remind us to contact significant others, boost our willpower,provide us with moral guidance, and encourage us to be civil. Taken together, we’re observing the emergence of tech that doesn’t just augment our intellect and lives — but is now beginning to automate and outsource our humanity.
Why an app that reminds you to text your partner might not be the best idea. If you’re looking to add a digital spark to your relationship this Valentine’s Day, you can download the new app Romantimatic. Romantimatic will send you scheduled reminders to contact your significant other and give you pre-set messages to fire off. The pre-set messages include simple, straightforward classics like “I love you” and “I miss you.”
The technology world was abuzz last week when Google announced it spent nearly half a billion dollars to acquire DeepMind, a UK-based artificial intelligence (AI) lab. With few details available, commentators speculated on the underlying motivation.
Without “noise makers”—folks spreading rumors, false information, hoaxes, rumor, and hearsay—markets and the blogosphere might grind to a halt. But asVincent Hendricks argues in “When Twitter Storms Cause Financial Panic,” information bubbles can be immensely destructive. They can hurt the economy and damage society.
A well-intentioned grandmother accidentally hurt her grandkids’ feelings. She took screenshots of their delightful Instagram photos and proudly uploaded them to Facebook for all of her social network friends to see. If the younger generation didn’t set their accounts to private, could Grandma possibly have committed a faux pas? All she did was lovingly pass along publicly available information!
It would be nice to believe that the road to civility could be paved by following simple formulae, like Frank Bruni’s New Year’s exhortation, “Tweet less, read more”. Unfortunately, uncomplicated Op-Ed advice doesn’t translate into effective results in the messy real world.
Apple'sAAPL -0.45% latest television ad, “Misunderstood,” is leaving viewers with impassioned and conflicting interpretations. Giving Talmudic treatment to a short commercial might seem like overkill, especially given the Christmas theme. But I think we’re lucky the narrative has become a Rorschach test for discussing the social and ethical impact of technology.
Who has time anymore to manage their social media feeds? All the status updating, replying, and posting of smart takes on the day’s news is exhausting. Well, Google want to help you out with that: The company recently submitted a patent for software that learns how users respond to social media posts and then automatically recommends updates and replies they can make for future ones. Consider it outsourcing, for your social life—an amped up, next gen blend of automated birthday reminders and computer generated, personalized remarks (more successful Turing Test than random word salad).
Big data generates big myths. To help society set realistic expectations, the right kind of skepticism is needed. Kate Crawford, Principal Researcher at Microsoft Research and Visiting Professor at MIT’s Center for Civic Media, does a fantastic job of explaining why folks are too optimistic about the promise of what big data can offer. She rightly argues that too much faith in it inclines us to misunderstand what data reflects, overestimate the political efficacy of information, and become insensitive to civil rights concerns.
As technology expands our communicative reach, new opportunities to be rude inevitably arise. Some people overreact to this incivility by turning to uniform and mechanical etiquette rules, hoping to make things better by constraining choices and limiting situational judgment. But for societies that value diversity and autonomy, general mandates—like expecting everyone to turn off their cell phones in theaters—only work in exceptional cases.
While privacy advocates have expressed concern about the phenomenon of massive data collection and analytics colloquially known as “big data,” most people are more familiar with social media anxiety, like inappropriate Facebook posts leading to embarrassing and reputation ruining incidents. This situation is likely to change, and in the near future society will have to confront a profound question.
Time recently ran a cover story titled, “Can Google Solve Death?” The wording was a bit much, as the subject of the piece, Google’s new firm Calico, has more modest ambitions, like using “tools like big data to determine what really extends lives.” But even if there won’t be an app for immortality any time soon, we’re increasingly going to have to make difficult decisions about when human limits should be pushed and how to ensure ethics keeps pace with innovation.
Is it OK to use a smartphone in class, email an instructor, record a lecture? A professor offers lessons. There’s a widely shared image on the Internet of a teacher’s note that says: “Dear students, I know when you’re texting in class. Seriously, no one just looks down at their crotch and smiles.”
“Big data” can be defined as a problem-solving philosophy that leverages massive datasets and algorithmic analysis to extract “hidden information and surprising correlations.” Not only does big data pose a threat to traditional notions of privacy, but it also compromises socially shared information. This point remains underappreciated because our so-called public disclosures are not nearly as public as courts and policymakers have argued—at least, not yet. That is subject to change once big data becomes user friendly.
The new ads for Facebook Home are propaganda clips. Transforming vice into virtue, they’re social engineering spectacles that use aesthetic tricks to disguise the profound ethical issues at stake. This isn’t an academic concern: Zuckerberg’s vision (as portrayed by the ads) is being widely embraced — if the very recent milestone of half a million installations is anything to go by.
Let’s face it: Technology and etiquette have been colliding for some time now, and things have finally boiled over if the recent spate of media criticisms is anything to go by. There’s the voicemail, not to be left unless you’re “dying.” There’s the e-mail signoff that we need to “kill.” And then there’s the observation that what was once normal — like asking someone for directions — is now considered “uncivilized.”
For the past few weeks, my six-year-old daughter has been obsessed with Selena Gomez reprising her role as Alex Russo on the Disney show Wizards of Waverly Place. Like many of her friends, Rory has seen every episode of Wizards and religiously listens to Selena's music.
Yet for all the efficiencies these do engines may provide, they may also carry a significant risk. Evan Selinger, a fellow at the Institute for Ethics and Emerging Technologies, argues that less friction in our lives may “render us more vulnerable to being automatic,” and eliminate crucial opportunities for moral deliberation. “The digital servant becomes the digital overlord, and we don’t even recognize it.”
My grandfather died on Halloween. Thanks to Hurricane Sandy, none of the New York family members could attend the funeral in Massachusetts. Fortunately, another option became available: The ceremony was streamed online, and so my wife, daughter and I gathered around a laptop in our living room to watch the live webcast.