Robot Servants Are Going to Make Your Life Easy. Then They’ll Ruin It
Evan Selinger
2014-09-05 00:00:00
URL

But, should we let robot servants into our lives?

Jibo is almost too adorable to resist. Sleekly designed with a curvy, clean-looking white enclosure and a dark round face, this teensy-weensy gadget looks downright adorable when doing what it does best: taking family pictures, reading stories to our kids, ordering our pizza, and just hanging out, being polite and sociable. While some might find Jibo over-priced or functionally limited, there seems little else to object to. Right? Not so fast.



IRONICALLY, ROBOT SERVANTS COULD END UP DIMINISHING OUR QUALITY OF LIFE AND CHARACTER BY DOING OUR BIDDING.


Jibo poses a fundamentally existential problem: Is a life lived with a robot servant the kind of life we should want to live?




Will Robot Servants Make Us Worse People?



Robot servants promise to make things better by freeing up our time and eliminating our grunt work, yet, ironically, they could end up diminishing our quality of life and character by doing our bidding.

This problem doesn’t arise when all the robot servant is doing is unrewarding grunt work we all despise. Take familiar devices, like washing machines and vacuum cleaners. Most us of would declare victory if a fully automated robotic cleaner, like the new Dyson, could reliably get tricky jobs done while removing the human element entirely. Similarly, we don’t worry about passing the buck on activities we should be doing ourselves when, say, asking Siri what the weather is or dictating messages for her to compose and send. In fact, we like to extract maximum labor from Siri and even pose ridiculous questions when we’re bored.

Things begin to get complicated when robots go beyond basic manual, bureaucratic, and cognitive labor and become tools for us to outsource intimate experiences and functions to. Part of Jibo’s appeal is that it will let you to stop thinking. That is a disconcerting change, one which over time, can profoundly impact who we are. The issue concerns predictive technology, a feature that’s come to be an essential ingredient in the design of all kinds of digital assistance technology.

In the promotional video, Jibo isn’t just depicted as an educator and entertainer; Jibo is a mind reader. Coming home after what’s presumably been a long and grueling day, “Eric,” a businessman, turns to his robotic helper and says: “Can you order some takeout for me?” Jibo replies: “Sure thing. Chinese as usual?” Mouthing a line that could be an advertisement for any number of highly hyped “anticipatory computing” products, Eric responds: “You know me so well.”



WHAT WILL HAPPEN TO OUR INCLINATION TO DEVELOP VIRTUES ASSOCIATED WITH WILLPOWER WHEN TECHNOLOGY INCREASINGLY DOES OUR THINKING FOR US AND PREEMPTIVELY SATISFIES OUR DESIRES?


Now, computationally determining past dining patterns and inferring about likely future choices might not seem to be a big deal. “Eric” or any other user always can reject a recommendation. “Sorry, Jibo,” he might say, “but I prefer pizza.” But things become more complicated once we look past disconnected examples and examine the import of our decisions in light of pervasive patterns. Certainly, Jibo won’t be the only forecasting helper peddling prognostics—especially if the vision of smart homes associated with the Internet of Things comes to fruition.

Click Here to read more...