Awareness Is Everything
Jamais Cascio
2009-09-08 00:00:00
URL

The real shift comes when we move away from direct interaction and input, towards a world of ambient interaction and awareness.

Our laptops, mobile phones, and sometimes desktop computers increasingly come with built-in microphones, cameras, accelerometers, and even GPS. For the most part, these sensory technologies only come into play when we call upon them directly by launching a related application (to take a picture, or find something on a map, etc.). The rest of the time, these senses are turned off. Battery life probably plays a role in keeping the senses off, but I suspect a bigger reason is that we're simply not accustomed to thinking about our tools as always "paying attention."

Android smartphoneOne notable exception is worth calling out, because it's indicative of what kinds of possibilities we have in front of us: many mid- and top-end laptops come with built-in accelerometers to lock the hard drive heads if the laptop takes a fall. Apple patented theirs as a "Sudden Motion Sensor," but Lenovo, Acer, and HP all have similar systems. But think about this in the abstract--in the laptop I'm typing on right now, there's an environmental sensor paying constant attention, ready to act if certain conditions are met. Now, imagine that same concept holding true for other kinds of sensors.

Imagine a desktop with a camera that knows to shut down the screen and eventually go to sleep when you walk away (but stays awake when you're sitting there reading something or thinking), and will wake up when you sit down in front of it (no mouse-jiggling required).

Or a system with a microphone that listens for the combination of a phone ringing (sudden loud noise) followed by a nearby voice saying "hello" (or similar greeting), and will mute the system automatically.

Perhaps a "sudden motion sensor" for phones, not to detect when the phone is dropped, but to detect when the phone has too-quickly gone from freeway speed to zero (perhaps with the microphone picking up collision noises, or sounds of distress), and auto-dialing a 911-like service.

These are just a few simple examples, relying on some fairly basic rules. But imagine if you combine the sensory awareness with a more complex Bayesian-style learning system. What if your digital device could learn your habits, and adjust accordingly?

Read the rest here