Advertisement

Sensor-rich computing: the quiet revolution that started in your pocket

Suppose you're at at your desk with a MacBook and an iPhone. You want to check cinema screening times at your local multiplex and the weather forecast so you'll know if you'll need a jacket or not. Which device do you reach for?

If you choose the MacBook, you'll need to go to a cinema listings service and enter your zip code to find your multiplex, then repeat the process on a weather tracking site (unless your Dashboard already includes appropriate widgets for that data, in which case it's one keystroke away). On the iPhone, you just load a couple of apps, which know where you are so can show you the local data automatically.

This exact scenario happened to me earlier, and I surprised myself by reflexively reaching for the iPhone, without thinking. Somewhere along the way it started to feel like the logical device to use for this sort of thing.

Now, this is a highly trivial example. Safari and Firefox on the MacBook can do location sensing via Wifi positioning, for example (although few websites support this), and sites can also try and guess your location based on your IP address (although I've found that to be occasionally quite inaccurate). Local information services like cinema listings will also typically offer to store your location for future use, so the search becomes a one-time thing. Stick with me, though; I'm going somewhere with this.

Think about the bigger picture. Go back a few years, and computers typically had just two input devices: a keyboard and a mouse. Some of them would also have a webcam and a scanner, but you'd only be using those occasionally and for well-defined specific tasks (mostly "Skype" and "scanning", respectively.)

Now think about the input methods on an iPhone 4. The keyboard and mouse have been replaced by a touchscreen, of course. You've got front and back cameras too, similarly to the scanner and webcam. But wait! There's more!

  • It has a GPS chip, of course, so it can tell where it is (bolstered with some clever battery-saving aGPS too.)

  • It has a magnetometer-based compass, so it can tell which way it's pointing (and sense any magnetic fields.)

  • It has a gyroscope and acceleration meters, so it can tell when you move it -- and in which direction, how far, and how fast.

  • It has an ambient light sensor to adjust screen brightness according to your surroundings.

  • It has two microphones -- one for your voice, and one purely for background sounds for noise-cancellation purposes.

  • It has an orientation sensor so it knows which way up it is.

And that's just the iPhone itself, before we consider additional hardware like Nike+. When compared to almost all the proceeding technology in the thirty-something years since the Apple II kicked off the personal computer revolution, the iPhone has an extraordinary number of ways to perceive the world around itself. In turn, this leads to huge possibilities for apps in the future that can do a much better idea of anticipating our needs based on our surroundings and, in turn, feel far more personal than the "personal computer" ever did.

Perhaps the simplest examples are apps that use device rotation to change the entire UI to something different. The iPod app, for example, switches between CoverFlow selection and the more traditional UI. Calvetica shows day and month views in portrait and week planners in landscape mode. WeightBot allows daily weight entry in portrait view, then goes one stage further -- with a summary view when you turn the device counter-clockwise and a graph of your weight loss (or, in my case, lack of weight loss) when you turn it to landscape clockwise. Another simple example is Instapaper and its option to change to a muted white-on-black color scheme when your phone's local clock indicates it's night-time.

More sophisticated is the use of the various motion sensing circuitry for games like Rage, which allow players to aim their guns in the game by moving their devices around. Spin the same tech another way and you get 3D panorama photography apps like 360 Panoroma. Here, you move and tilt your phone in a complete circle and the app uses the camera to build a complete image of your surroundings -- and then also uses the same tilt sensing to let you view the photos, panning the picture as you pivot on the spot.

Take this idea to the next level and you get augmented reality, which is probably the poster child for sensor rich computing at the moment. Combine the motion tracking with the camera feed and, as the user waves their phone around, show them the camera view but with extra bits added. Star Wars Trench Run (sadly no longer available in iTunes) superimposes TIE Fighters on whatever you are looking at and allows you to shoot them down. Less frivolously, the AR mode in the Yelp app is a very intuitive way to get your bearings when navigating to a restaurant.

Yelp is far from alone. There are lots of augmented reality apps for the iPhone that do all sorts of things -- translate foreign languages, give metadata on live events like gigs, even create floorplans of your house.

This trend isn't limited to smartphones either. Hasselbad's H4D-200MS digital medium format camera has a clever new feature called True Focus. The photographer sets a focus point, perhaps a model's eye, and then software in the camera watches the scene. As the photographer moves the camera around to change the composition, motion sensors in the camera body feed back to the software which adjusts the focus setting to compensate, keeping that point in perfect focus. By using the motion sensors the camera can do a better job of staying on-target than traditional autofocus subject tracking.

But what about the future? I think we've barely scratched the surface of the possibilities. However, one key drawback with iOS is the relatively limited multitasking means some of the more wacky ideas of third party devs aren't possible.

For example, consider the rather dizzying possibilities of Tasker, an Android app that can engage a childlock for all apps except a few games when you are at home and the time is before 9pm. Or sense when you are at your parents' house in the boondocks and disable 3G altogether, so your phone doesn't hammer its battery flat trying to hang on to weak signal. Or set the phone to mute all notification sounds between 11pm and 7am, except for calls from numbers in your phonebook. Or any one of a million other possibilities, all of them ways to leverage your phone's knowledge of its surroundings to make it adapt to you, rather than the other way around.

Another interesting idea was suggested by Dan Frakes, MacWorld's senior editor: put an unlock code on your iPhone that is automatically disabled when you are at home. It turns out this is possible today for jailbroken iPhones with CleverPin, but of course this sort of deep system modification isn't possible without jailbreak tools.

We can only hope that Apple will do more to embrace the possibilities of sensor rich computing in future iOS versions and give developers more flexibility to access these features. We've all seen how iOS 5's Reminders app offers location-aware popups (e.g. "remind me to put my lunch in the fridge when I arrive at the office"), and we know Apple is patenting location-aware traffic warnings. It would be very nice for all of us if Apple would open this up to third party developers, so they could set an app up to perform some pre-arranged task like a notification alert when the phone is next in a certain location.

I hope to see something like this in future iOS versions, because I think there's still a lot of ideas that no one has had yet. What awesome uses of sensor-rich computing do you like in your apps, and what would you like to see in the future?