While patrolling the halls of the CHI 2013 Human Factors in Computing conference in Paris, we spied a research project from MIT's Media Lab called "Smarter Objects" that turns Minority Report tech on its head. The researchers figured out a way to map software functionality onto tangible objects like a radio, light switch or door lock through an iPad interface and a simple processor / WiFi transceiver in the object. Researcher Valentin Huen explains that "graphical user interfaces are perfect for modifying systems," but operating them on a day-to-day basis is much easier using tangible objects.
To that end, the team developed an iPad app that uses motion tracking technology to "map" a user interface onto different parts of an object. The example we saw was a simple radio with a a pair of dials and a speaker, and when the iPad's camera was pointed at it, a circular interface along with a menu system popped up that cannily tracked the radio. From there, Huen mapped various songs onto different positions of the knob, allowing him to control his playlist by moving it -- a simple, manual interface for selecting music. He was even able to activate a second speaker by drawing a line to it, then "cutting" the line to shut it off. We're not sure when, or if, this kind of tech will ever make it into your house, but the demo we saw (see the pair of videos after the break) seemed impressively ready to go.