Microsoft is certainly no stranger to multi-touch interfaces, but it looks to really be pushing things with its new so-called SideSight research project, which promises to do away with that pesky need to actually touch the screen. To do that, Microsoft proposes to employ a whole range of proximity sensors around a device, which would be able to detect gestures up to ten centimeters away, with a quick motion towards the device representing a click, for instance, or a twisting motion letting you rotate an image. While Microsoft has actually built a prototype of sorts using an HTC Touch (seen above), it still has quite a ways to go, as the sensors on the prototype actually just connected to a PC via USB and then relayed back to the phone using Bluetooth. Eventually, however, Microsoft says that the system could allow for multi-touch to be used on very small devices, which could possibly even use printed sensors that would cover the entire casing.

[Image courtesy CNET News, thanks Peter]


Microsoft SideSight project promises to take multi-touch beyond the screen