If you ever get tired of poking away at your smartphone's screen like a doorbell, you're not alone. The forward-looking folks over at Microsoft Research have been working away at a new touchscreen system designed pick up on more natural, whole-hand movements, effectively allowing users to break free from the finger-based paradigm that governs most tactile devices. Developed in coordination with engineers at Microsoft Surface, the company's Rock and Rails interface can detect three basic hand gestures: a balled fist, which holds items on the screen, an extended hand that can align objects (see the cell marked "d," on the right) and a curved paw, around which users can pivot images (see cell b). This taxonomy opens up new ways for users to crop, re-size or generally play around with their UI elements, though it remains unclear whether the display will trickle down to the consumer level anytime soon. For now, it appears to operate exclusively on the Surface, but more details should surface when the system's developers release a paper on their project, later this year. Hit the source links to see a video of the thing in action.