The University of Buffalo's mad scientists are hoping that the "Fingertip Digitizer" will kick off the next phase in computer interfaces by harnessing people's learned physical motions and movement to kill the UI learning curve. All one needs to do is simply slip the sleeve onto their fingers and the device's thin-film embedded force sensors and tri-axial accelerometer will track their movements in real-time, even providing tactile feedback corresponding to physical motions and virtual environments. One might reminisce of a Tom Cruise flick or other implementations of gesture interface control, but the "Fingertip Digitizer" works a bit different then most by allowing the user to not only to command the system with motion, but also feel it (perhaps most similarly to Novint's Falcon). For example, if you move your hand to pick up a ball, you will not only instruct the computer to grab the ball, but feel the pressure and weight of the ball in your hand; should you motion pulling the trigger on a sniper rifle in an FPS, you'll feel that 1.5-pound hair-trigger move ever so slightly with your index finger. The whole system is going to be on display at this year's SIGGRAPH if you're nearby and want to peep one vision of future haptic interfaces, but otherwise you'll have to wait until this system hits commercial applications within about three years to get your mitts on these mitts.

Public Access
Researchers create human-like "shape-shifting" lens