Advertisement

EyePoint software improves vision-based input

We've all seen them before: high-tech eye-tracking systems that let disabled folks control their computers by adjusting their gaze, or those cheaper models that promise to free up gamers' hands and enable vision-based navigation. However, the problem with current systems -- both medical and recreational -- is that they have difficulty correcting for rapid, unconscious movements of the pupil, making them prone to frustrating errors and giving them only meager functionality. Well that may all be about to change thanks to a Stanford researcher named Manu Kumar and his EyePoint software, which can be used with the same multi-thousand dollar hardware as existing setups, but improves upon their accuracy through pupil-steadying algorithms and by throwing the user's hand into the mix. Someone using a rig powered by EyePoint first looks at the general area of the screen they're interested in, and then presses a key to magnify that area for purposes of editing text or clicking links. Kumar's ultimate goal is to bring eye-tracking hardware and software to the masses at affordable prices, but right now the 20% error rate means his system is still too flaky for everyday use; further refinement of the algorithms to incorporate peripheral vision may help somewhat, but he'll have to get the accuracy above 95% if there's any hope of widespread adoption. Or products like the Project Epoc thought-controlled helmet could end up making a big splash, and completing obviating the need for what is essentially a souped-up early 19th Century technology.

[Via Gadget Lab]