eyetracking

Latest

  • Valve experiments with players' sweat response, eye-tracking controls for future game design

    by 
    Mat Smith
    Mat Smith
    05.07.2013

    Valve has a surprisingly varied staff roster. Mike Ambinder is the company's very own experimental psychologist and he's been outlining some of Valve's work with biofeedback technology, including eye-motion controls for Portal 2 and perspiration-based gaming adjustments on Left 4 Dead. Mentioning these developments at the NeuroGaming Conference last week, Ambinder notes that both are still at an experimental stage, but that "there is potential on both sides of the equation, both for using physiological signals to quantify an emotion [and] what you can do when you incorporate physiological signals into the gameplay itself." In Left 4 Dead, test subjects had their sweat monitored, with values assigned to how much they were responding to the action. This data was fed back into the game, where designers attempted to modify (and improve) the experience. In a test where players had four minutes to shoot 100 enemies, calmer participants would progress normally, but if they got nervous, the game would speed up and they would have less time to shoot. When it came to the eye-tracking iteration of Portal 2, the new controls apparently worked well, but also necessitated separating aiming and viewpoint to ensure it worked. With Valve already involving itself in wearable computing, it should make both notions easier to accomplish if it decides to bring either experiment to fans. Venture Beat managed to record Ambinder's opening address at the conference -- we've added it after the break.

  • The Eye Tribe aims to bring its eye-tracking tech to Android devices with SDK this June

    by 
    Donald Melanson
    Donald Melanson
    04.17.2013

    Eye-tracking technology on Android devices isn't exactly anything new, but Danish startup The Eye Tribe is now looking to broaden its use further with its own new set of tools. The company has been showing off its tech since last year, but it's taken advantage of this week's DEMO Mobile conference to officially launch it, and reveal that its SDK will be available to developers this June (they can sign up now if they're interested). As for the tech itself, it promises to allow for everything from eye-activated logins to gaze-based controls to user engagement monitoring, but it won't simply work on every Android smartphone or tablet. It has some basic hardware requirements that the company says will only cost manufacturers an extra dollar in materials. In the meantime, you can get an idea of some of its capabilities at the company's site linked below.

  • LG outs eye recognition tech for Optimus G Pro, other features in April update

    by 
    Alexis Santos
    Alexis Santos
    03.13.2013

    Sure, there's been a lot of buzz about possible eye-based scrolling in Samsung's Galaxy S IV, but LG's in the eye-recognition spotlight -- for today, at least. The electronics giant has revealed that a "Value Pack" update for the Optimus G Pro will be served up in Korea next month, and will pack a feature called Smart Video that responds to a user's peepers. With its front-facing camera, the handset will pause a video if the user looks away, and start playing it when their gaze falls back on the display. In addition, the upgrade will pack what's said to be a world's first Dual Camera feature (taking a page from the phone's Dual Recording feature, of course), which creates picture-in-picture shots by using the hardware's two cameras. Devices will also receive the ability to change the home button's LED to correlate with contacts, pause and resume video recording, color emoticons and refreshed QRemote functionality. According to LG, the update's features will find their way to their other premium smartphones in the future, but there's no word on when the revamped software will arrive on phones in other territories. Hit the break for more details in the press release.

  • Insert Coin: NUIA's eyeCharm brings Kinect-assisted eye tracking (video)

    by 
    Steve Dent
    Steve Dent
    03.11.2013

    In Insert Coin, we look at an exciting new tech project that requires funding before it can hit production. If you'd like to pitch a project, please send us a tip with "Insert Coin" as the subject line. While there are countless eye-tracking devices in various stages of research, development and speculation, few so far have shown what you'd call a wallet-friendly consumer face. NUIA intends to fix that with eyeCharm, a new Kickstarter project that would give you gaze-control of your computer with a software suite and Kinect-attached device. We saw similar tech from the company earlier that used the Tobii motion detector, but to work with the more consumer-friendly (and widespread) Kinect, NUIA created the eyeCharm clip-on that adds special optics and illumination to its infrared camera. A suite of apps will get you started with Windows 7/8 functionality, while an included SDK will let developers create extensions for apps --which will also work with other eye-tracking devices, according to NUIA. For $60 you'll get the hardware (a prototype is shown above), along with existing apps developed by 4tititoo and the NUIA SDK, with delivery estimated by July. To see it in action, check the video after the break or hit the source to pledge.

  • NYT: Samsung Galaxy S IV will tout eye-based scrolling

    by 
    Jon Fingas
    Jon Fingas
    03.04.2013

    One of the Galaxy S III's most vaunted features was Smart Stay: when it was active, the smartphone's display would stay awake as long as its owner did. A reported Samsung insider's tip to the New York Times claims the Galaxy S IV will take that intelligent use of the camera one step further with eye-based scrolling. Theoretically, readers will never have to put finger to glass when scrolling downwards; the phone can tell when they're looking at the bottom of the page and move to the next section on its own. The hands-off scrolling is supposedly part of a strategy where the software ultimately matters more than the hardware. Chief product officer Kevin Packingham wouldn't confirm anything for the newspaper when asked, although he didn't feel the hardware will take a back seat. Either way, consider us intrigued -- as long as the software is real, and works in practice. We'll know the full story in several days.

  • SMI Eye-Tracking 3D Glasses use rim-based cameras to adjust perspective

    by 
    Zach Honig
    Zach Honig
    02.04.2013

    Can 3D glasses get any less fashionable? Of course they can! And here's some proof. Today, SensoMotoric Instruments (SMI) teased its new Eye-Tracking 3D Glasses, which use a pair of small cameras mounted to the eyeglass rim to keep tabs on your gaze, adjusting perspective as you look about a scene. The rig uses ActiveEye technology from Volfoni, and can detect eye distance as well in order to provide a 3D-viewing experience that's optimized for each user. Adding some optional hardware, including optical targets (as seen in the video after the break), can enable 6D head tacking support, for an even more immersive experience. Naturally, the solution is a bit cumbersome, and while pricing hasn't been announced, we don't expect it to come cheap -- it's definitely something you'd be more likely to see implemented as part of a virtual reality system, rather than a device you'd use at home. So, while you may never see such a product in the flesh, you can still get an idea of how it'll work in the video after the break.

  • Hyundai unveils HCD-14 Genesis concept: suicide doors, gesture and eye controls

    by 
    Richard Lawler
    Richard Lawler
    01.14.2013

    At NAIAS 2013 Hyundai has given an indication of where its "premium vehicles" are headed with its HCD-14 Genesis concept. Sporting a sharp-edged style and suicide doors, the sedan gets even better inside, with a control layout that foregoes the traditional knobs and buttons. According to Hyundai (it wasn't demonstrated) it includes eye tracking and 3D hand gesture recognition accurate enough to control navigation, infotainment, audio, HVAC, and one's phone. The RWD vehicle packs a 5.0-liter Hyundai Tau V8 engine under the hood with optical recognition that verifies its driver before starting. Hyundai stated that there would be two vehicles on the way following this concept's design, with the second including even more of its advanced tech. Check out the full list in the press release after the break, as well as a good look at the car in our gallery.

  • Hands-on with Tobii REX, a peripheral that brings eye-tracking to any Windows 8 PC (video)

    by 
    Dana Wollman
    Dana Wollman
    01.06.2013

    One of the neat things about CES is that it gives us a chance to check in with startups we covered the previous year. In the case of, Tobii, 12 months makes a world of difference. When we met with the company last January, it had never publicly shown off its eye-tracking Gaze UI, which allowed us to navigate, zoom, select and scroll on a custom Windows 8 laptop with just our pupils and a touchpad. After playing with it, it was obvious to us the technology still needed some fine-tuning, but nonetheless Tobii promised it would have a product to sell in about a year's time. Fast forward 12 months: Intel now owns a 10 percent stake in the company, and Tobii recently started shipping its first piece of hardware, the REX. This small USB peripheral, just slightly thicker than a pen, attaches to the base of any computer display, allowing it to bring eye-tracking technology to any Windows 8 machine. For now, it's only available to developers for a price of $995, but Tobii expects to ship 5,000 consumer units by the end of 2013. Happily for us, though, we got to play with it here at CES 2013. Meet us after the break to see how the technology's grown up since we tried it out it a year ago, and then when you're done reading through our impressions, check out the walkthrough video at the end. Follow all the latest CES 2013 news at our event hub.

  • CEATEC 2012 wrap-up: concept cars, eye-tracking tech and motion sensors galore

    by 
    Sarah Silbert
    Sarah Silbert
    10.03.2012

    CEATEC, Japan's largest annual electronics show, is winding down here on the outskirts of Tokyo. We've spent the past two days scouring the halls of the Makuhari Messe, digging up no shortage of concept cars, eye-tracking technologies and even the odd Windows 8 device. The star of the show may have been Japanese carrier NTT DoCoMo, with its gaze-controlled prototypes and real-time translation app, but there were plenty of other gadgets on hand to peak our interest -- even if many of them won't make it to market anytime soon. Have a look for yourself by browsing our complete CEATEC 2012 coverage past the break.

  • Fujitsu eye-tracking tech uses built-in motion sensor, infrared LED for hands-free computing (video)

    by 
    Sarah Silbert
    Sarah Silbert
    10.02.2012

    Eye-tracking technology looks to be one of the major tropes at CEATEC this year. One of many companies demoing a gaze-following setup is Fujitsu, which is showing off a prototype desktop PC with a built-in sensor and infrared LED. This configuration should be cheaper than many other eye-controlled solutions out there, as the components are integrated directly into the computer and no external hardware is needed. It's sweet and simple: the camera captures the reflection of light on the user's eye, and image processing technology then calculates the user's viewing angle to allow for hands-free navigation on-screen. We got a brief eyes-on with Fujitsu's demo, which shows off the eye-controlled tech working with a map application. Even without any detectible calibration, the system did a respectable job of navigating around Tokyo based on how we moved our eyes. Panning from right to left works especially seamlessly, but moving up and down required a bit more effort -- we caught ourselves moving our whole head a few times. This is an early demonstration of course, though Fujitsu has already enumerated several applications for this technology, from assisting disabled users to simply eliminating the need to look down at the mouse and keyboard. See the gaze detection in action in our hands-on video past the break.

  • Tobii, Fujitsu and NTT DoCoMo partner on eye tracking ibeam tablet, promise a peek in October

    by 
    Jon Fingas
    Jon Fingas
    09.19.2012

    Tobii's eye tracking Gaze UI hasn't been especially portable so far, but we'll soon see that change through a new collaboration involving Fujitsu and NTT DoCoMo. The trio plan to reveal the ibeam, an Android tablet with Tobii's smaller IS20 (formerly the IS-2) detector taking input just through glances. Together, the partners want to show that an eye-driven interface can be more reactive than plain old multi-touch: think turning a page in an e-book while you're holding on to a subway car strap. We're only getting a brief preview as of today, but we're teased with the prospect of a full look at NTT DoCoMo's CEATEC booth in early October. Whether or not ibeam leads to more than a well-that's-nice prototype, though, is still up in the air.

  • Google gets patent for eye tracking-based unlock system, shifty looks get you access

    by 
    James Trew
    James Trew
    08.07.2012

    Look up. Now down. Back up here again? Imagine having to do that every time you wanted to unlock your phone, as this granted Google patent for "Unlocking a screen using eye tracking information" possibly suggests. Okay, it actually looks more like it's intended for the firm's super spectacles -- which given their general hands-free nature -- makes more sense. The claims are fairly straightforward, unlocking of a device would be granted based on "determining that a path associated with the eye movement substantially matches a path of the moving object". As long as those moving objects aren't moving too fast, we think we can work with that.

  • Researchers use off-the-shelf parts to let you write emails with your eyes (video)

    by 
    Steve Dent
    Steve Dent
    07.13.2012

    There's a lot of research to help the spinal cord or stroke-injured become more self-sufficient, but it often takes some exotic paraphernalia. To buck that trend, scientists from Imperial College London showed that subjects could perform relatively hard tasks like writing messages and playing Pong using eye movement -- with a mere $35-worth of parts. They even showed how well the system worked, with subjects scoring within 20 percent of an able-bodied person after a scant 10 minutes of practice. The tracker works with two video console cameras and a pair of eyeglasses that, after calibration, can precisely track the pupils -- allowing them to control a cursor or move a paddle. The researchers also figured out how to "click" the eye-mouse by winking, and can even use more precise adjustments to calculate gaze depth -- meaning subjects will be able to perform more complex tasks in the future, like guide a motorized wheelchair. While by no means the first eye-tracking system we've seen, it's by far the most economical. Check the video after the break to see how it works.

  • Northeastern University students develop eye controlled robotic arm that's happy to feed you

    by 
    Alexis Santos
    Alexis Santos
    05.24.2012

    As an alternative to receiving brain implants for robotic arm dominance assistance, check out this surprisingly cheap eye-tracking solution developed by six electrical engineering students at Northeastern University. Labeled iCRAFT, for eye Con­trolled Robotic Arm Feeding Tech­nology, the award-winning senior project drew its inspiration from one team member's difficulty syncing spoonfuls with the eating pace of elderly and disabled patients. Simply gaze at the on-screen box that corresponds to your food or beverage choice and the robotic arm will swing your way with grub in its grip. Ambitious DIY-ers can chase down the open-sourced software behind iCRAFT, and construct a contraption of their own for about $900 -- considerably less than self-​​feeding rigs living in the neighborhood of $3,500. You can catch a video of the robot arm serving up some fine Wendy's cuisine after the break.

  • Intel drops $21 million for ten percent stake in eye-tracking firm Tobii

    by 
    Donald Melanson
    Donald Melanson
    03.16.2012

    Tobii has managed to impress quite a few folks with its eye-tracking technology -- most recently in the form of the "Eye Asteroids" arcade game -- and it looks like Intel has been paying particularly close attention to the company. As Computer Sweden reports, Intel (or Intel Capital, specifically) has now shelled out roughly $21 million to buy a ten percent stake in the Swedish company, which hopes to soon see its eye-tracking system used in everything from desktops and laptops to phones and even vehicles. Presumably, having Intel at the table will give it a considerable boost in those endeavors.

  • Tobii EyeAsteroids 3D lets you destroy virtual space stones with a gaze, we go eyes-on (video)

    by 
    Zach Honig
    Zach Honig
    03.08.2012

    We've touched and tapped our way through a variety of gadgets at CeBIT, but it's the devices that operate without traditional user interfaces that have really grabbed our focus. Tobii was on-hand to demonstrate its eye-tracking technology earlier this year at CES, but the company is peddling its wares here in Hannover as well, and we decided to drop by for a second look. This time, it's all about gaming, with EyeAsteroids drawing quite a bit of attention on the show floor. The demo pairs Tobii with a SeeFront glasses-free 3D panel for a fairly engaging extraterrestrial shootout. We weren't really sold on the glasses-free 3D, unfortunately, which provides the same unconvincing three-dimensional image from any angle, but Tobii was spot-on, letting us hone in on those infamous space rocks to save our planet from destruction without even raising a finger.Like SeeFront's display, you're able to make visual selections from any angle (within reason) just as easily as you can from directly in front of the panel. There's a seconds-long calibration process each time you start the game, so Tobii can locate your eyes and pair your pupil orientation with a target on the screen. After that, it's open season -- you simply focus on an asteroid to destroy it, and you can add your name to the leader board and navigate menus as well, just as we saw with the Windows 8 demo back at CES. Is this the future of gaming? That remains to be seen, and while the eye-tracking seemed to work just as described, old school gamers will likely prefer tilting a joystick and (violently) tapping on arcade buttons. We still had a lot of fun playing without using our hands, though, as you'll see in our glare-filled demo just past the break.%Gallery-150214%

  • Tobii Gazes into the future, sees you navigating Windows 8 with your eyes (video)

    by 
    Joseph Volpe
    Joseph Volpe
    01.05.2012

    You may be waiting with bated breath for Microsoft to hurry up and release Windows 8 PCs and tablets to the masses, but before they get here, there might a twist to the way you tweak'em. Tobii Technology intends to demo its new mouse-free interface at CES this month -- dubbed Tobii Gaze -- that it hopes'll revolutionize the way we interact with devices. The gesture-based system incorporates eye-tracking to direct an on-screen pointer and works in conjunction with touch pad input for "fine-tuning." The company's hoping this new interface'll help you toss out that antiquated clicker and embrace the world of Minority Report. Hey, it's inevitable and you know it.

  • Tobii looks to keep you alert by detecting when you're tweet-driving (video)

    by 
    Billy Steele
    Billy Steele
    11.30.2011

    Don't think you'll get caught checking the Duke score while cruisin' down the highway? Soon, you may have to think twice before hitting the scoreboard. Tobii has unveiled its new technology that detects drowsiness and distraction in on-board driver safety systems. The platform is based on the company's advanced eye tracking tech to bolster automobile safety on the highways and byways. The system detects eyes of all shapes, sizes, and colors -- without calibration -- even if the driver is wearing glasses or a pair of Ray Ban shades. A constant stream of data communicates to the watchman the driver's condition, regardless of changes in environment or if the person behind the wheel takes a quick peek out the window. Tobii isn't looking to stop here either, as it says eye control of in-cabin infotainment systems is within reach. Perhaps this time next year, we'll be able to browse that Spotify collection with a series of blinks -- one can only hope.

  • Utechzone Spring eye-tracking system hands-on (video)

    by 
    Richard Lai
    Richard Lai
    06.06.2011

    In the midst of fiddling with tablets and laptops at Computex, we haven't been thinking much of eye-tracking technologies until we saw Utechzone's booth. What we have here is the Spring, a TW$240,000 (US$8,380) eye-tracking rig that was launched in March 2010 and is aimed at users with limited mobility. The package consists of an LCD monitor, a computer, and an external sensor that utilizes infrared to track our pupils. Also included is an eye-friendly software suite that lets users play games, browse the web and media files, send emails, communicate with caretakers, and read PDF or TXT files. We had a go on the Spring and quickly learned how to control it with our eyes: much like the Xbox Kinect, in order to make a click we had to hover the cursor over (or fix our eyes on) a desired button until the former completes a spin. The tracking was surprisingly accurate, except we had to take off our glasses for it to work; that said, the other glasses didn't exhibit the same issue, so the culprit could be just some coating on our lenses. Another problem we found was that it only took a quick jiggle with our eyes to cancel the spinning countdown, so full concentration is required to use the Spring. This shouldn't be a problem outside a noisy event like Computex, anyway, and if you need more convincing, we were told that a disabled Taiwanese professor managed to hit 100,000 Chinese characters within three months using phonetic input on the the same rig -- he's planning on releasing a new book soon. Have a look at our eyes-on video after the break for a better idea on how the Spring works. %Gallery-125350%

  • Hacked Kinect duo teams up with HD projector to make 360 Snowglobe display (video)

    by 
    Sean Buckley
    Sean Buckley
    05.19.2011

    Flatscreen displays? Decidedly old hat; students from Queens University have a better idea: snowglobes. Hacking together a 3D HD projector, two Kinect sensors, and a hemispherical mirror mounted inside of an acrylic sphere, "Project Snowglobe" has created a pseudo holographic display -- presenting a 360-degree view of a digital object. The all-angles display is compelling, but it's strictly a single-user affair; the object isn't actually projected in 3D -- it instead follows the movements of a lone Earthling, rotating and shifting position, in sync with the viewer. The display standard of the future? Maybe not, but pretty darn cool, all the same. Hit up the video after the break to check it out.