extremereality

Latest

  • Extreme Reality's Extreme Motion uses 2D webcams for 3D motion games, we go hands-on (update: video)

    by 
    Jon Fingas
    Jon Fingas
    01.08.2013

    Extreme Reality's technology revolves around gestures, and its latest effort is to bring that movement to the masses: its Extreme Motion developer kit turns just about any off-the-shelf webcam or built-in camera on common platforms, including Android, iOS and Windows, into an almost Kinect-like system capable of tracking 3D motion. Despite missing depth cameras or other additional sensors, it's theoretically quite accurate -- the software tracks joints across the body in every frame, although it's not quite so sensitive as to track fingers. This author had the chance to make a fool of himself in front of a laptop's camera to see how well Extreme Motion works. In short, reasonably well: while it wasn't in perfect sync, it recognized with less-than-elegant moves in a Dance Central-style demo title and flagged whether a shimmy was right on target or evidence of two left feet. Of course, this experiment was conducted in a brightly-lit hotel ballroom, where body detection is ideal, so take the experiment with a grain of salt. It's still adept enough that the developers who will have access to the (currently free) toolkit can produce motion games we'd be sincerely interested in playing. Update: Want to see Extreme Motion in action? If you're into seeing an Engadget editor expressing himself through the art of dance, a video demo awaits after the break. Michael Gorman contributed to this report.

  • Hands-on with wireless, ultrasonic stylus and touchless gestures at MWC (video)

    by 
    Zachary Lutz
    Zachary Lutz
    03.01.2012

    This isn't the first time you've heard of EPOS or XTR, but it's been quite some time since we've checked in with either of the outfits. So, imagine our surprise as we stumbled on new developments from each company as we perused the Texas Instruments booth at MWC. In the case of EPOS, we're shown a stylus that, in addition to offering traditional physical touch input, also allows users to interact with a device via ultrasound. The system is built upon TI's OMAP4 platform and requires that four microphones be placed at the corners of the screen. In this demonstration, we're shown how users can manipulate objects on a 3D plane via the Z-axis by pulling the pen away from the display. Next, we're shown a new application for the touchless gesturing system that XTR first debuted back in 2010. In this scenario, it's demonstrated how tablet owners could use the front-facing camera (at merely QVGA resolution) to flip through pages of a cookbook without worry of getting ingredients on the device. The concept software was developed by a French outfit known as Stonetrip, and also allows users to zoom and pan through the pages. You'll find demonstrations of each technology in a video after the break.

  • TI and XTR team up on touchless gesturing system for mobile devices

    by 
    Donald Melanson
    Donald Melanson
    09.15.2010

    We've seen a few examples of touchless, gesture-based interfaces for mobile devices, but it looks like Texas Instruments might be closer than most to making it a reality -- it's just announced a partnership with Extreme Reality (also known as XTR) on a new gesture engine and framework specifically designed for its OMAP 4 platform. The two companies actually showed off such a system back at MWC earlier this year (check out a demo of ti after the break), but they've only just now made the partnership official, and they're promising plenty more advancements to come -- including the ability to not only recognize simple gestures, but even things like whole body movements and two-handed gestures. Head on past the break for the complete press release.