openni
Latest
USC starts a web hub for DIY, open source virtual reality projects
For the sheer variety of virtual reality headsets available, there's been few resources available for those who want to craft their own devices. USC wants to save us the effort of searching around. Its MxR Lab has just launched a showcase of creations and modifications that DIY enthusiasts can build, including open source code for both the devices and integrating full-body motion control through Kinect for Windows or OpenNI. The most ambitious is Socket HMD, a complete 1,280 x 800 headset that involves a 3D-printed shell and custom-assembled electronics. If your own ambitions don't stretch that far, you can still build the VR2GO viewer, which uses iPhones and iPod touch players as the eyepieces, as well as mods for the Oculus Rift developer kit that add stereo cameras or increase the field of view. Yes, you'll need a 3D printer and a knack for programming to get most of these projects going, but you won't have to wait for someone to make them for you -- a big help when many ready-made VR displays are either in development or priced out of reach for the average person.
ASUS Wavi Xtion motion sensing control system demoed at CES (video)
ASUS may not be anywhere close to ready for its Wavi Xtion to hit retail shelves (we're hearing Q2 of 2012), but that didn't stop our brethren over at Engadget Spanish from stopping by for a hands-on demonstration at CES. We'll spare you the details on how it works, but in practice, we learned that it's quite similar to Kinect. Not shocking considering that PrimeSense is behind both boxes, but the primarily difference seemed to be the reaction time. ASUS' solution wasn't quite as snappy as the Kinect, being slower to recognize and translate motions in testing. Of course, we wouldn't expect a product that's 18 months out from mass production to be completely on top of its game, but feel free to head on past the break to see exactly what we mean.
Kinect hack turns you into a punching, waving MIDI controller (video)
If you're looking for an awesome, impractical way to make music with your computer (and who isn't?) please direct your attention to the following Kinect hack. Shinect, the brainchild of a YouTube user named Shinyless, uses motion detection to turn you into a MIDI controller! The current implementation gives the operator two virtual pads that can be activated by the old Jersey Shore fist pump -- and if that ain't enough, the sounds can be pitchshifted by raising / lowering the other arm. Pretty sweet, huh? This thing uses OpenNI, and while he's demonstrating it using FruityLoops it should work with any MIDI device. Things are pretty rough'n ready at the moment, although he promises big things in the future. In the meantime, check out the proof-of-concept in the video after the break.
DIY telepresence robot uses PrimeSense Kinect drivers for extremely awkward push-ups (video)
From enhancing your WoW game to putting you in Tom Hanks's shoes, DIYers the world o'er really do seem to love Kinect. And what do we have here? Taylor Veltrop's Veltrobot remote telepresence 'bot uses the PrimeSense open source Kinect drivers for tracking the user's skeleton, with a modified Kondo KHR-1HV mirroring the operator's movements (which are received via 802.11n WiFi). Right now he is only controlling the arms, but with any luck we should be seeing complete control over all the robot's movements soon enough. Once the thing is finalized, Veltrop plans on releasing an open source development kit. And then? That's right: robot avatars for everyone!
PrimeSense and ASUS team, bring Kinect-like Wavi Xtion to your living room TV (update)
PrimeSense provides some of the brains behind Microsoft's Kinect, and wants a bigger piece of the pie; ASUS has a reputation for announcing wonderfully wacky peripherals every year. At CES 2011, the Wavi Xtion will check off both boxes nicely. In a nutshell, the Xtion is a PrimeSense 3D depth camera built exclusively for PC, but with an important twist -- it connects to a pair of ASUS Wavi boxes, which wirelessly streams its data to your living room PC between your TV and a Windows PC over the 5GHz band. Oh, and should ASUS attract enough developers, it will even pull down applications from an Xtion online store. ASUS says we'll see the package commercially available around the world in Q2 of next year -- with a UI and selection of apps and games on board -- but they'll release an Xtion PRO developer kit in February to tempt all you Kinect hackers into coding magical things for the platform. No more details for now, but there's an event in Vegas this week where ASUS is all but guaranteed to show it off. PR after the break. Update: Did we say HTPC? Turns out it doesn't quite work that way -- the Wavi are actually a pair of boxes that wirelessly sling data between them. You put the Xtion sensor on top of your TV, connect it to Wavi #1, then plug Wavi #2 into a PC up to 25 meters away. Mind you, it looks like the Xtion may not be quite as capable as Microsoft's unit, as there's only infrared hardware inside -- it might be fine for gesture control, but don't expect any augmented reality lightsaber fights. See some mockups below! %Gallery-112375%
KinEmote: Kinect gesture control for Boxee and XBMC media centers now available (video)
We've seen plenty of Kinect hacks over the last few weeks -- trouble is, beyond the initial wow factor they're just not very useful on a daily basis. That situation just changed, however, with the release of KinEmote, a free public beta that lets Windows users navigate XBMC and Boxee menus using nothing but hand gestures. Better yet, the software is built around OpenNI and NITE middleware from PrimeSense, the company behind the Project Natal reference gear. It certainly looks impressive in the video after the break. Good enough that we suspect many of you will hit up the source link below instead of finishing up your last minute holiday shopping -- hey, Santa can wait, this is progress!
PrimeSense's Tamir Berliner on the future of natural interaction
Many gamers might not know it, but 2010 has been a big year for PrimeSense, and it's thanks to Kinect. The depth sensor might be a Microsoft product, but there's plenty of PrimeSense tech inside making it tick. As a company devoted to natural interaction (NI) interfaces, it must be pretty gratifying to see one of the first major NI devices selling over 2 million units in its first month of availability. Kinect, however, is just the beginning for PrimeSense. Earlier this month, the company helped found OpenNI, a not-for-profit organization dedicated to promoting "the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware." So far, the organization has released the OpenNI Framework, including open source drivers and skeleton tracking middleware for NI devices. Although the software was created to support PrimeSense's own 3D sensor development kit, the community quickly (and unsurprisingly) adapted it to work with Kinect as well. We recently spoke with PrimeSense's Tamir Berliner about the creation of OpenNI. As might be expected, he foresees a bright future for natural interaction.
PrimeSense's OpenNI provides the best Kinect drivers yet, from someone who would know
We've been so wrapped up in Kinect hacks lately that we actually missed a Kinect non-hack that emerged last week. PrimeSense, who built the initial Project Natal reference hardware for Microsoft, has released its own open source drivers for the Kinect. PrimeSense is working with Willow Garage (best known for its open source ROS robot operating system), and Side-Kick (a motion gaming startup) through a new OpenNI organization it set up, and the trio will be combining their powers for good. The OpenNI framework will cover low-level hardware support (drivers for actual cameras and other sensors), and high-level visual tracking (turning your body into a 3D avatar that kicks ass in a virtual world). This should be a boon to an already vibrant Kinect hacking community, and if the video above is any indication, we aren't far from Kinect-level interaction and gameplay on our lowly PCs. [Thanks to everyone who sent this in]
PrimeSense releases open source drivers, middleware that work with Kinect
The global hacking community has already done an admirable job of exploiting the technology inside Kinect, but now would-be motion control designers can get the tech straight from the source. PrimeSense, the company that created the motion-sensing tech inside each Kinect, has released open source drivers that will work either with Kinect or its own dev kit, which Develop notes is "smaller and lighter" than a Kinect unit. PrimeSense has partnered with two other companies to create OpenNI, a not-for-profit organization set up to "promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware." The drivers are available on the OpenNI website, as is the NITE motion tracking middleware. OpenNI binaries are available for both Windows and Ubuntu. With some pretty amazing Kinect projects already out there, we can't wait to see what comes of this officially backed software release. Check out a quick demonstration of the software's skeleton tracking capabilities after the break.