The big problem, Leap Motion argues, is that traditional game engines weren't designed with human hands in mind. We move in sudden, unpredictable ways, gripping objects with different levels of proficiency. When you pick up a sponge, for instance, it should flex and compress in the places where your fingers are exerting pressure. In VR, these nuances are difficult to track and simulate. If you push a rubber ball against the floor, for instance, most physics engines will be overwhelmed and send the sphere flying in a weird, unrealistic direction. The Interaction Engine solves this issue by implementing "an alternate set of physics rules" which trigger whenever your hands are touching or "inside" a virtual object.
"This makes it possible to pick things up and hold them in a way that feels truly solid," the company said in a blog post. "It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions." The Unity extension is currently in beta, and works best with objects that are one to two inches in size. That might sound rather restrictive -- especially for games with high fantasy weapons -- but it's a start towards accommodating our fingers and thumbs in VR. Leap Motion says the software is highly customizable too, with settings that let you change how objects can be thrown and collide with one another.
Even with this new engine, the Leap Motion is a niche proposition. Few people have high-end VR headsets at the moment, and even fewer have bought a Leap Motion controller too. Still, it's a unique sensor, and one the company hopes will be adopted by headset makers officially. If that happens, and virtual reality takes off, it'll finally have a viable business model. Until then, however, it has to continuously showcase what the Leap Motion is capable of, and the unique experiences it can provide in VR.