Advertisement

Taking next-gen augmented reality for an ARM-powered walk around the block

We know what you're thinking, because we initially thought it too, but this isn't your average AR. With the help of chip designer ARM, a number of developers are building a new type of augmented reality that is altogether more powerful than the usual sprite-on-a-surface routine. Instead of requiring well-lit, artificial and often indoor surfaces and markers, this new technology sucks every ounce of juice from a smartphone's processor in order to recognize, track and augment real-world 3D objects like people and buildings. It's still at an early stage and far from being practical, but the exclusive videos after the break ought to prove that this approach has potential. In fact, it's probably what augmented reality ought to have been in the first place. Read on for more.



To be fair, traditional AR can still be entertaining. Reality Fighters on the PS Vita, Sony SmartAR and even virtual pico projectors certainly do no harm to the AR dream. But those games and apps generally share the same fundamental limitation: they only track a totally flat surface or a deliberately placed marker. Since neither of these things normally exist in the real world, it's hard to see how the reality part of the AR moniker applies. Contrast that with the video below, in which Dr. Himane from developer Metaio shows how the built environment can be augmented using "gravity descriptors." It represents the next phase in Metaio's work on the Augmented City, which began as a traditional 2D platform but is now being upgraded to support 3D markerless tracking in its next SDK, due by the end of this quarter.


Metaio's software will run on an iPhone 4 or 4S, but the company's CEO, Thomas Alt, told us that the Samsung Exynos pairing of ARM's Cortex-A9 CPU and a Mali 400 GPU, found in the Galaxy Note and GSII, offers "better quality and less power consumption." This is at least partly because ARM sees augmented reality as a way of showcasing the capability of its chips, so it's been helping Metaio and other developers to optimize their code.

The next hands-on video below shows a couple more examples of this optimization. The first doesn't concern tracking, but instead reveals how much GPU rendering capability is left unused by most current software. Whereas a top-end game -- the video shows a title called Eon Sky -- might display just 40,000 triangles per frame, we look at a diamond ring created by an AR firm called Holition that renders smoothly with 105,000 triangles and represents the real limit of the Mali 400.

The other example in the video, from Olaworks, is admittedly bare bones and looks much like those cheap and cheerful face-altering AR titles that you'd find in any local app store. However, it highlights one of the more exciting aspects of 3D markerless tracking: the ability to augment people who are moving around naturally, even when they're not staring right into the camera. Look past the Donald Duck face transplant and you'll catch a glimpse of a crazy future in which we can pick people out of a crowd and see their social networking statuses and other information visually transplanted onto their beings -- at least when we look at them through whatever smartphones or goggles happen to become popular over the next few years.


There is a downside to all this, though, and that's battery drain. Both phones used in the demo became hot to the touch -- even considering that the Galaxy Note runs warm anyway. The same applies to the Augmented City, with Metaio's boss admitting to us that running 3D markerless tracking can burn through a full battery in as little as 30 minutes. ARM acknowledges that this is one of the biggest barriers to next-gen AR, but it told us that reduced power consumption is one of the major improvements we'll see in the Mali T658 and T604 GPUs, which will interconnect better with a mobile's CPU in order to spread the burden of compute tasks and accomplish them more efficiently.