Advertisement

Augmented reality hardware is still way ahead of its software

But the hardware still needs some work, too.

Augmented reality -- using a device's camera to overlay information on the real world -- is a concept that's been around for years. But the combined failure of Google Glass and huge hype around virtual reality players like Oculus over the last few years have dampened the enthusiasm for the field a bit, relegating it to second-player status. Microsoft has reignited interest in AR thanks to Hololens, but the demos I saw at this year's Augmented World Expo made me think there's still a lot of work to be done. Particularly on the software side.

Unlike VR, which has the huge tentpoles of gaming and entertainment to rely on for consumer content, AR's use case is a bit fuzzier. Probably the most useful consumer scenario for AR that I saw was at Epson's booth, where the company was showing off its recently announced Moverio BT-300 augmented reality glasses. A spokesperson told me there's been a lot of interest in using the Moverio while piloting drones. You can watch the drone's flight path and simultaneously see what its camera sees through the built-in screen in your glasses.

Beyond that, though, much of what I heard about was business focused. That's not a huge surprise, given that all the rumors point to Google remaking Glass into a device focused on work, not play. Right now it seems like enterprise is the most obvious place for AR companies to find success. The idea of providing workers in the field with the info they need, without tying up their hands, is compelling.

But nearly everything I tried at the Augmented World Expo felt like a half-baked demo. Part of that is simply due to the venue; I can't very well use an AR headset to diagnose a mechanical problem with an engine and follow instructions beamed to my headset to fix it. But even in controlled demos, most of what I tried to do with these headsets felt too fiddly, unfinished and just plain difficult to execute.

The best demos I saw were ones where I didn't really need to interact with what I saw on my headset. A company called Librestream is using AR technology to let remote workers see what someone sees out in the field and beam notes back and forth. For example, I could be looking at a data center server while a remote employee snaps an image from my headset's camera and then sends it back to me with annotations about what I need to fix and what I should ignore. That demo didn't rely on any tiny headset-mounted controls or gestures to make sense, and it was better for it.

I had less success trying Augmenta's "smart panel" software. The idea is clever enough: Look at a plain panel with what amounts to QR codes around it, and you'll then see a custom set of controls overlaid on it. The demo I tried had the panel wired up to a robotic arm, with controls to move it virtually placed on the previously blank panel. Unfortunately, actually touching and activating those controls was an imprecise process, to say the least. It's certainly less effective than using actual hardware controls.

Medicine is another big potential market for AR, and one demo I tried used a pair of glasses to make a minor surgery easier to perform. With one hand, I guided an ultrasound over a pretty gross bowl of something both solid and watery; in my other hand, I held a long needle meant for extracting a sample for biopsy. In my glasses, I was able to see the precise path the needle would take overlaid on the ultrasound slice, letting me enter and exit the sample while easily diving into the fake tumor in one shot. It's a hard experience to put into words, but it shows the potential for adding valuable information through AR.

The key word here is potential: Almost everything I saw felt like it had a lot of potential, but the software just wasn't quite living up to it yet. It felt like bleeding-edge, hacky stuff. That's fine in a vacuum, but for the technology to move forward, someone (like Microsoft) is going to really have to release some killer apps.

The other side of the equation is hardware, which feels much further ahead than the software. Both Epson and ODG are offering hardware with fairly impressive displays, both from a resolution and image-quality perspective. ODG in particular was showing off a prototype with dual 1080p displays running at 60FPS, and the effective screen "size" when wearing the headset felt a lot larger than most of the competition.

For Epson's part, the Moverio BT-300 is significantly smaller than earlier versions, and its screen looked impressive in the quick video demo I saw. But even the newer hardware seemed like a work in progress -- lots of headsets felt like they were going to overheat, or the battery life wasn't good enough to handle the rigorous demo environment.

A day spent trying out various AR demos was like taking a trip through one of technology's more wild frontiers. There's a host of companies working on hardware, software, components, platforms for developers and more -- and there are tons of partnerships across the board intertwining them all. How it'll all shake out remains to be seen, but it's likely that Microsoft will play the role of Oculus here and blaze a trail for the smaller companies to follow. And it's increasingly likely that a focus on business will be where AR finds its niche -- but before businesses start investing serious cash in this technology, the whole experience, from hardware to software, will have to get a lot better.