However, when I was able to use the one-handed "pick up" gesture and move virtual objects around, it was a pretty great feeling. Using two hands to stretch and enlarge digital items also felt natural, when it worked. Given that the Meta 2 is still a prototype, I won't judge it too harshly, but the gesture experience needs to be refined before it launches. Unfortunately, that was the case way back in 2013 when we tried the Meta 1, so hopefully more improvements are coming soon.
While gestures still need work, object permanence was a pretty fascinating bit of technology, and one that Meta calls out as something to help the headset function as a work-focused device. As part of the demo I experienced, the Meta 2 showed off a handful of virtual screens with various apps on them. It was mostly a variety of different web browsers, but the Meta 2 also currently works with Microsoft Office, Spotify and Zoom video conferencing. The virtual monitor setup was pretty fascinating -- I started with just one screen but eventually had a row of five, and then two rows of five stacked on top of each other. Again, it's just a prototype, but it showcases the flexibility that a virtual screen setup would offer.
Since the virtual objects created by the Meta 2 stay where you place them, you can do your work, remove the headset and then go back to it; things will be left just where you want them. And while I was still wearing the headset, I could walk around my virtual monitors -- it was a strange experience to see them disappear into nothingness as I looked at them from the side.
It's not really useful, but it's a good way to highlight object permanence. Meta makes this even crazier by letting multiple headset wearers collaborate on an object -- you'll all see the same thing from different angles, depending on where you're physically located. And since the glasses let you see into the real world as well, it's not isolating like a virtual reality experience would be.
A few other experiences in the demo stood out for me as ways that Meta's AR technology could become truly useful. Most notable was when a transparent human body floated into view, revealing the skeletal structure, circulatory system and so forth. You could grab the body and separate the various systems out and see each individually and walk all the way around it -- it's a pretty great trick, and something that could be genuinely useful. For example, clicking on the picture of an item on sale at Amazon and having it float into my field of view to enlarge and spin around in 360 degrees could be a pretty handy tool when shopping at home.
As I've mentioned, this is all very much in the "tech demo" phase, but that doesn't make it any less interesting. The combo of object permanence, gestures and the fact that the Meta 2 has a very wide field of view -- 90 degrees diagonal -- all adds up to a compelling AR experience. The headset itself is still a bit clunky and not something I'd want to wear all day long, but at least you can wear your glasses while using it.
The hardware needs tweaking, but the real question is what kind of software experiences developers will start building with it. Meta's been working at AR for a long time and is playing the long game here by giving developers an improved tool for creating the experiences needed to make this take off. While the Meta 2 might not be ready for prime time yet, it does seem like a step forward from both a hardware and software perspective.