After a quick run through with Lorraine Bardeen, GM Windows and HoloLens experiences, I was left alone in the living room to discover the digital information all around me. The juxtaposition of three-dimensional holograms and real objects created a mixed reality. Interacting with the projections peppered in the room quickly started to feel like some sort of technological hallucination. The holograms were only visible to me.
One of the first things I noticed about the headset was its see-through "holographic lens." It's held in place by a matte black frame that looped around my head. I could move the lens front and back ever so lightly to fix it at a comfortable distance away from my face (and enough room for my large-framed prescription glasses too). There's a second black rim with an adjustment wheel to tighten the device around the head. It supports the entire contraption and keeps it firmly but comfortably in place.The headset, which weighs a little more than one pound, is packed with sensors, a custom-built holographic processing unit and a ton of cameras. There's a front-facing camera and four environment-mapping cameras -- split into pairs on the left and right corners of the lens –- that make the precise placement of holograms possible.
The HoloLens experience requires a complete engagement of senses. It's not trying to hack the senses to create a virtual world; instead it works with sight, sound and movements to create an interaction between human, machine and the environment. That's what makes the experience feel unique yet natural. While the system imposes an artificial overlay on reality, it relies on intuitive controls like direction of gaze, gestures and voice.Staring at projected objects to indicate intent is a big part of the HoloLens experience. Before I could take the contraption for a spin around the room, I had to calibrate my eyes to make sure the holographic effect worked the way its creators intended. I looked at an introductory app that was projected on the floor, held out my right index finger and pushed it down to make the "tap" gesture, which indicates a selection. It starts with an automatic interpupillary distance feature that measures the distance from the center of one pupil to the other.
For the calibration, I closed my left eye and looked through my right eye at the image in front of me. Three taps later, I repeated the same process for my left eye by closing the right. The process, which took a few seconds and was saved on the device for future interactions, made sure the holograms in the room were customized for my eyes.
A mechanical voice proceeded to tell me how I could use gestures and my gaze to indicate what I wanted. I had learned how to tap and select an app, but it also told me how to exit one. I followed instructions, moved my hand up to my field of view and flared my fingers up and out to indicate a blooming flower. The "bloom" gets you out of an experience in an instant.
Now that I knew how to start and quit the apps floating around me, I was set for a barrage of demos. The first app I tried was a browser. The Engadget website was preloaded in front of me (it was the live site, too; I refreshed to check the front page stories). After messing around with the internet projected on a wall, I quickly launched into RoboRaid, a first-person shooting game that used spatial mapping of the room (done at the beginning of the demo) to project alien enemy creatures on the walls around me. Seconds into the game, I heard crunching sounds on my left, and I swiftly turned to spot the mini-Transformers-looking creatures bursting out of the walls. I saw them, zapped them, destroyed them. They hurled fireballs at me that I dodged. The game was straightforward. But it illuminated the profound possibilities of projections that are loaded with 3D sound.
I could turn up the sound of the holograms with small, inconspicuous buttons on the right arm of the headset. But even at the highest volume, the ambient sounds in the room were not blocked out. The speakers are concealed in one inch bars on either side of the headset to ensure that the sound stays close to the ears but doesn't overwhelm them like headphones.
In addition to the speakers, there are four microphones in the device. So at any point in the demos, I could call on Cortana, Microsoft's virtual assistant which has been integrated into the holographic system. I could switch from using my index finger to communicate my needs to simply stating them. As a bonus, I could ask my assistant to take pictures and videos of my imaginary experience as part of the Mixed Reality Capture feature. There's a micro USB port on one side to download those captured shots.
"Hey Cortana, record a video," I tell her when I want to capture shots in a game or an app. Almost every time, she threw up a string of random words ending in "video" in my field of view. She didn't always decipher my words with accuracy. (My accent, while clearly understood by humans in my daily interactions, is currently beyond the comprehension of most digital assistants.) But she understood the intent and started recording almost every time. Except for the time she thought I said "God video" and opened a browser for me as if to say "you're on your own for this one."
Despite Cortana's occasional struggles to truly understand me, the voice-activated interaction feels ingenious and necessary. It makes the application possibilities wide-ranging -- interaction between an astronaut and a NASA operator on the ground, collaborative work among a couple of group of designers, a Skype call with a family member or an expert dialing in to help fix a broken refrigerator. Using my voice quickly became a crucial part of the experience.
Through some of the experiences, the voice in my head, err, the one that was coming from the headset, often reminded me to move around the room. I forgot I was untethered. Having tried enough VR (some that made me nauseous), I have adopted a stand-still-and-be-immersed approach. Partly because wired headsets like the Oculus Rift curtail movement, but mostly because moving around in VR, completely disconnected from reality, can make the experience of walking feel precarious.
My legs didn't quiver at any point through the many experiences of HoloLens. Even when I walked the simulated cobbled streets of Italy in HoloTour, a travel app locked inside the floating globe on the couch, I felt immersed but I wasn't disconnected from the reality of the hotel suite that I was in. And that's precisely the point. The magic happens because the holograms mediate reality, not replace it. It's a different kind of immersion that neither leaves you nauseous nor disoriented.
All the applications aren't geared for immersive experiences, though. Practical ideas like the HoloStudio allow you to create your own models of places and objects. A pop-up menu lets you drag and drop holograms and even resize them. You can also replicate those movements outside this app. Each hologram can be adjusted and moved with a tap-and-hold gesture. It's a useful feature but it wasn't the easiest one to learn. I struggled to relocate a yellow puppy projected on the floor. But when I slowly got the hang of it, I noticed the low latency of the hologram that moved in sync with my gesture.
Although I had trouble engaging with the lifeless dog in the room, the holograms that were loaded with spatial sounds were incredibly believable. 3D audio is critical to an immersive experience like VR. It's the thing that draws you in and tricks your brain into thinking the simulated is real. But the audible cues are just as significant, if not more integral, to the experience of augmented reality. What good is a holographic RoboRaider if you can't hear it firing at you from behind?
Spatial sound makes you look in the direction of the virtual characters. In Fragments, a murder mystery game that's a cross between Minority Report and Clue, I was given both audio and holographic clues to solve a crime. An AI helper -- a pale-skinned, dark-haired man in an indigo spacesuit –- narrated events and guided me through the game. But the thing that drew me in was the sound of sobbing. It took me a second to find the source but the weeping alerted me to the presence of a young boy on the floor in front of me, below and out of sight. The murder scene was right next to the couch.
Projecting sound in front of the listener is an incredibly hard feat in spatial audio. But the audio cues were spot-on in the game. They weren't just helping me locate the holographic people in the room, they were significant to the believability of the mixed-reality experience.
Sounds aside, the characters in the game sat on the furniture and were able to pinpoint my location in the room. They fully inhabited the space to complete the illusion. At one point in the experience, when the leader of my mysterious investigative group showed up, she knew exactly where the couch was. She was sitting on it. Remember, HoloLens had mapped the entire suite with its sensors at the beginning of the demo.
Minutes later, when she addressed the room, while looking away from me, I realized we weren't the only ones in the room. On my right were three other key members of the investigative team. Sarah, the one standing closest to me, looked straight at me when she introduced herself. The hologram had sensed my presence.
Gallery: Microsoft HoloLens hands-on | 12 Photos
Gallery: Microsoft HoloLens hands-on | 12 Photos
The virtual crew was clearly simulated. But the HoloLens projected them with stunning clarity. As Bardeen would later tell me, Microsoft has devised its own unique terms to define the resolution of its holograms. The absence of a benchmark for 3D projections makes it hard to measure and compare the clarity and density of these objects. But visually, the projections looked drastically different. While the puppy and the pop orange tiger were clearly cartoony avatars, the characters in Fragments were much more detailed and comparable to existing Xbox One graphics. I noticed the lines on their faces and the creases in their clothes.
Through the hour and a half of the demos, I continuously found myself struggling with the limited field of view that remained unchanged across all HoloLens experiences. The holograms only appear in a rectangular frame right in front of the user. It leaves a wide gap on both right and left. But I'm told the peripheral view that borders the holographic frame is intentional, even necessary, to the experience. Augmented reality is all about overlaying information onto real environments. It works when the room is constantly in your view.
The unguided experience showcased the strengths and pitfalls of the HoloLens. But Microsoft is ready. With the technology packed in a sturdy and surprisingly comfortable headset, it's time for the developers to test the holographic power of the computer that's been in the making for years. More importantly, it's time for app creators to unleash the potential that the medium presents. While NASA's already sent a couple of headsets to the International Space Station to assist crew members, down here on Earth, Microsoft hopes the device will transform the way we learn, teach, communicate and collaborate in the future.
It's easy to draw comparisons between the two new mediums that are starting to shape new realities –- the virtual and the augmented. But everything from the physical headsets, the experiences, the human impact and the future applications are entirely different.
As my demo drew to a close, I noticed that I had no trouble returning to the reality of the hotel suite. Unlike VR, where I was often disoriented after taking off the headset, this time my brain wasn't struggling to differentiate between the real and the virtual. But I did get used to seeing things that weren't really there. After I handed back the HoloLens, I walked by the toilet to an empty spot where I'd parked my bag at the beginning of the demo. I stopped and instinctively turned to look for that bright pink octopus on the tiled floor.