The team is aware that this is a medium that still isn't quite there. As he introduced the press screening, HTC Vive VP Jeol Breton said that there are "so many problems with VR filmmaking. We're pioneering here; there's no process to follow. It's trial and error."
I watched three episodes. 7 Miracles felt less Passion of the Christ (executive producer Enzo Sisti was involved with both) than made-for-TV movie about Christianity. The struggle with filming in 360 degrees is that you're spreading the resolution (even if it's 8K) across a huge canvas. So, despite the high-end technology used, it didn't feel like a grand feature film.
If you've watched 360-degree content on a VR headset, it's business as usual. If you haven't, the experience is closer to immersive theater than watching TV or movies: it's up to you to pick up on the nonverbal clues, or arch your neck to figure out which apostle is talking to which. While 7 Miracles does a good job of centering a scene in your main field of view, it doesn't give the same crisp visual cadence of 2D media. There are really no close-ups, unless characters like Mary, Jesus, Judas or Lazarus intentionally get closer to the viewer/camera.
7 Miracles does play around with camera angles at points, but this is more jarring than innovative. VR projects give you a sense of being there (that'll be that "virtual presence"), so to have your placement and point of view wrenched away is uncomfortable at times.
It was hard to figure out who was talking at times: theater usually limits the number of talking characters onstage at once, because it can get a little chaotic -- I personally like to see and identify who's talking. VR films can also suffer from this, and hints from the audio can help only so much. This was especially true when, during one of the episodes, you're surrounded by a circle of people, and multiple actors are talking pretty much over your head.
The technical showcase, however, comes courtesy of photogrammetric scenes that exist separately from the "miracle" episodes. They look (and feel) completely different -- and that's a good thing. That's not to say that these shorter scenes, which combine photogrammetry and volumetric video capture, resulted in crisper visuals. Rather, the result is somewhere between CGI and motion pictures.
Let's set the stage. You're placed in the cave of Lazarus, who's still dead at this point. Jesus is also inside, standing above the wrapped body of the deceased. Equipped with the HTC Vive Focus, which is able to detect and adjust your movement in the virtual space, you can wander around the scene, peering over the body of Lazarus as he's resurrected and takes himself outside the tomb. Both of the characters kind of undulate, as seams of limbs or clothing roughly fold apart or join. It adds a rough, animated feel to the scene, which is more visually interesting to me than the theater-made-for-TV dramatics of the rest of 7 Miracles, which reminded me of religious studies classes at school and VHS tapes.
Technical producer Danilo Moura, who also worked on Buzz Aldrin's VR project that debuted at SXSW, explained the process -- and it's not a simple one. The team used a Leica BLK 360 scanner to scan this fictional tomb of Lazarus. After geometrically mapping the ear, the team then had to capture textures, which was done with a more typical choice of camera -- the Nikon D800. Actors, meanwhile, were captured separately as volumetric data in front of the green screen -- not in the tomb.
More VFX tech was folded in to capture information on light and color and match it with the virtually generated tomb. And then came the grand task of using post-production tools to mesh these all together. "This technique helps to immerse the viewer," noted Moura. Regardless of how you can freely move through the scene, the technology offers a more accurate point of view, regardless of your height. The rest of the feature is viewed while sitting down, from a static POV.
The goal with photogrammetry here was to elevate the "miracle moment," as Moura puts it -- maximizing the scenes from the feature that have the most impact. While environment scanning took roughly two hours, and capturing the actors took an afternoon, post-production and data crunching for a scene that lasts only a few minutes took roughly four weeks.