At the University of Southern California, where he is a member of the faculty, McDowell heads up the World Building Media Lab, an academic effort that unites experts and students alike from across several disciplines (e.g., game design, sound design, improvisational theater, engineering, cinematography) to pioneer the art of storytelling in what he calls the "post-cinematic" world.The 'hacked' Oculus Rift headset used for The Leviathan Project.
It's a heady term, for sure, but what McDowell is referencing is the new, immersive space that technologies like virtual reality and augmented reality have engendered for creatives. The combination of those two mediums has required a rethink of how narrative is constructed -- "You gotta get more and more cross-disciplinary in your radar. And I have to learn the basis of a lot more languages to be able to design in this space," he says -- and how the audience engagement goes from passive to active. McDowell is, quite simply, helping to redefine entertainment and, someday, by extension, education.
"There's a handful of people that can go down to two or three thousand feet in a submersible, but that's about it," says McDowell, explaining the basis of a potential edutainment project his team's considering. "In VR, we could put millions of people down in the depths of the oceans. And if you're giving them real-world, science-based data that's informing that world space, then they're getting that real knowledge in a way that's different than watching a documentary film."
The undersea world McDowell plans to create doesn't borrow from any of the fantasy elements that define the flying whale research lab of Leviathan. Instead, he says this potential project would incorporate real-world marine life, and show the impact of pollution and coral bleaching that's resulted from rising temperatures. The aim being to provide viewers with firsthand knowledge and experience of the deep seas previously reserved for multi-millionaires like James Cameron.
Sensors placed on the backs of hands and nearby objects allow for interaction between the physical and virtual worlds.
But first, McDowell and his team at USC, along with partners Unity Technologies, Intel and 5D Global Studio, have to finish iterating on Leviathan. The project, currently in its second incarnation, was brought to Sundance after three months of breakneck work specifically so the team could gain insight on audience engagement. And based on early feedback, McDowell's already discovered several areas where the team could refine the experience, like eliminating the artificial time limit to create a hybrid squid-like flying creature known as Huxley, or reducing the excessive narration that guides you through it.
"It's a failure of the state that we're in that you need ... that idea of a directed narrative. If the space is working, you shouldn't need anything. And I think we're a little bit in the stage of people don't know enough about how to behave in these kind of [VR] spaces that we're preemptively thinking that you need some sort of instruction," says McDowell.
The Leviathan Project is an interactive installation, adapted from the novels by Scott Westerfeld. In McDowell's interpretation, you set up shop inside a lab situated in the belly of a flying whale that's en route from London to Moscow in 1895, and tinker with genetics. The project has a twofold purpose: Using a "hacked" Oculus Rift, viewers go on a task-based journey through VR which incorporates haptic interaction (i.e., you can pick up and manipulate physical objects with virtual consequences). After which, they can then also view the creatures from the experience in the real world, using an AR-enabled tablet powered by Intel's RealSense, depth-sensing camera.
"In VR, we could put millions of people down in the depths of the oceans. And if you're giving them real-world, science-based data, then they're getting that real knowledge in a way that's different than watching a documentary film."
McDowell linked up with Intel around the time his World Building Media Lab was established in 2012. The company, which had secured the rights to Westerfeld's novels, wanted to begin experimenting with emerging technologies.
"Intel was saying, 'Taking on board the fact that cinema and theater and game and film and TV and all those things exist, and they're not going away, how might we weave all of that together into a new semantic workflow platform, a creative process for the post-cinematic?'" he explains. "Whatever that may be. And VR and AR were the provocations."
Intel's RealSense-enabled tablet lets the Leviathan fly through the physical world.
Though several other literary properties, like the sci-fi works of China Miéville, were considered, McDowell and Intel settled on Leviathan because of the "self-contained ecosystem" the whale represented and the opportunities it afforded to the "evolution of narrative." Eventually, they brought the project to life at the Consumer Electronics Show in 2014 as part of Intel's CEO keynote.
"We flew an 80-foot whale off the screen and over 5,000 people," he says of that early AR effort. "We kind of understood that you could get a real audience engagement out of this sort of experience. And then we developed that into apps and started thinking about how you could engage this little 'Huxley' engagement on the tablet."
The apps McDowell's referring to are live on the Google Play Store and iTunes. It's an offshoot of The Leviathan Project that lets users interact with a Huxley, and take photos of it blending into the real-world environment. But before you rush to download it, be aware that the app requires a special password or scanned AR marker to activate. So far, these were only made available to attendees of Sundance's New Frontier exhibit.
A 'Huxley' comes to life in The Leviathan Project
Though The Leviathan Project, with its AR and VR components, reads like an attractive entertainment option for consumers, it's actually not at all intended for commercial release. It's all part of a grand push-and-pull experiment McDowell believes will help shape storytelling, as well as the technologies -- and the companies designing those technologies -- that are shaping our future.
"I think part of the transaction here with the technology companies is we are going to be able to give back empirical data to say 'The audience prefers this kind of engagement' or 'We got great feedback from this,'" says McDowell. "And hopefully that shifts the way some of the technology evolves. That's a big part of our job as artists: to say the emotional needs of this story are this, and therefore the technology has to kind of adapt. As much as the technology triggers all sorts of tools for us that we wouldn't have otherwise, it's a very symbiotic relationship now."