Advertisement

How augmented reality put five Madonnas on stage at once

The pop icon danced with digital versions of herself at the Billboard Music Awards.

At Wednesday's Billboard Music Awards, Madonna performed her latest single, "Medellín", with Latin singer Maluma. However, they (and their flesh-and-blood dancers) weren't alone on the Las Vegas stage. Several virtual versions of the pop icon joined them: a secret agent, a musician, a cha-cha instructor and a bride. Augmented reality brought Madonna's personas to life with the help of volumetric capture -- essentially 3D video -- and Unreal Engine.

The avatars (not holograms, so they weren't visible to the naked eye) wove in and out of the inventive performance, bursting into butterflies and puffs of smoke. There were several environmental effects that livened up the show, including digital rain, clouds, greenery and splashes of color, which married with the physical side in an attempt to tell a cohesive story. Madonna has reinvented herself countless times over her storied career, so it's perhaps little surprise that she tried something like this.

Jamie King, Madonna's long-time creative director, said he was looking for something special for the BBMAs. "After meeting with [Madonna's manager] Guy Oseary, we settled on the idea of incorporating augmented reality into the performance," he told Engadget. "I wanted to explore a way to involve her Madame X personas into the performance as well as the possibility of the real Madonna actually being able to perform with [them]."

The team brought the concept to a new creative AR company called Sequin, which took on the challenge of piecing the performance together. While it was the first time Madonna and Maluma performed the song live, it also marked the first project for Sequin.

While you might not recognize the name, you'll probably be familiar with the work of co-founders Lawrence Jones and Robert DeFranco. At The Future Group, their projects included those dramatic flooding visualizations for The Weather Channel, an AR-enhanced performance by K/DA at last year's League of Legends World Championship Finals and effects for this year's Super Bowl, for which they were nominated for an Emmy.

Jones, who oversees creative, production and technology development at Sequin, believes it was the first time there's been a broadcast AR performance using volumetric capture, which he called "the next revolution" of the medium. "What's new about this is that it's a completely choreographed performance where Madonna and Maluma are dancing with four digital versions of Madonna in perfect choreography," he told Engadget in an interview.

The show was something of a global affair. The volumetric capture process took place at a studio in London, while a Canadian company created the digital assets and environments in Unreal Engine for Sequin to pull together.

The Unreal Engine is being used more broadly outside the confines of its gaming origins these days. Creatives in various fields are tapping into its potential, including film, virtual reality and, of course, AR. "The interesting thing about real-time visual effects in broadcast augmented reality [is that] a big portion of the work is happening in pre-production," Jones said. "All the creation of the assets, all of the animation, most of the lighting is all done ahead of time."

A critical aspect of making performances such as this work is real-time camera tracking. Jones and his team use a tool called Brainstorm, layering broadcast objects, including motion graphics, character generation and real-time data, on top of Unreal Engine. Jones explained that Brainstorm feeds data from the physical cameras into Unreal Engine so everything from the real set lines up with a digital replication, ensuring the AR renders are in the right place at the right time.

Once Sequin was on site in Las Vegas, the team tweaked the production so it would fully integrate with the actual performance setting, making adjustments for factors such as lighting, shadows, reflections, timing and placement. According to Jones, the Billboard Music Awards and Dick Clark Productions (which produced the broadcast) were "essential in getting this to happen" and were "super accommodating" to Sequin and Madonna. Sufficient stage time was vital to make sure the live and virtual aspects lined up correctly -- no mean feat for a show with more than a dozen performers who all needed rehearsal time.

Complex performances such as this are driven by time codes, as Jones noted. Everything from lights, music and pyro effects to graphics have their own time-code triggers. Renders do, too, of course. Jones worked with Carla Kama, another creative director for Madonna, to design the shots and make sure everything matched up.

The eventual performance used three AR-enabled cameras: the crane (or jib), a front-of-house camera and a wireless Steadicam. "The wireless one is more of a novel one, because it's allowing you to have a free camera roaming around and being able to have these really cool dynamic, fluid and organic shots," Jones said.

While it wasn't flawless and found its fair share of detractors on Twitter, the display showcased the cutting edge of AR and volumetric capture. Attendees at the awards show were able to see Madonna's AR guises on screens inside the MGM Grand Garden Arena. However, the creative team wasn't too concerned the AR aspects would distance attendees from the live performance, which was fairly lively even without the digital elements, conga line and all.

Given the live audience numbered in the thousands and the TV viewership will extend into the millions, "I think the pros outweigh the cons there," Jones said.