The first day on set, directors Joe and Anthony Russo wasted no time getting Brolin in a motion capture helmet and suit to test out some of his lines. But they also went a step further. "Instead of cutting when they stop doing the lines, we just kept the motion capture going," Deleeuw said. "We kept when he was just experimenting with the different lines and how he would approach Thanos."
Using those off-the-cuff line takes, Marvel Studios was able to capture nuances that Deleeuw didn't originally plan for. "Just being able to read almost imperceptible movements in his face... movements in his eyes and his cheeks, and then you know later on to show his frustration or sadness with Gamora, or his anger with Tony... just really bring a character like that to the screen, I think was one of the biggest challenges," he said.
"Doug Roble, the guy that's working on that [Digital Domain] software said something along the lines of, 'If you're not using machine learning in your software, you're doing it wrong,'" Deleeuw said, recounting a recent visit to the VFX company. Looking ahead, the technology will be used for more than just faces -- it could help with things like water simulations. Eventually, you can expect machine learning to have a role just about everywhere when it comes to visual effects.