Advanced Computing in the Age of AI | Thursday, March 28, 2024

AI at the Oscars: De-Aging and Other ‘Digital Human’ Special Effects 

At last night’s Academy Awards, two nominated movies in the visual effects category leaned heavily on AI – one to reverse the aging process of the famous stars in Martin Scorsese’s “The Irishman,” the other to create the digital arch-villain Thanos in “Avengers: Endgame.”

Powering the AI enhancements are the George Lucas-founded VFX studios Industrial Light & Magic (ILM) (for “The Irishman”) and Digital Domain (for “Avengers”), both of which used Nvidia Quadro RTX GPUs to accelerate production.

In a blog by Nvidia’s Rick Champagne, the company’s head of Industry Strategy and Marketing, Media & Entertainment, he explained that the challenge faced by Scorsese and his production team lay in “The Irishman’s” decades-long plot, which follows the main characters, played by Robert DeNiro, Al Pacino and Joe Pesci, from World War II to a nursing home more than 60 years later. All three actors are now in their 70s, which is fine for the nursing home scenes, but what about the scenes in earlier decades?

“A makeup department couldn’t realistically transform the actors back to their 20s and 30s,” said Champagne. “And… Scorcese was against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.”

So ILM developed a three-camera technique for filming the actors’ performances: the director’s camera flanked by two infrared cameras that captured 3D geometry and textures, according to Champagne. The special effects team also developed ILM Facefinder software that used AI to sift through thousands of stills of the actors’ performances in movies at various stages of their lives.

“The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered,” Champagne said, “giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.”

“AI and machine learning are becoming a part of everything we do in (visual effects [VFX]),” said Pablo Helman, VFX supervisor on “The Irishman” at ILM. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”

As for Marvel’s blockbuster Avengers: Endgame, the highest grossing film ever, it included more than 2,500 visual effects. Using machine learning to animate Josh Brolin’s performance, VFX teams at Digital Domain created the film’s villain, Thanos.

According to Champagne, a machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then transfer his expressions onto the high-resolution mesh of Thanos’ face, eliminating the need for requiring VFX artists to manually animating the character’s facial movements.

“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

Be it noted that neither movie won the VFX Oscar, it instead went to Guillaume Rocheron, Greg Butler and Dominic Tuohy for “1917.” But it can be argued that everyone associated with an Oscar nomination is a winner. As Dustin Hoffman said in his 1980 Oscar speech: “to that artistic family that strives for excellence, none of you have ever lost.”

EnterpriseAI