How “Avatar: The Way of Water” blurs the line between reality and fantasy!
If you compare the underwater shots from 2009's "Avatar" to 2022's "Avatar: The Way of Water," the improvement is pretty clear. And a look behind the scenes can help explain why.
Table of Contents (Show / Hide)
The first movie was shot on a dry set with actors mimicking what it'd be like underwater. But for the sequel, they actually filmed these dynamic scenes, well, underwater, creating some of the clearest animated water scenes ever captured for a film. "Avatar" was a stunning visual achievement in 2009. But 13 years of technical advancements made the second film even more visually groundbreaking, and in few places is that more evident than looking at the characters' faces. VFX artists brought the Na'vi to life by taking the actors' real facial expressions using motion capture and applying them to a 9-foot-tall character.
In the first "Avatar," the cast wore helmets with one tiny camera on them. The initial system was effective, but on a much more surface level. The tech advanced with 2019's "Alita: Battle Angel," where Wētā switched from one camera to two to capture actor Rosa Salazar's performance. The studio also used this two-camera system for "The Way of Water," and if you look closely, you can see the improvement in the facial expressions captured for the Na'vi characters. The extra camera adds a layer of depth and even more data for animators to perfect the final shots. The character work doesn't end once the performance is captured. Animators can and do adjust the expressions on a CGI character's face in postproduction. On the first "Avatar," VFX artists used a pretty common animation tool called Blend shapes. Blend shapes allowed them to adjust and heighten the facial expressions of Na'vi characters. They'd previously used the tool on movies like "The Lord of the Rings" to animate different expressions on Gollum, especially as he moved from one personality to another. But Blend shapes only allowed artists to manipulate the surface of a character's face, and Wētā wanted to go deeper. The studio made advancements with Thanos in the "Avengers" movies, going down a few more layers. You can see the difference in this test footage. This is facial manipulation using the surface-level Blend shapes. This is the same thing, but on a deeper muscular level to create more natural expressions. Still, Wētā wanted to push things further.
For "The Way of Water," Wētā was able to digitally reconstruct an actor's face based on the two-camera performance-capture footage. From there, the animators could measure how the face muscles under the surface strained during their performances. They used these muscle strains as data to more precisely adjust the Na'vi facial expressions. Working with facial muscles creates everything from more realistic blinking to the perfect eye roll and a new and improved smile. The animators could pull the corners back on the animated puppet's mouth, and it wouldn't stretch too far but could still stretch from cheek to cheek for a fuller, more realistic smile. Picking up the subtle nuances of an actor's performance was crucial for filming underwater.
That shot of Jake going through the rapids in the original film was a shot in a dry-for-wet style, meaning the performer acted out the scenes on a dry set, so the reference footage wasn't as accurate. The water you see here in "The Way of Water" and the way the characters swim through it looks much more like the real thing. The first step to achieving this was filming hours of footage of the cast acting underwater. The crew built a performance-capture volume in a tank that was 120 feet long, 60 feet wide, and 30 feet deep and then built another volume directly on top for shots where characters pop in and out of the water. Having to build such an elaborate set wasn't the only reason this was never attempted before. Getting clear underwater footage is a challenge on its own. Take lighting. Water creates all kinds of unwanted reflections. Wētā encountered the same issue while shooting the daytime performance-capture shots for the "Planet of the Apes" trilogy. So, for those films, they switched to infrared light.
That would work for surface shots, but infrared light is useless underwater because water quickly absorbs red wavelengths. So, to ensure visible lighting above and below the surface, the crew instead used an ultra-blue light underwater and infrared above the surface. But even the most up-to-date cameras struggled with water's natural qualities. For instance, surface reflections were making it hard to separate surface light and conditions under and above the water. The director had to go back in time to his 1989 film "The Abyss" to find the solution. On "The Abyss," he filled the surface of the water with black beads to keep the lighting as separate as possible.
On "The Way of Water," he used these opaque white Ping-Pong balls. And because the balls could be easily separated, they wouldn't get in the way of the actors' movements. You can really spot the difference in this shot from the first "Avatar" where Jake pops out of the water compared to this shot of Lo'ak and Payakan swimming right under the surface. There was another obstacle to getting underwater footage: bubbles. If the cast were to wear scuba gear, air bubbles would make it impossible for the performance-capture cameras to get a clear read of their faces. So, they had to hold their breath for long stretches of time, and this required six months of intense training in diaphragmatic breathing. Increasing their lung capacity allowed them to hold their breath for at least five minutes at a time. By the end of that training period, Kate Winslet broke the world record for longest breath held on a movie set. But it would take a lot more to make the actors swim as fast as their digital counterparts.
Unlike other Na'vi clans, the Metkayina, who are introduced in the sequel, are a water-based tribe, meaning they were animated with features like larger chests for holding their breath. And much larger tails to propel them forward. Whenever Kate and the other performers playing the Metkayina were swimming, the stunt team equipped them with these underwater jet packs. The VFX team would then use that swimming speed and swap out the jets for tails kicking the same way. They had to step it up a notch whenever the characters rode onto an ilu or a skim wing. While the actors used wire-rigged puppets to ride the flying banshees in the first film, these new creatures required them to be strapped onto these Jetovators that could move them in and out of the water.
Once the underwater sequences were filmed, Wētā took that raw footage and extended it into an endless aquatic world. Because of the sheer volume and complexity of water, animators had to simulate the movement rather than animate frame by frame. That included all of its defining characteristics as aerations, splashes, droplets, waves, mist, and more. When Wētā worked on this shot in "War for the Planet of the Apes," the animators had to account for everything from how Caesar's hair would look when it was wet to how water droplets would flow when there was air resistance. "The Way of Water" was a lot more work. Wētā was tasked with 2,225 water shots, including over 30 waterfall-interaction shots. Luckily, the VFX artists could now use a program called Loki, which Wētā created for the underwater scenes in "Alita." For "Way of Water," they added a whole host of specific interactions, like what a boat looks like at high speed. Here, the red particles are spray, and the green are mist. And even how hair and cloth would move. In the first "Avatar," the animators could only move the characters' hair in one big clump. Now they could move them in individual strands. You can see the difference looking at Jake's wet hair in the first movie versus the way Lo'ak's braid interacts with the water.
And because they nailed the tiniest details, they could now have high-definition close-up shots of water shedding off a character's hand. Once the animation was done on the rough water surface, the effects team would apply simulations, and at times animators would have to go back in and rework them. Some of the most challenging shots for the VFX artists were ones like these, where characters come up to the surface. They needed more than just a Jetovator for that. So, they worked with a new tool that allowed objects or characters to float on the surface. Elevating the film's action scenes. "The Way of Water" showed great advances not just in how the Na'vi emoted and the world they explored, but how they interacted with one another and, more importantly, the smaller-size humans. In the first "Avatar," there are only about a half dozen of these shots. For that film, the actors' references included a tennis ball on a stick and green-suit performers.
For "The Way of Water," real humans and CG creatures could coexist seamlessly thanks to football. More specifically, this eyeline system based off the Sky Cam, which is used to film NFL games. But instead of a camera, the crew attached a monitor to a cable that could lift it up into the air and approximate Na'vi height. The monitor following them around displayed rough performance-capture video to give the actors something to play off of. The monitors created more dynamic shots, and so did improvements in virtual cameras. This was one of the biggest innovations of the first "Avatar." With this lightweight device, James could view a rough version of Pandora and move around within it, helping him frame shots as if it were live action. When shooting with the cast, the crew used real-time depth compositing to see the real and digital elements in a shot in real time. This made shot composition less of a guessing game.
It helped Wētā seamlessly place Gollum in shots in "Lord of the Rings" and let Mowgli walk alongside Bagheera in "The Jungle Book." It even helped the artists match Alita's digital robotic arms with Rosa's real ones. For "The Way of Water," the tech took a huge leap forward. Real-time depth compositing got more precise because the crew could see digital and real elements combined down to the pixels making up the shot. This meant the shots where Na'vi and humans interact were a lot better. Just compare this shot in the original to this one in the sequel. And depth compositing is also why the water looks so good. For some shots, the actor playing Spider stood alone in a wave pool with the Na'vi characters visible in a real-time composite. Those pixels helped them digitally extend the water so seamlessly that it's impossible to know what was real and what was computer-generated. After all that, those early waterfall shots look a lot less daunting now.
URL :
News ID : 2081