10 minute read

RUSSELL CARPENTER ASC•AVATAR: THE WAY OF WATER

Jim is willing to voyage into the unknown and challenge people to make it happen. I’m glad to have been part of that

Carpenter, a longtime Cameron collaborator who won an Academy Award for his work on Cameron’s Titanic in 1997, first dipped his toe into the world of Pandora with test footage for Cameron as he began develop the second movie in the franchise in 2017, with footage of the human (as opposed to animated) character Spider that could cleanly interact with digital characters.

Following that experience, Cameron asked Carpenter to serve as cinematographer on the movie, which largely meant he was tasked with shooting all of the live-action material and developing lighting schemes for the virtual cinematography and CG teams as they painstakingly wove together a photoreal digital universe where live-action and synthetic characters could interact.

Figuring-out how to shoot the movie to Cameron’s specifications, and surviving the time and technical obstacles the project posed, proved to be the biggest challenges of Carpenter’s long career. Indeed, by the time it was all over, he says “I was mentally exhausted. Every day was like climbing a mountain, because every day was different in terms of so many types of environments and scenes.

“I’m glad I did the movie and learned so much about the cutting-edge and complex technology that Jim Cameron and his team created for Avatar: The Way Of Water. I think other people will adapt a lot of the technology we used, if they are crazy enough to go down this path. Jim is willing to voyage into the unknown and to challenge people to make it happen. That’s his strong suit. He’ll envision something and will form a team to figure it out. I’m glad to have been part of that.”

Carpenter recently sat down with Cinematography World to discuss his lengthy adventure diving, literally and figuratively, into Avatar: The Way Of Water, and what it all might mean for the filmmaking community moving forward.

With this project having gestated for so long, what was the status of things when you joined, and what were the big logistical challenges you faced?

I shot tests for Jim before I was officially on the project, around 2017, when they were finally ready to get going with photography. Before that, they were spending years creating the virtual worlds and getting him to sign-off on them. I think that process started in 2013. They weren’t ready to start prepping live action until 2018, and we didn’t start filming until 2019.

We started shooting for several months in 2019, and then Covid came and was a game-changer. Toward the end of 2019 we were going to take a break anyway and then come back in a few months. We hit that break time, but in late February or early March of 2020, Covid became the big problem. The production wasn’t sure when it could get back into New Zealand, where most aspects of production took place, because of the quarantine situation.

At the same time, we had a ticking clock in terms of our central human, liveaction, character Spider, played by actor Jack Champion. He has a central role in the movie, but he was a young teen, and was growing by leaps-and-bounds. He first came to the production in 2017, and was 13 at the time. When we finally ended filming, he was 18. When I first saw him, he was like a little tadpole. By our final shots, he was a young man. We wanted to shoot him out as fast as we could.

So, Jim and the producers wrote a letter to New Zealand authorities and asked them to let our production be a test case for how you can shoot during the pandemic and keep everybody safe. I think our production had some of the first tough Covid protocols for filmmaking that were developed – a very strict system for working on-set.

They said OK, but only one film crew from outside New Zealand would be allowed in, and if we screwed up, we would be out of there! They also said we had to all come back on one plane together, get tested before we boarded the plane, wear a mask the entire trip, and test before getting off the plane – all before being shipped-off to a hotel to quarantine. But we finally got back there and shot until Christmas of 2020.

What was the visual mission statement James Cameron gave you in terms of how to design, colour and light the fictional moon known as Pandora, where the movie takes place, especially given that he had already visualised the environment in the first Avatar film, well over a decade before?

When I came onto the project, in our first conversation, Jim noted that our audience was already familiar with Pandora, so we didn’t want to radically change things up, although we were showing new parts of Pandora and the underwater world, as well. He wanted people to feel like they were returning home.

However, in terms of lighting, since we had a bunch of different locations, he emphasised that this was a culture that was not estranged from the natural world. Therefore, he wanted that consistency to it, an almost spiritual quality to the way things looked – that the inhabitants were connected to the world in which they lived.

And then, the other thing he wanted was to make sure we didn’t use neutral light. In the natural world, light looks one way at the beginning of the day and another way at the end of the day – it starts to break apart and you get cooler light in shadows, and maybe warm light from a setting sun on one side.

Therefore, similarly, he wanted to see light fracture into different spectrums. If we were in the forest on a sunny day, he wanted to show the hard sunlight filtering through a canopy of trees, with a taste of warmth to it, but the light under the canopy would be cooler.

The environments had several important colour aspects to them. So, the look had to be based on realism, and the way light behaves in a natural world, like the one we were depicting.

What tools and techniques were fundamentally different, improved or evolved, in terms of how this film was shot, compared to the era when they made the original movie?

They learned a lot from the first film, but on this one, there was much more photography with human beings integrated into the world of Pandora. The tools were much finer and new techniques were added, specifically in terms of being able to use sensors that could discern how far human beings were from the camera within the virtual world.

We had new software that allowed us to embed a human actor in the environment so that Na’vi – the Pandoranative CG-generated alien characters – could step in front of that person, interact with that person, and so on. It was a really amazing system, much finer in terms of telling us what our composites were going to look like than previous iterations of this technology.

In terms of the camera, this time we had the Sony Venice system available to us, but with specially-made 3D stereoscopic beam splitter rigs. The big breakthrough was that the rigs were smaller and lighter than anything they used before. Jim wanted to shoot 4K, but with a 4K Super 35 target area, far longer than what they had on the first film. His mandate was that the camera and rig should be smaller and weigh less. Jim had been talking to Sony over the years, and said he wanted to split the camera apart. He only wanted the lens block and the sensor to be on the rig. All the processing pieces that are usually on the back of the Sony Venice, he wanted that something like 20-feet away.

“And so, the camera was split using the Rialto system, and the lightweight rigs were developed. Engineer Patrick Campbell and others did all kinds of things to pare down the weight, methods which have now been patented – all things they didn’t have available when the first film was made.

But the main trick was finding lenses that were light enough and small enough to fit on the smaller stereo rig. Engineer John Brooks worked with Jim for years on this problem. He and I eventually tested scores of lenses and had many zoom lens candidates. But, in the end, we wound-up with what I originally thought was kind of a prosumer lens made by Fujinon – the Fujinon MK series. There were two of them, small zooms – one went from 18-55mm and the other from 50-135mm. They were amazingly light, though you obviously have to double them, like everything on a stereo rig. But they weighed about two pounds and, for a zoom lens, they were short, about eight inches long. The production also used a small Premier Cabrio zoom on the crane and dolly rigs.

The miracle was that the Fujinon MK zoom lenses were really fast but remained as sharp as the higher-end Fujinon lenses I’ve seen, which are amazing lenses but cost a bundle compared to the ones we used. They gave us amazing visual quality and, on top of that, they were very consistent. Working with two lenses at a time for a stereo rig, if they get out of synch, that is a train wreck for a 3D production. These were consistent and tracked really well all the way through. That was a big breakthrough for us.

How did the Sony Venice’s enhanced ISO capabilities impact what you could do on this show, particularly in terms of lighting options?

The Venice’s dual ISO situation, where we could work at a more common 500ISO or an extremelysensitive 2500ISO, was unbelievably important. The 500ISO offered a beautiful, creamy image, but I actually liked the 2500 a lot, as it offered a little bit of what you would almost call a grain texture. We showed our tests with it to Jim, and he quickly signed-off on it, and then, WETA Digital, which might have had an easier time working only with 500ISO images, also signed-off on 2500. That was a huge game-changer, since you typically lose a stop as light passes through the camera’s half mirror.

Also, we were shooting at 48fps. Our tests led us to like a 270-degree shutter angle. The huge advantage of that was that we were able to shoot at light levels that are not normally associated with shooting stereo photography. That, in turn, allowed us to radically change our approach to lighting.

My gaffer, Len Levine, and I watched Jim work on the virtual stage with virtual environments, and he had a whole bank of computers and operators there to make changes to light very quickly. He had a lot of fluidity that you don’t typically have on a normal live-action stage.

That influenced us, and Len came up with a system that – instead of using normal movie lights on the ground, and having to move things around during shooting of the scene, which would eat up a lot of time – would be more like lighting from a rock-and-roll grid. This let us use moving lights, all controlled from a lighting board and always capable of displaying huge amounts of colours. And that was really important, because the movie takes place on Pandora, a place where colours are always changing. So, we had that capability, along with a lot of softlight up above.

We used those lights in ways you wouldn’t normally think of using them in a movie. For example, we had huge softlight bounces, either bouncing material on the walls or moving it around in the studio. As key light, we could shoot our overhead lights into those large bounce sources. And then, if we wanted to change the colour temperature, it literally just took seconds. It was a quick and fluid way of working with lighting, a kind of Swiss Army Knife that we could use for lots of different scenes that were drastically different.

What was the key to the stunning underwater cinematography in this movie? There was performance capture done underwater, lots of new technology, and cameras were always in the water, as opposed to shooting dry for wet. What was the secret to that success?

All the underwater performance-capture work was done in a specially-designed tank, that Jim and some other engineers designed, on a stage at Manhattan Beach Studios. It was enormous, about 118ft long by 59ft wide, and 29ft deep, with a carved-out section of the floor that went even deeper. It was a jack-of-all-trades type of thing, with huge turbines and powerful pumps. It could be configured for still water, rapids, shallow water, deep dives, and more. And they adapted their motioncapture systems for underwater work, with windows all around the tank so that reference cameras could participate to provide additional data.

It was kind of weird looking at the tank, because it looked like the top of the water was covered with floating white ping-pong balls. They had a large ARRI S60 SkyPanel-based lighting array over the entire tank. But when light hits the water like that, there are refractions there, which is not good for a motioncapture system. So, the solution was to kind of spread a layer of balls over the water that provided the diffusion they needed yet allowed those in the water to swim through them with no safety issues. They spent months shooting a variety of scenarios for both Way Of Water and Avatar 3 there.

We had a great underwater team led by a really excellent underwater cinematographer named Peter Zuccarini. The team used specially-housed versions of the Venice in rigs with 3D beam-splitter technology called DeepX 3D, invented by engineer/ cinematographer Pawel Achtel ACS, outfitted with unhoused, submersible Nikonos underwater lenses. It was an incredibly complex effort.

TM

Large-Area Lighting. Ultra Slim and Lightweight. Full-Spectrum Color. Large-Format Pixels. This is Auroris.

Auroris, the all-new full-color overhead light source from LiteGear, combines high-quality color rendition and precision control in one sleek, easy-to-use package. Powered by LiteGear’s Spectrum technology, Auroris X can cover a 100-square-foot area with just one fixture. Designed to be light on weight and easy to rig, Auroris is one of the slimmest, most versatile soft lights, period. The game of large-area overhead lighting is forever changed.

This article is from: