CineAlta Issue 6

Page 1

behind behindthe thescenes scenes

INSIDE the making of

Tomorrowland Cinematographer Rob Hardy, BSC

EX MACHINA

Discusses his choice of cameras and his use of anamorphic lenses

Cinematographer

Claire Mathon, AFC

MON ROI

issue issue 56


Letter from the Editors As the song goes, we don’t stop thinking about tomorrow. That’s why we continue to upgrade our cameras, to protect your investment today and into tomorrow. As we write this, tomorrow is also a major theme in theaters with two features that showcase distinct visions of the future. Both productions tested a huge range of film and digital cameras. And both selected the F65 and F55 cameras. Inspired by the mid-century optimism of the original Disneyland attraction, Tomorrowland is the latest from two-time Academy Award winning director Brad Bird and Academy Award winning cinematographer Claudio Miranda, ASC. This is a major production from Disney shot on multiple F65 and F55 cameras and released in theaters in 4K. A world’s first, Tomorrowland is showing in very select theaters in 4K High Dynamic Range. The feature is an unprecedented visual experience, with stunning texture and detail. We follow Tomorrowland through the production pipeline, starting with the extensive cameras tests and interviewing Claudio Miranda, ASC, co-producer/ VFX producer Tom Peitzman, 1st AC Dan Ming, DIT Alex Carr, Senior Colorist Stephen Nakamura and VFX supervisor Craig Hammack. Ex Machina offers an altogether different depiction of tomorrow: an artificial intelligence breakthrough takes the form of an extremely beautiful, extremely

frightening robot. Cinematographer Rob Hardy, BSC discusses his choice of cameras, his use of anamorphic lenses and more. If your concept of the “big screen” is 40 feet wide, it may be time to think bigger. Michael Conner of the Arizona Cardinals and Eric Long of the Philadelphia Eagles discuss the challenges of producing content for screens over four times as wide, with resolution up to 5856 x 816. We also visit the NHL’s Anaheim Ducks, to see how producer Peter Uvalle used the F55, FS700 and A7s to help commemorate the retirement of Teemu Selanne’s number 8. Also in this issue: LINK Technologies’ director Brandon Jameson describes how F65 RAW Lite turns out to be the perfect medium for capturing public service announcements. Cinematographer Claire Mathon, AFC discusses her Cannes Film Festival entry, the F55 feature Mon Roi. And nature cinematographer Bob Poole reminisces about shooting in tropical downpours, 114° Fahrenheit and 100 mph sandstorms. You probably have your own amazing stories. We’d love to share them with our readers. Please keep the stories coming to production@am.sony.com.

Thanks. Alec Shapiro and Peter Crithary

Alec Shapiro

Peter Crithary

President Professional Solutions Americas Sony Electronics Inc.

Marketing Manager (Twitter: @CineAltaNews) Professional Solutions Americas Sony Electronics Inc.


7

1 59

Brad Bird from the medium of Animation to Live Action Feature Film

79

Inventing the future for Disney's Tomorrowland Inside the making of Tomorrowland

Content 1 Brad Bird from Animation to Film F65 Anamorphic A.I.

Dheepan Cannes Palme D'Or

Cinematographer Rob Hardy, BSC discusses his latest film Alex Garland's EX MACHINA

Cinematographer Éponine Momenceau on Jacques Audiard's Dheepan

85

91

7 I nventing the future for Disney's Tomorrowland

59 F65 Anamorphic A.I. 79 Dheepan Cannes Palme D'Or 85

Colour, Softness and Freedom

91

Earth, Wind and Camera

109

Game Changers

Colour, Softness and Freedom

Earth, Wind and Camera

137

NHL's Anaheim Ducks

Cinematographer Claire Mathon, AFC discusses her Cannes Film Entry, Mon Roi

Nature cinematographer Bob Poole takes us on camera journey

149

Saluting the military

109

137

149

Game Changers

NHL's Anaheim Ducks

Saluting the military

A profile on two Game Production executives Michael Conner and Eric Long

Retire Teemu Selanne's number 8

LINK Technologies' director Brandon Jameson


Brad Bird from the medium of Animation to Live Action Feature Film Interview by Peter Crithary Photo Credit: Kimberly French Photos: ©Disney Enterprises, Inc

Q: You came from the medium of animation, and then you crossed over into live action. Tell me about your background. BRAD BIRD: I think from the very beginning, I was trying to do movies. I started drawing at the age of 3, and the very first drawings I did were sequential. I didn’t figure this out until later. They were just stick figure drawings, but they were meant to be viewed in a certain order. When I started doing animation at the age of 11, I had to figure out which shots were in close-up and which shots were medium shots, and which were pans, because I had to draw them and then shoot one frame at a time while I moved the background or changed the drawings out. I recognized that certain directors were making me laugh more consistently than others in animation, and I started noticing that certain filmmakers in live action were consistently getting my emotions involved in a way that other filmmakers were not. My parents talked about how Hitchcock could get a chill to go up your spine right when he wanted to, and that became fascinating to me. How do you do that? Just like you might have a favorite ballplayer or a favorite singer, I was noticing that certain filmmakers were consistently engaging. So I wanted to do a live action film almost as long as I have done animation. Animation was just kind of a 1

back to TOC

gateway drug to the world of film, and I love both the medium of animation and the medium of live action film. To me, it’s all film. I got my first breaks in animation, but I always intended to get into live action, and it took me a little longer to cross over than I originally imagined. Q: What was your first big break into animation? BIRD: I had pitched a short animated project to Steven Spielberg with the idea of bringing theatrical shorts back to movie theaters. Steven didn’t elect to do it, but he remembered it. Years later, I got involved with writing an episode of Amazing Stories for Matthew Robbins, and I was meeting with Spielberg. I had done another Family Dog storyboard. He had seen my original storyboard, just a series of little sketches. I brought them to the meeting when we were going to discuss other scripts, and I just showed it to him. He asked if I could do a half an hour of it, and I said “Sure.” So we made the only episode of Amazing Stories that was animated, and it was also the only negative pickup, meaning they just gave us the money and a deadline. It was the only episode produced outside of the machinery, so to speak, of the rest of the show. I had gotten a couple of screenwriting credits but that was my first opportunity to write and direct, and then it just kind of went from there. I got involved with the Simpsons through Family Dog, and then my first chance to do a feature was Iron Giant.


“I wanted to do a live action film almost as long as I have done animation.� back to TOC

2


Q: Crossing over into live action from the medium of animation, how would you characterize that jump? You mentioned that there are certain parallels between the two. BIRD: People make too much of the difference. You’re still using the language of film. The physical production of it is different, and there’s an exactness to animation that live action probably wishes it had, as well as a spontaneity. There are strengths and weakness in each medium, but the language is still the language of shots, color, music and characters. A lot of people mislabel animation a genre, and it isn’t. In fact, I think a lot of the problems that animation has come from creative people in studios treating it like a genre, and not like a medium. Q: Some say that character development and story need to be stronger in animation than it does in live action in order to draw more people into the experience. Would that be a fair statement?

“If you don’t have resiliency, live action will just chew you up and spit you out without any feeling of pity.” 3

back to TOC

BIRD: One of the primary reasons that I wanted to work with Pixar is that I felt the Pixar films developed their stories very well. The Incredibles started out as a hand-drawn project at Warner Bros., but they didn’t own it. I owned it. When I took it to Pixar, some of my fellow workers were mad at me for giving up on handdrawn animation to do this sexy new thing called computer generated animation. But I went to Pixar because they protect and nurture stories, not because of computer generated animation. It’s a good environment to grow a story in, because the people who are giving you notes are


Brad Bird

filmmakers, and not management. Then Ratatouille came in from the blue. I was about to start working on a project called 1906, which is a very tricky project that I’m still trying to figure out. I spent a couple of years trying to crack certain story problems with it, and I suddenly looked up and decided I had to make a movie. I can’t just be tinkering in a corner—I have to do something. Mission: Impossible – Ghost Protocol came flying in and hit me in the head. I had met Tom Cruise after Incredibles, and we had this great talk about what we liked in film. His knowledge of film is extensive, going back to the silent era. That’s the kind of film love that wins me over right away. In terms of difficulty, Mission: Impossible was like being thrown into the deep end of a shark tank. It was a big complicated project, but it was also a chance to use amazing tools with incredibly gifted collaborators and work with fantastic actors and giant stunts. It was, in its own way, completely exhilarating. Q: Was adapting to that from the medium of animation a huge challenge? Did you fall into the groove fairly quickly? BIRD: Well, it was a huge challenge physically. The days are long, and you have to keep your focus, and certain things don’t work out right. And when they don’t work out, you can’t just complain. You have to be part of getting a solution now, because the money is flowing out quickly whether you solve a problem or not. There’s weather. You’re dealing with a lot of creative people who are creative in different ways, and you’re trying to get them to all harmonize. If you don’t have resiliency, live action will just chew you up and spit you out without any feeling of pity. The pressure is just amped up quite a bit more than in animation, which has a more steady pace. For Tomorrowland, I was better prepared for the physical demands. I actually trained with a trainer. I got sick on Mission, and because of the schedule, I never had a chance to get well.

Q: Would you say there’s more flexibility to change things in live action? BIRD: You’re changing it as you’re doing it. You’re pushing, you’re iterating toward a thing that you want. In live action, you can do seven different takes, and pick and choose. You can really change how a scene feels just by the blend of takes you choose. It’s almost like the takes are ingredients that you’re stirring into an elaborate meal, and by changing the length or the order of takes, you can completely play with the seasoning of the scene. A bouillabaisse soup is still going to be a bouillabaisse soup, but how it tastes can vary widely. In animation, you are nudging towards one taste. You’re not exploring 15 tastes and then figuring it out later. You are nudging people towards that, and that’s all there is. There’s nothing else to cut to. So, while you have flexibility and control, you have very little wiggle room once you lock in on something. In other words, if you have a lot of footage on the floor in an animated film, you’re a bad director. Every second costs a lot, and you have to know what you want. It is possible to not know what you want in live action and get away with it. In live action, if you’ve directed a scene with no pace at all, a good bunch of editors can cut around and make it look like you had pace in the scene. Whereas in animation, it becomes fairly clear when you don’t know what you want. It shows up on the screen.

back to TOC

4


Brad Bird

Q: Do you have to focus more on character with an animated project than a live action because the characters are less real? BIRD: I would say focus differently. With a live action, you can talk a character’s feelings over a bit with an actor, and then kind of let the actor go. In animation, sometimes you can do that, but other times if something’s not there or something is unconvincing physically, you might have to go in and get to the bottom of why it’s physically unconvincing. This gets very tedious very quickly. One of the challenges of The Incredibles was that we wanted audiences to actually worry about them. We didn’t just want them to be funny, which is easy to do in animation. What is hard to do is convey convincing jeopardy. Animation people are used to the coyote falling off the cliff 1000 feet down and splatting, and then dusting himself off and going into the next scene. So, how do you create jeopardy in a medium that basically tells everybody that it’s artificial and nobody’s going to get hurt? So I try to declare early that this is not one of those animated films. In Iron Giant, Hogarth is running from the Giant, and he runs into a tree branch, and it knocks him on his ass, and he’s got a bloody nose. I’m saying that in this universe, it’s not Wile E. Coyote. If you run into a branch, you’ll start to bleed. A bone can break in this universe, and somebody could die, and that was very deliberate. On The Incredibles, I was cracking the whip often and loudly about making it physically convincing. If somebody’s lifting up something really heavy, you’ve got to have them move like this thing is

5

back to TOC

really heavy. That’s something that people get right probably 5% or 10% of the time. I would point my finger at a lot of live action superhero movies, too. And I’d put The Incredibles up against any live action film out there. They may look cartoony, but they move in a way that says that things are physical. We busted our asses to make sure that it felt like it was tangible, and I would argue that that’s why a lot of people found our suspense sequences to be white-knuckling—because they feel convincing. Q: How did Tomorrowland start? What was the first spark for that project? BIRD: Damon Lindelof, who is the producer and co-wrote the script for Tomorrowland, did some uncredited work at the very end of Mission: Impossible. There were a few little holes in the writing that were bothering me, so Damon came in and did some incredibly surgically precise work, and really impressed me. We were having lunch one day and he mentioned this idea, and the hook was put in me right away. Damon and Jeff Jensen had been writing for about six months, coming up with this back story, and the ideas behind Tomorrowland. I talked it over with Damon and Jeff, and then Damon wrote the first draft of the script. From there, I wrote every draft with Damon. So I didn’t create the initial story, but I was part of how the story developed. Q: Eventually you went into preproduction with Claudio, and you did very extensive camera tests, from what I was told. BIRD: Very extensive camera tests. I think they’ll be using them at Disney for years. We did our homework, and it’s because I’m a lover of film. One of the things I like about Claudio is that he is an agnostic. He loves film too, but he wasn’t going to not do digital just because it was digital. He was going to look at each format and say, “Here’s where it’s good, here’s where it’s wanting.” Given the story that we were setting out to tell, my initial thought was that everything in Tomorrowland was going to be in IMAX. But those cameras are very loud and cumbersome, and it was also becoming clear to me that there was going to be no way to


“...we tested multiple digital cameras as well, and we just came out feeling that the Sony F65 was the best choice...”

exhibit IMAX on anything that wasn’t digital. So we wanted to see how all these different formats look through a digital projector, because that’s how it’s going to be projected no matter whether I shoot on film or not. We tested not only film, 5-perf, 15perf, 8-perf and everything, but we tested multiple digital cameras as well, and we just came out feeling that the Sony F65 was the best choice given all of our considerations. The F65 looked as good as anything else, but it had other advantages. We have some very quiet, emotional scenes towards the end of the film that take place in Tomorrowland, and I couldn’t imagine having an IMAX camera, which is like a lawnmower next to the actor’s head while she’s trying to be sensitive. Q: What were you looking for in the image as a director? What is it that makes film so good visually for you? BIRD: Essentially, I want it to be really sharp. That’s the reason I went after 70mm—the ability, if it’s projected properly, to see incredible detail. I want a bulletproof image with a nice spectrum of light to dark, and that was one of the things that I liked

better on film. But in our tests, digital started to get over into that realm and that sort of range of light. With the Dolby system and laser projection, I think we’re really in good shape now. I think that any limitations are disappearing rapidly. I love the image quality in Tomorrowland when you see it projected in 4K, particularly in the laser. You have really dark blacks, but there’s detail in the blacks. They’re not just dead blacks. And the brightest stuff, you have to squint when you look at it. We have shots in the film where the sun is behind George Clooney, shot deliberately to make it a little bit difficult to see his face. When you see that scene laser-projected, you’re squinting. It’s like you’re looking up at the sun. We want movies to be experiential, and to me that’s a big part of the argument for 70mm. It was something I always dreamed about shooting when I was a kid, because I was so impressed at things like 2001. That just blew my mind. But the stability of 70mm and the tightness of the grain and all that is now pretty nicely achievable. It looks pretty close with these digital cameras. As I said, I’m happy with the image quality in our film, and if you see it nicely projected, it’s just really solid.

back to TOC

6


7

back to TOC


Inventing the Future for Disney’s

Tomorrowland Written by David Heuring Interviews by Jon Fauer, ASC Story produced by Peter Crithary Photo Credit: Kimberly French Photos: ©Disney Enterprises, Inc

Tomorrowland is a sci-fi adventure feature film that melds Walt Disney’s optimistic view of an innovative, utopian future with the absolute latest in high fidelity filmmaking technology. Two-time Oscar® winning director Brad Bird (Ratatouille, The Incredibles, Mission: Impossible – Ghost Protocol), cinematographer Claudio Miranda, ASC and Co-producer/VFX Producer Tom Peitzman headed the list of top talent assembled for the undertaking. The cast includes George Clooney, Hugh Laurie, Judy Greer and Tim McGraw. The main character, Casey Newton, is a bright and curious 17-year-old girl who aspires to become an astronaut. Denied those opportunities, she teams with Frank Walker (Clooney), a disillusioned inventor, and travels to an enigmatic place in time and space known as Tomorrowland, where the

duo is able to use their skills and talents to make the world a better place, now and in the future. After an extremely thorough camera test that compared seven different formats and camera systems, the filmmakers chose to shoot Tomorrowland with Sony F65 and F55 cameras, with a 4K visual effects pipeline for key scenes. Tomorrowland is the first live action cinematic release in 4K High Dynamic Range, and some lucky theater goers will see the film with Dolby Vision High Dynamic Range projection. What follows are accounts of the production from Miranda, Peitzman, first camera assistant Dan Ming, digital imaging technician Alex Carr, Senior Colorist Stephen Nakamura, and Visual Effects Supervisor Craig Hammack from Industrial Light and Magic.

back to TOC

8


Disney’s Tomorrowland

Claudio Miranda, ASC Cinematographer

Claudio Miranda, ASC earned an Oscar® for his groundbreaking 3D cinematography on Ang Lee’s Life of Pi. His other credits include TRON: Legacy, The Curious Case of Benjamin Button, Failure to Launch and Oblivion. In the following, he recalls the thought processes that led to the decisions he made with director Brad Bird on Tomorrowland. For Tomorrowland, Brad Bird wanted 65mm film quality up on the screen. His thing was to have a 65mm, big negative kind of feeling. “I want to see a lot,” he said. “I want to see the Tomorrowland locations. I want to see all the details.” He didn’t want really anything getting in the way of the image, no softness. Brad hated the idea of digital in the beginning. Strangely enough, he didn’t hire a film guy. Even though I love film, he knew that I had shot digital on The Curious Case of Benjamin Button, The Life of Pi and Oblivion.

9


10


Disney’s Tomorrowland

When I first came on the job, there were five kinds of film formats he wanted to shoot. It was a whole mixed bag of stuff, and I was concerned that managing all of that, along with the second unit, we’d be tripping over ourselves. That’s when we decided to shoot a test with every camera, putting them all on the table. When I shot the tests, we had three 65mm cameras, a regular film camera, VistaVision, regular 35mm, RED, F65, F55, ALEXA, and even a GoPro. This was my chance to shoot everything all at one time. Shooting separately biases people’s decisions. If I shot these tests one camera at a time and a cloud came over one of them, or it was more backlit, it changes perceptions. By the time you got to the seventh camera it might be totally different. I didn’t want any of those things to weigh anyone’s decision. I see a lot of tests. It’s not accurate if one shot is in overcast light, and the other was done at sunset. Of course it looks better. It’s better light. I wanted it shot exactly at the same time, exactly with the same lens pointing at exactly the same thing.

“...we decided to shoot a test with every camera, putting them all on the table. When I shot the tests, we had three 65mm cameras, a regular film camera, VistaVision, regular 35mm, RED, F65, F55, ALEXA, and even a GoPro.“ 11


Brad wanted to see how each camera handled movement, so we put a truss on the front of the Shotmaker camera car. That way, we can drive around in daytime and nighttime to judge moving shots. We shot static as well, day and night. Brad was concerned about how the digital cameras would handle motion, so I wanted to alleviate some of that.

12


Disney’s Tomorrowland

13


“Nighttime is where you saw that things just held up sharper and allowed for lower light levels.“ We brought the test over to Stephen Nakamura for a grading test where Brad could really look at the footage hands-on. We projected them all in 4K on a 40-by-60-foot screen, and we picked the camera that we thought was best for this project. It was like a blindfolded taste test. People get hung up on labels and brands. We kept our eyes open and just looked the images without labels or identification, and we decided what we were after for this movie. IMAX 15-perf was the sharpest tool in the shed in daylight, but I knew a lot of the scenes that we had to shoot would require low light. For instance, the shuttle is lit to a very low level. We have a turtle migration, and we had to protect them by using very low light levels. I didn’t want to light it up from the ground because it would look heroic, and there’s no

Condor to make moonlight from up top, so that was out of the question. There was no way to use film for that scene. There were other instances. I like shooting wide open a lot. I often live at 800 wide open, and that makes it hard for focus pullers. But for me, that depth of field takes a little bit of the “digital-ness” away from what we shoot. We saw it on the screen and we picked the F65 because it suited all our needs. In our testing, we found that the F65 was the biggest Swiss Army knife — it could do bright daylight and it could do dark scenes. For daytime, we were hard-pressed to see any difference. Nighttime is where you saw that things just held up sharper and allowed for lower light levels. On some of the test scenes, I was five stops underexposed.

14


Disney’s Tomorrowland

I love the format. Some people are proposing a 2.0:1 format. That would be my perfect format. If I had a format to land on, for composing the right ceiling height, for making a great two-shot, for seeing beautiful vistas, I would go to 2.0:1 as a common format.

I wanted to let Brad make the decision. I liked the F65. It was capturing everything I saw. It’s sharp. I think Brad had to kind of back into it because he was really romantic about wanting to shoot film. So he was nicely open to the idea of shooting digital. When it came to aspect ratio, Brad was really interested in the 65mm format. The Tomorrowland world, by its very nature, has a lot of height in it. There are a lot of skyscrapers and a lot of vistas. 15

Brad initially wanted the 4:3 IMAX format in the beginning. He wanted the real world to be 2.40:1, and the Tomorrowland scenes to be 4:3. I had the idea of using 1.3x squeeze Hawk anamorphics. For the 2.40:1 scenes, I would use them normally since the F65’s sensor with 1.3x anamorphics unsqueezes to 2.40. For the IMAX 4:3 scenes, I thought it would


be cool to take 1.3x anamorphics and then rotate them 90 degrees in the mount to expand the F65’s 1.9 into 4:3. That would be fun — to have Tomorrowland scenes with vertical flares and the real world with horizontal flares. It was a fun experiment. In the end we went with ARRI/Zeiss Master Primes and Fujinon Premier Zooms. We had Chapman Hydrascope 73-foot and 32-foot cranes, similar to the package we had on Oblivion.

“The Tomorrowland world, by its very nature, has a lot of height in it. There are a lot of skyscrapers and a lot of vistas.“

16


Disney’s Tomorrowland

When it comes to lighting, I try to make it feel as though people are lit naturally in their environments. I don’t do crushed blacks or crushed anything. On this film, I tried to make it as if a real camera were really in that space. What would it look like? And how real would it be? I have never really forced an instamatic kind of look to anything. It all just feels natural. If it’s warm sunlight, it’s warm sunlight. I don’t make it cobalt green or something. There’s a lot of LED in the set. I am natural as much as I can be, using standard silks and grid cloth and similar tools. On exteriors, if it was an overcast scene, I’d maintain this look by putting up 60-by-60 frames of full grid cloth overhead. If the sun came out, I would be prepared for that. Interactive lighting solves a lot of problems, makes things look realistic, and even helps the VFX, I think. I work pretty much hand in hand, making sure that everything I shoot for VFX is usable and enhances the interaction. Of course they could paint all the light in,

17


but it doesn’t feel as real. So for example, in the house assault where they’re firing guns, there is a little light emitter on the end of the gun that blasts forward and out. This gives it more of a punch. I can get a similar look to the F65 with ALEXA. I could probably get a similar look on RED. I just think it’s the eye of the artist, and in the end when we grade the film — how it feels, what its bite is. There is the resolution, and also the mechanical shutter, which I like on the F65. Also, I want to adjust the depth of field and the adjustable ND is available on the fly by just scrolling the wheel on the F65. I’ll make an artistic choice of depth of field very fast. I actually don’t even know exactly what I rate it at. I just look at the scopes. I have a great BVM-E250 monitor and between that and the scope, I don’t even use a meter. I would say it’s generally between 800 and 640. I rate the F55 at 800.

“When it comes to lighting, I try to make it feel as though people are lit naturally in their environments. I don’t do crushed blacks or crushed anything.“ 18


Disney’s Tomorrowland

“There is the resolution, and also the mechanical shutter, which I like on the F65. Also, I want to adjust the depth of field and the adjustable ND is available on the fly by just scrolling the wheel on the F65. I’ll make an artistic choice of depth of field very fast.“ 19


20


Disney’s Tomorrowland

The camera operators said they were happier with the new viewfinder. People complain about the F65’s form factor — they say it’s tall and weird. But I have an operator named Lucas Bielan who goes back to the film days, when he was handholding with 10:1 zooms. He’s like a tank, and I gave him the F55, which is a nicer handheld camera for many people. It’s lighter. And Lucas said, “Claudio, you know, I kind of prefer the heavier weight. I actually prefer the F65 to the F55. It feels better. I like the weight. It helps me.” We did have the F55, but we didn’t use it that much for a couple of reasons. One, I think F65 is way better. It has better blacks — it’s amazing in the blacks. I think the F65 is sharper. Also, we wanted to do some scenes at 22 fps and 20 fps, and at the time, the F55 couldn’t do that. The F55 was very new on

“The amazing thing when you look at the movie is that there are so many interesting details to see. There are robots working three miles away welding something, and you see all the detail.“

21

the market, kind of like the way we started Oblivion. When we started Oblivion, the F65 camera only did 24, 30, and 60. But we carried the F55 for its small size, and getting into tight spaces. The visual effects were done in 4K, and that was a first for me. The main supervisor was Craig Hammack at ILM, and Tom Peitzman was one of the producers. These guys are good enough that the detail is there. They had to make sure the textures were up to snuff, that they didn’t block out. They took precautions to make sure that they added enough fine detail. Because it’s big, and it’s 4K laser, there’s so much detail. Your eye just can look everywhere. The amazing thing when you look at the movie is that there are so many interesting details to see.


There are robots working three miles away welding something, and you see all the detail. ILM put so much detail in the tiniest of pixels. Their work was incredible. I’ve never been in DI with the digital effects so rock solid at the beginning, and I think a little bit of it had to do with the fact that they weren’t pushed as hard on deadlines. I felt like I didn’t need as many mattes. I just kept on complimenting. I just couldn’t get over how good these guys were.

22


Disney’s Tomorrowland

My color corrections on set are pretty simple. I just go for a standard look. A little bit of contrast, but nothing much, really. I review dailies with DIT Alex Carr, but we didn’t push it that far in dailies. I know I’m going to do something at the end. Everyone, including the studio, was happy with dailies. But I always make a caveat. This is not our end look. It could be close, and usually it is close. And it’s been that way for all my movies — every single one. When we go into DI, I just remove all the grades and start from scratch. I like a fresh start. I’ll have Stephen Nakamura watch the movie, because everything has to live in a different world. On set, it’s a video monitor world, which is different than the projection release world. In the DI, I see the whole movie as a whole and some of those decisions I

made a long time ago may have changed. Scenes move. You might have a different feeling because the scene before lifts the color of the scene after. If you’re going from a warm world, you might like the other world colder. Or there’s a scene in the movie that was cool but is all of a sudden backed against a warm one in her memory. I may change the idea of making the scene a bit warmer. So the edits change a thousand times until the very end. It’s a whole different animal now, morphed and better. Tomorrowland is being distributed in SDR, HDR, 2K, 4K and IMAX releases. I knew there would be a separate pass for HDR and a separate pass for IMAX.

“You can have the dynamic range that you couldn’t before. And the biggest thing is that you truly have black blacks.“ 23


The IMAX release is a different aspect ratio than the regular film release. Brad really wanted to honor the old 65 mm format, and that’s why the regular and the HDR release aspect ratio is 2.20:1. It’s a 1.85 container, and it gets slightly letterboxed on top and bottom, with special instructions for theaters. For IMAX, the window has been opened up to 1.9:1. We did repositioning in a few spots. We did a different grade with David Keighley, who is VP of IMAX and in charge of re-mastering. IMAX is not so much about the frame, compared to a normal theatrical release. IMAX is about sitting in your seat and barely being able to see the sides of the screen. IMAX is more of an immersion, which I think is completely different from sitting a little bit farther away, in a regular theater, watching a smaller screen where you really are conscious of the frame. You wouldn’t want the same headroom in IMAX as you would in 1.85. So, for the IMAX version, I gave it more headroom. For the composition of the final conform, I sat very close to the screen for the experience of having it fill my field of view. And then

in the grading theater, I just sat back a little bit to make judgment calls about color and contrast. The premiere for Tomorrowland at Disney was in HDR. With HDR, the concept is having much more dynamic range in the theater. So where a normal screening is 14 foot-lamberts, with HDR you’re getting screen brightness that is much higher. In a normal DCP container, we’re kind of dumbing down the contrast and bringing down our range into a space that lives inside 14 foot-lamberts. That’s what I grade to. But HDR is interesting because it opens up the contrast. It’s almost like adding more bass to the subwoofers in the speakers of an audio system. For example, we can make explosions that don’t burn out. You can have an explosion with a lot of light. You can use it for an effect, and it really opens up more in the DI, which can add to the story. You can go from inside to a blinding light outside if you want to make the audience feel that extreme contrast. You can have the dynamic range that you couldn’t before. And the biggest thing is that you truly have black blacks. 24


Disney’s Tomorrowland

The F65 camera is HDR. The camera always puts out more information than we will ever use. We add more contrast. We stretch our confines into the cellblock that we’re in — so to speak — of theatrical projection standards. There is a space of color and luminance that we have to work in. When you run black on a standard cinema projector, you can put your hand in front of the projector and do bunny puppets. You can still see your hand. But in HDR — nothing, zero, zip. You cannot even see anything with your hand in front of the screen. Black is finally black. It almost makes me feel like this is truly what 3D should be. I think it’s truly three dimensional — I almost feel like it’s

25

even more three dimensional than 3D sometimes. Obviously I haven’t seen HDR 3D yet, so maybe there’s something to convince me. But what it also promises to do in the future is to make 3D HDR extremely bright, and brightness is the thing that people are normally complaining about, beyond the glasses and artifacts. HDR is brighter projection, but that’s not the whole story. If we just had brighter projection, black would just be grayer, and everything would just be brighter. It’s not about that. It’s like OLED, that inky black. You could choose to go milky black — you could always make those creative choices. Imagine a starship where the night


“The F65 camera is HDR. The camera always puts out more information than we will ever use. We add more contrast. We stretch our confines into the cellblock that we’re in —so to speak— of theatrical projection standards.“ sky is totally black and you see it in space. You would almost feel like you wouldn’t have any presence of the screen at all. You don’t necessarily have to light differently. You just know that your blacks go totally black if you want them to. You do spend time in a separate grade, and we made shortcuts in the grade because we know where highlights go when we’re grading with DaVinci Resolve. It’s called a soft highlight clip. In the regular screening, we had the soft clip moderate to heavy, but when you go to the HDR, we just abandoned the highlight clip, and it let all the highlights go into this new space. We don’t do anything different on the camera, but the grading is slightly different. 26


Disney’s Tomorrowland

“You don’t necessarily have to light differently. You just know that your blacks go totally black if you want them to...We don’t do anything different on the camera, but the grading is slightly different.“ 27


28


Disney’s Tomorrowland

Tom Peitzman

Co-Producer, VFX Producer Tom Peitzman is a visual effects producer who started out as an assistant director. His previous credits include Mission: Impossible – Ghost Protocol with Tomorrowland director Brad Bird, as well as Planet of the Apes, Watchmen, Alice in Wonderland and Lemony Snicket’s A Series of Unfortunate Events. I grew up in South Torrance. I studied film at San Diego State University, and I was very pleased with the way that things went. You had to plan your projects. You were apt to be given broken equipment that you had to make work, and so you really became quite adept at compromise and figuring out problems. After I got out, I kind of worked my way up from the bottom. I was a production assistant. I worked my way into being a production coordinator to assistant director. I joined the Director’s Guild and I am still a DGA member. I worked for a number of years as an AD, and then on Congo a number of years ago, they were looking for somebody to be the liaison between visual effects, mechanical effects, and animatronic effects. I did that, and then from that point on, I started doing visual effects. I was a visual effects producer from that point forward, and over about the last five or six films, I’ve been co-producing as well, and to be perfectly honest, I’m ready to transition into just producing. It’s been a great journey along the way, and I’ve learned a lot in terms of how visual effects are made and movies are made, and now I’m ready to put that to the test and produce a movie. I had done Mission Impossible with Brad, and he and another of my producing partners, Jeffrey Chernov, called me and asked me to come in and do this project — Tomorrowland. Basically I signed onto the project without even reading the script, 29

and that’s something I don’t normally do. But knowing Brad and the type of storyteller he is, it was a unique project that wasn’t the third in a line of sequels. He had a special affinity for the word Tomorrowland, and given his relationship with Walt Disney Studios over the years, I knew he’d come up with something special. We started out with words on paper, and then we started hiring an art department, then we started hiring a series of storyboard artists, and then I brought in the pre-vis team. So there was a lot of heavy lifting early on. I wear many hats on the project. My primary responsibility has been figuring out how we are going to put Brad Bird’s images up on the screen, all the way from the beginning. I started in June 2012, so I’ve been on the project for three years now. From the time the original script came in, I would sit down with Brad, break it down, figure out what the visual effects were in the movie, and then I would hire the visual effects companies. Industrial Light & Magic is the company that I hired to do the lion’s share of the work. I have worked with them over the last 20 years. In preproduction, we figure out what’s in the movie, and how to do it creatively and technically. Then we sit down and figure out where to do it. That’s where my co-producer hat is put on. I used to be an assistant director. It involves figuring out the scheduling with our other producing team, working with the costume department, and liaising with the mechanical effects department, because there are so many things that they are able to do that really make the visual effects better, because they can be grounded in reality. So we do as much in camera as possible.


30


Disney’s Tomorrowland

For me, it’s been a really great collaborative effort. I try to bring to the table a bit of a fresh perspective on visual effects, by trying to not do everything digitally, even though that’s what I do. I think there’s so much merit in doing things practically. The ability to hinge the visual effects on reality, when possible, often ends up make it much better, and Brad embraces that wholeheartedly. Claudio Miranda, ASC, our DP, was very happy to hear that as well, because it’s very different when you have a set or a location that can be lit. It can then be acted in, and Brad can give actors direction for the things that aren’t there, and the editor has something to cut with instead of just doing it in a blue box. I think it really hampers creativity, and there’s something always a little off in CG work, when it’s completely synthetic. But when you have something in the frame that’s real, you have to match to the thing 31

that’s real to make it look realistic, and that’s always been the fun challenge. While we’re shooting, I’m there from day one until the final frame of digital film has been shot, and then I’m on all the way through postproduction, overseeing all the visual effects and the vendors who are doing the work. I’m making sure that our creative team with Industrial Light & Magic is coming through and realizing the visual effects the way Brad envisioned. I follow it all the way through color timing. I’m there for scoring. I’m there for the final mix, and then I help with the QC-ing of all the DCP packages. I’m in charge of the money when it comes to all of the visual effects — managing that, and the personnel, as well as working with our line producer


and our executive producers on the rest of the movie in terms of keeping track of where we are and problem-solving on location. That’s a part of the job that I really enjoy. I feel like I’m a puzzle solver, because we’re never doing the same thing twice, and we’re given circumstances that are very different every time. You go to different locations, whether it’s a set on stage or the Eiffel Tower in Paris. Something will always come up and you have to think on your feet. You have to come up with a solution, and you have to make sure you’re still protecting Brad’s vision up on the screen. It gives me gray hair, but at the same time I absolutely love it. In the visual effects world, you have to be a really good collaborator and communicator, because so many things can happen afterwards, when 90% of the people are gone. I actually was able to figure out a way to get a consulting deal going with Scott

Chambliss, our production designer, during postproduction, because I wanted to protect the ideas that Brad Bird was trying to put forth. So many times things change in post, and all of a sudden one location or a scene changes for whatever reason. Creative and aesthetic decisions have to be made. A lot of times the production designer is not consulted, and the vision gets changed. Fortunately, both Scott and Claudio were available. We would bring Claudio up to Skywalker Ranch when we were in postproduction and say “I’d like you to put your eyes on this.” We’ve spent so much time together — myself, the visual effects supervisor at ILM, and Claudio, talking through how we’re going to do various things. We want to make sure we protect what Claudio was looking to do. Claudio was really pushing technology at all times, and so was Brad Bird. Brad is the king of pushing technology.

“I’m making sure that our creative team with Industrial Light & Magic is coming through and realizing the visual effects the way Brad envisioned.“ 32


We sat down two and a half years ago — Claudio, Craig Hammack the VFX Supervisor from Industrial Light and Magic, Brad, and I — wondering what format we were going to shoot this movie on. Brad was exploring a lot of options. He was looking at high frame rates. He was certainly looking at 4K. We ended up shooting a camera test that Claudio spearheaded where we lined up seven cameras side by side. Seeing these seven cameras lined up next to each other with seven slates was unique. We had a system rigged with colored strings that went from the camera to the slate, so you knew which slate was for what camera. We finished all this at 4K, and it was very, very telling. We created a chart, and we defined the strengths and weaknesses of each of the systems. We considered dynamic range, chip size, recording ability, and physical camera size. After many gorounds looking at the footage, we all decided that shooting this movie on the Sony F65 was the way to go. The F65 gave us as many of the things that we wanted in one system — for example, very 33

accurate flesh tones. With some other systems, the flesh tones can be a bit of a challenge. It allowed us the high pixel resolution that we were looking for, and it gave Claudio all the flexibility that he wanted in shooting both day and night scenes with existing light. Here’s an example of the low-light capability: We were in Florida, shooting plates of the exterior of a house. It was a bright moonlit night, but we saw lots of stars, and Claudio said, “Let’s see if we can shoot a plate of the stars.” And we actually were able to photograph stars, which I have never been able to do before. With film you just can’t get them to register. So that was something really cool and unique.


Disney’s Tomorrowland

“After many go-rounds looking at the footage, we all decided that shooting this movie on the Sony F65 was the way to go. The F65 gave us as many of the things that we wanted in one system — for example, very accurate flesh tones.“

That’s the great thing with Claudio — he’s not afraid to try new things, and he is always on the leading edge of technology. Sometimes that makes it very challenging. On a financial level, it’s expensive. On a creative level, it’s challenging. On a production level, it’s really difficult to schedule around, because some

of it we just don’t know until we try it. But I think the results we were able to get really showcase the camera system, 4K, OpenEXR, and 16-bit imagery. And Brad is super happy with everything that is going up on the screen. 34


Disney’s Tomorrowland

On Oblivion, Claudio used actual photography playback as a projection on screens, which was terrific. On Tomorrowland, we used it as an interactive lighting source. We have a scene where we’re inside a sphere, and imagery is being projected on this sphere. We needed to have our four actors standing on a platform, watching all this imagery. So that imagery would need to be throwing interactive light on their faces. We used the media panels for that. We also show them traveling through the city on a monorail, and we needed it to feel as though they were going in and out of buildings and shadows and a tunnel. So we were able to really create sequentially some really interesting-looking effects on the people, which we then hinged on in visual effects. When we saw shadows passing someone’s face, we put a building going by or we put another vehicle going by. If all of a sudden there was a bright light source, we’d have a kick coming off a building or something like that. We had a lot to work with as a result of using these interactive panels. We did the visual effects at 4K resolution. We were very careful about how we did the 4K. There were many specific shots that would really call out for 4K. We did them all the way through in 4K. The assets were built in 4K, the renders were rendered in 4K, and then we output to 4K. Sometimes we would do shots where we were building some elements in 2K and up-rezzing to 4K, but doing a 4K render. Then there were some shots where we realized it was dark, and it was fast motion. You couldn’t tell the difference anyway, so we would do the work in 2K and up-res to 4K. We did a number of side-byside comparison tests early on with Industrial Light & Magic. We would see what native 4K would look like, and then we would show a 2K visual effects shot up-rezzed to 4K, side by side, and for the most part, most people couldn’t tell the difference. Sometimes we’re sitting ten feet away from a screen just to see if we can see pixels or any kind of anomalies. We did do a good portion of the film in 4K, but everything is output and rendered in 4K. We were very judicious about it. If we had rendered over 1100 visual effect shots in 4K, it would have been a lot more cumbersome compared to using 35

three different fashions with the final 4K output. It was well worth it in the scenes where we really spent the time. The main reason we did it is that the content lent itself to detail. I swear that Brad sees 1/24th of a second at a time. He can see the smallest detail in shots, and he really wants to push the envelope that way. Even though it became challenging, and financially at times it could be a bit cumbersome, we were able to support it and make it happen. Another consideration is that we’re doing a extended dynamic range grade. There’s so much more brightness and latitude there than before. The brights are brighter, the blacks are blacker, and it also adds more sharpness and contrast. We were able to see so many things in the 4K High Dynamic Range that we couldn’t see in a normal 4K DCP package or DCI package, and that’s what was so cool — all that fine detail that we put into the shots, sometimes you can’t see. But once we got to this HDR grade, you could see it, and we were all just super jazzed by that. You could really take advantage of 4K in a way that you couldn’t before. When we started, we knew we’d be doing two grades most likely. Midway through preproduction, Dolby rolled around, and we were interested, but not banking on it. Once we got into production, we were talking more seriously with Dolby, and we decided to pursue it. The good news was that in visual effects, not only were we working in 4K, but we were working in 16-bit OpenEXR, a High Dynamic Range image file format developed by Industrial Light & Magic. Being in 16-bit OpenEXR gave us a lot more latitude to begin with. We probably went back into only a dozen visual effects shots to adjust for the HDR grade. We did that in a timely fashion so it wasn’t a problem. I’m definitely a fan of HDR, and I do think that it’s going to be a game changer, especially if you know going into a movie that you can do these things. It really can be pretty dramatic what you can accomplish. You may have seen some of our trailers, where you go from inside a room to outside in a beautiful wheat field. That jump in dynamic range and contrast is just striking when you see it in HDR.


“On Tomorrowland, we used it as an interactive lighting source. We have a scene where we’re inside a sphere, and imagery is being projected on this sphere.“

Tomorrowland is definitely a movie that I love, because it’s a young woman with hopes and aspirations, and being a dad of three daughters in

their 20s, I get that. That’s what really resonated with me. It’s been one of those unique, special projects that I’ve just been thrilled to be a part of.

36


Disney’s Tomorrowland

Dan Ming 1st AC

Dan Ming served as first assistant on the Tomorrowland camera crew. Over a 20-year career, he has worked on more than 65 productions, including Lincoln, American Sniper, and Thor. Here, he looks back on Tomorrowland and explains the tools and procedures he developed to ensure a smooth shoot. On Tomorrowland, we had five F65s and two F55s out of Keslow Camera. Our lens package consisted of Master Primes and all the Fujinon Premier zooms. We had the OLED EVF, and we used the Preston FIZ, MDR-3 motor drive and the Hand Unit 3 lens control system. On Steadicam and at close quarters handheld, I still like to be next to the camera for focus pulling. It’s easier to see the geometry of where the camera is going and how everyone is moving in the shot. Otherwise, for dolly work, since the operator no longer has the best view of focus, I like to have an eye on a big monitor, but still off to the side where I can see the dolly and actors moving. I use a 15" OLED. I also pulled off the monitor when it got really cold during night exteriors with wide establishing shots. There’s nothing like extreme cold to throw the marks off on lenses! I have been using a Preston Light Ranger 2 Follow Focus Systems for the last month. It works really

37

well as a rangefinder tool and has the ability to take over and pull focus, with you directing which area in the frame to keep in focus. On certain shots, it works really well. Preston has made an interface that lets you go from auto focus to manual pretty easily during a shot. It’s a great tool to help keep things in focus in a world where no one really wants to rehearse the shot anymore, and we often just shoot the rehearsal. The LR2 gives you 16 zones of simultaneous distance data, and I got all the marks I needed for the next take via instant replay. Otherwise, I also use a Sniper and a CineTape or nothing as well, depending on what is best for the shot. In terms of footage, Alex Carr, our DIT, would grab selects for Claudio to time and send them on to Sixteen19, where dailies were done. We would hang onto the shot media until it was cleared by editorial, so we had to have enough media to last quite a few days. For the camera department, it was really about keeping track of our plates and elements so we knew what stops and lenses and angles we were at for any given set. We designed and implemented a digital camera report system where the second ACs did their reports on their iPhones and at the end of the day, the data was imported into the VFX and editorial databases. Alex could take that data and insert it into the footage metadata. It also resulted in a searchable database in everyone’s phone that was updated every day, so information on any shot from any unit could be looked up when needed. We tried to streamline the data entry as best as we could, and the seconds were always giving feedback on how to make the system work faster. I don’t think it is possible to have a system that is faster and more convenient than paper camera reports, but we got the system running as fast as we could, and the data generated benefitted everyone. It also was a paperless system, so we saved a lot of trees!


Dan Ming 1st AC and camera operator Lucas Bielan

38


Disney’s Tomorrowland

I had designed a digital camera report on Life of Pi using Filemaker and Filemaker Go for iPhones and iPads. I refined it for Tomorrowland to streamline the process as much as possible, but it still requires manual input on Tomorrowland, VFX and editorial also had their own Filemaker databases and all the data was ingested by everyone. So by the time it was done, you could look up pretty much any possible piece of information you were remotely curious about for any given shot. I asked Keslow to export their equipment scan to an Excel worksheet, which included bar codes, serial numbers, country of origin — the things required to do a manifest. We added dimensions, weights, et cetera. Then it was imported into Filemaker, and I designed several layouts to process the information. We took pictures of the packed cases as well. So we could type in any barcode and see what case it went in, along with a picture, and when we received it and when it was returned, along with which route it took. The job was in British Columbia, Alberta, Florida, Spain, Los Angeles, and a couple of other places. So we were generating manifests long before we shipped to get pre-clearance, but we could know the weight of our shipments and the volume we needed well ahead of time, by virtually packing everything. There was also a manifest layout, so it was very straightforward to generate. We had thousands of items weighing about 6,000 pounds altogether. One of my favorite memories of the Tomorrowland job was standing on Launchpad 39A at NASA where the Apollo and Space Shuttle missions took off and watching the MAVEN mission to Mars take off. Anyone who hasn’t seen and felt a rocket take off to space should add it to their bucket list.

39


40


Disney’s Tomorrowland

Alex Carr

Digital Imaging Technician Alex Carr, digital imaging technician on Tomorrowland, has experience on more than 20 films in a range of genres. Here he talks about his approach to data management. I had more time for software development on Tomorrowland. I’m specializing in remote network control access. Instantly controlling camera settings and grading on camera with metadata has many advantages. For instance, CDL values are included with RAW files and the CDL grade can be seen in the viewfinder and via SDI outputs during shooting. It’s just metadata, so most software can just turn it off to start the grade over. I did this for Tomorrowland. I had a base look CDL in the camera that I would adjust per scene. Then I would reset the CDL for grading in ColorFront’s Express Dailies. I have since jumped for ColorFront On-Set Dailies, with expanded secondaries and many extra features.

41

The F55 and F65 have very similar characteristics. Once we exposed for 800 ISO in the F55 the noise levels seem to match a little better. F55 was a great choice for a small lightweight camera on a primarily F65 movie. I would estimate that our total data was 250+ TB. I would load in one take per setup as soon as possible to begin grading on set. It was priority number one to begin the grade on set so that dailies could start processing once the data was downloaded onto the SAN, while LTO 5 archives were being made. I made very sure that the turnaround on media was very tight. I wanted no more than 48 hours turnaround on media cards while editorial viewed all of the Avid offline files before we could re-use cards. We had about 40 or 50 SR Memory™ cards. We ordered more when we were on distant location or when the lab was further away.


42


Disney’s Tomorrowland

Stephen Nakamura Senior Colorist, Company 3

Stephen Nakamura helped pioneer digital intermediate color grading, and is considered one of only a few truly A-list DI colorists. In the early 1990s, he worked at The Post Group doing telecine work. In 2002, when Technicolor started the first digital intermediate facility, Nakamura worked there on films like Panic Room with David Fincher and Confessions of a Dangerous Mind with George Clooney. His work can be seen on dozens of films, including The Departed, The Terminal, Prometheus, Michael Clayton and The Hurt Locker. Below are his recollections of the Tomorrowland project, where he

used a 4K pipeline and delivered the first theatrical feature film released in the high dynamic range Dolby Vision format. In the process of color correcting digital files, film gets converted into a digital file, or in the case of digital cameras, the digital is converted into a usable digital file that we can work with. I use DaVinci Resolve, and we have a Barco projector. We typically work in P3 color space. But for Tomorrowland, not only did we do P3, but we also did a High Dynamic Range transfer theatrically. That’s a whole different thing altogether, and it was done with Christie laser projection, in Dolby Vision, HDR/PQ color space. Tomorrowland is the very first high dynamic range theatrically released feature film. And it was color corrected at 4K. We did three completely different theatrical versions — one for regular digital cinema, one for Dolby Vision and one for IMAX. The IMAX color was essentially the same as the regular digital cinema, but we built the frame lines for IMAX’s 1.90:1 aspect ratio. First, we basically graded the whole movie in P3, which is how most of the world is seeing it. We go through everything in DaVinci Resolve until we’ve got the master for traditional theatrical projection. Then we made extensive use of a LUT we designed to convert the P3 files into the PQ format for the HDR pass. We also made some internal adjustments to Resolve, and obviously, switched to the Dolby Vision projection system. Dolby was very helpful with that.

“Tomorrowland is the very first high dynamic range theatrically released feature film. And it was color corrected at 4K.“ 43


44


Disney’s Tomorrowland

Once the choices that Brad made for color were converted into the HDR/PQ space, we massaged the images from there to make it look really good in that space. We basically recolored the movie to an extent while maintaining all of the things that Brad really wanted — darkening a wall or making a dress a little bit bluer. The PQ files need to be massaged because something that works at 14 foot-Lamberts and P3 likely doesn’t look so good in Dolby Vision. HDR can deliver 31½ foot-Lamberts. It’s a whole different way of displaying that image. I went through shot by shot and built a lot of Power Windows and made corrections to take advantage of the higher dynamic range without making it look like a whole different movie. The main differences would really be in contrast and some brightness and darkness. It’s not like trying to reinvent the wheel. With HDR, in terms of color fidelity, you can certainly have more vibrant colors, but that’s a personal taste. What HDR allows you is much more creative freedom as a director. A David Fincher movie might look almost exactly the same on HDR, because his palette is very desaturated, and it’s not very contrasty. Just because he has more dynamic range doesn’t mean he would use it. For a particular movie, the colors could really pop.

“Brad and Claudio didn’t really want the crushed-blacksand-blown-outwhites type of contrast. They wanted it to look very elegant, and very film-like“ 45

It’s also not the kind of color palette that Brad Bird and Claudio Miranda were trying to achieve. Tomorrowland is a colorful, exciting place everybody would want to go to, but it’s not as extreme or intense as you can get in HDR if you wanted to. It definitely looks brighter and the contrast range from deep shadow to brightly lit areas of the frame is more defined than we’re used to seeing. It’s really impressive. It’s just that we didn’t build the aesthetic of the movie around it. You could get really wild with Dolby Vision if you wanted to grade something that would be seen primarily in Dolby Vision. You’re working at 31.5 foot-lamberts so the screen can get really bright. The color space lets you have some really intense colors that you just can’t get in a normal P3 color space. Another important difference between P3 and the HDR color space is that the HDR really does have much more dynamic range. You’re talking about twice the amount of brightness in the HDR. The pictures are inherently much more robust. Even dark scenes have much more fidelity. The picture feels dark, but it’s so much brighter. So you don’t have to strain to see into shadows, or to see an actor’s performance


in a night scene. You can really see it much more, because you’re pumping more light through it. But there are very few Dolby Vision theaters right now. We colored the movie the way most people are going to see it, and then did another pass on top of that, which was designed to take advantage of what Dolby Vision can do. But it wouldn’t make sense to build a look that could only be seen in Dolby Vision and then try to approximate it in P3 color space, the way 99% of the audience will see it. Brad and Claudio didn’t really want the crushedblacks-and-blown-out-whites type of contrast. They wanted it to look very elegant, and very film-like. So

until we hit Tomorrowland, we really didn’t use the full potential of the HDR system. It’s not that type of movie. We still wanted the movie to look filmic in the HDR space. For scenes in Tomorrowland, we definitely brightened it up, and it’s much more vibrant. We could use more of the HDR’s capability. But the main thing is that it’s a creative choice, and each movie can be crafted differently. That’s the real beauty of it. It’s not a destructive format. We can grade it so that it looks very, very similar to your theatrical grade on the monitor, but we are certainly not limited by what P3 or REC 709 has given us in the past. We have more creative freedom to have blacks that are super black and whites that are super blown-out. It just gives you a greater palette to work with.

46


Disney’s Tomorrowland

“This movie was color corrected in 4K, which is not very common. Most films are color corrected in 2K. It’s really nice working in 4K. You certainly see more detail.“

This movie was color corrected in 4K, which is not very common. Most films are color corrected in 2K. It’s really nice working in 4K. You certainly see more detail. There’s more resolution there. You have to be a little bit more careful with the chroma keys and luminance keys and the separation tools that we use in the DI process. Because you’re actually seeing more, you’re also seeing more artifacts. You have to be much tighter with everything, and that certainly extends to the shoot and the visual effects. They also have to be tighter and more intricate with their work, because there’s more resolution and bits that they have to deal with. With 4K, we all have to be a little bit more careful with what we do. We have to be careful in how we approach our crafts. But the color is still the color. The color doesn’t necessarily change, because you’re just getting more resolution. Part of my job is to take whatever color that has been signed off on in the HDR theatrical and create the color correction that looks correct in any form, whether that is a Pulsar, or Samsung, or whatever, so the feel of the movie is correct in every exhibition format. We create all of the deliverables for all of the formats, whether it’s for theatrical projection, 3-D theatrical, 2-D theatrical, 3-D video, 2-D video, pan and scan, all the versions. We don’t start from scratch 47

— it’s not a regrade from the RAW files. It takes so much work to master something. The filmmakers really massage everything. In an intricate film, one shot can potentially add 10, 15 or 20 Windows, and you’ll have 3500 or 4000 shots. It’d take you months and months to start from scratch with all of these deliverables. I’ll make a look-up table, and then massage it from there to make sure that the feel of the movie feels correct on that device, whatever that is. I’ll make it work. It may look a little bit different, but it’s not like it’s out in left field. I’m not going to have to grade the whole movie again. You basically just have to fit your movie into each space. Certain scenes may not work, so you adjust from there. It’s more a tweak than a re-grade, and it’s a bit of an R&D thing that we do. At Company 3, we’re a color house. We know what we’re doing when it comes to color science. The F65 is a really great high dynamic range camera. It captures lowlights very well. In general, the F65 is a very clean image. It has a lot of dynamic range. It certainly is a beautiful camera. The real difference, the key, is putting it in the hands of a good cinematographer, and Claudio Miranda did a great job. He makes sure that the image that he’s putting on that camera is exposed


correctly. In the old days, people thought that digital cameras were going to take creativity away from cinematographers. With film, you had to be so good, and you were doing things blind. But that’s not necessarily true. Now, they have a monitor in front of them, but that doesn’t mean they can light properly. A cinematographer that was good with film is going to be good with video, and a person that’s no good shooting film is going to be no good lighting video.

“The F65 is a really great high dynamic range camera. It captures lowlights very well.“ 48


Disney’s Tomorrowland

HDR could potentially change the way cinematographers work. They can take creative advantage of the brightness that the TV sets and the projection can have — especially the sets. The TV sets are pretty spectacular. I think HDR affects visual effects a little bit more. Visual effects will now have to do everything in 4K, and they’re dealing with high dynamic range. When they’re creating a sky or a wave or an ocean from scratch, they’re going to have to put more detail in, because they’re dealing with more resolution. The amount of time it takes to do a shot, and the level of expertise required, will be greater. Once everyone learns the nuances of what the format allows, I think we’ll see a lot more creativity with HDR. With HDR projection, you certainly have more to work with, but the colors triangulate

differently. There’s a bit of complication that is really interesting. All the DCP products out there, the Christies and the Barcos, typically work with a Xenon bulb. A Xenon bulb basically allows you to have colors in a spectrum that everyone sees the same, unless they’re colorblind. Laser projection has sort of peaks and valleys in the wavelengths where people actually start seeing colors differently, because it falls out of the spectrum that the typical human eye sees. So you get a situation where someone may see a certain white as a little bit more magenta, or a little bit bluer, a little bit pinker — something they typically wouldn’t see in regular DCI P3. I think there’s still a lot of work and experimentation to be done to develop a straight-ahead workflow that the whole industry to adapts to.

“HDR could potentially change the way cinematographers work. They can take creative advantage of the brightness that the TV sets and the projection can have — especially the sets.“ 49


Looking back on Tomorrowland, I’d say that it was exciting to work on a project with Brad, who is a very forwardthinking and progressive director. With Claudio, Tom Peitzman, Rick Sayer, Craig Hammack, and all of the ILM guys — it was really cool to be on the cutting edge creatively, working with a great team to create the visuals for the movie. We all learn things together, and we all find mistakes, and we all try to correct them. I like that. I like being on the cutting edge, seeing issues and problems and trying to figure things out. That’s also true with my team here at Company 3. Everyone’s so nice and cooperative, so it was a real pleasure working on this movie. 50


Disney’s Tomorrowland

Craig Hammack

Visual Effects Supervisor, Industrial Light & Magic

Craig Hammack is a visual effects pro who has worked with Industrial Light & Magic in a number of roles over the past two decades. He contributed in various capacities to Titanic, Pearl Harbor, Star Wars II — Attack of the Clones, and Star Trek, among others, and his credits as a visual effects supervisor include Red Tails, Battleship, and Now You See Me. At Industrial Light & Magic, our role is to derive all the 2D and 3D elements that need to be added in post. I’m a visual effects supervisor, so on Tomorrowland, I was working on set with cinematographer Claudio Miranda, and throughout post. Brad Bird and Damon Lindelhof came up with the screenplay of the story by Jeff Jensen. Brad had worked with John Knoll and ILM on Mission: Impossible – Ghost Protocol. Brad is a northern

51

California resident, and that proximity makes the process a little more convenient. The Tomorrowland project required a pretty broad range of effects. The first part of the movie is a fun chase sequence where we go from situation to situation with the characters. It culminates in the world of Tomorrowland, which is a mostly synthetic environment that we generated. We were fortunate to find a location in Spain that had enough of the characteristics that Brad wanted in the city, and we used that for some foreground pieces and to anchor our imagery. Of course, to get the vastness and the scale of the futuristic city, we took over pretty quickly from the foreground. The spaces where you can fly jetpacks, or make camera moves that aren’t possible required us to do full digital city environments, in daylight, and in 4K extended dynamic range. It became a real feat to maintain the detail level and the clarity. The city wanted to be something that is far beyond the pale of any of the cities and buildings in our world, and it’s also supposed to look like the clearest, most perfect day in our world. If you’re in the Swiss Alps where there’s absolutely no contamination to the air, and it’s a crystal blue sky day, and you can see for miles and miles — it’s that kind of environment. The beautiful sunlight, the clear day — all those things hurt you when you’re trying to generate a realistic image of architecture, because to convey scale, you need atmosphere, and to convey detail levels, you want a certain amount of dirt and grime. So generating this ideal city to a realistic visual level was the main challenge.


52


53


Disney’s Tomorrowland

“The spaces where you can fly jetpacks, or make camera moves that aren’t possible required us to do full digital city environments, in daylight, and in 4K extended dynamic range. It became a real feat to maintain the detail level and the clarity.“ 54


Disney’s Tomorrowland

One example of what we did on Tomorrowland is a scene we call “The Pin Experience.” Casey, our heroine, is given a pin, and when she touches the pin, it transports her visually to Tomorrowland. It’s like a virtual reality kind of dream. At the beginning of the movie, she touches the pin, and she’s miles away from Tomorrowland. Eventually, she gets to where she touches the pin and she’s in the heart of the city. When we get to that scene, it plays as one continuous shot, so it’s like five minutes of following her through an amazing and awe-inspiring cityscape. For that shot, we filmed in three different countries at different times of the year, with all of that having to be stitched together in one seamless shot, with seamless motion and camera moves that are all handheld or Steadicam. Probably 90% of the frame at any one time is digital. In the end, 1,037 visual effects shots made it into the movie. I think 150 or so were completed but then omitted. To generate the visual effects, we use a combination of tools. We have proprietary tools for layout and some of the effects work, and we mostly work in 3D Studio Max for the environment work, and render that in V-Ray 3.0. The creature work is rendered in Pixar’s RenderMan. Lighting is done in KATANA.

55

In order to bring the RAW data from the F65 cameras to 16-bit OpenEXR, Joust, the data management company that helps with the workflow, uses Colorfront software. They held the RAW data and rendered it out to anyone who needed it. So the F65 cameras were capturing in 8K, and that came to us rendered out to 4K. I’ve got to say I was very

“The image we would get from the F65 was very sharp — there was really no loss of visual information at all.“


“One example of what we did on Tomorrowland is a scene we call “The Pin Experience.” Casey, our heroine, is given a pin, and when she touches the pin, it transports her visually to Tomorrowland.“ impressed. We were selective about which shots we maintained and worked on as 4K. We tried to find the sweet spots where it was most noticeable — no real camera movement and sharp contrast, vistas with very slow movement, or establishing shots

where you really get the resolution. The rest we would work at 2K and then up-rez at the end and drop into the 4K plates. The difference in rendering load between 2K and 4K is massive. It’s literally four times the data, and there are so many calculations that get done per pixel that it’s actually more than four times the rendering time. The image we would get from the F65 was very sharp — there was really no loss of visual information at all. There’s a process to finding the right filters when you’re doing the resolution transformations, but the clarity of the image was amazing, and incredibly clean, especially when you would get to the low end. The kind of characteristic noise that you start seeing in night shots wasn’t there to the level that we had seen before in other cameras.

56


In terms of color, the differences are not so noticeable to me in the range of the gamut, but more in the trueness of the color. It’s an odd thing, but the subtleties of color roll-off and color fall-off seem to stay truer than I’d seen before. In the past with other cameras, it seemed like there were always odd, very subtle levels in the mid-tones where either color cross-talk happened or there was some kind of skewing in the color capture — an inconsistent falloff. And I didn’t notice any of that on Tomorrowland. On this project, we went out to 16-bit OpenEXR, and there were a couple of reasons for that. We work natively in 16-bit, and we were also doing a release that uses Dolby Vision, the extended color gamut delivery and projection system in a few theaters that will provide an extended dynamic range viewing environment. Rather than going to 10-bit or 12-bit DPX files to grade on, we worked with 16-bit OpenEXR to get a full pipeline and retain the EXR latitude. I’m incredibly pleased with the final results. We’ve screened it several times in extended dynamic range 4K. It’s a pretty stunning set of work. We started down this road with Dolby, 57

which was great to work with and very responsive to the requests and comments we had about the technology. I’ve recently read and heard that IMAX’s version, their latest projector, is purported to have similar capabilities. It’s always been a little frustrating at the end of the day to lose range. When we started, the concept was that it would bring people to theaters. Now, the reality is that the professional monitors and TVs that support High Dynamic Range are going to provide an equally impressive bump in viewing experience, it is something that we are very excited about. The beautiful thing for us at ILM is that it’s a range that we work in anyway. On our work, we are basically protecting for the grade to many stops above and to absolute black, just in case somebody goes crazy with it in the grade. All our final checks are done at extreme ranges. It’s staggering the stuff that you start seeing when you start working in high dynamic range. It just becomes so sharp that flaws, any of the imagery you would normally get away with, are readily apparent. It’s a combination of holding up to 4K viewing, and being conscious of the High


Disney’s Tomorrowland

Dynamic Range viewing. From 2K to 4K, you get clarity and sharpness. The HDR is a bigger jump in apparent resolution than the jump from 2K to 4K. The project required the city to be done at a scale and a resolution that we don’t typically use for environments, but once we actually got to that level, the extended dynamic range showed even more than we expected. And so then we had to go back in and even up-rez that. For example, put a façade on a building. In many cases, you can just do that with 2D textures

as opposed to 3D and actual geometric detail, because in the distant imagery, the pixels get filtered. Your eye basically fills in the gaps and gives you the perception of depth. But in this case, the sharpness and contrast would betray that. So then you have to actually go in and model the detail and do the actual 2D paint on top of that. So anything where you would have kind of a flat surface or a flat color, you had to go in and put texture so that it actually had the right kind of light breakup and visual textural characteristics. It’s just quite a bit more visual resolution than is normally needed. I think it’ll be a little eye opening to people who follow this methodology in the future, but I’ve got to be honest, it’s great to be able to see that detail. It really helps the viewing impression and the realism of it. I think it’s something to be excited about, but it’s definitely something to be aware of, too, when you start planning a project.

“...but the subtleties of color roll-off and color falloff seem to stay truer than I’d seen before. In the past with other cameras, it seemed like there were always odd, very subtle levels in the mid-tones... “

58


F65 Anamorphic

A.I. Award winning cinematographer, Rob Hardy, BSC discusses his latest feature film work, Alex Garland’s EX MACHINA, produced by Scott Rudin & DNA Films and being distributed by A24.

Interview by Jon Fauer, ASC Story produced by Peter Crithary

Ex Machina is a 2015 British science fiction thriller film written and directed by author and screenwriter Alex Garland, making his directorial debut. The film stars Domhnall Gleeson, Alicia Vikander and Oscar Isaac. A young coder at the world’s largest internet company, wins a competition to spend a week at a private retreat belonging to the reclusive CEO of the company. On arrival he learns that he must participate in a bizarre experiment which involves interacting with the world’s first true artificial intelligence, which comes in the form of a beautiful female robot named Ava.

59


60


Anamorphic F65 A.I.

JON FAUER: Let’s talk about your wonderful work on Ex Machina. What cameras did you chose and why? ROB HARDY ROB HARDY: The key thing with every project is to find the right ingredients to implement the impressions you get when you first read the script. Every film has it’s own distinct personality. There are many different ways of looking at something and it is through testing that you get the answer to what a film’s visual personality might be. Whether it’s a combination of cameras and glass or something else, I generally start with an idea in my head about what I think it will be, but sometimes new concepts crop up. Something may present itself as a surprise, and suddenly you’re looking through the camera and you can sense it. You think, “This is the right combination. This is the way the film should look.” The key to that process is choice, which is why I feel it’s a shame that people are so quick to write off film, compared to digital, these days. I’m not one to say one is better than the other. I just think it’s a shame that we’re losing our sense of choice. It is incredibly important that we still maintain choices. On a recent HBO project, we tested four different formats and three types of digital cameras. We tested 16mm, 35mm, anamorphic, Red, Alexa and F65. It’s wonderful to sit in the screening room and see the difference between these cameras and the different personalities they have. I think it would be a shame if it really just boiled down to two choices or even one. That’s just rubbish. It’s homogenization. It goes against every creative bone in my body. 61

The F65 ... performed in a way that I thought illustrated it to be a great companion to film. It just read the faces in a way that I felt was spot on and rendered the landscapes in the style that we wanted.


Testing is important. It determines which direction you want to go. With Ex Machina we already knew it was going to be a digital show and strangely we did feel the format leant itself to the subject matter in some ways. I tested other cameras. We just tested them in as fair a way as possible with consistent lighting conditions, exteriors, interiors. We played with reflections; there are a lot of reflections in this film, as you know. And also faces. The face in itself can be like a landscape, but how can you read a face? How can you read emotion on a face? I actually think there are subtle differences between the particular cameras. The landscape of the face, and more importantly, the emotion behind the eyes are shown by micro expressions, as the character Ava would say. I actually think that choice of camera, lens and format has a very subtle influence on those aspects. We found that some cameras seemed to negate my lens choices. In other words, they didn’t show the subtleties that I was looking for. The F65 was the opposite. It performed in a way that I thought illustrated it to be a great companion to film. It just read the faces in a way that I felt was spot on and rendered the landscapes in the style that we wanted. Everybody was very surprised when they saw the quality of the F65. But it’s all subjective. It is absolutely subjective for me. So, we chose to shoot Ex Machina with the F65 camera combined with Xtal Express lenses.

62


Anamorphic F65 A.I.

JON FAUER: Are Xtal Express lenses the Joe Dunton anamorphics? ROB HARDY: Correct. They are re-housed Cooke S2 and S3 spherical lenses — old lenses that were rehoused by Joe in the 80s and anamorphisized. I’m not entirely sure how many sets were built, but obviously each one was handmade, and within those sets, each lens is completely different. You have a very unique feel when you’re looking through these lenses. It sounds really pretentious, but I always look for a personality in a lens. So a 32 mm has a specific personality to me, whereas a 40 or a 50 would both be very different. There are reasons why I would choose to use a particular focal length — not simply because I want to be a bit wider or I want to be a bit closer, but I want the feel of the 32 mm — I want the feel of the 40 mm.

63

And the Xtal Express is slightly more at the extreme end of that concept for me, because they really do have distinct personalities. I had Jenny Patton, my first AC, essentially bring together as many sets of the Xtal Express as she could at Panavision in London. She thoroughly tested them to select the best lenses from each set, the ones she felt were technically right. So effectively, we made a custom set, if you like. It is like that scene in The Good, the Bad and the Ugly where Tuco goes into the gun shop and selects a series of guns. He custom builds his own gun by ripping apart other guns and bringing together this specific one for him. And so Jenny became known as Jenny “Tuco” Patton after that. JON FAUER: The Good, the Bad, and the not Ugly Xtal Express anamorphics.


It sounds really pretentious, but I always look for a personality in a lens.

ROB HARDY: Yes, fantastic, really amazing glass. I mean, they have their restrictions, they’re not necessarily easy to use, but I enjoy them. JON FAUER: Why are they not easy to use? ROB HARDY: They vary in size. They’re not like the Panavision G-series, for example. If you use that set, they’re all the same size, they’re all uniform in the way that you can shoot quickly with them. But then I learned to shoot quickly with the Xtals in the end, unhindered, as I would if I were shooting film. The close focus isn’t fantastic on some of them, but that can be a blessing sometimes. I believe there is what I would call a ‘cinema’ close-up and a ‘noncinema close up’. When you’re framing someone up in that way, I think the lenses apply these slightly filmic restrictions on you, which I enjoy.

64


Anamorphic F65 A.I.

JON FAUER: That’s the nature of anamorphics, isn’t it? ROB HARDY: It is. But I think one should celebrate the difference. What people look for now isn’t uniformity. I think what they’re looking for is a distinct uniqueness. Especially with certain digital cameras that have inundated the market. They begin to look similar to each other. I think that homogenization is damaging in many ways to the creative process, and I’m not just talking about filmmaking. It’s like certain highstanding cameramen a few years ago were saying they were only going to shoot with one particular

camera. To me that was like saying “I’m not going to do anything other than shop in Walmart from now on.” It seems absurd to me to deny yourself choice. And I think that comes right back down to lens sets. Of course, people want to know that the lens is technically reliable, but from a creative point of view lenses are now like the new emulsion, in a way. Everybody feels that’s where their choices are. JON FAUER: Absolutely. Because there are only a few sensors. There used to be many films stocks and what is left is the lens. What are the focal lengths of the Xtal Express?

I think there’s something about framing with a 2.35:1 aspect ratio that gives us two things to look at, a 2.35-5 frame with a spherical lens feels very different to me than a 2.35 frame with an anamorphic lens.

65


ROB HARDY: There is an 18, 25, 32, 40, 50, 75 and 100 mm. The 18 is pretty extreme, but we used it once or twice on Ex Machina and it gave us a very specific look. The 25 mm is a great lens. I think it has a very bizarre feel to it. On several landscapes in Ex Machina this lens created a surreal look. I always thought that the exteriors on Ex Machina subconsciously could be seen as Ava’s point of view, even though they’re not. It feels like Ava’s emotional point of view, or possibly even her own imagination. We were able to distill this idea of strangeness in the landscape.

JON FAUER: Why anamorphic on Ex Machina? ROB HARDY: I think there’s something about framing with a 2.35:1 aspect ratio that gives us two things to look at, a 2.35-5 frame with a spherical lens feels very different to me than a 2.35 frame with an anamorphic lens. They’re so completely different in terms of what they’re saying — emotionally and otherwise. For me, anamorphic has a great presence. It has an epic, cinematic feel to it. 1.78 feels more painterly to me and I would frame scenes in a completely different way in that format.

66


Anamorphic F65 A.I.

JON FAUER: You had a zoom on Ex Machina? ROB HARDY: I carried a zoom specifically for three shots where we were using it to make a physical camera move. But other than that, I wouldn’t generally use a zoom. I use primes, because I feel that those focal lengths are very specific. Whereas with a zoom lens you can land on different frames that I feel are not appropriate. The longest lens I used was the 100 mm in the actual 67

principle photography. Every part of Ex Machina has a sense of movement to it, even if it’s almost imperceptible. And those three zooms helped us do that. JON FAUER: What rental house did you use? ROB HARDY: We took the Xtals from Panavision. They came with a PL mount. The cameras and everything else came from Movietech. Movietech carried the F65s whereas Panavision didn’t, or


Every part of Ex Machina has a sense of movement to it, even if it’s almost imperceptible.

rather they couldn’t source them. And Movietech are great supporters of that camera system.

in there at lunchtime and talk over a few things. It was just great to have it all so close at hand.

Panavision were really great and Andy Cooper at MovieTech has a fantastic relationship with Charlie Topperman, and they all really went out of their way to make it work.

JON FAUER: It’s interesting how many big Science Fiction films are using the F65: Luc Besson, Claudio Miranda, you...

We were shooting at Pinewood Studios, so Movietech is right there. Andy Cooper was our point man and he was very supportive. I could pop

ROB HARDY: I think more and more people are starting to use them.

68


Anamorphic F65 A.I.

JON FAUER: What is it that you guys know that others do not? ROB HARDY: It’s not really a secret any more, but it did feel like that at the time. I was surprised more than anything. When we screened the tests at Theatre 7 at Pinewood, even to a layman’s eye, it was a no-brainer: why isn’t everybody using this camera? It just didn’t seem to make sense to me. But I think because the system that ARRI has is such an easy working system, it’s just so user-friendly both for camera crews and production. It’s crazy the amount of times I’ve gone into productions — more in commercials than anything else — where the assumption is we’re just shooting with the Arri Alexa. It’s funny. The producers are saying, “Well, you’re going to use the Alexa aren’t you?” I might reply, “Well, hang on. There are many other choices out there — let’s think about this for a minute.” And it’s a remarkable thing that they’ve done. I don’t want to be negative about it because I think Arri makes great cameras. But with the testing that I did for this particular production, for Ex Machina, we chose the F65. But like I say, it’s all subjective and I think subjectively some people may prefer the look of another camera. JON FAUER: You may have done one of the few jobs that I know of with the F65 in anamorphic. It’s a 16x9 sensor and you’re pillaring a 1.2:1 squeezed format in there. ROB HARDY: You have this incredible image recorded. It’s almost twice as wide when unsqueezed, but you frame accordingly and crop in post. (1.78 x 2 = 3.56:1 vs normal anamorphic 2.39:1) One of these days, I keep saying to people, it would be amazing just to do a film in that format, 3.56:1. It would have a remarkable look to it. JON FAUER: Were you able to de-squeeze in the eyepiece? ROB HARDY: Yes. You see the image de-squeezed in the eyepiece and everybody’s monitor on set as well.

69


You have this incredible image recorded. It’s almost twice as wide when unsqueezed, but you frame accordingly and crop in post. (1.78 x 2 = 3.56:1 vs normal anamorphic 2.39:1) 70


Anamorphic F65 A.I.

JON FAUER: Did you have a DIT? ROB HARDY: I had a DIT named Jay Patel who works at 4K London, and Pinewood Digital took care of the workflow. They’d had a show prior to that with the F65 so they had a bit of knowledge of it and they were the only really positive people who were coming forward saying, “We can make this work.” Whereas there were a number of other houses who were not supporting the system, because they were concerned about the workflow. Production was concerned about the workflow because it was something different. If basically it wasn’t the Alexa, suddenly everybody’s going crazy. They’re saying, “Wait, we don’t have a way to do it — there’s no standardization here.” And of course there is no standardization — that’s the slight issue with digital formats. But Pinewood Digital took it and ran with it and did a fantastic job. JON FAUER: And you shot RAW and they handled the rest?

When we went to Norway ... we were literally out in the middle of nowhere with backpacks, climbing hills, waterfalls, and glaciers and things.

Rob Hardy in Norway 71

ROB HARDY: Correct. I had Asa Shoul at Molinare, the colorist, fairly early on in the pre-production stage when we were deciding how we wanted the look of the film to be. We created a few very subtle grades with Asa, and Jay was present at those sessions. Then we implemented those whilst shooting so that I could effectively shoot untethered. There wasn’t really any grading on-set. I don’t generally go for that, except maybe one or two minor adjustments at the end of the day. When we went to Norway, essentially we didn’t have any chance of that because we were literally out in the middle of nowhere with backpacks, climbing hills, waterfalls, glaciers and things. Essentially Pinewood Digital was taking care of the workflow. They had a guy come out and handled it overnight whilst we all slept. But by that point, the look of the film had been established. I would ask for stills every day rather than viewing rushes compressed several times on


some Dropbox somewhere. I always go for stills directly from the camera. That way, when anybody comes to you and says, “I’ve just seen the rushes and they look like X,” you say, “Here, have a look, this is how they should be.” You’ve always got a good reference.

JON FAUER: How do you do those frame grabs? ROB HARDY: I had Jay do them. He would do screengrabs for me every day. He’d pick five to ten images that he felt were key images from what we’d shot that day and send high resolution copies to me.

72


Anamorphic F65 A.I.

JON FAUER: And you operate yourself? ROB HARDY: I do, yes. I just couldn’t imagine not doing that. The composition is so important and I just like the interaction it gives you with the actors. I like the fact that you can react to any situation, to whatever the actors are doing and whatever emotion they’re conveying. With Alicia Vikander, we developed a really good working relationship in Ex Machina and then straight away we worked together on Testament of Youth directly after that. Testament of Youth had a slightly more visceral, hand-held feel to it. Alicia had a great deal of trust in me because it helped being a director of photography who also operated. JON FAUER: Tell me a little about the lighting on Ex Machina. How did you establish it? What did you use for common reference? Did you look at pictures or stills?

ROB HARDY: Our references were few and far between. It sounds odd, but it kind of created itself. There were one or two references that we used. Saul Leiter, the New York photographer who deals with reflections a lot. His name came up and the initial images that I sent to Alex Garland, the director, were Saul Leiter images. One of the key images in the film which came out of the script was the sense of looking at Ava through reflective glass. We have a mid-shot of her carved up by these strange reflections. Just seeing her eyes and sensing moments of connection that she has between her artificial intelligence and actually feeling something — Saul Leiter was a good example of that. It was a good way of demonstrating what I was trying to get across to Alex in terms of what I felt when I read his script. Kazimir Malevich is an abstract Russian painter whose later work had a lot to do with composition — frames within frames — composition broken up by other composition. He works with lines and geometric patterns. Because Ex Machina was so much about those things, I was really into exploring this idea of frames within frames in order to convey a particular emotion at a particular moment. Because you never know who’s watching who in this scenario. What we were really after, by subtle framing, was a suggesting of certain emotions that one character may have for another. I really enjoyed playing with the geometric compositions in order to move through those Ava sessions, because each one had to feel different. Each one had to have its own unique inner story.

73


What we were really after, by subtle framing, was a suggesting of certain emotions...

As reference for the look or style of the film, we also talked about Tarkovsky’s Stalker. And John Carpenter’s The Thing. I just love that movie because you don’t know who’s who. There’s always the sense of never knowing where you are and I think Ex Machina has a great deal of that. It has a sense of creeping dread, who’s actually watching who, who’s trying to play at what, and those allegiances constantly shift and change.

74


Anamorphic F65 A.I.

JON FAUER: Working with the director, Alex Garland — how do you communicate with him? ROB HARDY: He’s primarily a writer. He was fantastic to work with. His great skill was creating a real sense of true collaboration with all the departments. Bringing together all those autonomous departments and making it feel like he was trusting. He was willing to try many things. He had a very specific way that he felt about the story. It was a very specific angle on what he felt Ava was about. And he gave me complete freedom to visually facilitate that, as he did with Mark Digby and Michele Day, the designer and costume designer, and also everyone else involved. I can’t praise him enough really; working with him was wonderful. There is another thing I want to say about Alex. You’ll see certain interviews that he’s done about this film and about being a filmmaker. He doesn’t believe in the auteur theory. He thinks it’s nonsense, which is interesting because a lot of film media wants to say that a film is the work of one single person who is responsible for it. Sure, there is one single person who is responsible for guiding the film but Alex doesn’t believe in the cult of the director. And I think it’s so true. Alex really understands the process. He understands that every element is important. I think it’s a real key issue. 75


JON FAUER: You had VFX and most VFX guys hate anamorphic. How did that work with you? Did they resign themselves to the fact you were using Xtal anamorphic lenses? ROB HARDY: Andrew Whitehurst is an exception to the rule in their case, because not only is he a genius at what he does — and again, this is Alex’s genius for bringing together such a fantastic team — it’s that Andrew is a great facilitator. He was like part of the camera crew, as far as I was concerned. I’ve never worked with a VFX guy so brilliantly involved and actually so clever. He effectively said to us, “Look, you guys, you shoot the way you want to shoot.” There’s no green screen, there’s no blue screen, there’s no tracking marks — nothing. We shot it as if we’re shooting a drama. Ava is created as an emotional being rather than a VFX creature. I think that’s a really key point. It’s often overlooked because it creates a human element in VFX. Andrew embraced that and actually loved our lens choices. He kept saying, “Wow, this is amazing.”

There’s no green screen, there’s no blue screen, there’s no tracking marks — nothing. We shot it as if we’re shooting a drama. 76


Anamorphic F65 A.I.

JON FAUER: Did he make you shoot grid charts and all those charts and things? ROB HARDY: Yes, he came in and shot the grid charts himself at Movietech. And then he was overjoyed with those lenses. He couldn’t get enough of them. He was saying, “These lenses are so crazy.” He was so happy I’d chosen them. Andrew Whitehurst is at a company called Double Negative (Dneg.com) based in London. Right now, I think he’s doing Bond. JON FAUER: Did you use filters or were those just natural halations from the Xtal Express? ROB HARDY: A combination of both actually, but I used a lot of soft light, mostly tungsten light and so it was a combination of the gauziness of that, the gauziness of the lenses, very subtle diffusion occasionally. And also how we graded it in the color-timing with Asa. JON FAUER: One last thing, your background. How did you get started in this business? Film school? ROB HARDY: I was in film school very early on. I watched all these old Hammer horror movies when I was a kid — and had an epiphany one day and realized that they weren’t real and actually somebody had made them. And that was it. From then on I just wanted to do that and so I went through school, writing scripts — really bad scripts — which were essentially really dodgy rip-offs of War of the Worlds and things like that, and trying to get my friends to reenact them. I was obsessed by it all. And eventually I went to film school, one in Wales, which was like an experimental film and art school and then I went up to Sheffield to specialize. At that point I was specializing in cinematography and then broke into the music video world. I was involved in a lot of art installation work — video installation. I lived in Sheffield for five years, and 77

eventually I moved back to London. I was doing a lot of commercials and music videos at the time but also low budget short films. I absolutely loved — I still love shooting commercials, they’re good fun and you get to try a lot of things out. Then I think it was a film called Boy A that started to get attention and helped me move forward.


I still love shooting commercials, they’re good fun and you get to try a lot of things out.

Left to right: Rob Hardy, Alex Garland, Lee Walters (Gaffer) 78


Dheepan Cannes Palme D’Or By Jon Fauer, ASC Interview by Brigitte Barbier, translated from French by Alex Raiffé for the AFC (afcinema.com). Courtesy of Film and Digital Times and the Association of French Cinematographers. Photos by Niki Kannan.

Éponine Momenceau was the cinematographer on Jacques Audiard’s Dheepan. We met after the CST reception for Aaton, Angenieux, K5600, Transvideo in Cannes. I was invited to lunch by AFFECT, Aaton and Transvideo President Jacques Delacoux, CST President Pierre-William Glenn, AFC, and Angenieux President Pierre Andurand. Pierre-William led the way to his favorite restaurant, Astoux & Brun (depuis 1953). Apparently he goes there almost every day during the festival. It’s well deserved. Pierre-William supervises all the projections and technical details of the Cannes Festival. Pierre-William, Pierre Andurand, Jacques Delacoux all advised: “Interview Éponine.” They knew something the critics at Cannes didn’t know. Dheepan would win the coveted Palme D’Or two days later — first prize at Cannes. The story unfolds like an ethics class, or like a screening of the Battle of Algiers. Your sympathies sway. Dheepan was a Tamil Tiger in Sri Lanka. The battle lost, he arrives in France, pretending to be married to a woman with a little girl. He finds work in a housing project in a tough banlieu, and one cannot overlook the fact that director Audiard’s original intent was a remake of Sam Peckinpah’s Straw Dogs. It was a total surprise to all the critics that Dheepan won — but obviously not a surprise to Glenn, Andurand, and Delacoux. 79


Eponine Momenceaux. Photo by Niki Kannan. 80


Here are excerpts of an interview for the AFC (afcinema.com) by Brigitte Barbier, translated from French by Alex Raiffé. Courtesy of the Association of French Cinematographers.

Cinematographer Éponine Momenceau discusses her work on Jacques Audiard’s film Dheepan

Éponine Momonceau graduated from La fémis film school in Paris 4 years ago. That was followed by a rather eclectic series of jobs. She did two features as camera assistant and then directed a film that won prizes and were screened in museums. That opened doors to other directing jobs. That attracted Audiard’s attention, and she was hired as DP on Dheepan. Dheepan was shot with a Sony F55 from Panavision Paris. Éponine said, “I like the fact that it is a lightweight, comfortable, shoulder resting camera. We did tests in advance, and I thought it had the best colors for faces, with the best color gamut, and color space. We shot in spherical widescreen with Cooke S3 lenses, Angenieux Optimo 28-76 mm zoom and occasionally 24-290. In India it was Cooke S4, but for the majority of the shooting it was S3. “In the beginning I tested with the F65 but it was too big and heavy for all the handheld we wanted to do. So we went with the F55 and its RAW recorder.”

81

For his seventh feature film, Dheepan, which is in the Official Competition of the 68th Cannes Film Festival, Jacques Audiard chose an unknown aspiring actor to play the lead role. The story is about a former rebel soldier, a young woman, and a little girl who pretend to be a family in order to be evacuated from their war-torn country. Upon arrival in France, they have to adapt to a new life in a violent housing project. It is there that this imaginary family will, over the course of time and the events they face together, learn to form a “real” family. A graduate of La fémis’ Image Department in 2011, Éponine Momenceau directed an experimental film screened at the Palais de Tokyo in 2013 and a film for Arte, Jungle, as part of the Atelier Ludwigsburg-Paris — La fémis. She was the director of photography for several short films and then one day Jacques Audiard called. He saw her film Waves Become Wings at the Palais de Tokyo, and he wanted to offer her the job of director of photography on his next film, because he was interested in combining cinema and contemporary art. A few days later, she was asked to do tests for the film. Éponine Momenceau felt that over the course of their conversations, Jacques Audiard reconsidered his concept of the screenplay and its visual style. Then he called to confirm that he wanted to work with her, he wanted a new direction for his movies, he didn’t like the feeling he’s just sailing forward.


Dheepan Cannes Palme D’Or

During the search for the location of the home, on one of sets of the film, she filmed images that could potentially be used in the film. These images will not end up in Dheepan but Éponine said, “It allowed us to get to know one another better.” She thinks that Jacques appreciated how she filmed her short films. Their discussions about the film’s visuals were exciting, sometimes destabilizing, and a source of many questions, because “it was a bit like getting dizzy faced with the number of possibilities, we had discussions regarding the shape of the film but nothing concrete really came out of it before we really started shooting. “

The heavy lifting, from preparations to shooting. “We thought we’d shoot a lot of the scenes on a tripod and we ended up shooting a lot of them from the shoulder. As we weren’t dealing with professional actors, they first had to get comfortable with the scene. I watched how they would use the space, I had a viewfinder with a zoom and I would look for the right angle. “Sometimes, when we reread the notes regarding editing we had taken during the preparatory readings of the script, we would indicate a choice of scenes or camera movements. There were lots of scenes where we moved from one place inside to the outdoors and we used a Steadicam for those scenes.”

“In the beginning I tested with the F65 but it was too big and heavy for all the handheld we wanted to do. So we went with the F55 and its RAW recorder.”

82 82


Lighting a Jacques Audiard film.

Lights.

“He often wanted to be able to use all 360° of a set with the lighting set up before shooting with as few adjustments as possible. But of course adjustments were needed between shots.”

Éponine used the most powerful lights in Sri Lanka. There, she used Molebeams, placed far in the distance in order to light the jetty which was completely in the dark. She also used 18 kWs on a crane, the only light coming in through a window, in order to obtain high contrast.

About the style. “Jacques wanted us to feel the exotic gaze of the Tamil newcomers to Paris. For example, he didn’t want the housing project to be too grey. It’s hard to make a grey housing project not look grey on the screen… How do you do it without making it look artificial? All of the indoor locations where the Indians lived were decorated in Indian style, with neons, cyans, and colors.” Éponine said she had great support from Michel Barthélémy, the production designer, who was always thinking of the lighting for each set, he would choose very appropriate wallpapers and colours for the walls. She said she got a lot of help by her crew, Marianne Lamour, the gaffer, and Edwin Broyer, the key grip.

83

She also used a Ring Light, especially at night, for frontal lighting, but she never used them on the lenses. She also used boxes set up with Kino Flo bulbs, created by Pierre Michaud, the gaffer.

Camera. She tested the Sony F55 and found that it did a very good job rendering skin tone and other colors. It was fortunate that she chose this compact and lightweight camera because a lot of the scenes ended up being filmed from the shoulder. Éponine opted for Cooke S3 series lenses, which she likes because of their vintage “flaws”: they have a tendency to vignette or to flare, which “breaks up the digital effect.” She often used them without “matte boxes” and


Dheepan Cannes Palme D’Or

“Sometimes, when we reread the notes regarding editing we had taken during the preparatory readings of the script, we would indicate a choice of scenes or camera movements. There were lots of scenes where we moved from one place inside to the outdoors and we used a Steadicam for those scenes.”

without a filter, in order to be able to handle the camera however she pleased. Otherwise, she also used the Angénieux Optimo 28-76 zoom. She also used a Mitchell filter to reduce the sharpness.

Regrets? “Not being able to watch the dailies together because the shooting took up all of our time. Suddenly, we discovered the film during color timing on the big screen. Then we had to

readdress the subject of the visuals, make adjustments and rebalance the colors, all while moving as fast as possible because of Cannes.” Stressful…

On shooting her first feature film with Jacques Audiard. “That’s a great first question: how do you propose another technique of filming when the one he already used in his past films is extremely interesting? “It means being reactive. Not setting up too much in advance (generally speaking), having good instincts and making lots of suggestions. Being receptive to unforeseen circumstances and be flexible because Jacques Audiard doesn’t like feeling that things have been predetermined, and that the actors can’t move because there are marks all over the place. Putting as much lighting as possible off set. “Learning about his method of directing the actors, he lets them make suggestions and adjusts their acting and their movements in function of what they say. The set and the acting get constructed simultaneously, little by little. The actors guide the form, the sets, and the lighting, and not the other way round.”

Crew Focus Puller: Nicolas Eveilleau Gaffer: Marianne Lamour Key Grip: Edwin Broyer Postproduction: Digimage Color Timer: Charles Fréville

Equipment Camera, grip, lighting equipment: Panavision (Sony F55, Cooke S3 Series, Angénieux Optimo zoom) and Transpalux

84


85


Colour, softness and freedom!

Colour, Softness and

Freedom!

The hallmarks of shooting on F55

Cinematographer Claire Mathon, AFC Interview by Pierre Souchar

Following her triumph in 2011 with Polisse, Maïwenn returns to Cannes in the Sélection Officielle category with her fourth feature film Mon Roi, a drama depicting a couple’s passionate and tumultuous relationship. Produced by Alain Attal (of production company Les Productions du Trésor), the film was shot with the F55 by Claire Mathon, a young cinematographer and long-time collaborator with Maïwenn.

86 86


Colour, softness and freedom!

Pierre Souchar: You have just finished the grading process for the film. Could you give us your first impressions of the pictures from the F55, from such a crucial stage in your work as a cinematographer? Claire Mathon: I must say that I am rather enthusiastic about the images and pleased with the work that we have done. I have been able to revisit the photographic choices that I made and we have created great chemistry by combining the images from the F55 with the laboratory treatment provided by Technicolor. The colour rendition of the camera and the way it works in strong light is particularly appealing. I found it pretty reassuring that I could use natural light in all its brilliance, be it in a snowy scene, on a beach or at a sunlit wedding.

PS: How did you plan the photographic work for this film? CM: Maïwenn wanted to make a film where colour would be of great importance. She gave a great deal of attention to choosing the sets and the costumes. The strength of the F55 really lies in the shades and the richness of the colours.

“The colour rendition of the camera and the way it works in strong light is particularly appealing.”

Texture was also important. From my experiences with the previous films, I knew that there would be a lot of close-ups. The film’s plot is serious enough, so I tried to keep a certain level of softness, especially for complexions. My work with the lighting also contributed to this.

PS: How did the F55 help you to achieve your vision for the film? CM: We needed a camera that could combine rich colorimetry, softness and lightness. Mon Roi was filmed almost exclusively on two hand-held cameras, using long shots. Maïwenn likes shooting lengthy sequences without cuts

87


for 20 or 30 minutes at a time. We had a huge number of sets and moved around a lot. We needed a tool that we could hold by hand to allow us to move freely.

“The strength of the F55 really lies in the shades and the richness of the colours.”

PS: Which lenses did you use and how did you justify your choice? CM: Maïwenn would ideally shoot the entire film with a zoom lens, to keep the sense of freedom and speed. We shot with Angénieux Optimo zoom lenses, sized 28–76 mm and 45–120 mm. For scenes that were more critical in terms of lighting, we also used a set from Zeiss GO. Creating softness and the need for compactness affected our choice of lenses. I also used the filter a lot. The action in the film spans almost ten years. Rather than making the actors look younger, I worked to soften their features in line with the different periods of time. We conducted a lot of tests because I was worried about filtering too much and losing acutance. When watching the film, I found the skin was rendered quite pleasantly.

PS: What was your workflow? Did you shoot in RAW? CM: No, we didn’t shoot in RAW, mainly because of the quantity of rushes (285 hours in 12 weeks of filming). I carried out tests before shooting so that I could clearly see the differences between the recording formats and understand what I would lose by not filming in RAW. We chose the SR 444 SQ codec. I knew I would have much less tonal range when grading. The exposure must be precise and the colour temperature has to be as close as possible to the required result. Everything is recorded in the image.

PS: Did you mix daylight with artificial light? CM: Yes, we shot a lot of the interior daytime scenes with the lights switched on (or with candles) and used colour gels (1/4 CTO or 1/2 CTO) to reduce the contrast. I wanted to avoid ending up with complexions that were too warm and an image with uniform hues. I realised that we needed to manage the mix of daylight and artificial light on a significant number of sets. So for interiors, I often chose a colour temperature of 3200 K or 4300 K, although I would have preferred to be more precise and change the colour temperature in increments of 100 Kelvin. Grading let me correct this without any regrets.

PS: Actually, do you have any regrets? CM: No regrets, but there is a small fly in the ointment. I noticed that the camera added noise in low lighting, especially for bright colours, as if some of the colour had not been transposed. PS: What has your career path entailed? CM: I graduated from the École Nationale Supérieure Louis-Lumière (Louis-Lumière National College) and immediately started doing lighting for shorts. So I became a “young cinematographer” very quickly. My first film was Maïwenn’s first film. And I have followed her for all the others.

So I became a “young cinematographer” very quickly. My first film was Maïwenn’s first film.

88


Colour, softness and freedom!

Mon Roi by MaĂŻwenn (2015) SĂŠlection Officielle at the Cannes Film Festival. Release date: October 2015 Producer: Alain Attal Actors: Vincent Cassel, Emmanuelle Bercot, Louis Garrel and Isild Le Besco. Camera assistants: Jimmy Bourcier and Lucile Colombier Gaffer: Ernesto Giolitti Grip: Roman de Kermadec Laboratory: Technicolor

89


Colour, softness and freedom!

90


91


Earth, Wind, and Camera. By Bob Poole

Working with National Geographic and Discovery I burned through about 1000 miles of 16mm film with my Arri SR between 1991 and 2004. When a series I was working on in 2004 decided to go digital, I bought a Sony F900. My F900 was a beast. It took me a while to get used to it but the camera was tough and weathered the beating it was receiving as my career started to focus on Africa.

Photo Credit: National Geo screen grab

92


Earth, Wind, and Camera.

Water in West Africa makes life‌ and a hard life for cameras and people

93


In the Congo, and Gabon the humidity and solid rainfall was enough to drown any equipment and the F900 did struggle with the humidity. There was constant maintenance replacing all the moving parts that were required to thread the digital tape and to run it over all of the heads. But, I became my own field tech and found I could always get the camera to run. The images that came out of the F900 were blowing my executive producers away and the assignments kept rolling in for me.

Stuck in the mud and a camera that trained me to be a technician.

94


Earth, Wind, and Camera.

With my F900 I shot the Africa segments of National Geographic’s Great Migrations for which I won an Emmy for my cinematography. These were some of the most difficult shoots of my career.

95


96


Earth, Wind, and Camera.

In Mali, the camera held up through the worst conditions I’ve ever experienced. I watched temperatures rise to 114 F daily in the peak of the dry season and the camera kept going. I was filming desert elephants and following their migration in a 34,000 square km home range.

97


Photo Credit: National Geo screen grab Desert Elephants moving before the rising sun on their way to the last water point in the peak of the dry season

98


Earth, Wind, and Camera.

The heat was relentless but when it comes to harsh conditions for the camera, nothing could have compared to the sandstorm. A “tempet”, as it is referred to in Mali, was the big event I had been waiting for: the change in wind direction that brings the rain, and signals to the elephants it’s time to move on. But this was a “grand tempet” and nothing could have prepared me for the fury it would unleash on me, and my equipment. Winds well over 100 mph blowing so much sand that it blocked all light coming from the sun. Total darkness followed by an eerie red glow that lasted for hours. Equipment was lost in the darkness as I ran for cover inside my pickup. I’d covered my camera with a tailored space blanket I had designed to keep out the dust and reflect the sun, but it was completely hopeless. When the storm was over the devastation was tremendous. Sand was in everything. I opened the side panels on the F900, blew out the sand, put it back together and kept shooting. I shot with my F900 for 6 years and really loved it. I knew the camera inside and out and had come to rely so heavily on it that I rarely traveled with any sort of backup camera. By that time Sony had made huge advances in solid state and digital disks that were far easier to work with than the old tape cassettes, and my producers wanted me to move on to a new camera. At that time the camera of choice was the XDCAM PDW-700. 99


Photo Credit: National Geo screen grab Over 100 mph winds and a red glow follows total darkness

100


Earth, Wind, and Camera.

In a show of peace, an armed gang leaves their weapons against a tree while we gather for a meeting.

101


I took a brand new Sony XDCAM PDW-700 into Southern Sudan where armed gangs and landmines were real and present dangers. After the 22 year long civil war, we had film permit #1 issued by the South Sudanese government and it took our vehicles 12 days traveling overland to reach the location that I had spotted from the air and marked with a GPS. We had gone there to film the annual rut of the White Eared Cob, an antelope that migrates in numbers greater than the wildebeests in the Serengeti. While the F900 was tough, the 700 took reliability to a new level. By recording digital info to ram and then writing to XDCAM disks, the camera gave huge advantages over the F900 as well. Instant on, and even better, pre-roll buffer is such an advantage when shooting wildlife. The XDCAM disks themselves were bomb proof and

it was great peace of mind knowing the media was safely stored with no need of transfers and backups. And, for the first time, I could view my footage on my computer while on location via XDCAM Transfer, a great program that organized all my footage on thumbnails so an entire shoot could be viewed and stored in a small file. The XDCAM 700 shot 60p at 720 and at the time that was super cool. Wildlife always looks better in slow motion. My XDCAM PDW-700 worked flawlessly for years but I retired it quite happily when I got my hands on the F5 for the first time last year. I was working on a 6-hour series for PBS and National Geographic International about the restoration of Gorongosa National Park in Mozambique. I wrote about the project and my work with the F5 in the last issue of CineAlta.

102


Earth, Wind, and Camera.

The beautiful thing about the F5 is that it can be both a cinema style camera, and an ENG, like my F900 and XDCAM PDW-700. The camera is so versatile. Its compact size and flat bottom make it easy to mount and balance on a Steadicam, but with a few accessories the camera can sit comfortably on the shoulder. The variety of lens choices, frame rates and formats is staggering, especially when used it conjunction with the R5 recorder. Most of the footage for our series was shot in 2K and HD. We did however want to shoot 4K on our aerials for archive, but also in order to pan and scan within the frame. The F5 with the R5 recorder was the perfect solution for that. 4K RAW at 60p was impressive for sure, but it ate up our precious hard drive space quickly. Ours was a two-year shoot and we shot massive amounts of footage. As a result we shot mostly to SxS cards on the F5 in 2K. Sony continues to develop new accessories for the F5/55 CineAlta cameras. I was lucky to get ahold of their new FZ-PL adapter recently. It is designed for the new 2K Center Cut that came in the V3.0 firmware update. With only 0.5 stop loss, the new adapter transformed the F5 into a fantastic wildlife camera because I was able to use my Fuji 2/3 B4 25x super telephoto zoom. It has a range of 16.5 to 413mm in HD at 4.2 stop.

103


Photo Credit: Off the Fence screen grab Loving the new FZ-­ PL in Mozambique with the F5 and a 25x Fuji super tele lens.

104


Earth, Wind, and Camera.

Photo Credit: Gina Poole Working in Botswana with an F5 body pieced together with bits and pieces to make a working camera.

105


I had to temporarily break away from the PBS/NGI series when war broke out and the fighting around me got too intense. I ran away with the most valuable items but could not take everything with me. I ended up in Botswana filming elephants for another film but had only the F5 camera body with me. With lenses, batteries, a viewfinder and shoulder support coming from other cameras, I was able to piece together a full F5 camera package. It is a modular system and so adaptable that it can be set up in many different ways to suit the needs of the production or the resources at hand. Photo Credit: Windfall Productions

106


Earth, Wind, and Camera.

My next production will be shot in 4K. Because the footage load shooting RAW would be enormous and incredibly difficult to handle in the field, I am going to use an F55 to shoot 4K in the XAVC™ format directly to SxS cards. The Sony F55 is the perfect camera for the job. This film is about a family of wild elephants that I will follow for 3 months in Kenya. Elephants grow up in tightly knit family groups comprised of babies, young males and females, mothers, grandmothers and greatgrandmothers. They can live into their 70’s and the oldest female, or matriarch, leads the family but together the elephants make decisions in order to keep the family safe. Elephants in Africa are being decimated. Hundreds of thousands of them have been poached for their ivory and the killing continues. Films are being made about this crisis and that is extremely important, but I want to show how wonderful wild elephants are alive. It has been quite a journey following the evolution of Sony CineAlta cameras. I am excited about the new products in the lineup built around the F5/55. The first time I picked up a Sony F55 I knew right then that it was the camera for me. Photo Credit: Sebastian Dunn Next assignment: following a family of elephants for three months with an F55

107


108


T

GAME

CHANGERS A profile of two Game Production executives and their use of Sony’s F55 cameras to augment the in-venue fan experience.

109


Michael Conner Director of Videoboard & Event Production, Arizona Cardinals Eric Long Event Production Manager, Philadelphia Eagles Website: www.philadelphiaeagles.com Twitter: @EaglesShows By Mike DesRoches Sr. Sales Support Engineer, Sony Electronics Inc. Twitter: @DesRochesSony

110


Game Changers

It’s early Sunday afternoon…game day. As you look over the extended family enjoying the fruits of your labor, face paint drips onto the shoulder of your lucky jersey; a consequence of working over the bbq’s heat. The burn of smoke dissipates as you close your eyes to savor the last chew of juicy, cooked meat. Momentarily, time slows to a delicious crawl. Snapping back to reality, the urge for a quick phone check of your fantasy lineup for the hundredth time is interrupted by neighboring bass thumps and cackles of laughter; both partially drowning out the squawk of an AM radio touting today’s matchup. A long, final pull of a cold one whisks away the salt from that last brat. It’s nearly game time, so you grab a roadie, gather the family, and make your way over before kickoff…only to plop down on the couch and fire up the big game on that home theater system. …This, my friends, is a major concern of all those involved in sporting venues, specifically event production managers. Make no mistake, there’s a high-stakes battle going on at professional sports venues, and it’s not just on the field between opposing teams. As Josh Beaudoin, associate principal at consulting firm WJHW, has put it, “It’s become a nuclear arms race with getting the best equipment to provide that premium in-venue experience.” In this context that means these venues do not just want to house a winning team, they’re very much in a fight to provide an experience that simply cannot be had at home in front of the television. For many teams, like the Arizona Cardinals and Philadelphia Eagles, that means sparing no expense — all in the name of providing unmatched entertainment value that not only keep the fans coming back, but has them spreading the word both vocally and electronically. Simply put, the game has changed. In the past scores of months, that’s meant spending tens of millions super-sizing video boards, upgrading control rooms and production equipment to feed/ support them, increasing wireless/data connectivity and making the facilities as glitzy as possible to get that much-sought-after fan interest and increased traffic through the turnstiles.

111


The initial measuring stick for this battle may have been the double-sided, 60-yard-wide mammoth video board that was unveiled at the opening of AT&T Stadium, home of the Dallas Cowboys back in 2009. In the past year alone, all around the major sporting leagues, a fevered pitch of staggering venue upgrades continue to leapfrog each other for “largest board” or “most total area of board space” have taken place, but the common denominator is a fervent desire to provide unmatched entertainment value. Some may simply call it “Keeping up with the Joneses,” but it’s much more than that. It’s about investing in the future by having a strong team with a vision, committing to that vision, then doing your homework and making smart choices to realize that vision, in order to maximize that investment. With 65" TV’s now commonplace in the home, organizations have turned to these new, gigantic video boards to become the “home theater in the stands,” but the reality is, as boards increase in size and relative resolution from SD to HD, 2K, 4K and now the 8K board (Jacksonville Jaguars, installed prior to the 2014 season), these boards are only as good as the quality of cameras, scaling tools and event production teams driving them. This story profiles two top event production “game changers” responsible for making that in-venue fan experience a noteworthy one; how, during their respective group’s impressive venue upgrades chose the Sony F55 4K cameras, among others, to fill their gorgeous new video boards. Also, how they leverage 4K technology to augment the fan experience and provide enough flexibility to evolve and benefit other groups within their organization. Oh, and perhaps a little on what keeps them up at night to ensure they keep those turnstiles turning.

112


Game Changers

MICHAEL CONNER Arizona Cardinals

Currently employed by the Arizona Cardinals as the Director of Videoboard and Event Production, Michael C. Conner started working with the team as freelance Producer/Director in 2000 and was hired to head the department in 2006, the inaugural season at University of Phoenix Stadium. Directing game presentation and production for Arizona Cardinals football games and working on event presentation for many other events including Super Bowls XLII and XLIX, Fiesta Bowl, BCS Championship games, NCAA Basketball, Cardinals NFL Draft Party and many Concerts, he has enjoyed presenting some of the most exciting events in one of the nation’s finest venues. Mike has been producing, directing, shooting and editing highly acclaimed programming for a national audience for more than 30 years. His credits range from entertainment shows for ESPN to acclaimed documentaries for the NBC family of networks. With wide experience throughout the United States to locations as far away as Afghanistan, Iraq, Pakistan, Latin America and much of Europe, Conner knows how to get in and get the job done no matter how remote the location or difficult the challenge. In 2014 he traveled to Russia to direct the live venue production for the medals ceremonies at the Sochi Olympics.

Technical Highlights Camera/Replay Implementation — Two PMW-F55 4K Live cameras and seven HDC-2570 cameras, one HDC-2400 & BRC-H900 PTZ (rented two HDC-2400’s for the Super Bowl). Five Evertz Dreamcatcher replay systems. Two for 4K zoom, two in a 6x2 config, 1 for social media. New Video Board — New Daktronics HD13 on South EZ - 164' wide by 54' tall — (3836x1260 resolution) with 8 outputs off their Vista Systems SPYDER (four X20 frames). Integration — In-house staff (Shane Gavin and Jamie Gillespie, event and systems engineers) and Pro Sound and Video based in Orlando, FL 113


114


Game Changers

Mike DesRoches: Mike, you’ve had a pretty exceptional run last season topped off by hosting the Fiesta Bowl, Pro Bowl and a small event called the Super Bowl. That must have been pretty special to run the boards for that. We can get into that later, but specific to Cardinals games, what would you say is your functional “mission statement” on game day? Michael Conner: In a nutshell, we’re looking to create the most entertaining, passionate and engaging fan experience 115


possible. And to reward the emotional — and financial — investment fans have made in choosing to spend the day with us means that what we provide has to be the absolute highest quality. Consumers have so many great entertainment options, especially here in Arizona, and we’re competing with all them. Our job is to make sure that at the end of the day Cardinals fans have the best possible in-stadium experience and are looking forward to the next time they can come back and do it again.

“...we’re looking to create the most entertaining, passionate and engaging fan experience possible.”

116


Game Changers

MD: Heading in that direction, prior to the 2014 season, there were numerous significant upgrades made to University of Phoenix Stadium including the new Daktronics 164' wide x 54' tall board in the south end zone. But those new high-resolution boards need high quality imagery to feed them. How did the purchase decision come about and how did you decide on the F55 and HDC cameras? MC: Like most teams, the Cardinals are constantly evaluating the fan experience and exploring areas in which it can potentially be improved. In assessing priorities, enhancing the video board operation to take advantage of the latest advancements and technology ranked high on the list. Because the stadium is owned by the Arizona Sports & Tourism Authority, the video board renovations were a collaborative effort between the Cardinals and AZSTA. Everyone understood the benefits not only for Cardinals fans but also in the ability to attract future events to the stadium. We worked with Pro Sound out of Orlando for some of the integration, but we have a lot of in-house technical expertise on staff, led by Gavin on the video and IT systems side, and Jamie on the audio systems side. We also have a great relationship with Evertz and leaned on them heavily. When it came down to cameras, our team tested out a few manufacturers’ 4K, as well as some non-4K, solutions and chose the F55 for 4K and HDC-2570 cameras for our HD hard camera needs. They seemed to be the best fit with our Evertz Dreamcatchers, and provided the same chains for camera control between the HDC’s and the F55’s. 117


“When it came down to cameras, our team tested out a few manufacturers’ 4K, as well as some non-4K, solutions and chose the F55 for 4K and HDC-2570 cameras for our HD hard camera needs.”

Arizona Cardinals Scoreboard Production team (left to right): Shane Gavin, Jamie Gillespie, Mike Conner, Amanda Flanagan, Brian Myers. 118


Game Changers

MD: What would you say is a unique ability that you’re getting out of the 4K cameras in order to augment the video board experience, and how would you say it benefits fans? MC: It was very important for us to advance into the 4K technology to enhance the fan experience. For us, the optimum use was the 4K zoom-in or cutout. Fans are used to seeing 4K zoomed in for replays on some of the networks . You’re zooming in to see if the foot is inbounds or on the line or not. Then, tilt the shot up within the zoom capability to see if he has control of the ball and pull out to see if he’s been touched. The fan is looking for that high resolution zoom to confirm the call is good. Now, with the technology in-house, when we do that, we get a big fan reaction. When you’re at home, it’s exciting to see that it’s a catch and a touchdown for your team, but when you’re at the stadium and we do that zoom, then you not only see, but feel 64,000 people all screaming at the top of their lungs in excitement because they can see the play is going to be upheld or overturned in favor of our team, and that’s not something you can get anywhere else. To experience that eruption live is visceral and like no other. Another great experience was the first time the Network took our feed, where they took our 4K zoom. Once the challenge flag is thrown, we can’t show any more of our in-house looks again. All of a sudden, one of our angles came back to us through the Network feed during a review. There was a bit of a panic for a moment because one of our staff yelled out “That’s Our Shot!” and we can’t show our shot. But because the Network took it, they showed it, and in that case they CAN see our camera because the Network camera is also showing that to the referee, so it is part of the referee’s decision making process to confirm or overturn the play. Since the Network doesn’t typically take our camera angles, it was great we had the expertise from a staffing and equipment standpoint to deliver something that was of such high quality that the Network would actually take it and put it out to their audience. That confirms that we’re giving the patrons in the venue the very best of what’s possible and 4K with the F55 allows us to do that. 119

“It was very important for us to advance into the 4K technology to enhance the fan experience.” MD: One of the common concerns of operators in shooting large format cameras for sports is pulling focus. You’re using 85-300mm Fujinon Cabrios in front of your F55s. Where there any growing pains for the operators using the F55? MC: Anytime you introduce new equipment, there’s going to be a learning curve. With the F55’s and large sensor format cameras, in particular, there’s definitely a different way of approaching live action sports, especially in the NFL. The game is played on such a large field, and the players are moving very quickly. You only have so much light and aperture to work with, so there’s definitely a learning curve to operating an F55 versus on of our 2/3" 2570’s. On an aside, we were the first pro football venue to put in LED sports lighting, so that helped with the overall lighting levels up to give us a bit more depth of field. Another factor involved is the fact we’re using it for the zoom capability and the cutouts. You shoot that differently as you have to keyframe the motion, and so we limit the amount of movement that the operator has. We can zoom electronically, and we don’t want to zoom as much optically. The more rapid movement by the camera operator means a LOT more keyframing which will delay the amount of time in which that clips will be available to deliver to the boards. Finally, we’re dealing with odd aspect ratios. We’re often taking only a portion of the full raster to put up to the video boards. Previously, we were taping down our markers on the viewfinder displays, but now we’re using the electronic markers with the Sony cameras. These features have been great, but have all taken some getting used to. The cameras ultimately provide numerous ways to get there, and it’s just a different way to go about telling the story. In conjunction with the Fuji Cabrios, which are great, we’ve been entirely pleased with all of the cameras.


120


Game Changers

MD: As the director of a staff that’s responsible for making these advancements come to life at the stadium, also leading the productions of not just the football games, but every event hosted at the facility, I would refer to you as a “game changer” for your ability to impact the audience’s overall experience. Aside from what you already said with the network using your feeds on occasion, how would you say the F55 has played the role as “game changer” in enhancing your event production abilities? MC: One of the important aspects to the design and decision to make these upgrades, was to attract other events outside of just football. Those events that are looking for a home often come through and tour our facility. It’s exciting when a major promoter comes through and sees the quality of our equipment and it affects their decision on what venue they choose to hold their event. When the folks at the Super Bowl and the NCAA Championship 121

come through and see our commitment to excellence of equipment, it goes a long way in them choosing the University of Phoenix Stadium for their event. Sometimes, the event promoters will want to use our control room for their broadcast properties as well. Having the 4K capability like the F55 for both in-stadium and broadcast properties is no longer a luxury, but in many instances is now a necessity in hosting these type events. When we’re competing against other venues and we have these capabilities, it certainly makes us a front runner. MD: Has your usage of the F55’s 4K ability, in conjunction with the Dreamcatchers, evolved over the past full season? Did you utilize the toolset differently during the Super Bowl than you were at the beginning of the season? MC: It was certainly a learning curve at the beginning of the season. It takes a bit of work to get


“It’s exciting when a major promoter comes through and sees the quality of our equipment and it affects their decision on what venue they choose to hold their event.”

the dance down between the camera and replay operators. By the time we reached the end of the season and into the Pro Bowl and Super Bowl, they had the process down to where it became second nature. Basically, it was a lot of fine tuning and polishing that’d hit a fine point by the time we got to the big game. I wouldn’t say we used the tool differently, it’s just that 4K is a different tool. For instance, we’re ISO recording that footage. It’s not just for the in-game footage, but also for the broadcast properties later that week. It’s nice when you have a play and maybe you didn’t realize a significant holding that was going to negate the play. When you’re working in HD, you’re going to follow a tighter composition and possibly miss that game changing holding. In retrospect, when you’re producing HD content, even in the game, we have it in a 4K view, you can go back and look at that hold and you can zoom into that hold in 4K rather than into the running back. In the end, that play

was negated and the hold or the penalty was the game changer and not the long run. I think that’s pretty significant! MD: Were there any noteworthy plays that happened over the course of the season that the new technology made an impact that you wouldn’t have had the previous year? MC: The experience of a play that we show when the coach and the staff are looking at the board, and we show the replay, we can see in the control room that he should throw the challenge flag. With that 4K capability, we can demonstrate what the facts of the play are. When we have that power. The clarity of those cameras really makes that possible and can affect the team in a positive way.

122


Game Changers

MD: You’re one of a handful of event producers that direct games for the Olympics, how do you find that differing from working in your home base, and did you get a chance to work with 4K there? MC: I had a great opportunity to go to Sochi and direct the medals plaza for Big Screen Network during the Olympics. It was a tremendous experience. Similar to the Super Bowl, the Olympics utilize the very best of production and techniques for these incredible international events. Being involved with Olympic Broadcasting Services (OBS), they used 4K cameras, high speed HD cameras and numerous innovative tools that I could get experience working with, and bring back to, in my case, the Arizona Cardinals, as opportunities for growth moving forward when it comes to game presentation and fan engagement. While the broadcasters were using 4K, I didn’t have 4K for the in-venue production. It would have been a terrific asset to have and I imagine in future Olympics, 4K will become the standard. While we weren’t taking the 4K feed, we were taking a downconverted feed from 4K, and it was impressive with what we were able to see in the quality of the images. MD: Do you think having the Olympics under your belt, in a way, prepared you for delivering a better Super Bowl experience? MC: No question! When you’re a part of a production team for something on the scale of the Olympics just helps to prepare you, give you more tools in your toolbox to draw from. It certainly hones your skills and helps you raise your game. It made the work required for the Super Bowl that much easier and enjoyable, quite honestly. It’s not the first time you’re on a big stage. You’re much more prepared to bring new innovation and the highest level of production. Now, we also have the comfort knowing we have such innovative tools to draw on at University of Phoenix Stadium. But the pageantry of both the Olympics and the Super Bowl are incredible, inspiring opportunities and my experience in those will echo in our game presentations for years to come. MD: What’s your biggest fear on gameday and has the new technology added solace or more anxiety? 123

MC: I think the biggest vulnerability we have on gameday is equipment failure. We’ve had more solace because now we have multiple ways of delivering the services required. We have more redundancy, and in turn, security than we’ve ever had before. We’re also asking more than ever before. With those increased demands, there is more than can go wrong. If something does go wrong, we have options to recover. It’s a lot of excitement, and an exciting challenge. We put on a show. We want to have the most engaged fans of any sporting venue in the world. To have these capabilities and our imagination being the only limit for what we can do for our fans is a great situation to be in. With that comes added responsibility. It also requires a system that’s as flexible, robust and durable as the one we have and we rely on our manufacturers to deliver mission critical equipment. Not just Sony, but our other major partners like Ross, Evertz, Daktronics as well. All of our major partners were selected because they provide equipment that is used in the highest level of sports production and it’s an absolute requirement for our “failure is not an option” philosophy that we have with the Arizona Cardinals and University of Phoenix Stadium. Truly, they’ve all come through for us. MD: Well, it’s a pleasure to be part of that partnership. Last question…During the recent SVG Venue Tech event you hosted in late March, you mentioned exchanging some ideas with other game production directors like Mike Bonner of the Denver Broncos. Have you been trying out new ideas for next year’s season? Do you have new tricks up your sleeve you’d like to share? MC: We have a number of initiatives that we’re developing which will be a great surprise to our fan base, including working with our technology partners to enhance the fan engagement further. We’re not going to tip our hat at this time, but we’re putting a lot of energy into next season’s gameday experience. Stay tuned, it’s going to be a lot of fun. MD: Thanks Mike! We appreciate your support. Good luck in the upcoming season.


124


Game Changers

ERIC LONG Philadelphia Eagles

Eric Long enters his seventh season with the Philadelphia Eagles this fall. A veteran of live sports broadcast, Eric’s experience includes a diverse range of events and control room positions. His responsibilities with the Eagles include producing the in-venue live broadcast of all NFL games, and managing production logistics and staffing for all events at Lincoln Financial Field including Eagles football, Eagles training camp, Temple University football, the Army-Navy football game, NCAA Lacrosse Championships, international soccer, concerts, and other special events. In addition to his employment with the Eagles, Eric has held positions in the production of Major & Minor League Baseball, ECHL & NHL hockey, NBA, MLS, and college athletics, as well as consulting for a number of organizations. Eric recently oversaw the multi-million dollar renovation of Lincoln Financial Field’s control room and LED surfaces, which included the installation of over 20,000 square feet of LED surface into the building. The stadium’s two main video boards are constructed of 10mm surface mount LED, which is currently the highest resolution in the NFL.

Technical Highlights Camera/Replay Implementation — 10 total PMW-F55 (eight F55 4K Live systems, two F55 handheld RF) cameras. Four 8-channel Evertz Dreamcatcher systems (three configured in HD 1080i I/O and one in 4K UHD I/O). New Video Boards — Two main Panasonic 10mm SMD LED boards manufactured by Lighthouse — North Endzone 192' wide by 27' tall (resolution of 5856x816); South end-zone 160' wide by 27' tall (resolution of 4886x816). In conjunction with other in-venue ribbon boards and auxiliary displays, nearly 22,000 square feet of new display panels. Consultant — WJHW Integrator — LED — Panasonic, Control Room — Sony/Diversified Systems 125


126


Game Changers

“We were wowed by the features the camera — the size of the sensor, the codec options, and the multiple frame rate and format possibilities that it offered.”

127


Mike DesRoches: Your fan base is well known for being demanding of what’s on the field. As the game production manager for the Philadelphia Eagles, what would you say your fans demand most from an in-venue experience? Eric Long: We have such a diverse fan base, so if you ask five different fans the question you probably get five different answers. Some people might tell you that free wi-fi or fantasy stats are incredibly important and others might tell you they only care about what’s happening on the field. But I think every fan would tell you that the reason they love coming to our stadium is the overall atmosphere; they want to influence the game with their energy and to ultimately help the team win. Our production is heavily focused on telling the story of the game with timely and effective replays, stats, and information, all of which underscore the storylines of the game. By providing these features and capitalizing on big moments like touchdowns or turnovers, our goal is to complement the natural fan energy that exists and amplify the drama of the game, hopefully resulting in a competitive advantage. For our production staff, we’re motivated by the pursuit of creating a fan experience that is worthy of such intensely passionate fans in a great football city. MD: We’ve been on a Sports Video Group Venue Tech panel together discussing 4K and the use of it, but your team’s story is a unique one that I think many would like to hear. So, how did the Eagles get to the point of pulling the trigger on 4K and going with these cameras?

EL: Although it ultimately rolled in with the LED control room upgrade, the 4K journey began for us in late 2012. We were exploring options for ENG cameras when we came upon the newly released F55. We were wowed by the features the camera — the size of the sensor, the codec options, and the multiple frame rate and format possibilities that it offered. We had gotten into the DSLR revolution several years earlier and cinema cameras were the next natural step in that evolution for us. Around that same time we were separately evaluating build-up cameras for our live in-venue broadcast, leaning towards standards like the HDC-2500. Our internal discussions progressed to where we found ourselves asking, “Why can’t we use the F55’s in the broadcast environment, too?” Early on, the appeal was being able to use a cinema camera with a 2/3" lens for live production on Sundays and then having the bodies available to use with cinema lenses the other six days of the week for television and web production. We felt like the F55’s versatility made it the best investment for something we could truly use year-round, and after seeing it in full build-up form at NAB in 2013 we knew that it was a viable option. As we followed the development of the camera throughout that year, we also talked to teams like the 49ers who had begun the process of using a few F55’s for production (Click here to see why the 49ers chose the F55) which was very helpful. The last domino to fall was running SMPTE fiber to our camera positions, which had previously been triax. Once we made that decision, it sealed the deal on our decision to go with the F55. By the time we were going through our renovation in the spring of 2014, there was much more information and support available for 4K usage of the F55 in a broadcast environment, so we latched onto that and planned for all of our hard-wired F55s to hit our router in 1080i and 4K. Coupled with the things that Evertz was doing with their DreamCatcher replay system, we were able to hit the ground running with full 4K pan & scan replay capabilities when our control room was commissioned. Technology changes so quickly that by the time you commission a new control room there’s some aspect that is already out of date, but we wanted aspects of our build to give us a shot at growing with the changes over the coming years. The F55 turned out to be a huge part of that for us. 128


Game Changers

MD: You mention that it was rolled into another purchase, was there anyone else involved in funding this effort, or was it all internally funded? EL: The Eagles organization decided to make the investment as part of a larger $125 million dollar stadium revitalization that took place over two years and included the addition of seats, two bridges to improve accessibility within the stadium, and redesigned club spaces. The whole project was 129

based on improvements that would enable us to better deliver a great fan experience at Lincoln Financial Field. For our production team it was a great process because we were able to determine which equipment we wanted to use and how we wanted to use it. In conjunction with our new LED boards and our control room, we added so much new technology that it’s made a huge difference in what we can offer to fans on gameday.


130


Game Changers

MD: What integrator was used for the control room and what was the full package of cameras purchased? EL: We were very fortunate to have a great team of minds on our project that worked together to think through the best way to accomplish what we were looking to do. Our consultant Chris Williams 131

from WJHW and our integrators Tom Sullivan and TJ Beardsmore from Diversified brought a lot of knowledge and experience to the project and were great to work with. Everyone put a lot of effort into answering the questions of how to most effectively take advantage of the 4K functionality of the F55s, and how best to integrate it with our Dreamcatchers. Evertz and Sony were great partners in giving us the


“...we can simultaneously route the 4K and 1080i feeds of any of our eight F55 cameras into our four 8-channel Dreamcatchers”

information we needed and went above and beyond to make sure the integration was seamless, which was important because this type of installation hadn’t been executed in any other stadiums at that point. Our package is made up of eight F55 CA4000 4K Live cameras performing build-up kit roles, and two F55’s as RF handhelds. Everything hits our Evertz EQX router and we can simultaneously route the 4K

and 1080i feeds of any of our eight F55 cameras into our four 8-channel Dreamcatchers. We chose Canon for all of our lensing, and purchased one 100x, four 80x’s, two 40x’s, and one 22x for our build-ups. Our handhelds are both outfitted with 22x lenses. We also supplemented our manned cameras with seven Sony BRZ series PTZ cameras and two Canon PTZs.

132


Game Changers

MD: Certainly no small undertaking as that’s a ton of new equipment. Once you had it all in place, how did the usage take with the operators and how did the ability to “tell the story” evolve? EL: The first couple events we produced were definitely baby steps. By the time we got into regular season Eagles games our Dreamcatcher operators were getting comfortable with zooming, panning, and scanning, and we were able to really take advantage of that feature. Consequently it really changed the mentality of how we covered the game with our cameras. We ended up with a lot of situational-based assignments which isn’t all that radical itself, but the difference was we were telling our photographers to shoot wider for ISOs so there was more of the frame to use, which was backwards from what they were used to. It was a whole process where we were forcing ourselves to shift from trying to do all of the work within the lens, and instead asking our cameras to stay wider and our replay ops to zoom in the Dreamcatcher world. It has evolved into a powerful tool for us as the replay looks we’re showing in-venue focus on the most important 133

information in the frame, which is a tremendous asset in telling a clear story. MD: I imagine that took some practice but it’s interesting how the tools change the approach. Continuing on that theme, how else has the 4K capability added creativity to your role, or caused you to do things differently in a productive way? EL: It’s difficult to give a complete answer to this question as we’re still in the beginning stages of harnessing what our equipment can do. It’s certainly opened so many doors for us and allowed us to not be governed by the limitations of our tools, which is where we were at before this upgrade. My hope is that we’re able to display pre-produced content in 4K that capitalizes on the resolution of our video boards but we’re still waiting for some other equipment in our control room to catch up. The awesome part is that we’re able to acquire 4K footage of game action in the meantime so that we have a nice library of assets ready to go when everything catches up.


134


Game Changers

“Looking at the bigger picture I think the F55s have opened our eyes to things we can implement in a venue that maybe are a little more challenging in the traditional broadcast world.� 135


MD: After the first full season, what would you say has become the biggest impact of your F55’s? EL: To date the biggest impact in our situation has been the live environment and what it brings to the replay world. Having the ability to take advantage of the full frame is something that we haven’t taken for granted. Looking at the bigger picture I think the F55s have opened our eyes to things we can implement in a venue that maybe are a little more challenging in the traditional broadcast world. We’re producing a show that is displayed on an ultra high-resolution canvas, and we control every step between acquisition and transmission. In the broadcast world there are so many factors out of your control between creating the content and how it looks by the time the viewer sees it, which is not something that we have to worry about. It’s exciting because we can act fairly quickly as the technology grows and improves, and be on the forefront of using these tools effectively. MD: Have there ever been any times where the technology, and the way you employ it has affected the course of a game or anything you’d like to share in that area? EL: Our in-house camera feeds aren’t used for Official Replay Review so we can’t take credit for truly influencing a decision, but we are able to present some pretty decisive angles on close plays very quickly on our video boards, which has absolutely influenced our crowd’s reaction. Our fans are the target audience, but if in the process we can provide our coaching staff on the sideline a chance to see something that may assist them in their decision to challenge (or not challenge) a referee’s decision, that’s obviously a benefit to being the home team. MD: You quickly touched on actual use for production? You also mentioned that’s how you were originally looking at the F55’s, as ENG cameras, but has it turned out that way?

EL: Yes, we’ve used the F55s for a number of ENG applications and especially for things that are shot specifically for our stadium video boards. Having the tools in the F55 to go shoot that content in 4K, then work with the content for the finished piece is a huge advantage for us. Rather than scaling up, we’re starting with something that’s closer to our boards’ native resolution. The source is purpose built. Last year we switched our post-production workflow to the Adobe Creative Cloud platform on Macs and in Adobe Premiere it’s very simple and easy to work with the content. MD: I know it’s only a full season removed from all these high-end upgrades, but with the ‘arms race’ of venue improvements going on, do you see more upgrades coming, or perhaps utilizing your equipment a different way to further take advantage of what they can do? EL: We try not to get caught up in what everyone else is doing from a competitive standpoint, but it is helpful to learn from the experiences of our colleagues in other venues. We’ll definitely be looking at cinema lensing in the near future to take full advantage of the F55’s sensor. We’ve been closely following Evertz’s continued development of the DreamCatcher platform and the expanded ability to record multiple 4K sources on a single box. As 4K production switching becomes more available we’ll be doing our homework since we have the infrastructure in place to capitalize. Of course, all of these things are purely possibilities at this point, but it’s very exciting to see more and more ways to fully use our F55s and I feel like we’re in a really good spot. MD: Thanks for your time and support, Eric! EL: It’s been a pleasure, we appreciate Sony’s efforts in continuing to develop tools and technology that enable us to focus on telling great stories without concerns of our equipment holding us back.

136


How the Anaheim Ducks went with 55, 7 and 700 to Raise 8 By Peter Uvalle Producer, Anaheim Ducks

Gabe Suarez Associate Producer, Anaheim Ducks

http://ducks.nhl.com/ @anaheimducks

Article Produced by Mike DesRoches Sr. Sales Support Engineer, Sony Electronics Inc @DesRochesSony

137


138


The Anaheim Ducks

One more year. This was the annual heart-felt ask from Anaheim Ducks fans everywhere to their beloved number 8, and this went on for nearly a decade. The Finnish Flash, Teemu Selanne flirted with retirement since winning the Stanley Cup in 2007. In the last few years of his career his decision became more difficult and even more entertaining (Click here for the story). 2014 however would mark the end to a magical career that spanned over 20 years and 1450 regular season games in a Ducks sweater. As the horn sounded on his final game, everyone on and off the ice at the Honda Center paid homage to Anaheim’s favorite son with a standing ovation. He was, and is a true international ambassador for the game of hockey. We were honored to have worked with him for so many years, and took our role in retiring his number 8 to the rafters very seriously.

139


140


The xxx Anaheim Ducks

The Anaheim Ducks decided in the summer of 2014 to raise Teemu’s number to the rafters making Selanne’s jersey the first in franchise history to be retired. In fitting fashion number 8 was set to be retired January 11, 2015 when Anaheim would host the Winnipeg Jets, Teemu’s first NHL team. The “The Finnish Flash” burst onto the hockey scene with 71 goals in his rookie season, and was named Rookie of the Year in 1993 as a member of the Jets. The retirement of Teemu’s number was the single largest event the Anaheim organization was to produce. The pressure was on. We knew this was going to happen but were we ready? Telling the story of Teemu’s career and his dedication to the game of hockey and to the fans, had to be done right. After all, this was going to be an international event.

141


142


“

Shooting in its XAVC codec gave us the ability to do 4K recording and to over-crank up to 60p was a huge positive.

143

�


The Anaheim Ducks

Protecting the Legacy — Choosing the Right Tool to Tell the Story Storytelling is our forte, and this is a story we were eager to tell. We previously used the FS700 quite a bit during post-season runs, and added the A7S for its low-light performance. Since using the FS700 and A7S in unison the overall look of those video features improved significantly. This led us to finding a 4K solution for Selanne’s tribute, and the Sony F55 was the next product we were eager to try. Sony put us in touch with a local F55 owner/operator, Neal Williams of Par Fore Productions. Although we were convinced that this would give us the impact we wanted, initially we were very concerned about working in 4K. Shooting in its XAVC™ codec gave us the ability to do 4K recording and to over-crank up to 60p was a huge positive. But the daunting task of working with video formats that spanned Selanne’s 20+ year career, mixing with 4K, flexibility and support with numerous formats was a concern. Adobe Premiere afforded us this luxury and finding out native footage from the F55 without the 4K XAVC trans-coding, proved encouraging. Even the brand new XAVC-S format of the A7S was supported. This allowed us to focus on telling an engaging story, while not cutting corners.

144


The Anaheim Ducks

Shooting Day(s) When the day of the shoot arrived, we had everything planned out — or so we thought. It was a learning experience for all. We’d planned to shoot over a couple days in multiple locations, but wound up only having Teemu for one day and (due to weather and time constraints) were forced to shoot everything indoors at Honda Center. The high sensitivity of the F55 really helped out in this regard. From a file efficiency standpoint, the concentration of shooting could have been disastrous if we were working with other formats, but the XAVC codec was very efficient. And Neal from Par Fore was a true professional and brought in a second F55 to maximize coverage. At the end of the day, even with the condensed shoot we were extremely happy with the quality of footage we walked away with. When we first set out, the expectations were incredibly high. We knew we had to up our level of production and provide a candid and organic feel. The F55 delivered. Aside from the Selanne F55 shoot, we had a number of side productions that were needed to support this piece. We wanted to show the impact that Selanne had on the community, and how the younger So-Cal generation was now choosing hockey as a sport to pursue in the sun belt. The Sony A7s wound up being the perfect companion camera to capture this portion of number 8’s story. These productions were much smaller and more “run and gun”. The A7s were easily rigged on rails, sticks and mini-jibs. We were able to get some fantastic complimentary footage in a short amount of time that augmented the overall tribute nicely. We found the A7s so efficient, we wound up shooting all our “making of” elements with this versatile camera.

145


We wanted to show the impact that Selanne had on the community, and how the younger So-Cal generation was now choosing hockey as a sport to pursue in the sun belt. The Sony A7s wound up being the perfect companion camera to capture this portion of number 8’s story.

146


The Anaheim Ducks

As we had hoped, the entire retirement ceremony was a success and the video tributes turned out exactly as we had planned. Anaheim fans and fans of Teemu worldwide experienced a retirement ceremony worthy of a player of Selanne’s caliber. Our production team was pleased, management was pleased, and so were the fans. But more importantly, Teemu was impressed after seeing what we produced, and that is the one fan we wanted to please most.

Click here for the Temmu Tribute

147


Anaheim Ducks Entertainment department staff: Gabe Suarez, Paul Janicki, Peter Uvalle, Sarah Moews, Davin Maske, Doug Keiser.

148


Saluting the military PSAs from LINK Technologies offer “things you don’t know, but should.”

By Glenn Estersohn Behind the scenes photography by Allyna Roman Community Matters is an award-winning series of public service announcements created by LINK Technologies. The 45-second spots appear on Public Television and feature out-of-theway community affairs stories, often with a connection to the client. LINK Technologies has just completed the fourth season of Community Matters announcements, shot in RAW Lite on the Sony F65. We spoke with the cinematographer, DIT, editor and colorist—after checking in with the director, Brandon Jameson.

149


“Tomb Guard” takes us to the Tomb of the Unknowns in Arlington National Cemetery.

Director Brandon Jameson tells the story LINK Technologies (linktechllc.com) director Brandon Jameson’s recent projects have received accolades and best in show nods from the governing bodies of Cine’, Clio, Telly, Communitas and the AVA Awards. Q: So how did Community Matters come about? Jameson: The client saw some work we had done for another client and contacted us directly. They pride themselves on who they are for the communities that they serve. It was just a perfect match. Their message was a clear one and from that we developed the Community Matters concept which has won just about every award you can for public service spots. In previous seasons, we were dealing with different forms of diversity in the community. We were asked to research stories that would make you say, “I didn’t know that but I should.” Diversity in the community can be LGBT, women, people of color. We created stories that really had integrity and reflected sources of pride. That was a mandate from our clients.

150


Saluting the military

Q: How did you approach the new season? Jameson: The client said that this year we want to honor the military, portraying unknown facets. The award-winning screenwriter for the series, Shell Danielson, helped us come up with three very solid stories. “Panama Canal” describes how Army Major Walter Reed demonstrated that yellow fever was transmitted by mosquitos, helping to save lives as the United States completed the Canal. “Tomb Guard” tells the story of the elite detail from the 3rd Army Infantry that guards the Tomb of the Unknowns in Arlington National Cemetery. And “Operation Christmas Drop” describes an annual Air Force humanitarian mission in the western Pacific. I’m delighted that the client will be contributing to Operation Christmas Drop.

According to Brandon Jameson, “These stories have a great deal to do with diversity, and therefore a great deal to do with different skin tones.”

151


Q: What else is new this year? Jameson: We made a decision early on to always keep the camera moving. They were one‑day shoots each. Although it would have been nice to go from Steadicam to Movi to dolly to sticks, our great cinematographer, Flor Collins and I decided very early on that it would probably be in our best interest to just keep it on a dolly and keep it moving. Between that and the extraordinary work of Flor’s gaffer, we were able to pre-light and just roll into different locations with absolutely no wait time. It was amazing. We just went from scene to scene to scene.

“When we drop the S-Log3 onto the RAW footage now, it needs little to no color correction. This is a huge advance.”

— Brandon Jameson 152


Saluting the military

we’re both. Because of that, we have a modicum of pride and integrity in knowing that what we own is the best we can do for our clients. We differ from a classic creative agency that always hires a production company. We keep it all in house, which makes it a lot easier. And while we have sets of lenses that we like, we do not impose our camera or lenses on the directors of photography we hire. It’s something we can offer them that works very costeffectively in the way that we budget a full job. The Production Crew on set.

Q: Where did you shoot? Jameson: Tucked in an industrial park in Anaheim is an amazing place, Silver Dream Factory. They provided us with this little backlot and also the hospital set and the flower shop. So we were able to move directly from the spokesperson’s walk & talk, which opens every spot, right into the hospital and the flower shop! Q: “Operation Christmas Drop” includes some compelling shots of Air Force cargo planes. Did you shoot that? Jameson: No! In order to stop the constant requests for footage, each branch of the armed services has their own producers and their own cinematographers. And that footage is available free of charge. You say, “We need this.” And they say, “Well, you can’t come here but we can show you on the web what you can get free.” It’s unbelievably cool. It works. It’s amazing. Q: I heard that you shot this with your own Sony F65, nicknamed “Birtha.” Why would an agency own a high-end camera? Jameson: Link Technologies is different from 99.9% of classic agencies or production companies in that 153

Q: How did you choose the F65? Jameson: We have a history of adopting technology fast. We tested the DALSA camera. We had two Vipers before anyone else did and were fortunate enough to lend them out on Benjamin Button. It behooved us to determine the absolute best camera for us and we jumped on the F65. We were lucky enough to be working with some really stellar support people at both Abel Cine in New York and Band Pro in LA. At Sony, we had experts on both coasts, as well. In New York we worked with Ian Cook, and in LA Simon Marsh.

“The rendition of skin of all different shades and textures, all different age of actors is really quite beautiful and complimentary.” —Brandon Jameson


Q: You must have seen the F65 evolve over time. What were the biggest changes? Jameson: S-Log3 is one breakthrough. If I were Sony, I’d be shouting from the rooftops about this. When we take the RAW footage in, colorists will commonly drop a LUT on. And when we drop the S-Log3 onto the RAW footage now, it needs little to no color correction. This is a huge advance, especially for the Community Matters campaign. These stories have a great deal to do with diversity, and therefore a great deal to do with different skin tones. We love the F65 because you can see that the rendition of skin of all different shades and textures, all different age of actors is really quite beautiful and complimentary. Q: What else? Jameson: RAW Lite. When we needed something to go to broadcast very quickly, we used to shoot 1080 HD. Now we find we can use RAW Lite pretty much the same way but utilize the full resolution of F65 capture. That is lovely! And we can also offer our clients stills. It becomes not only a cinema camera but also a useful way to capture frames that can be blown up for print and even billboards. We’ve been really, really happy with that.

Q: You mentioned lenses. What do you own? Jameson: We have two sets of lenses, modified to PL mounts. The first is a set of Cooke Speed Panchros which we love because it gives us a more painterly feel. The other is a set of Zeiss Contax lenses that have the T coating instead of the MC coating. That tends to give us a sharper image. But these modified lenses can be a challenge for focus pullers and ACs. So on jobs like this, we encourage DPs to rent lenses of their choosing. In fact, DPs are using lenses the way they used to think of emulsion. You would pick a film stock for what it could offer you in terms of grain and ASA and color rendition. I’m finding DPs less concerned about the camera and more concerned about the glass they want to use. It’s the digital form of emulsion, of expression. On this job, Flor chose a set of Zeiss Super Speeds.

“We can also offer our clients stills. It becomes not only a cinema camera but also a useful way to capture frames that can be blown up for print and even billboards.” —Brandon Jameson

“Operation Christmas Drop” included a warehouse set, plus stock footage of Air Force cargo planes.

154


Saluting the military

DP Flor Collins: Keeping it moving Born in Ireland, director of photography Flor Collins (florcollins.com) now finds himself in Southern California. His credits include commercials for Nike, T-Mobile, BMW and Toyota. Q: How did you prep? Collins: I had a meeting with Brandon, who showed me his storyboards. He had some reference stills for the kind of look he was going for. Style-wise, he wanted everything to be on the move slightly. So we talked about that and composition and shot design. We had one scout day and we were off and running. Q: What was different for this season? Collins: I had seen the earlier seasons and each had a slightly different look. Brandon wanted to keep it fresh. Part of that was the camera movement: subtle but interesting camera moves on each shot. Not distracting but the person walking, the camera moving, slight drift left-to-right, in or out. It was a tight shooting schedule. You try to maximize to make each set as interesting as possible. But again, there’s only so much time for art direction, lighting and everything. Q: Why did you choose Zeiss Super Speeds? Collins: Because the digital cameras are just so high-resolution these days, sometimes you can get a little too cold and clinical. The old Zeiss Super Speeds give a nice balance with high-end cameras, kind of softening the image. Ultra-modern with somewhat old-school lenses. Q: What kind of crew did you have? Collins: Since they were PSAs, it wasn’t a normal commercial crew size. I had a gaffer, grip and two extra guys. Four guys altogether. But we didn’t have a lot of time or crew or money, considering how much we had to do. There were things I had only limited control over. So I was very pleased with the way it came out.

155

DP Flor Collins and crew in the backlot at Silver Dream Factory.

“It was getting hotter in the background and there was nothing I could do. Fortunately, the F65 handled it very well.” — Flor Collins Q: How do you mean? Collins: For example, the intro to each spot is the spokesperson walking towards camera. That was the shot I was most worried about. Normally for that shot, you’d pick the best time of day and be out of there within an hour. But there were wardrobe changes, there was dialog. So it took over half a day. And because it was a very small back-lot set, I could only shoot one direction. I couldn’t shoot in the ideal direction for sunlight. For a big-budget commercial, I would have put a silk overhead and covered the whole backlot for diffuse sunlight all day long. In this case, I had some diffusion overhead for the spokesperson. But it was getting hotter in


the background and there was nothing I could do. I had to live with it. Fortunately, the F65 handled the hot background very well. The highlights rolled off really nicely. That was very confidence-inspiring. Q: Was that a surprise? Collins: Yes. It was my first proper job with the F65 and I was really pleased with the natural look that rolled off. With some cameras these days you have to do a lot of work to get them roughly where you want. But the F65 had a very pleasing image from the get-go. And then actually in the coloring session it needed so little tweaking. It went really fast because we didn’t have to do an awful lot of work.

“With some modern cameras, I spend more time with the DIT, tweaking the skin tones than I would like. This one was effortless.”

Q: How did you find the color? Collins: Really pleasant again. With some modern cameras, I spend more time with the DIT, tweaking the skin tones than I would like. This one was effortless. Tony Salgado, our DIT was doing subtle tweaks on-set. But again it was nothing that slowed me down. So the image I was working with was very pleasing from the get-to. Q: What did you think of the camera’s 20 Megapixel sensor? Collins: Actually, I’m baffled by the whole resolution thing. You can get nice-looking images without going 4K, 6K. It’s not important to me, unless there’s a specific demand. Post sometimes requires maximum resolution because they want to move the frame around or punch in. In that case, I would listen to post. But for an average job where you’re not doing a lot of punching in afterwards, the resolution is not important to me. I’m more interested in how the colors roll off and how the dynamic range is. I would prioritize that more than resolution on any camera.

— Flor Collins

156


Saluting the military

Q: How did you record? Collins: We shot the RAW Lite. I liked it because it didn’t slow us down at all. Sometimes when you shoot the higher resolution stuff, you get bogged down with the workflow. I didn’t even notice it. It was invisible to me, which is all I care about. You’re so busy on-set, shooting. So I got the best quality and there was no penalty in terms of my shooting style. Q: Were there any surprises? Collins: The electronic viewfinder. For some strange reason, the eyepiece on some cameras tends to cook your eye. Because your eye is jammed in there, your eye tends to get hot and dry out. Whether subconsciously or not, I found the F65 was very easy on my eye. I wasn’t baked by the end of the day. Q: How would you summarize the F65? Collins: In the old film days, you never thought about the camera. Just the film stock. Today, you sometimes shoot and you’re more aware of the camera than you’d like to be. The F65 is not one of those cameras. It was invisible. I wasn’t tweaking the camera. I wasn’t hunting for menu things. So I could concentrate on my job: lighting, and operating and composition. The camera is part of what makes the image on the screen, but for me, the lighting and the composition are the main things.

“The F65 had a very pleasing image from the get-go.” — Flor Collins

157


“The eyepiece on some cameras tends to cook your eye,” says DP Flor Collins. “The F65 was very easy on my eye. I wasn’t baked by the end of the day.”

158


Saluting the military

Making the work flow: DIT Tony Salgado Tony Salgado, Digital Imaging Technician, IA 600, has previously worked as an online editor, video controller, engineer in charge and still photographer. Q: What does a DIT do on a project like this? Salgado: I was responsible for interfacing between the DP, production company and editorial to insure a solid transition of the technical logistics and media deliverables as well as maintaining the creative intent of the DP and director.

Q: Was there anything that particularly surprised you about the F65? Salgado: I’ve worked with practically all the Sony cameras and I’m very familiar with the F65. But I did appreciate the iPad app which provides remote access to the F65 menus.

Q: And what was your setup?

Q: How did you back up the memory card?

Salgado: The spots were recorded in F65 RAW Lite onto the SR-R4 and Sony 512 GB cards. The video village was simple on this shoot: one monitor for the director, another monitor for the DP and 1st AC and finally I had the color critical monitor at my DIT cart. I used a 24" monitor and a small overhead tent for onset color grading which incorporated the S-Log3 LUT.

Salgado: I downloaded the camera original footage a total of three times. This included two hard drives (one for editorial, the second for the production company archive) and the third copy onto my workstation RAID for transcoding.

Q: You did on-set grading? Salgado: Yes. We used DaVinci Resolve on set and in post production. I chose Resolve based on ease of use, especially the seamless incorporation of the S-Log3 LUTs for the F65. And DaVinci Resolve is my primary choice for onset transcoding as it handles many different camera models, resolution and output codecs. Q: How did you communicate your on-set corrections to post? Salgado: I used Pomfort Livegrade for creating Color Decision Lists (CDLs), which I then imported into Resolve for final onset grading and transcoding. I did apply secondary color correction to fine-tune certain shots. I delivered all transcodes with the onset color grading burned in. I delivered ProRes LT media transcodes to Lance, the offline editor.

“I appreciated the iPad app which provides remote access to the F65 menus.” — Tony Salgado 159

Q: Did you follow the job into post? Salgado: As a former online editor, I always consult with post production to insure we’re all on the same page. It’s essential to discuss deliverable codec and resolution preferences to avoid any issues. Q: How did the F65 RAW Lite transfer times compare to your experiences with other top-end digital cinema cameras? Salgado: Shooting RAW Lite did increase the transfer time slightly compared to shooting Apple ProRes with an Alexa camera. But I maintained an onset protocol not to fill up memory cards more than 50% maximum to allow a constant stream of ingesting and simultaneous transcoding. In addition, my GPU and MacPro are well set up to deal with F65 Raw Lite. So it was all very fast. I also opted to record an audio scratch track directly to camera to speed up the transcode times. This was a common workflow I have done in the past for documentaries with ENG-style camcorders. Q: How did the camera do? Salgado: I believe the F65 outperforms other RAW recording cameras. I was so happy to work with a camera in which the color reproduction is accurate to the eye. Some other cameras cannot faithfully record the color palette on set. The camera’s S-Gamut will always be my choice for color space.


160


Saluting the military

Quick and easy for editor Lance Tracy Editor and director Lance Tracy (lancetracy. net) has done powerful, engaging work for the NBC Olympics, Fox Sports and Ace Hardware, in addition to documentaries and narrative films.

Q: The F65 is a top-end camera. How did that affect the edit? Tracy: Not at all. It was just like any other job. We used DropBox to bring the material in. Tony Salgado gave me the proxies in Apple ProRes, exactly as I wanted and I cut in Final Cut Pro 7. We delivered XML files. Pretty standard. Q: The “Panama Canal” spot featured some Ken Burns-like slow pans across sepia-tone archival photos from 100 years ago. Tracy: Yes. I did that on my system. Again, very easy. Q: Wasn’t anything on this job a challenge for you? Tracy: Not really. Maybe syncing the sound.

161


162


Saluting the military

Hitting the highlights with colorist Jake Blackstone Editor and colorist Jake Blackstone (jake-blackstone.com) of MOD Color has done ground-breaking work in remote grading. His credits include music videos and commercials for Yves Saint Laurent, Snickers and Gulf Oil.

Q: Could you describe your workflow? Blackstone: All the materials came in on a hard drive. I was given XML so I imported everything in Resolve, conformed in Resolve and after that I was able to trim only the pieces used to my hard drive. Even though F65 RAW Lite is compressed, it’s still quite a bit for the regular hard drive to play back in real time. So I trimmed it down, I laid it down to my fast drives and then I just replaced the trimmed pieces with the original pieces. So I was able work off the RAW material but just the trimmed pieces in Resolve. Q: Who attended the session? Blackstone: It was a supervised session, a normal grading session with both Brandon (the director) and Flor (the DP). Mostly the director of photography was running the session saying, “Let’s do this, let’s do that.” Q: How was working with F65 RAW Lite? Blackstone: It’s a wonderful combination of RAW and compressed because the files are still a manageable size. It just amazes me that this camera makes such beautiful images. Really amazing. I would love to work more with the F65. Going in, I was worried about performance. F65 RAW needs to be decompressed and de-Bayered. Decompression is really only done in CPU. With other RAW formats, if you don’t have many cores of CPU you can’t decompress real time. De-Bayering is done in GPUs and I have a mid-level system with a couple of 680s. With other RAW formats, if you want top end performance, you need two Titan GPUs. Which means that if I want to work in real time, I’m forced to work in half- or even quarter-resolution. But F65 RAW Lite was so easy it was amazing. The moment 163

I started using it, it was literally like I was using ALEXA ProRes. Everything was real time. There was absolutely no need for caching anything.

“I was worried about performance, but F65 RAW Lite was so easy it was amazing. Everything was real time. There was absolutely no need for caching anything.” — Jake Blackstone Q: Doesn’t the RED Rocket card overcome these issues? Blackstone: Sure. But it’s $7,500, something I don’t have to spend on other cameras. It’s a problem for me when you have to buy hardware that’s specifically tied to a camera and that takes slots. And frankly, I’m out of slots on my computer. Even if I had $7,500 I have no place to stick it in.


Q: How did you find grading the material? Blackstone: It literally surprised me. Dynamic range and highlight protection are always an issue with electronic cameras. There were a lot of shots outside and there is always concern with the highlights. I was very pleasantly surprised how well the F65 image held up. I could have done more but sometimes the DP said, let the highlights fall where they are without bringing them back in. It was just really beautiful. The highlight protection had so much latitude. There was absolutely no issue with highlights blowing out. It was just wonderful. The way it retains the highlights reminds me of film. In the old days of film you could pretty much never blow the highlights out. You just dug in and there they were! The F65 was like that, almost film-like. Q: Some of the shots were stock. How did you get it to match? Blackstone: The stock was video gamma versus Sony’s S-Log. Normally, if everything was just S-Log, I would just apply S-Log to the whole timeline. But this way, because it was mixed gamma material, I had to apply gamma to each Sony shot. I just had to do it individually versus global. It didn’t take that long. It’s not that big a deal.

“In the old days of film you could pretty much never blow the highlights out. You just dug in and there they were! The F65 was like that, almost film-like.” — Jake Blackstone Q: What was your biggest challenge? Blackstone: Maybe some metadata issues with Resolve. To be honest, it was a really easy job. It was beautifully photographed. It’s always interesting. In the higher-end jobs, you get material that is so beautifully shot that it’s 90% there. Sometimes I just adjust a little bit of white balance and contrast. It’s like, “Wow, I get paid for something that doesn’t need that much work.” This was definitely one of those jobs.

The crew went on location to the USS Iowa in Long Beach.

164


sony.com/35mm

Š2015 Sony Electronics Inc. All rights reserved. Reproduction in whole or in part without written permission is prohibited. Features, design, and specifications are subject to change without notice. The values for mass and dimension are approximate. Sony, CineAlta, SxS, XAVC, XDCAM, and the Sony logo are trademarks of Sony Corporation. All other trademarks are the property of their respective owners.

165


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.