Camera & Lens Repair in Los Angeles
Industry Leading Facilities
Class 100 Clean Room
2 Full Frame Lens Test Projection Rooms
MYT Works Opti-Glide System + PAT Charts
T-Stop Evaluation System
Lens Collimator
MTF Machine
New & Vintage Models
Sony Authorized Warranty Service Center
Angénieux Factory Authorized Service Center
Zeiss, Cooke, Leitz, Sigma, and more
TAILORED FOR BOUNDLESS CREATIVITY
The new EOS C400 is the ideal fusion of both functionality and form, seamlessly integrating into almost any production with its robust features and compact design. Innovation, Versatility & Imaging Excellence
• 6K Full-Frame Back-illuminated stacked CMOS sensor
• Triple-base ISO (800, 3200, 12,800)
• Improved autofocus with Dual Pixel CMOS AF II
• Variety of interfaces, including Timecode and Genlock
• Built-in Mechanical ND Filters
PRESIDENT’S LETTER
BUILT TO LAST
One week ago, last Thursday, before I started writing this letter, Local 600, as well as all the other IASTE craft unions signed onto the Hollywood Basic Agreement and the Area Standards Agreement, voted overwhelmingly to ratify the 2024-2027 contracts with the Alliance of Motion Picture Television Producers (AMPTP) – the trade group representing producers, studios and streaming services.
The numbers were beyond impressive: more than 88 percent of ICG’s voting members cast their votes in favor of ratifying the Hollywood Basic and Videotape Agreements (with 71.3 percent of eligible Guild members voting), while 85.9 percent of the voting members of the other IATSE West Coast Studio Locals voted to ratify the Hollywood Basic, and 87.2 percent of IATSE’s voting members approving the Area Standards Agreement. These are historically high figures and represent tremendous gains in many areas, including wage increases and quality-of-life enhancements, both of which are key to sustaining a long and healthy career in this industry.
Personally, I want to highlight new language in the Basic Agreement that requires productions that work beyond a 14-hour day to ask crewmembers if they want a “ride or a room,” and then to pay for it. This is one of several “guard rails” (as former ICG President John Lindley, ASC, described them) for our members working unsafe hours. Another “guard rail” this new contract provides is tripletime payment after 15 hours, which we hope producers won’t need to put into practice, both because of the obvious safety hazards and the cost to their bottom lines. I urge all members to read the Memorandum of Agreement (MOA) for the Hollywood Basic and Videotape Agreement, as well as the Comparison of Gains in both agreements. They can be found on your MY600/Negotiation Page.
This 2024-2027 contract also saw historic gains for our publicist members (Local 600 Publicists Agreement MOA), who increased pension and health-care coverage in an additional ten states, totaling more than 20 states – hopefully, the entire country will be covered for our publicists members in future contracts. Speaking of the future, it’s important to emphasize that the gains in this contract were built upon the victories in contracts past. That means the achievements in 2024 will be built up even stronger in 2027, a contract negotiation for which this union (and the entire IATSE) is already beginning to strategize. As IATSE International President Matthew D. Loeb announced: “The gains secured in these contracts mark a significant step forward for America’s film and TV industry and its workers. This result shows our members agree, and now we must build on what these negotiations achieved.”
Building blocks of financial wellness, health and welfare, safety on set and quality of life don’t just happen. They take an incredible team of union professionals, working with complete dedication for a very long time. That included Lead Negotiator, ICG National Executive Director Alex Tonisson; Local 600’s Bargaining/Negotiation teams and ICG’s incredible staff, who met more than 25 times in 16 months; the ICG Communications department, which went above and beyond to keep this membership informed and connected before, during and after the negotiation and ratification process; and, finally, our Contract Action Team (CAT).
With this recent contract ratification, I can say that I am more proud than ever to call myself a member of a union that is – most definitely – “built to last.”
Now and forever.
Publisher Teresa Muñoz
Executive Editor
David Geffner
Art Director
Wes Driver
STAFF WRITER
Pauline Rogers
COMMUNICATIONS COORDINATOR
Tyler Bourdeau
COPY EDITORS
Peter Bonilla
Maureen Kingsley
CONTRIBUTORS
Ted Elrick
Margot Lester
David Geffner
Derek Stettler
COMMUNICATIONS COMMITTEE
John Lindley, ASC, Co-Chair
Chris Silano, Co-Chair
CIRCULATION OFFICE
7755 Sunset Boulevard
Hollywood, CA 90046
Tel: (323) 876-0160
Fax: (323) 878-1180
Email: circulation@icgmagazine.com
ADVERTISING REPRESENTATIVES
WEST COAST & CANADA
Rombeau, Inc.
Sharon Rombeau
Tel: (818) 762 – 6020
Fax: (818) 760 – 0860
Email: sharonrombeau@gmail.com
EAST COAST, EUROPE, & ASIA
Alan Braden, Inc.
Alan Braden
Tel: (818) 850-9398
Instagram/Twitter/Facebook: @theicgmag
IATSE Local 600
NATIONAL PRESIDENT
Baird B Steptoe
VICE PRESIDENT Chris Silano
1ST NATIONAL VICE PRESIDENT
Deborah Lipman
2ND NATIONAL VICE PRESIDENT
Mark H. Weingartner
NATIONAL SECRETARY-TREASURER
Stephen Wong
NATIONAL ASSISTANT SECRETARY-TREASURER
Jamie Silverstein
NATIONAL SERGEANT-AT-ARMS
Betsy Peoples
NATIONAL EXECUTIVE DIRECTOR
Alex Tonisson
ADVERTISING POLICY: Readers should not assume that any products or services advertised in International Cinematographers Guild Magazine are endorsed by the International Cinematographers Guild. Although the Editorial staff adheres to standard industry practices in requiring advertisers to be “truthful and forthright,” there has been no extensive screening process by either International Cinematographers Guild Magazine or the International Cinematographers Guild.
EDITORIAL POLICY: The International Cinematographers Guild neither implicitly nor explicitly endorses opinions or political statements expressed in International Cinematographers Guild Magazine. ICG Magazine considers unsolicited material via email only, provided all submissions are within current Contributor Guideline standards. All published material is subject to editing for length, style and content, with inclusion at the discretion of the Executive Editor and Art Director. Local 600, International Cinematographers Guild, retains all ancillary and expressed rights of content and photos published in ICG Magazine and icgmagazine.com, subject to any negotiated prior arrangement. ICG Magazine regrets that it cannot publish letters to the editor.
ICG (ISSN 1527-6007)
Ten issues published annually by The International Cinematographers Guild 7755 Sunset Boulevard, Hollywood, CA, 90046, U.S.A. Periodical postage paid at Los Angeles, California.
POSTMASTER: Send address changes to ICG 7755 Sunset Boulevard Hollywood, California 90046
Copyright 2024, by Local 600, International Alliance of Theatrical Stage Employes, Moving Picture Technicians, Artists and Allied Crafts of the United States and Canada. Entered as Periodical matter, September 30, 1930, at the Post Office at Los Angeles, California, under the act of March 3, 1879. Subscriptions: $88.00 of each International Cinematographers Guild member’s annual dues is allocated for an annual subscription to International Cinematographers Guild Magazine. Non-members may purchase an annual subscription for $48.00 (U.S.), $82.00 (Foreign and Canada) surface mail and $117.00 air mail per year. Single Copy: $4.95
Email: alanbradenmedia@gmail.com August 2024 vol. 95 no. 06
The International Cinematographers Guild Magazine has been published monthly since 1929. International Cinematographers Guild Magazine is a registered trademark. www.icgmagazine.com www.icg600.com
OUTSTANDING DRAMA SERIES
CINEMATOGRAPHY FOR A SERIES (ONE HOUR)
Adriano Goldman, ASC, BSC, ABC “Sleep, Dearie Sleep”
Sophia Olsson, FSF “Ritz”
“THE GOLD STANDARD FOR HISTORICAL DRAMA. It will be hard for another series to stand in its shadow.”
OBSERVER
WIDE ANGLE
As the editor of this magazine, I can hardly offer an unbiased opinion – basically, every issue we do is fantastic! But, I have to confess to some added love for our annual Interview issue, mostly because we highlight an amazing range of voices across the entertainment industry. This year was no exception, as I had the pleasure of “Zooming in” with all ten subjects, who shared their insights around the theme of “Future Tech.” The section starts with a trio of directors of photography, from three different generations, whose work has all been ground-breaking in their respective journeys, and who all share a love for combining traditional filmmaking with the latest in leadingedge tech in ways that will surprise many readers.
Let’s start with Greig Fraser, ASC, ACS. Few would dispute the Oscar-winning Aussie ushered in Hollywood’s current era of LED Volume storytelling with his Emmy-winning work on Season 1 of The Mandalorian (and even before with Rogue One: A Star Wars Story). And even though many felt Fraser took the Volume to greater heights on The Batman, when I asked about Dune 2, the highest grossing live-action film of 2024 – so far – Fraser said he put the Volume’s game-engine technology into service for a much more old-school aspect of filmmaking.
“What we could do with Unreal [Engine] was to scan the rock formations in Jordan with a drone, make them into a 3D model, put them in the right place on the planet, and then type in the day we wanted to shoot,” Fraser told me. “What we got back was the exact time a location is in shade and/ or sun.” Translation: Fraser used the most leadingedge digital tech to facilitate a make-or-break physical schedule that included an opening scene with 14 different real-world locations.
Alice Brooks, ASC, is a decade-plus behind Fraser in experience, but she’s no less resourceful when it comes to putting technology in the service of a story. For the upcoming feature Wicked (whose trailer alone would win an Oscar, if there were one), Brooks went searching for a way to make Director Jon M. Chu’s Oz different from any
other. She found it courtesy of Panavision lens alchemist Dan Sasaki, who built, in Brooks’ words, “new/old anamorphic lenses, used once on our film, and that’s it.”
Sasaki’s one-of-a-kind glass ultimately did morph into Panavision’s Ultra Panatar IIs (which Brooks has since used on another project), but they’re markedly different from the custom lenses used on Wicked. Brooks’ love for combining old/ new also extended to her approach to lighting. “We used a lot of gelled Tungsten on Wicked,” she adds. “But because the sets were insanely big and complex, we used Unreal Engine to figure out what type of light would fit where, before everything was built.”
Speaking of virtual world-building, when it comes to an Interview section themed around “Future Tech,” the acronym AI can’t help but come up. Thankfully, according to two of the industry’s most respected technologists – SMPTE President Renard Jenkins and Mariana Acuña Acosta, senior vice president of Global Virtual Production and OnSet Services for Technicolor and a woman who’s now on her fourth start-up company – computers won’t be replacing humans any time soon. Acosta reminded me that machine learning and artificial intelligence have been embedded in the VFX world for more than a decade. “There’s a feeling that AI is a brand-new technology,” she said, “but nothing that’s being done today would be possible without what came before. Just like the invention of the camera, the ATM, or the Internet of Things, where your fridge reminds you to go buy strawberries – none of these technologies has replaced human creativity and neither will AI.”
Jenkins, a longtime “Trekkie” with a soft spot for great cinematography, was even more bullish on humans as evergreen creators. “It may be easy –with the various generative AI tools available now,” the D.C.-based technologist told me, “to create still images that are fantasy based. But as far as producing narrative content, with real people, in the real world, what the cinematographer and his or her camera crew do? There’s no way that’s going away.”
And just to heap on one more (very biased) opinion: “We at ICG Magazine heartily agree!”
David Geffner Executive Editor Email: david@icgmagazine.com
FILM & TV SPORTING
Masterpiece’s Entertainment Division understands the complex logistics of the industry including seemingly impossible deadlines, unconventional expedited routings, and show personnel that must be on hand. Masterpiece has the depth of resources to assist you in making sure everything comes together in time, every time.
Our range of services, include:
• Customs Brokers
• Freight Forwarders
• ATA Carnet Prep & Delivery
• Border Crossings
• Domestic Truck Transport
• International Transport via Air or Ocean
• Expedited Shipping Services
• 24/7 Service Available
• Cargo & Transit Insurance
• Custom Packing & Crating
FESTIVALS
$12,838
“What makes the DJI Ronin 4D special begins with having remote wireless control of nine stops of ND plus all the other camera parameters from kilometers away –and that all of these features fit inside a camera head smaller than two clenched fists,” notes Rodney Charters, ASC. “Its compact size allows for working in tight spaces and up close with actors – all in luscious 8K.” Launched last December, the all-in-one camera integrates DJI’s revolutionary four-axis stabilization. It delivers excellent image quality in complex lighting environments – as seen in recent projects including Civil War, Incision, and Wicked City – via full-frame 8K/60-fps and 4K/120-fps capabilities and DJI Cinema Color Science. The camera includes DL/E/L/PL/M interchangeable lens mounts, autofocus on manual lenses, and automated manual focus (AMF) with a LiDAR focusing system. As Charters adds: “Not since my first film with my father’s 16-millimeter Bolex and a 10-millimeter Switar have I had so much fun operating!”
“ONE OF THE BEST-LOOKING SHOWS EVER MADE. ”
LAOWA RANGER FULL FRAME & S35 SERIES
$2,999-$3,499 RANGER FF LENS
$1,999-$2,499 RANGER S35
WWW.LAOWACINE.COM
Get a more extensive total zoom ratio with a larger aperture. The Ranger FF Compact Cine Zoom comes in 16-30-mm T2.9, 28-75-mm T2.9 and 75-180-mm T2.9 lenses with more than 11× zoom ratio in total. The lightweight and compact lenses are available in PL and wireless mounts. They deliver excellent neutral color rendition on skin tone and high image sharpness at both ends of the spectrum. They also have a built-in back-focus-adjustment system. The 3-lens Ranger S35 Compact Cine Zoom Series (11-18-mm T2.9, 17-50-mm T2.9 and 50-130-mm T2.9) delivers a lot of focal range at featherlight weight (750-850 g per standard lens and 650 g for the Lite lenses). “The Ranger T2.9 full-frame zoom lenses were indispensable during the filming of Wong Kar-Wai’s TV series Blossoms Shanghai,” says Peter Pau Tak-Hei, BBS; cinematographer, director, and Oscar-winning DP of Crouching Tiger, Hidden Dragon. “The 28-75-millimeter lens, with its exceptional detail, minimal breathing, and balanced contrast, quickly became my top choice for zoom lenses!”
It’s time to start using your head! This helmet cam is purpose-built for extreme, mobile video capture and real-time streaming. The patented base is helmet-contoured for solid fit, and the housing is low-profile. The compact, lightweight combo reduces snag hazards and increases mobility. You can mount it on multiple helmet types or deploy it as a remote video camera in vehicles, on robotics, and elsewhere. The product maximizes data throughput for uninterrupted footage but is easy to operate. Its intuitive web interface empowers you to customize everything from image quality to camera settings to video capture parameters. The new generation’s onboard sensors deliver 20× improved low light over their predecessors with customizable video output streams (2× H.264; 1× MJPEG). MOHOC and Silvus designed it to be plug-and-play-compatible with StreamCaster 4000-series MANET radios.
TECH. TOOLS. PRODUCTS. PEOPLE.
Join thousands in broadcast, media and entertainment at NAB Show New York, where infinite product discovery, networking and knowledge is within your reach. Ask questions. Get answers. Make connections. The latest innovations await on a show floor full of new-to-market tech and time-saving digital tools. Plus, access to amazing people and the most pivotal trends and topics you need in on now… AI, the creator economy, sports, photography, virtual production, FAST and more!
Invest in yourself through these incredibly in-depth conferences:
• Local TV Strategies
• Post|Production World New York
• Radio + Podcasting Interactive Forum
EXHIBITS: OCTOBER 9–10, 2024
EDUCATION: OCTOBER 8–10
JAVITS CENTER | NEW YORK, NY
The Middle MAX™ Menace extends your booming capabilities. Based on the same design principles as the Academy Technical Award-winning MAX Menace Arm, the Middle MAX has a maximum height of 18 ft/5.49m and can go to 6 ft/1.83 m below grade. With a payload capacity of 150 lb/68 kg (80 lb/36 kg when fully extended), it’s perfect for supporting lights, reflectors, cameras or set dressing. Its self-leveling head with baby pin and junior receiver features three lock-off or fixed positions. The rig rides on heavy-duty 10-in. × 3-in. puncture-proof tires. Fully assembled, this easy-to-use support even fits inside a standard cargo van. “I am amazed by the new design – the size and ease of movement make Middle MAX a must-have item for every production,” says Local 80 Key Grip David Donoho. “When time is limited on set, you really appreciate the value of high-quality equipment.”
KEVIN COSTNER
DIRECTOR / PRODUCER / ACTOR
HORIZON: AN AMERICAN SAGA
BY PAULINE ROGERS
SANDY MORRIS / APPLE TV+
It feels like symmetry that Oscar-, Emmy-, and Golden Globe-winning filmmaker Kevin Costner’s most ambitious project to date – the four-part Horizon: An American Saga, releasing in Chapters 1 and 2 this summer from Warner Bros. – is a Western. After all, it was nearly 40 years ago that Costner first made his mark in Hollywood in the Lawrence Kasdan-directed Silverado, a film that many viewed as an attempt to revive a beloved cinematic genre that had, after its TV heyday in the 1960s, ridden off into the sunset.
Much like earlier Costner projects (most notably Dances With Wolves and Open Range), Horizon is as much an ode to the strength and toughness of the American West as it is to the men and women who settled there. It’s also yet another massive gamble by a filmmaker willing to put everything on the line to bring a story to audiences he feels compelled to tell, irrespective of its inherent commerciality.
Horizon’s box office fate aside, few can question Costner’s zeal for American history. His portrayal of the real-life Civil War veteran William Anderson “Devil Anse” Hatfield in the 2012 miniseries Hatfields & McCoys broke cable TV viewership records and earned Costner a television trifecta – Emmy, Golden Globe and SAG awards. Six years later, the actor/writer/ director/producer kicked off the long-running Paramount+ franchise Yellowstone, starring as John Dutton III, the owner of Montana’s largest ranch. While Yellowstone centers on the family drama within the wealthy ranching clan, as well as the chafing away of Western and Native American lands by developers, Horizon, shot by James Muro (who operated Steadicam on past Costner hits Dances With Wolves and Field of Dreams) is about a specific period in the American West – pre-and-post Civil War –and the (often violent) struggles the emerging pioneers faced.
ICG Magazine: There was a lot to overcome in bringing Horizon to life, and it’s inspiring to see this story materialized. What was the genesis of this project? Kevin Costner: I commissioned a screenplay in 1988. That was another century if you think about it. I liked it very much. It was original, but still a recognizable Western story and I wanted to do it. But ultimately, when I tried to get it going in 2003, the studio wouldn’t do it for the budget we needed. I was baffled by that because I had just made a movie called Open Range that had done quite well for them. But at the end of the day, they didn’t want to do it. So, I went on to do other movies. And yet, I always loved the story. So, after some time, I thought maybe I would do it differently than what was originally intended. Maybe I would make a movie about how towns begin and show the drama of that, because all Westerns start with people already having a town, automatically. But we never see the history of a town that was fought for, that people died over; these towns didn’t just magically appear. That was something I thought was missing in [other Western] stories. With my writing partner, Jon Baird, we decided to
“I THINK THE IMAGERY
JIMMY
[MURO] GAVE US IN THESE MOVIES WILL LIVE FOREVER.”
develop that idea, which you see in the opening of Chapter 1 – somebody putting their stake in the ground. But even after I finished those four screenplays, which took about four years, I finally had to look to myself financially to get them made, which I’ve had to do before. And that hasn’t always been the most comfortable way to go. But, to serve the vision I’ve had in my head, I’m willing to do it.
Putting your own money on a project, betting so much on it, is such a strong investment in your vision. Does it feel risky? I take the risk because I believe in an audience’s ability to see this as something highly original, something that goes on – there are three more coming –and can look forward to that. That’s why I was willing to risk this. I believe that people want to see fresh things that mean something. That matter. That have empathy. That have danger. That have excitement. That have love and tragedy.
Given the state of television and the way it’s been going, what was behind the decision to release the story as multiple films in theaters, rather than as a television series? I just believe that some movies have a real life and that when we see them on the big screen, we’ll never forget them. And this story felt like one of those.
What is it about the Western genre that appeals so much to you, personally? Why do you think it’s captured the audience’s imagination more than any other era? I think it starts when you realize that the West was not just a land in Disneyland – it was real. It was a 300-year struggle. People from Europe were crossing an ocean, and they found that if they just kept going, they could find new lands. And if they were tough enough, or if they were mean
enough, they could control those lands. And America was formed on that basis, that if you were strong enough, you could take what you found and hold on to what you had. Not so much in Europe, where you were a second-class citizen, you lived under monarchies. America was like a promise, a promise that these pioneers could become kings themselves. And, ultimately, what we do know is that there were already people who had settled the land, the Native Americans, thousands of years before, and who paid a terrible price for our national appetite. So there’s this march across America, which was very much like the Garden of Eden before people came. The land was wholly natural, filled with animals and Native peoples, who all lived very lightly on the ground.
As an actor yourself, what qualities do you look for in the cast to evoke or match the setting and era of the story? Sometimes I’ll work with first-time actors. Sometimes I want a more seasoned actor because I feel like it’s going to require that. So I tend to mix my casts up. I’m very proud of who we had working on Horizon. There are some actresses in there that are going to be around for a long time. You probably noticed how heavy this Western leans on women. Entire plots revolve around women, so they’re not just tossed into cliché roles. Most Westerns feel like clichés. There are good guys and bad guys, but I know that the West was made up of all types of people. And those that were bad weren’t always stupid and dumb. They were conniving and mean. And that’s why the West was so incredibly dangerous.
Sounds like perfect fodder for complex villains. There was no law, at all, and that’s hard for people to conceive. It’s the mistake we make about Westerns. With no structured form of law, there was no one to tell you
what to do. If someone wanted what you had, there’s no one there to stop that. Imagine that you were my character in the scene on the hill and you realize this guy is going to challenge you. He’s either going to take what you have, he’s going to kill you, or he’s going to humiliate you. He’s going to do whatever he wants. And there was no police, no sheriff, no military around to stop him. To me, that’s very interesting to see on-screen.
How do you feel Horizon contributes to or diverges from traditional Western imagery? What are you proud of when you think of the film’s visuals? We tried to go to places that no one’s ever seen before; we tried to go to authentic places where great issues were solved. For example, crossing a river was really important 200 years ago! It’s not
so important now, you can ride a bike across these tremendous bridges that we’ve made, so we don’t apply our sensibilities to what life was like 200 years ago in the West. But we tried to show how it really was. One of the approaches we took was to keep the camera at horseback level, for the most part. I tried to keep the camerawork grounded. There are not a lot of straight overhead shots of something happening down underneath you, and not a lot of drone shots.
You go pretty far back with Horizon’s director of photography, Jimmy Muro. We absolutely do. It took me 106 days to shoot Dances with Wolves, and I shot Horizon: Chapter 1 in 52 days with Jimmy, who was a Steadicam operator on Dances with Wolves and Field of Dreams. I got Jimmy because I knew we wouldn’t be able to
wait for the [natural] light all the time, and we wouldn’t have all the bells and whistles, and that he would deliver despite all that. I’m sure there were days Jimmy ached to wait for the light to be at a certain place. But I said the story is the star, and asked him to bring as much magic as he could to it. What he managed to do, I thought was magnificent. Trying to make things interesting in midday light was one of our biggest visual challenges. And although it was unfair to Jimmy – or any cinematographer with that kind of canvas to work in – he pulled it off. I loved how we accomplished the battle. I was willing to let that battle go on for 35 minutes, but it was a living hell. And everybody went with me. The story is the star. And Jimmy embraced that and still managed to dazzle. I think his work in these movies, the imagery he gave us, will live forever.
IN THE
Dan Mindel, ASC, BSC, SASC, flies into a weather vortex for Universal’s disaster-flick update, Twisters.
BY KEVIN H. MARTIN
PHOTOS BY MELINDA SUE GORDON, SMPSP / UNIVERSAL PICTURES
“Everybody talks about the weather but nobody ever does anything about it.”
Roughly a century after Charles Dudley Warner, American essayist and novelist (and writing partner of Mark Twain), first voiced this pearl of wisdom, Hollywood provided the answer in the form of the 1996 disaster film Twister. In that fun popcorn flick, a group of tornado chasers manages to survive various encounters with nature’s most dangerous cyclones. The thrill-seekers release an array of weather sensor instruments into the core of an F-5 tornado, amassing a wealth of data on the 300-mph phenomenon that will presumably aid scientific understanding and the devising of lifesaving early-warning systems.
Even with Twister’s big theatrical grosses and a healthy tenure on home video (it was the first feature ever issued on DVD), the film’s only official follow-up was a longrunning thrill ride at Universal Studios. But in recent years, filmmaker Joseph Kosinski developed a new storyline that combined the requisite thrills with doses of science and pop culture – the latter in the form of a social media icon [Glen Powell] who documents his storm-chasing online.
Director Lee Isaac Chung, having scored a critical success with the 2020 Sundance hit Minari [ICG Magazine April 2020], decided to return to his Middle American roots to shoot Twisters on celluloid. “I learned on film in school,” Chung notes. “My first feature was shot on Super 16 millimeter, my second on 35 millimeter. So my choice of Dan Mindel as DP was specifically with this in mind, as Dan’s full commitment to shooting film helped with selling the studio. Also, we connected well together on our visual approach. Even though it’s a big action movie, this isn’t set in another galaxy and doesn’t feature superheroes. As much as was possible, we wanted to go into the real world to make this picture.”
For Dan Mindel, ASC, BSC, SASC, the director’s in-camera approach was a major selling point. “Between you, me, and the whole of Southern California, being outside from dawn to dusk is probably what I like most about my job,” Mindel shares. “One thing that enamored me right off was Isaac’s idea of shooting on location in Oklahoma and doing it in a nuts-and-bolts fashion, working from camera, grip and lighting trucks on the road.”
A-Camera Operator Geoff Haley was equally impressed with the old-school approach. “I am often pitched projects with the idea that we’re going to do things practically, without relying heavily on blue screen,” Haley muses. “But it doesn’t always turn out that way. So, I went into the project with some doubts. However, once I got to the location after prep, it was clear that everybody, from Isaac and the producers on down, was totally supportive of getting things done in-camera.”
Haley adds that there was a level of excitement over doing a movie out in the real elements, especially within the camera department. “Of course, as an operator,” he’s quick to add, “you wonder what horrific environments you’ll be placed in – I had visions of 100-mile-per-hour fans and ice machines pelting my face. I prepared myself
for that aspect plus the excitement of getting to shoot on film again at a time when 80 to 90 percent of my work is digital. We’ve shot film for a century, but the industry has a bit of amnesia about the medium. Some of the crew had never shot film at all. The cadence, rhythm and momentum associated with shooting film are different from what goes on when shooting digitally. One thing I love about film is that when you roll, everybody has focus and knows to pay attention because film is expensive.”
Mindel had suggested a vintage photojournalism approach to shooting on location. “Dan and I were trying to create painterly images that hearken back to vintage pictures we liked showing the life and culture in Oklahoma,” relays Chung. “There’s an element of nostalgia in what we were going for visually, and I think Dan and [Chief Lighting Technician] John Vecchio did a great job to achieve those scenes impressionistically. They use the strength of celluloid; Dan truly paints with light, and it isn’t just for the actors, but the light that hits within the frame and the highlights that roll off.”
As was the case on the original film, Twisters required a careful mix of visual and practical effects. Heading up the latter was SFX Supervisor Scott R. Fisher, who says he studied real tornado footage, “and the level of carnage is unbelievable. When you work on a disaster film, you rely on reference as a starting point, but then bring as many tools to the table as you can, because it is crazy what you’re trying to replicate. You need immense scale while making everything as visceral as possible in the foreground up to 40 feet away.
"You' can't just say up front that ten fans are going to be enough," he continues. "Our particulates were very high-velocity, sometimes propelled by jet engines that helped us achieve a 300-mile-per-hour look. It was a layered approach, with the area nearest the actors using our big fans from fifteen feet away, while further back we relied on the jet engines to cover the expanse.”
Another filmmaker instrumental in delivering the visual calamity was ILM VFX Supervisor Ben Snow, who worked as a digital artist on the original film. “Back then it was early days for computer graphics, and we didn’t have the greatest toolset to work with,” Snow recalls. “A few years later, I wished we could have done Twister again to take advantage of all the advances in CGI.
So to have several decades of progress to support this film was a dream come true.”
Snow says the original film used a particle system, “which is when P-render [an early volumetric approach for CGI] was developed,” he explains. “My role on that film was developing surface-based tornadoes and cloudscapes that were essentially geometry meshes with fractal shaders applied. Now we can use high-resolution fluid-dynamics simulations – plus terabytes of space! And one nice thing with this approach is that sims do sometimes generate something unexpected but useful – the digital equivalent of a happy accident on set.”
During prep, producers sent out a video crew and stills team to document real weather events in “Tornado Alley.” Snow says that yielded high-resolution skies to use for plate photography “and footage of some phenomenal cloud patterns for our library,” he marvels. “Oklahoma skies are so dynamic that they influenced our approach. One sequence was intended to be gloomy and overcast, but the plate shoot got us these amazing, blue-dappled skies. Leaning into that, we could maximize our use of the real environment in the live action.”
Chung himself drove much of the look, having conceived a clear idea for depicting the film’s various nemeses conjured by Mother Nature. “I knew each tornado needed its own look and personality,” he recalls. “Otherwise, it could end up feeling repetitive, seeing the same things happen six or seven times. So, each time our actors encountered a tornado, there would be a different context. That meant figuring out what each tornado did story-wise, with each action beat delivering a different portrait.”
Camera prep took place at Panavision Woodland Hills. First AC Sergius Tariq X Nafa reports that “anamorphic supplies are in such a critical state right now worldwide that getting matched sets doesn’t have the same meaning as it did back in the 1990s and 2000s. It’s more important that we get focal lengths that can produce a quality image regardless of series. We generally roll with the large and lavish Primo AO anamorphics [set for film]. For scenes that were shot inside vehicles, handheld and Steadicam, we opted for T Series and occasionally the custom C Series retro Panatars, a Mindel concept brought to life by Dan Sasaki back in 2014. Zoom-wise, the ATZ2 and the 11:1 Anamorphic Primo handled the telephoto shots.
“Aside from the skillset necessary to pull focus on film,” Nafa adds, “my main
“ I HAD
VISIONS
OF 100-MILE-PER-HOUR
FANS AND ICE MACHINES PELTING
MY
FACE.
I PREPARED MYSELF FOR THAT ASPECT, PLUS THE EXCITEMENT OF GETTING TO SHOOT ON FILM AGAIN... ”
A-CAMERA OPERATOR GEOFF HALEY
teammate, B-Camera 1st AC Andrae Crawford and I were acutely aware of the elevated role protecting cameras and equipment would play in our duties to the show. We tested rain deflectors from Schultz, the Eliminator, and the Prodigy air blower from Bright Tangerine. Ultimately each system had instances in which they were the preferred method of keeping our cameras functioning smoothly. Speaking of which, we predominantly used Panavision’s Millenium XL2s [including all black Death Star and Millennium Falcon variants used by Mindel on The Force Awakens] as well as a Panavised 435 for high speed.”
During location scouting, Production found a basketball arena (the former home of the NBA’s Oklahoma City Thunder) to serve as a makeshift stage in case of unsatisfactory weather and for process work. “That was large enough for our needs,” Mindel reveals, “because we needed a very long throw for our traditional lights. We also found an unused carpark nearby that worked out pretty well.”
Chief Lighting Technician Vecchio participated in the scout and had sidebar meetings with Mindel. As Vecchio recounts:
“We were armed with a large stack of computer-generated drawings from the art department to plot things out. I’d then turn the drawings with my notes over to my board operator so they could be put into Vectorworks. Even with plots noting what we’d need on any given shoot, I kept a 40foot truck full of lights in reserve, as you never know what might be needed when dealing with the unexpected. [Executive producer] Tom Hayslip and all of those guys at Universal were great at giving us whatever we needed.”
LED elements used included two- and four-foot Astera tubes to embellish a variety of building exteriors, along with many older HMI and Tungsten units to deliver a look Mindel felt newer systems could not achieve. “It was important to Dan that when lighting the principal performers, we use traditional tungsten or HMI units with a Fresnel,” Vecchio states. “Molebeams, 10K’s and Maxibrutes are old-school tech that Dan and I both love. When we were towing the lead actors in these trucks, we’d have gelled 10Ks and Minibrutes lighting them from outside. The way you place the light has to do with the shadow and fall-off, and Dan can perceive the special quality of a tungsten
light going through a Fresnel onto a sheet of diffusion before it falls onto the actor’s face. It’s the difference between illumination and lighting. But, it’s also a commitment that requires heavy generators. When you see the previous day’s footage in dailies, you see clearly what a difference that choice makes on the actors’ faces.”
“I am very reticent to light movie stars with LED’s,” Mindel adds. “My rule of thumb is that you use movie-star lighting on stars when shooting close-ups and dialog with them, as the balance of color can impact how a person looks. Using available light might sound like a sexy option, but it is not what you do with movie stars.”
Scenes following the heroes in vehicles as they pursued tornadoes were often captured using an Allan Padelford Camera Cars MTV insert truck, equipped with a 30foot Technocrane. Traveling ahead of the car carrying the actors was an SFX flatbed, which Fisher equipped with a wind/rain/weather rig that allowed him to blast the windshield and sides of the vehicle.
“We had a lot to deliver while coordinating with other departments and being careful not to complicate things for VFX,” Fisher shares. “We destroyed so much
of the foreground with our rubber debris that it was hard seeing through to the background in some cases. Testing early helped us nail things down; when [Chung] saw where we were going, he realized there would be these huge value gains for enhancing actor performance, as when they would have to lean in against the wind and react to being hit by stuff in the air.”
Haley, who was sometimes attached to the outside of a speeding vehicle, recalls seeing the SFX people “essentially getting ready for battle,” he smiles. “They’ve got their howitzers and catapults that are going to assault the operators and the actors, which can be daunting! It was always safe, but that didn’t mean it was without pain, because you get pelted with a lot of stuff. It’s an adrenaline rush, and it makes your operating feel more real, not unlike what the war cameramen did with handheld 16 millimeter in Vietnam. The actors, by the way, were troupers, always ready to go again.”
Nafa says he’s had experience pulling focus with film. “We had upgraded HD taps on the cameras,” he describes, “which work quite reasonably, plus LightRanger is a huge
aid. We also had a first AD and operators who were veteran filmmakers, so if I or [B-camera 1st] Andrae Crawford needed time to get positions, they’d give that to us.”
One tornado event is preceded by a rodeo sequence, which was an elaborate night exterior shoot. “We had 13 or 14 Condors loaded with Lightning machines,” Vecchio describes, “plus some 20Ks and 12-light Maxis. We had Super Troupers and other lights festooned over all the merchant vending areas, plus neon signs, plus more lighting for the people in the stands.
“[Chung] came up with a wonderful shot of leaves dropping down into the grandstand,” Vecchio continues. “Our stars look up and realize something bad is about to happen – and then everything goes dark. Per [Chung], the rest of the scene was to be illuminated only by lightning. We used traditional Lightning Strikes machines, plus Creamsource Vortexes – which can strobe and flash – that also pulled double duty for certain night exteriors. The new Vortexes and other LED’s have a higher IP rating, which means they have a higher resistance to rain. But the older lamps needed old-school
solutions, with our grip brothers, led by Key Grip Joseph Macaluso, putting Celotex over them as protection against the elements.”
Haley says operating on Twisters was a primal experience. “With digital, the operator has the worst seat in the house, because he’s either looking at a crummy little monitor or through an eyepiece with crappy resolution,” he relates. “But with film, I’ve got the best view of the action through the eyepiece, while everyone else relies on an HD video tap.”
When the weather changed during shooting and the sky darkened, Haley says he could see storm clouds forming in the distance. “I thought: ‘This looks amazing.’ But then I realized lightning would be following the clouds, and we’d only have a few minutes – with everything looking so dark and perfect – before we’d have to follow the industry-wide regulation for a mandatory 30-minute shutdown [when lightning strikes are within six miles]. That’s the biggest irony of this movie about inclement weather! By the third week, I knew
that whenever I got excited by what I saw through the eyepiece, I also knew we’d only have those ten minutes before production would get shut down. So, we might wait 25 minutes and be just about ready to go again when there’d be more lightning, triggering another automatic delay.”
Given the need for visual effects in so many shots, Chung relied on previsualization to prepare the various departments. “We had some previs on scenes that demanded technical choreography,” the director shares. “But when out on roads in nature, we tried to capture things in as natural a way as possible. Ben Snow was a real supporter of doing things practically and participated in every conversation I had with Dan on set about where the tornado was going to be and how we’d be framing the action.”
One sequence utilizing previs involved a sustained oner.
“Having done 104 movies, when I’m asked how to make the next oner ‘the best anyone has ever seen,’ I can sometimes feel stumped,” Haley laughs. “I saw the first iteration of previs, which has the heroes take cover from a night tornado in a drained pool.
Previs can’t really consider all the practical realities of shooting. So, even though the data at the bottom of the previs frame tells you what the taking lens will be, the actual move might not consider the physical size of the camera. How can a 35-millimeter camera with a magazine move between pipes that are eight inches apart? And how do you make the camera leap up over a table, then accelerate to 25 feet per second inside three seconds? On certain shows, everybody just loves seeing previs where you go zooming up into a helicopter view before suddenly entering the quantum realm, but sometimes you have to put on the brakes and consider reality.”
To that end, Haley and Snow went to the location with Chung to work through the scene before shooting commenced. “We took the previs with us to the pool so Isaac could review it in context,” Haley adds. “We said, this looks cool, but what if we did this instead? Using whip pans, careful lighting, and CG stitching, you can create a oner made from six or seven different shots and multiple elements.” The new plan included a crane move, Steadicam and handheld that
Haley did with a [Walter Klassen] SlingShot rig attached to a gimbal.
For a proof-of-concept, Haley and Snow used stand-ins and shot each pass on iPhones. “Doing this physically, almost like stunt vis, was the best way to work out details,” Snow reports. “We had a useful reference for how these seven or eight plates could be made to work as one, with lighting covering the handoffs or digitally stitching them together.”
“The advantage of having a world-class operator like Geoff Haley,” Mindel muses, “is that I can leave him to choreograph the move, while I go build a lighting scenario that will support the move and help with the transitions. I’ve seen the scene a few times in the last couple of weeks while coloring the movie, and I feel it was a particularly good use of that technique. I’ve learned that visual effects require careful treatment to give them what they need to make us look good. The payoff from working that way is not just great effects, but also that I can ask for a favor when I’m in trouble, like needing to remove a giant yellow crane from frame in the middle of the night. It’s a symbiotic relationship.”
“
WHEN YOU SEE AN ANALOG MOVIE, ONE SHOT ON LOCATION, YOUR BRAIN STOPS COMPARING IT WITH THE WAY OTHER THINGS LOOK, AND YOU GET IMMERSED IN THE STORY. ”
DAN MINDEL,
ASC
Snow concurs. “One of the most important parts of having the VFX Supervisor on set is reminding everyone there about this giant actor at the back of the shot that we’ll be putting in. Camera has to understand because they must allow enough space to frame for something that isn’t there. Intuitively, they want to frame for what they see, so communicating with images helped get everybody on the same page.”
Those post-added “monsters” benefited from the input of ILM animators. “Since the twisters have character, our animators essentially created their performances,” Snow continues. “That would be ingested into our sim engines and used as a basis for each tornado. Each storm required a bespoke solution, so it was not dissimilar to how Dan might use his lights, silks and flags to customize the light and optimize the camera image.”
Part of the VFX game plan revolved around what could be handled at ILM versus what could be addressed in the DI. “We knew that with all the sunny days, changing to overcast looks was going to be a big part of our end,” Snow adds. “That meant sometimes having to rotoscope every blade of grass or stalk of corn in an attempt to get rid of highlights created by sunlight. In the opening sequence, they are driving in incredibly thick fog, hail and rain, but a lot of that was done in sunshine. Since we had a perfect CG match for the car, we could replace all or part of the vehicle and avoid showing those hot glints of sunlight that would have been inappropriate for the
conditions of the scene.”
More VFX matching came about as a result of the SAG-AFTRA strike, which created a gap in shooting. “The seasons had changed, so to get things to match we’d be changing green grass to yellow or vice versa,” shares Snow. “The whole postproduction workflow benefited from the use of ACES. ILM had had a lot of involvement with the team that created ACES, and for this show, Company 3 was totally on board with the idea of using it as a basis for their color pipeline. As a result, this was probably the smoothest DI I’ve been involved with.”
Mindel, having worked with Company 3 Colorist/Executive Producer Stefan Sonnenfeld since their doors first opened, says: “I’m aware of what can and cannot be achieved in the DI. Stefan knows me well enough to have thrown his look into the mix in a way that I will appreciate when I turn up. For me, that has become the most fun part of the process – the discovery element, finding out whether our changes made the story better. At the same time, I know that for some other projects, Stefan’s input is the most important part of the finishing process, often dictating how the final result looks and feels, which marks a significant change in filmmaking.”
The DI, for Mindel, also made clear the differences between film and digital. “I was timing a digital movie a month before starting to grade Twisters ,” he reports. “The space that a digital movie lives in is beautiful and sleek, with high contrast values that make it quite enticing. But it’s very much its own thing. When you see an analog movie, one shot on location, your
brain stops comparing it with the way other things look, and you get immersed in the story. I think that’s because of the subliminal nature of film emulsion and the way intangible atmospherics appear within a scene – smoke, darkness, a shaft of light, wind, rain – become texturized. That sense of texture is missing from the digital space. But when you’ve got nothing to compare it to, you don’t know what you’re missing. That was clear seeing these projects back-to-back. The payoff shooting film was huge.”
Chung says working at Skywalker Sound with supervising sound editor Al Nelson and his team on Dolby Atmos and 7.1 mixes, left him impressed with the dimensionality of the soundscape. “It was like we were inside the tornado,” the director enthuses. “Al’s team conveyed a feeling of rotation within the storm, not just the sound of wind but also of debris hitting. That created an immersion and emphasized the danger of what our characters faced. Equally wonderful and surprising was how they were able to clean up our production sound, eliminating a lot of wind-machine noise so the on-set dialogue tracks could often be used.”
The making of Twisters reflects a “right tool for the right job” approach that has sometimes been neglected in the current love affair with digital technology. As Mindel concludes: “Being the oldest guy in the room is not an impediment. I’ve seen multiple times how best the various departments can work together, and that’s an advantage. If there’s an issue of doing something that nobody else is familiar with, I can sometimes raise my hand and say, ‘Hey, I know how to do that.’”
MAIN UNIT
Director
A-Camera
B-Camera
B-Camera 1st
B-Camera
LOCAL 600 CREW
AERIAL UNIT
2ND UNIT
Director
A-Camera
A-Camera
A-Camera
B-Camera
B-Camera
B-Camera 2nd
C-Camera
VFX SUPERVISOR SCOTT FISHER SAYS THAT WHEN WORKING ON A DISASTER FILM, YOU RELY ON REFERENCE AS A STARTING POINT, AND BRING AS MANY TOOLS TO THE TABLE AS YOU CAN. IT’S CRAZY WHAT WE’RE TRYING TO REPLICATE.”
f a k e n
Dariusz Wolski, ASC, takes one career giant leap in the new romantic comedy, Fly Me To The Moon.
BY TED ELRICK
Although Director of Photography Dariusz Wolski, ASC, is best known for his many large-budget action features (The Martian, Prometheus, Napoleon, Pirates of the Caribbean: The Curse of the Black Pearl) for directors like Ridley Scott and Gore Verbinski, he jumped at the chance to work on the new Sony Pictures romantic comedy Fly Me to the Moon, directed by longtime TV Producer/Writer/Director Greg Berlanti. The period story, set in Florida in the late 1960s, centers on a marketing hotshot named Kelly Jones (Scarlett Johansson), who wreaks havoc on NASA Launch Director Cole Davis’s (Channing Tatum’s) already difficult task of putting a man on the moon. When the White House deems the mission too important to fail, Jones is directed to stage a fake moon landing as backup. “It’s completely different from the movies I did with Ridley,” Wolski explains. “It’s a light romantic comedy, something I haven’t done, and was a bit of a challenge. Although it deals with space and the mission to the moon, there’s a bit of a twist.”
That twist included Berlanti casting Wolski as the cinematographer in the story who is tasked with faking the moon landing, à la shades of Capricorn One . “Dariusz was a natural, as so many of our conversations in prep were how we were going to shoot a fake 1969 moon landing as though it was done with the technical equipment of 1969,” Berlanti shares. “When are we going to be on real cameras, in camera; and when are we going to be on our cameras? There were a thousand logistical conversations.
“Many of the conversations they would have had if they were going to fake the moon landing were probably similar to what we were talking about,” Berlanti continues. “I had to also audition people for the role that Dariusz ended up playing, and it just seemed apparent to me that he should do both, because he was so much more real. It was so much fun to have him handle the onscreen stuff as well as shooting the movie.”
DIT Ryan Nguyen explains that when Wolski was acting, it was his job, along with A-Camera/Steadicam Operator James
Goldman and Chief Lighting Technician Josh Stern, to ensure Wolski was happy with what was being photographed.
“We worked together to cover Dariusz,” Nguyen says. “I think that may be the proudest thing for me, being there for him and making sure that he was comfortable in front of the camera and behind the camera during his two weeks of acting.”
Of his turn in front of the camera, Wolski laughs, noting, “I was completely conned into it by [Berlanti]. He killed me with kindness and put me in a place where I had no choice. It was fun, but also a little nerve-wracking, as my first take, I was looking straight in the camera to make sure [Goldman] operated it correctly. James was laughing. He couldn’t help it. The most amateur acting is looking straight at the lens. I did have to join SAG, but I don’t think I’m going to pursue it as a career goal. [Smiles.] It was fun while it lasted.”
Wolski, who shot with the ARRI ALEXA Mini LF and his own set of Angénieux lenses, says a lot of time was spent analyzing
period footage from NASA. “They restored a lot of 70-millimeter footage,” he describes. “And it was beautiful, almost too good. We analyzed it all, and when you look at period stock footage or documentary footage, it’s been shot in every possible format with all the different exposures and different film stocks. Once it’s been digitized, you have a lot of flexibility. You can change contrast, color representation, add grain or take away grain. We used quite a bit of the period footage, blending it with our own.”
While there was no IMAX footage from the 1960s, Wolski says many of the big helicopter flying shots over the launch pad were available in 70 millimeter. “With the large format [70mm] footage,” he continues, “you’re just trying to make it all consistent. Our terrific costume designer [Mary Zophres] and Production Designer [Shane Valentino] helped smooth the transitions from period to current footage as well.”
Unique for other films centered around NASA, Fly Me to the Moon was shot extensively at the Cape Canaveral/Kennedy
Space Center in Florida (as opposed to the Johnson Space Center in Texas, where other projects have lensed). Cape Canaveral is where NASA prepares to launch the mission, and Johnson takes over once the mission has launched. Berlanti’s film is set at Canaveral, and the production was given unprecedented access to launch sites.
Aerial Director of Photography David B. Nowell, ASC, whose credits go back to Apocalypse Now , had previously shot at Canaveral for Transformers: Revenge of the Fallen , and for the zero-gravity sequences of Apollo 13. “What was different this time,” Nowell shares, is the unlimited access his team had. “We had a guy from NASA, John Graves, come on board the helicopter,” Nowell recounts. “We thought we were going to have to carry him with us every time to prevent us from doing something wrong. But he was great. It was basically a history lesson. You know this is where the Redstone launched for the first two flights
to outer space with Alan Shepard and Gus Grissom. You know here’s where they did the launch for John Glenn’s historic flight. You know this is the pad where Apollo 11 launched, and it’s now being leased by SpaceX.”
Nowell says SpaceX would not allow his team within a mile-and-a-half of the pad “because they had an active rocket there that was going to launch people up to the ISS,” he states. “So, we ended up using another pad, 39B, that was used for other things besides Apollo. We shot plate after plate after plate because everything was pretty much the way it was back in 1969.”
The aerial team filmed in and around the 39B pad because, as Nowell explains, “they were going to add in the Saturn V rocket and everything that would have happened during Apollo. The only restriction was staying the mile and a half from around the SpaceX pad and one side of the VAB [vehicle assembly building], as there is something there that is top secret. We did the best we could to cover that up.” Nowell also
shot in Atlanta for sequences with a P-51 Mustang where Tatum’s character is flying Johansson’s PR maven around. “It was a lot of beautiful shots with the P-51 around all the lakes,” he says.
Nowell used the Shotover K1 stabilization system from Team 5. “Then I had my tech, Peter Graf, build Team 5’s special array platform called the Trident,” Nowell describes. “We had three Mini LF’s and three 21-millimeter Zeiss compact lenses, a huge array system. The cameras are turned on their side so they’re in portrait [orientation] rather than landscape; overlapped and stitched together it’s, like, 12K resolution.
“We did a lot of plates for that, both primary stuff for production and plate stuff for visual effects,” Nowell adds. “I had a list of what to get from Dariusz and sat with Sean Devereaux, the visual effects supervisor, who didn’t want to fly. He’d tell us what they needed, and then we’d go up, shoot it all, and come back and show it to him. Sean would then approve it or say, ‘We need to do something different.’”
“GOING TO NASA WAS MY FAVORITE PART OF WORKING ON THE JOB. WE SAW TWO SPACEX ROCKETS LAUNCH; ONE FELT LIKE IT WAS LITERALLY ACROSS THE PARKING LOT FROM WHERE WE WERE SHOOTING.”
A-CAMERA/STEADICAM OPERATOR JAMES GOLDMAN
While there were storyboards, Wolski says it was mostly him and Berlanti talking about the scenes and then having the actors come in to figure out the blocking. “There was a full-on read and some rehearsals,” Wolski recalls. “The actors would look at the scene, try this, try that – Scarlett was just amazing. She was so right on.” Berlanti, who studied theater in college before becoming a TV writer, screenwriter and producer/ showrunner (and also finding time to direct three features), says he’s a big believer in rehearsals and remembers Sidney Lumet’s comments about the value of rehearsing before shooting. “Before we get to set, we read through,” he explains. “It allows the actors to play with certain things that save you a lot of time on the day. We did a week’s worth of rehearsals with the main cast.”
Operator Goldman has worked with Wolski many times and notes how smooth the show was because he’s such a wellthought-out cinematographer. “He’s so
prepared on the day you show up,” Goldman describes. “Very thorough in his prep, which makes my job so easy. Dariusz is very clear with what he wants to accomplish, with each scene and each day.”
Goldman goes on to note that Wolski “is a forward-thinking filmmaker, as far as when you arrive on set. He likes to scout; he likes to pre-light; he likes to be in the moment so that he’s able to move with speed and efficiency. And Greg Berlanti is just a wonderful human being. He set the tone across the entire board. It was a delightful film to work on, a straightforward, oldfashioned romantic comedy.”
Several crewmembers of Fly Me to the Moon noted how Johansson’s character is all about raising awareness for the space program by marketing things like Tang, the orange drink of astronauts, and Omega watches. Others pointed out how they watched space movies before filming, such as Philip Kaufman’s The Right Stuff, shot by
Caleb Deschanel, ASC.
“Going to NASA was my favorite part of working on the job,” Goldman continues. “We saw two SpaceX rockets launch; one literally felt like it was just across the parking lot from where we were shooting. We shot at the Apollo Memorial, which is for those who died on the Apollo 1 mission before take-off. To see a space launch at such close range was a once-in-a-lifetime event.”
While many other space films are set in Houston, Fly Me to the Moon stands alone for its unique setting. “We’re at Canaveral, where nobody claps,” Wolski explains. “It’s just not done until the ship is safely in orbit, and that happens in Houston. There have been so many bad experiences, nobody claps until the ship is in orbit.” The DP adds that he was very young during the moon landing. “I was watching it on Polish television, but I do remember it.”
As for recreating a fake moon landing, Wolski says lighting, in particular, presented many challenges. “We didn’t really use arc lights, just one for the Moon landing but then for real we used the arc-light housing with an HMI inside, so that was kind of fun to look for all the old lights,” he recalls.
With the fake moon landing shot in Atlanta on a large stage, Chief Lighting Technician Stern says they needed to figure out what the light source would have been. “Other movies have used 50 K SoftSuns because they’re a very wide, single-shadow source,” Stern shares. “The problem was that we needed to do shots of people sitting at video village looking at the astronauts, who are on wires above this lunar landing set. It has cables holding them up so they can float, and we also wanted to be able to turn around and look at the light on the platform.”
What lighting was available to filmmakers in the 1960s?
“The most obvious thing we could think of was a Brute Arc Fresnel,” Stern continues. “Fresnels are beautiful, but they don’t have a wide enough spread and they don’t really work as a shadow caster, so we tested clear lenses. I knew the K5600 Alpha HMI’s have
a clear lens function with a small reflector and nearly 180-degree beam angle, and it turned out to be the perfect single-shadow source on our stage. Any serious lighting nerd would probably say, ‘Well, the shadows on the moon’s surface should not get wider because of the sun’s proximity,’ but the clear lens was convincing on camera.”
Stern says he called Sam Leu at Warner Brothers, “who’s in charge of all the studio’s vintage fixtures. I asked if they had any Brute Arcs, and Sam said they have four,” Stern adds. I asked if they could be modified to have HMI guts so we could use them on a stage. He modified two of their carbon arcs to have an 18K globe inside so we could photograph that and use it on stage, but not have smoke billowing out the top. Our stage in Atlanta didn’t have perms, let alone the evac systems that all of the old stages in Hollywood have.”
Leu’s modifications were key to the fake moon-landing shoot.
“We put one up on a 30-foot-tall steel truss tower riding on an old Molevator stand, the ones that have a drill motor,” Stern notes. “I also was able to get a lot of period fixtures from Cinelease, who let us
take their branding off and repaint so we could put them all in the background. We lit the entire moon set with the modified carbon arc or the Alpha, depending on the shot requirements.”
Berlanti is quick to point out just how important Wolski was to the entire project. “I don’t think I’ve learned more in the last decade in this business from anybody than I did watching Dariusz on set,” the two-time Emmy nominee concludes. “It was one of the highlights of my life to work beside someone of his caliber and to watch him create on the day. To see where his inspiration came from will always be one of my favorite experiences. He also has a wonderful sense of humor, which is key when making a comedy. If you’re not laughing on the set together, the audience won’t be laughing later.”
As Wolski concludes: “The challenge in this film was to be true to the historical elements. The other challenge, for me, was to shoot a romantic comedy, which I’ve never done. Many of the films I work on are so dark, and this was light. It was a departure from my normal style of working, and what I found is that shooting a comedy is all about the directors and actors.”
LOCAL 600 CREW
Director of Photography
Dariusz Wolski, ASC
A-Camera Operator
James Goldman
A-Camera 1st AC
Wilfredo Estrada
A-Camera 2nd AC
Trevor Carroll-Coe
B-Camera Operator
Jesse Roth
B-Camera 1st AC
Trevor Rios
B-Camera 2nd AC
Kevin Wilson
C-Camera Operator
Ramon Engle
C-Camera 1st ACs
Andy Hoehn
Daniel Guadalupe
C-Camera 2nd AC
Katrienne Soulagnet
DIT
Ryan Nguyen
Digital Utility
Breona Jones
Still Photographer
Daniel McFadden, SMPSP
Unit Publicist
Denise Godoy Gregarek
DIRECTOR GREG BERLANTI NOTES THAT, “I DON’T THINK I’VE LEARNED MORE IN THE LAST DECADE IN THIS BUSINESS FROM ANYBODY THAN I DID WATCHING DARIUSZ [ABOVE AT CAMERA] ON SET. IT WAS ONE OF THE HIGHLIGHTS OF MY [WORKING] LIFE.”
STAND YOUR
STAND YOUR
GROUND
Director of Photography James M. Muro rides into the often treacherous breach of the American West for Kevin Costner’s sprawling cinematic anthology.
GROUND
Thirty-six years in the making, Horizon: An American Saga is a sprawling chronicle of the Old West too expansive to be contained in a single film – and too cinematic for its creator, Kevin Costner, to conceive as an episodic series. The sweeping narrative is split into four films (two of which are yet to be shot), and spans more than a decade before and after America’s Civil War. The anthology aims to capture the complex story of western expansion and settlement with vivid, immersive detail and a palpable sense of place. Yet another ambitious passion project from twotime-Oscar-winning filmmaker Kevin Costner (remember Dances with Wolves and Waterworld?), Horizon, which Costner co-wrote, directed, and stars in, debuted with Chapter 1 this past June and paves the way for Chapter 2 (release still to be determined.)
Director of Photography James Michael “Jimmy” Muro’s partnership with Costner dates back to Dances with Wolves (1991), when Muro served as Steadicam operator. Muro’s journey from camera operator to director of photography culminated in his first feature Director of Photography credit on Costner’s Open Range (2003). The collaboration aims to hit new heights with Horizon, visualizing the American West in a raw, honest way not typically associated with a genre rife with visual and narrative clichés.
Muro’s approach to lighting emphasized natural, organic illumination, skillfully manipulating sunlight, often with 20×20 mirror boards to redirect light into interiors, while employing negative fill to shape and control contrast. To address the unique challenges of the vast Utah landscapes – where the film was made –particularly inherent in night scenes, Muro
worked closely with 2nd Unit Director of Photography Rob Legato, ASC, to develop a day-for-night technique that creates a distinct look characterized by edge lighting and deep blue-black skies. (This visual approach is thrillingly showcased in a dramatic horseback escape sequence.) For some night interiors, real gas lanterns were utilized as both practicals and light sources for the actors. For daytime exteriors, a shower-curtain material was the go-to choice to soften the harsh sunlight without it becoming too diffused.
Costner’s commitment to realism required creative problem-solving. A prime example is an early scene that sets the story in motion – an Apache attack on the home of the Kittredge family forces Frances Kittredge (Sienna Miller) to lead her daughter to safety through a narrow tunnel. Safety concerns precluded the use of real gas lanterns in the
confined space. Muro’s unique solution was to use small LED cube lights purchased at Best Buy, which fit perfectly into the lanterns; flame flickers were added in postproduction to complete the effect. The scene also exemplifies the dedication to practical effects. During the ensuing gunfight, as the Kittredge house burns, real embers were blown onto the set from above instead of relying on visual effects. “If this was studiofinanced, they would have shut us down a long time ago,” Muro quips, highlighting the unique freedom afforded by the film’s independent nature.
Horizon stands as one of the most ambitious independent films ever made, partly funded by Costner’s personal $38 million investment. The rare degree of independence allowed the filmmakers unprecedented creative control. Despite the epic scale, which involved large numbers
of extras, wagons and horses in remote locations, the schedule for Chapter 1 and Chapter 2 was a remarkably efficient 50 to 60 days, with zero reshoots or pickups. “It was such a privilege to be there,” Muro adds. “We’d pinch ourselves daily as a reminder [that] this was one of the biggest independent projects ever made, and that’s where our energy came from.”
Production Designer Derek Hill’s work with Costner dates back to Oliver Stone’s JFK (1991), shot by Robert Richardson, ASC. Hill oversaw the creation of meticulous periodaccurate sets, including a mining town called Watt’s Parish that was constructed at an elevation exceeding 8,000 feet in Utah’s La Sal mountains. Perhaps the most ambitious undertaking was the construction of an artificial river within a valley when a suitable natural location couldn’t be found. “There’s something new on every project, but only Hollywood or the Pentagon would take on the task of building a river,” Hill remarks.
This artificial river, conceived as a 320-by-80-foot pool, was built using techniques typically employed for resort pools. Concrete and rebar were laid along an extended trench, with water pumped in from the nearby Colorado River, as it was built on a farmer’s land with water usage rights. The titular town of Horizon was then built on its banks, setting the stage for the spectacular Apache attack sequence that culminates in the town’s fiery destruction, which was shot over five nights. Although the town of Horizon was destroyed, the river is still standing and will be used for filming the remaining two chapters in the Horizon saga.
While New Mexico was considered for cost reasons, Utah’s visual grandeur ultimately proved ideal for Horizon’s epic scope. Costner even made an impassioned plea before the Utah senate, advocating for the value of shooting in the state, which helped to secure a crucial film tax rebate. The Horizon saga now stands as the largest film production in Utah’s history.
The impact extends beyond the film
itself. Costner is currently developing Territory Film Studios, a 150,000-squarefoot facility in Utah that will house sound stages, production warehouses and office space. This investment promises to further bolster Utah’s growing film industry and solidify Horizon’s legacy as a true catalyst for economic growth.
To better capture the towering buttes that are so prevalent in Utah’s landscape, "Muro opted to break free from the widescreen Western tradition, shooting Super 35mm cropped to 2.40:1 (as he did on Open Range) in favor of a taller aspect ratio. A-Camera Operator Alan Jacoby notes that, “The landscape and the locations are characters in this movie. Having the top and bottom of 1.85 just gives us more of that expansive feel for the western expansion story that we’re trying to tell.” To support that goal, Horizon: Chapter 1 was shot in 6K, 16:9, utilizing a Super 35 area of the RED V-RAPTOR’s large-format sensor and framed for a 1.85:1 extraction.
Muro’s selection of the V-RAPTOR (RED’s V-RAPTOR XL was used for Chapter 2) came
after testing he had done on the Paramount+ episodic drama SEAL Team. “I was shocked by how good the Raptor was,” Muro recalls. “It was light-years better than what I’d been using. I was so happy with that sensor.” The RED’s compact nature also appealed to Muro, who appreciates “the project box nature of RED’s many camera systems.” After selecting the V-RAPTOR, Muro consulted with 1st AC Alex Scott (working with Eric Messerschmidt, ASC and David Fincher at the time), who recommended pairing it with Leitz Summilux-C lenses. This combination proved crucial in achieving the film’s distinctive look. “Alex’s recommendation was great and I think we hit the jackpot when we found that combo,” Muro notes. “The look of Horizon is unlike many of the films in the last 20 years.”
Breaking away from the prevailing trend of shallow focus for daytime exteriors, Muro also opted to keep the backgrounds in focus with stops between 5.6 and 11. This approach lets the exquisite locations create depth naturally. As Jacoby asserts: “We want the viewer to understand that we’re not shooting on a soundstage, and it’s not with a backdrop, CG or AI. These are real locations.”
The Local 600 camera team employed a three-camera setup for nearly every scene, with Muro always operating C-Camera. This allowed for capturing spontaneous moments and unique angles, affectionately referred to as “Jimmy shots.” “It gives me great joy to just sneak in there with the third camera and then all of sudden it’s like the primary shot that the editor chose,” the longtime Steadicam-operator-turned-director-ofphotography reflects.
Costner’s vision called for a return to classic filmmaking techniques in other areas.
“Kevin is not a handheld filmmaker,” Muro explains. “So, we decided we’d go back to the classic John Ford approach. When the camera moved, it would always mean something; it would never be gratuitous.” As such, the team primarily relied on sliders, Steadicam and a jib, using tools like the Technocrane sparingly. For the majority of the shots, Jacoby operated A-camera on a Lambda head at the end of a 10-foot jib arm attached to an electric four-wheeler, providing extraordinary flexibility and efficiency across a variety of terrain.
The straightforward approach to gear and camera movement allowed for quick adaptations, which were important given
the lack of formal rehearsals. As Jacoby explains: “Kevin is an actor’s director, first and foremost. So, we would shoot what would normally be considered rehearsals, see what we captured, and then just refine it. Spontaneity was the key.” Minimal takes, multiple cameras, unfussy equipment and letting the natural beauty of the locations, the skill of the cast and the power of the script take center stage – these were the guiding principles on Horizon
The purity of Costner’s intent also extended to the film’s treatment of history. In establishing the reality of the time and place, extraordinary attention to detail was paid to the props and costumes. Western wagons were custom-built by the Amish community in Pennsylvania to specifications suitable for filming, while remaining authentic.
Costume designer Lisa Lovaas based all her designs on reference images from the era, going so far as to photograph original patterns of women’s clothing to ensure period-accurate prints on dresses. “Kevin wants people to experience something,” Muro details. “So the telling of this part of America’s history is not about changing history. It’s not about manipulating history; it’s about showing how difficult it was in those days and creating the reality of the
OPPOSITE PAGE/ ABOVE: UTAH’S VISUAL GRANDEUR PROVED IDEAL FOR HORIZON’S EPIC SCOPE. DIRECTOR COSTNER MADE AN IMPASSIONED PLEA BEFORE THE UTAH SENATE FOR SHOOTING IN THE STATE, WHICH HELPED TO SECURE A CRUCIAL FILM TAX REBATE.
period so people can feel like they go there when they watch this movie.” As Lovaas adds, “This is the Wild West, so Kevin is not sugarcoating the situation. What he wants is for people to absorb what it must have been like back then.”
From the very beginning, Costner’s focus was on making the Horizon saga a theatrical experience, with the grandeur of the visuals matching that of the story, and evoking the feeling of the films from his childhood. To achieve the desired cinematic look, a hybrid digital-film grading process was employed. In an unexpected discovery during post-production, FotoKem’s SHIFT Analog Intermediate process was suggested. The process involves printing the digitallyacquired and color-graded images to celluloid film and scanning them back to digital.
“Many of the great Westerns in cinema history were originally experienced on a film print,” explains FotoKem Senior Colorist Phil Beckner. “So, while this movie was acquired
digitally, we knew we wanted to incorporate a strong filmic look in the final product to bring that same experience to modern moviegoers.”
For Horizon: Chapter 1 , approximately 50 percent of the scenes underwent an additional printing to an interpositive film stock before re-scanning. This technique was applied to all night scenes and those at Watt’s Parish, based on aesthetic preferences. The strategy shifted for Chapter 2, exclusively utilizing a 50 ASA negative film-out process for all scenes. Muro says he’s delighted with the results: “I was already thrilled with the look from the V-RAPTOR, but the filmout gives the footage a very unique color structure, and I just couldn’t be more pleased with it,” he notes.
Muro says his approach was deeply rooted in serving the story and Costner’s vision. “We want to get the best light we can. The best performances we can. And we were lucky. But we try to control our luck,” he
continues. “I get a text from Kevin in post, and he says, ‘Every time I go through the footage to make sure the scene is right, or I’m struggling with anything, I find something that you’ve offered that makes it all work, and I can’t thank you enough.’ With Kevin, I’m never just a pair of hands. I’m there to be creative.”
The 33-year working relationship between Muro and Costner has fostered a shorthand and collaborative synergy that other department heads noticed. “Jimmy and Kevin’s collaboration is one of the most beautiful and seamless I’ve ever witnessed,” shares Lovaas. “The way they talk to each other, their common voice and taste that they share is quite beautiful and I think contributed to harmony on the set.” The sentiment is echoed by Muro himself: “I’m super proud to be working with an American treasure in Kevin Costner,” he concludes. “Our collaboration is what I always aspired to have in my career in camera.”
COSTUME DESIGNER LISA LOVAAS DESCRIBES THE WORKING PARTNERSHIP OF COSTNER (L) AND MURO (R) AS “ONE OF THE MOST BEAUTIFUL AND SEAMLESS I’VE EVER SEEN. THE WAY THEY TALK TO EACH OTHER, THEIR COMMON VOICE AND TASTE THAT THEY SHARE IS QUITE BEAUTIFUL AND CONTRIBUTED TO HARMONY ON THE SET.”
LOCAL 600 CREW
Director of Photography
J. Michael Muro
A-Camera Operator
Alan Jacoby
A-Camera 1st AC
Roger Spain
A-Camera 2nd ACs
Blake Hooks
Cameron Keidel
Ted Buerkle
B-Camera Operator/Steadicam
Michael Alba, SOC
B-Camera 1st AC
Stephanie Saathoff
B-Camera 2nd AC
Noah Muro
C-Camera 1st AC
Seth Peschansky
C-Camera 2nd AC
Austin Swenson
DIT
Tim Balcomb Loader
Zac Prange Utility
Marco Muro
Still Photographer
Richard Foreman
Unit Publicist
Diane Slattery
2ND UNIT
Directors of Photography
Robert Legato, ASC
Eric Leach
A-Camera Operator
Thomas Tieche
A-Camera 1st AC
James Apted
A-Camera 2nd ACs
Max DeLeo
Jeremiah Kent
B-Camera 1st AC
David Rhineer
B-Camera 2nd ACs
Grant Williams
Courtney Miller
DIT
Tyler Higgins Loader
Ethan Hamme
Where exactly is technology leading us? We talked to 10 professionals, from an Oscar-winning DP to the current president of SMPTE, to find out how people and machines mesh at this historic inflection point.
INTERVIEWS BY DAVID
GEFFNER
Oscar-winning Director of Photography
Greig Fraser, ASC, ACS, has been pushing the boundaries of production technology for forever – but not necessarily for the reasons you might think. Yes, Fraser was the first cinematographer to shoot an entire feature (Rogue One: A Star Wars Story) on the ALEXA 65, deftly combining that camera’s massive digital sensor with vintage Panavision lenses from the 1970s. And, yes, Fraser was also the first DP to pioneer the use of an LED Volume on Season 1 of The Mandalorian (for which he won an Emmy). And, yes, no less a cinematic icon than Roger Deakins, ASC, BSC, called Fraser’s work on The Batman (where he took the use of the LED Volume to new places in feature filmmaking) “extraordinary.” But as my conversation reveals, Fraser is mostly using future tech to give filmmakers more time to do what they do best. Whether that’s employing game engines to pinpoint light and shadow for Dune 2’s Jordanian locations or creating dusk in the Volume for The Batman, for Fraser, it’s all about making the hours in a day as creative as humanly possible.
What did you do differently on Dune 2 from the first Dune , based on how technology had advanced in that window of time? Greig Fraser: It’s a good question. We used Unreal Engine a lot more to plan the placement of sets and time of day on the backlot. We used it to work out where and exactly when shadows would land in a way that was much more precise. If you need an actor to be half in shadow and half in sun, the old-fashioned method is to get an Artemis or Sun Seeker app and figure out that “around four or five there’ll be a shadow here.” [Laughs] What we were able to do with Unreal was to scan the rock formations in Jordan with a drone, make them into a 3D model, put them in the right place on the planet, and then type in the day we wanted to shoot. What we got back was the exact time a location is in shade and/or sun. Then we worked closely with the first AD to schedule – not approximately, mind you, but exactly –when to shoot the scene. It’ll be 9:33 a.m. and we’ll be done by 10:45 a.m. Right, let’s go.
So for something like the long opening scene in Dune 2, this tech was a game-changer? Exactly right. That scene was very complex and took place over 12 to 14 different locations, so the continuity of the sun and shadows was very important. Every day was a hard out, so we could never afford to just say, “Oh, we missed that one, we’ll come back tomorrow.” The schedule on that film was a house of cards, and if we didn’t get something, we were in dire straits.
You pioneered shooting in an LED Volume for The Mandalorian and used it again on The
Batman . Did you use it on Dune 2; and if so, how has it advanced? We actually didn’t use it on either Dune film. The second Dune has a lot of sunrises and sunsets, and dawns and dusks, and those scenes were short enough that we could be in real locations, with a scaled-down crew, to get what we needed. The Batman had a lot of long scenes, in mainly one location, so the Volume was perfect.
That’s interesting – the conventional thinking would be that the control of the Volume – for things like big exteriors –sunrises and sunsets – would be more desirable. You would think. But sometimes control can be a bit of a handcuff. And this is what I loved about having done The Mandalorian . After that experience, I could say to directors and producers, “Here is
where the Volume would be great, and here is where the real world would be much better.” As filmmakers, we have always given up a certain amount of control to the elements. Call it whatever you like, but there is something magical, almost undefinable, about a crew racing to get a shot at Magic Hour and everyone being hyper-focused. In fact, I think one of the misnomers after The Mandalorian was everyone saying: “Why would you do it for real if you can use the Volume?” Just like when blue screen came in and everyone was like, “Why build it? We’ll just use blue screen!” The answer then, and now with the Volume, is that if you still can do it for real – you probably should. [Laughs] At the end of the day, it’s horses for courses and whatever serves the story – and budget/ schedule – best.
And yet you did some of the best Volume work on The Batman ever done in a feature film. That goes back to the subject material. Long dialogue scenes in a recurring location? Check. Let’s build the Volume around that set. Also, we were using light levels – dusk into night – that would rarely be that consistent in the real world – check again. The setting was an urban environment where lots of things in the real world – cloud cover, helicopters overhead, traffic noise –would mess us up. So all of that made sense for us to build a Volume, which, remember, is still relatively expensive. For The Batman , or an episodic like The Mandalorian , the Volume can comprise a significant number of pages, and it can be amortized over the length of a project. That was not the case for either of the Dune films.
As you were talking about using Unreal to pinpoint light and shadow, my mind went back to a film like Days of Heaven , and DP’s like Nestor Almendros, who had such an affinity for natural light. Is that a skill we may be losing with new technology like the Volume? I don’t think so. We still need to understand natural light, in the real world, to recreate it in the Volume. If we’re worried about one day everything being shot in a Volume – history has proven with other new technologies – that won’t be the case. We saw how much production was rushed into the Volume after The Mandalorian , and, clearly, not all of it was successful – with both audiences and the filmmakers who were using it. Just like shooting in the real world, the lighting in the Volume, and the content on the LED screens, needs to be carefully designed.
Final thoughts? With every film I’m doing, the technology is progressing at a rate of massive change. The project I’m on now is a huge jump forward, technologically, from even the way we did Dune . I’m not talking, necessarily, about what audiences will see on the screen. Hopefully all the tech is completely invisible in that respect. I’m talking about how to make the preproduction and production incredibly efficient, without sacrificing the magic filmmakers bring to the table. I’m talking about pre-scouting and pre-lighting sets, and rehearsing action, all in Unreal Engine, effectively gaming out as much of the movie as you can in a virtual environment, before you ever lay the foundations of a set. It’s all about maximizing efficiency, so the quality we can achieve – in the limited hours we have to shoot – is increased.
OUTSIDE THE BOX
PHOTOS BY TROY HARVEY
brooks alice
Alice Brooks, ASC’s creative relationship with director Jon M. Chu dates back a quarter century to when the pair were both in film school at USC. For their newest collaboration, Wicked, Brooks began sourcing lenses more than three years before the movie was set to come out. Brooks and Chu’s past partnerships include the film adaptation of Lin-Manuel Miranda’s Broadway hit In The Heights [ICG Magazine June 2021]; Tick…Tick…Boom!, Jem and the Holograms [ICG Magazine October 2015]; and Apple TV+’s 2020 crime series Home Before Dark . For Wicked , Chu and Brooks wanted to create an Oz that no one had experienced before. Lensing was only the starting point; Brooks also delved deeply into the color language of L. Frank Baum’s original Wizard of Oz books, mixing old-school gelled Tungsten lights with game-engine technology. The result, if the trailer is any indicator (Wicked comes out in theaters later this year), is beyond breathtaking, setting a bar for visual storytelling that is truly somewhere…“over the rainbow.”
I’m guessing there were a lot of firsts for you on this movie. Alice Brooks: [Laughs]. You could say that. It all started with a call to Dan Sasaki [Vice President, Optical Engineering, Panavision] right after I got the job, in April 2021, to see what he had that might give us a unique look for Oz. At that time, we were planning to shoot on the ALEXA Mini, but we ended up on the ALEXA 65, which is where the story gets interesting. Dan said he was developing new/old anamorphic lenses and wanted me to test them when they were ready. He had three lenses –none of which had any markings – which we lined up to test at Panavision with every possible camera body. Dan had developed the lenses to cover the Mini’s sensor, but we threw them on the 65 as well.
What happened next? Jon and I went to Company 3 to watch all the test footage, and when we saw the lenses on the ALEXA 65, we both said: “That’s what we’re using!” I called Dan to see if he could develop the lenses for the 65, and he said: “Not in your time frame.” Then, in October 2021, they said production was being paused and we’d be moving to London. I called Dan back and said, “What about now?” He said, “Okay, let’s go for it.” They were designed just for our film, with questions like: ‘What color flare we needed?’ We landed on this beautiful amber look that would take the green of Emerald City to a whole new level. The lenses Dan designed are now called Ultra Panatar IIs, which I have since used. The Production UPIIs are beautiful but very different from the ones we used on Wicked
What words would you use to describe the feeling these lenses gave you? Soft, magical, romantic, they had this effervescent, bubbly
look…Jon asked about my goals for the movie, and I said it would be the greatest love story ever told about two friends, Elphaba and Glinda. Every choice I made was to ensure that intent was fulfilled. It’s a grand, epic film; but at its heart, it’s the relationship between these two women. And it’s all in the close-ups. Cynthia Erivo looked stunning on the 65-mil Dan made, and Ariana Grande’s lens was the 75 mil.
The backlot sets were of a scale you hadn’t attempted. Three of our backlot sets were the size of four American football fields each. They were so big, there wasn’t any studio with enough land for us to build on. So, we ended up on a turf farm north of London, where we could place the backlot sets based on where the sun would be. Using Unreal Engine and sun maps, we could preview exactly where the sun would be on any given shoot day, and where shadows would be cast relative to the height of the sets. We would change things six or seven degrees to find the exact optimum light. Pablo Helman [ILM] was our VFX supervisor, and Nathan Crowley was our production designer, and very early on we all agreed to capture our Oz as much as we could in camera.
Did you use Unreal to previsualize for camera before Nathan built his sets? We did, because the sets were so insanely complex. We’d go into Unreal and they’d have our lights loaded – a Dino, 12Ks, SkyPanels, you name it. All the departments worked closely together –for example, our library set had nine skylight domes, and we tried different lights in Unreal to see what would give us the result we were looking for. We found the light that worked but realized it wouldn’t fit with the mullions over the opening. So we sat down – me, Nathan, Pablo and our gaffer, David Smith – and decided it would be better for the light if we made the dome opening slightly larger and removed the mullions, as Pablo could always put them back in VFX.
Any other use of Unreal to previs your workflow? We had a big musical number in the Emerald City that takes us from day to sundown. This set was at the end of our schedule and there were construction cranes everywhere when we were doing our final prep on this part of the film. Jon and I had this idea for a long oner where the Steadicam operator would step onto a camera crane, but with all the construction going to finish building the set, it was impossible to walk the camera path to see if it would work. So, we went into Unreal, flew the camera around, and realized it was a cool shot, but it did not serve the story. Thanks to what we saw in Unreal, we completely redesigned the sequence before construction was complete.
I would think new-school LED’s that are programable, dimmable, and can easily change color would be preferable on a film like this, but that wasn’t the case. We used a lot of gelled Tungsten – a lot! We used LED’s, of course. But it was a mix of old and new. For the pink sunrise in the song “Popular,” we used a 20K Molebeam. During prep, we went through so many different gel tests to get all the colors dialed in, including the right pink. In the end, the Molebeam was dimmed to 45 percent with 1 1/2. It was truly magical.
Color, like lensing, was your best friend. Every paragraph in L. Frank Baum’s book has a color description, and I wanted to make sure that was part of the film’s visual language. Each scene has a specific color intent, and we used every color in the rainbow to light this movie. It was such an incredible collaboration across all the departments to create the film’s color palette, because it is so much a part of telling this story. I’m actually feeling a bit sad now because I can’t imagine having a better creative experience with a better group of people.
Speaking of color, tell me about the flowers. Nathan wanted to build color into this film through nature, which is an important part of the original stories. Munchkinland was set in the middle of tulip fields. So two years ago, right at this time, we planted nine million tulip bulbs and picked the colors very specifically. I was able to have the farmer plant the tulips in the orientation that was best for the sunlight and that would orient Munchkinland in the best way so that Glinda would always be backlit in her pink bubble. It was quite stressful after planting the tulip field nine months before, because the tulips bloomed two weeks early, in the Spring of 2023. It was like having a baby. The farmer kept calling us saying: “The tulips are about to bloom!” But we also needed the perfect sun conditions when they bloomed.
For a film this leading edge, how important is an ally like Jon M. Chu to help safeguard your vision and intent? My relationship with Jon is everything. He knows how invested I am in a film from the moment I take a job, and how protective I am of the storytelling, the final image, and seeing the film through to the end. It’s a unique and special relationship that’s spanned 25 years. We’ve grown together, learned together, and made some huge mistakes together. We’ve watched our kids grow up together. I have a picture of our two daughters hanging out on the dolly in Emerald City. I recently saw the movie, which is almost picture-locked, and I gave Jon 15 more notes, minor but very specific. Some he will take and some he won’t. But I know they were all heard.
OUTSIDE THE BOX
PHOTOS BY ELIZABETH FISHER
murphy fred ASC
When I talked to New York City’s most experienced (and beloved) episodic Director of Photography, Fred Murphy, ASC, he was in a hotel in Burbank, CA remastering a horror film called Stir of Echoes for its 25th anniversary. “It wasn’t a high-profile movie,” Murphy smiled, “but it was a very good one.” Even with working from the original IP, Murphy says the process was challenging – and emblematic of how much the world has changed since the film was made. “To convert the original to Dolby Vision is tricky because the lighting was fairly intense – hot sources, highlights, backlit streets – all of which don’t do well in HDR,” he shared. “On the other hand, the detail coming out is amazing! I haven’t shot film in a long time and had forgotten how interesting it is.” Anyone who’s a fan of Murphy’s work – most notably The Good Wife TV franchise, which includes The Good Fight and Evil – hasn’t forgotten how interesting Murphy is with his use of new film and TV technology, prominently on display in Season 4 of Evil, streaming on Paramount Plus.
You’re remastering a project from 1999, shot on film. It feels like film is having a moment. Fred Murphy: It seems like it. Four of the five Academy Award nominees this year were shot on film. It says something about the quality of those productions. The only one that was digital was Ed Lachman’s film [El Conde], and that had very little means.
Can you chart any kind of tech evolution from when you first shot The Good Wife [in 2009] to what you did this season on Evil? There’s been a huge difference. We went from old-fashioned filmmaking on The Good Wife, all Tungsten lighting with minimal control, shot mostly on stage. It required a huge amount of rigging and cable, even though it was not a large-budget show. Just to light the backdrops we had rows of strip lights on the top and the bottom. In the beginning, I was using a Sony camera [CineAlta F35] which they said was 400 ASA, but it was more like 300. [Laughs.] Put it this way: it was not fast. It was the same camera they had developed with Panavision. The magazine for the media alone was the most complicated piece of machinery I’d seen at that point. So we had a camera that was slow, mostly Tungsten lights, and we were using gels. If you wanted to change the color of the backdrop and make it dawn, you had to change dozens of lights. Even on that simple level, it feels primitive to what we’re doing now.
When did a change happen? It was kind of out of left field. But the first time we used LED’s was just to light the backdrop. We wanted to eliminate the millions of cables we were using, along with all the heat and power requirements. Those LED’s didn’t work out very well because they were not powerful enough. But then things started to accelerate a bit and the LiteMats appeared, followed by ARRI’s SkyPanel. With
those, we could remotely control all the light on the backdrop, or coming in through the window, and it felt like a monumental change. Our gaffer, Tim Guinness, could stand there with his iPad and make it lighter, darker, colder, warmer, bluer, greener – whatever we wished. It wasn’t a super-fast change because it required a great deal of taking Production by the arm and twisting. This was probably about 10 years ago. Between the SkyPanel and changing over to the ALEXA, ARRI kind of took us into the future and helped create many of the workflows we use today.
You mentioned using the LiteMats early on. They were revolutionary because they took the place of us having to bounce heavy units – mostly Blondes into muslin with diffusion in front of it. All of a sudden that complexity, heat, cabling and space is taken up by what was essentially a self-illuminated white card. It gave us so much more room on the set, and the Kings [Showrunners Michelle and Robert King] specialize in using small sets! [Laughs.] It’s part of their ethos to build something like the real location where their characters exist. The cameras got infinitely better, obviously, but that didn’t change things as much – for me – as when LED’s came in.
What are you using on Evil that’s been new to your workflow? The Astera LED tubes are probably the biggest change for that show. You can clip them to the wall, hide them in the set for fill light, or rig them in groups on the ceiling. They’re battery-operated, so they require no auxiliary power, and they allow us to work so much faster, and in many ways, better.
You come from the film world, where hard light was a common choice. What happened to that creative option once LED’s came in? Mole-Richardson came up with an 18K, with a Fresnel, that worked fairly well. But it’s nowhere near as powerful as a real 18K or 20K, either HMI or Tungsten. To get that sort of light, we still need the sources – either a 20K or several 10Ks. I love the MoleBeams – Tungsten or Daylite – which are beam projectors. But those are only one color, need to be gelled, are large and are not efficient to move around.
What’s changed for you with lensing? I still use Leica lenses with Panavision’s DXL2 camera, which is a RED chip and is really good. With lensing, if anything, we’re going back in time. In search of very wide-angle lenses, I’m using the old Zeiss Ultra Primes, 10 and eight millimeters. There’s no modern eight-millimeter equivalent that I know of, or at least one that’s easily available. I use them because we’re shooting at 5K, so Postproduction has the option to zoom in, which I’m not a big fan of. But the [Ultra Primes] were designed to cover more than the basic 35mm frame and will easily cover 5K. I own a set of very old Series 2 Cooke Speed Panchros
from the 1950s, which have a very specific look and are not appropriate for [Evil]. Once upon a time, the Cookes and Baltars were pretty much all we had. I remember when I started in commercials, we were using the big Mitchell Reflex cameras, and the Super Baltar wideangle lenses were incredibly dull. Not sharp – at all! Now people love to use those lenses to cut back on all that sharpness of the digital sensor. Everything old is new again.
VFX in episodics has evolved, and Evil is a good example. Yes, we do a considerable amount of VFX. They’re mostly concerned with demons, who are real people in demon suits with FX makeup and/or prosthetic makeup. And we’re always expanding our sets with digital extensions. In the first show in this fourth season we built a super-collider, like the one in Berne, Switzerland, and a great deal of that is VFX. We built 100 feet, but it goes off into infinity. Our VFX Supervisor, Mitch Suskin, is on set all the time talking with me. I think more future tech comes into my work with our production designer, Ray Kluga. We use SketchUp, which is 3D-modeling software that lets you create and manipulate 3D models way before anything is built. It’s great for figuring out where the practical lights can go, and even how I may want to move the camera through Ray’s builds. He’s really good with SketchUp, and I’m marginal but learning. [Laughs.] You don’t need VR goggles or anything. You just do it on your iPad.
Do you have an example from Evil? There was a show last season where someone puts on a headset and sees into his mind, and there was going to be a set built for the journey he takes. Initially, I couldn’t figure out how to move the camera through the character’s perspective because there was no floor. It was all this spongey stuff. I finally figured out how to do it with a crane and then had Ray show me what the move would look like with SketchUp – this was all before we ever built anything. It saves money, time and guesswork and gives the director a firm idea of what it’s going to look like. Most people can’t read the blueprint of a set, something that’s 2D on paper. But show them an animatic on an iPad? No problem.
Final thoughts? The other thing I use all the time now is stabilized heads. I’m in love with this M7 Evo Head from Chapman. We can go on location and there’s no need to lay track. You put on this stabilized head and it just bumps across the floor smooth as can be. It will rotate in all directions and it saves so much time. We did an episode in a hotel, like The Shining, with all these endless corridors we had to track down. It’s better than a Steadicam because you can easily adjust the height. I think my big ask for future tech would just be for lenses to get smaller and focus closer. [Laughs.] That would be great!
eguia eduardo
COLOR & GLASS
DIT Eduardo Eguia’s on-set relationship with advanced technology goes back to the Mauro Fiore, ASC-shot robot flick Real Steel [ICG Magazine October 2011], which utilized motion capture of human performers, 8-foottall CGI robots, and a virtual art department. Born in San Luis Potosi, Mexico, Eguia moved to Mexico City in 1995 to work for Latin
America’s biggest TV network, Televisa, working in postproduction before moving to Los Angeles in 1998. Working with reference cameras as a Video Assist on Real Steel (Eguia is a member of Locals 600 and 695), he gained familiarity with motion capture. Joining Glenn Derry at Video Hawks and Technoprops, Eguia solidified his passion
for future tech, working with game-engine technology before moving into the DIT’s craft. Since 2018, he’s spent most of his time in the Volume, working for Disney/Lucas Films on their Star Wars episodic spin-offs, namely all three seasons of The Mandalorian , Book of Boba Fett , Skeleton Crew and the upcoming Mandalorian feature.
It feels like you’ve seen the entire evolution of virtual production, having started in mo-cap and now spending so much time in the Volume. Eduardo Eguia: Yes, everything kind of merged. When I did the first season of Mando, with Greig Fraser [ASC, ACS], many of the guys in the Volume were the same ones who did the mock-ups for the technology for Avatar and Real Steel, so I’ve been involved with the same people for 13, 14 years.
What are the main challenges for a DIT in the Volume? There are a lot of technical challenges with an LED screen: flicker, frequency, genlock, and moiré. I work very closely with ILM to ensure the cameras and the screens work in harmony. It all starts with the selection of the cameras, by choosing from a pool of bodies, the ones whose sensors capture the images as close to each other as is possible. Then ILM measures every camera body and lens to ensure the renderings on the screens are consistent with what the cameras capture. After five seasons of shows, we've standardized this process with great results. Another challenge is to make the image look as real as possible. This is achieved by collaborating with all the departments, from ILM controlling the rendered images and the manipulation of the virtual elements to match the real elements, from the Art Department, to the lighting team providing the additional lighting that brings realism to the set, to Special Effects adding that extra touch. The overall color correction that I do glues everything together and maintains the creative concept set by the DP.
What about the dynamic range capabilities of an LED wall? It’s challenging, as it can be hard to get the kind of deep blacks and highlights as from the real world. Sometimes, I will slightly overexpose the scene and then bring it all down to achieve richer blacks, especially when there is atmosphere or real lights illuminating the tiles. ILM has also developed great virtual tools than enhance the realism of the scene, such as virtual depth of field, virtual background, props that virtually interact with the real time renderings such as torches, explosions, etcetera.
The color spectrum of the LED wall is also different. It’s not a nice smooth curve like you might have with a prearranged LUT; the RGB spectrum is spikey and uneven, which can create this “lobster” effect on skin tones. In the beginning, we thought we needed to add green light, but that created more issues because wherever there wasn’t more magenta, that area went more green. What we came up with was to add more magenta on specific areas, either with real lighting or virtual cards, which sounds weird. But that allowed me to offset everything back to normal colors. Now with the introduction of RGBW panels, this
problem has been diminished.
Are you working with LUT’s in the Volume? We started with a LUT Greig Fraser created, and we have made subtle changes to it over the years. We are all ACES-based, so when we bring in another camera, we just bring that into our ACES color pipeline. We’re still using ACES 1 because the workflow is proven so well and the color pipeline is well established. We were talking about moving to ACES 2, but it was decided to stay with ACES 1 for continuity with previous seasons.
Have you added in HDR? You can’t get HDR from the Volume – the luminance of the wall is limited, and there’s nothing like a sun source you’d have in the real world. But we can set up how we want HDR to look later on. I’ve had DP’s who don’t want an HDR pipeline on set because they feel it messes up the look. They’ve carefully created a film look, and suddenly there’s all this detail in the highlights they didn’t want to see. It’s fine for sports or live events, but many DP’s I’ve worked with don’t appreciate what HDR brings to a narrative story. So what happens when the studio requires an HDR deliverable as is becoming more common? We say, “Fine. But, we’ll do it our way.” So we’re not using the whole potential of HDR in the Volume to create our looks. When we go to a location or other sets outside the Volume, we’ll get the value of HDR and just bring that back to match the overall look we did in the Volume.
What about resolution? What’s optimum for the LED backgrounds? We’re doing HD and 4K at the same time, and I will do the main color correction on the HD stream and then apply the same corrections to the 4K feed. All the monitors are OLED and precisely calibrated. We’re fortunate we have the Sony X300s for the 4K feed. It’s sent to ILM’s on-set Brain Bar, and with the X300s they get a very pristine image so they can do their magic and match the virtual background to the real set. It’s quite amazing what they can do, as many times you can’t see where the set ends and the wall begins.
Avatar 2 and 3 were started before COVID and then completed with a remote workflow during the pandemic. What was your role in that endeavor? I was part of the Video Assist team that handled the playback of the mo-cap material. We were also managing the video assist material throughout different stages so Jim [Cameron] could be overseeing them all at once. I also worked on the integration of the video equipment, including what was set on mobile carts that were shipped to New Zealand. Since the first Avatar, the technology has really evolved: renders are way faster, realism has increased, cameras are more accurate and they have much higher resolution.
Ryan Champney, who is the virtual production supervisor – and a true genius – designed and built that automated pipeline. Once Jim called cut, everything was processed immediately, transcoded and sent to editorial and video assist in a matter of seconds, compiling at the same time the metadata from different stations. Avatar 2 was my last shoot just before COVID shut everything down.
What’s evolved for the DIT in on-set color management? On-set color management has expanded from the traditional one-light pass, to HDR, to wireless color correction, done virtually with on-set devices, and external systems such as LED screens and projectors. I’ve been using [Pomfort’s] Livegrade Pro since Season 1 of Mando. Before that, when I was with Technoprops and Video Hawks, Glenn Derry and Alex Arango developed their own proprietary color management system. Nowadays, with Virtual Production, I have the ability to color correct screens as well, very useful on commercials and some VFX shots. In the case of ILM, I don't control any of their systems. There is a whole team controlling every element of the screens and their systems are proprietary. They do an amazing job!
Your key role as a DIT is to support the director of photography, and you’ve worked with some really good ones – Greig Fraser, ASC; Baz Idoine, ASC; Matt Jensen, ASC; David Klein, ASC; Dean Cundey, ASC. They all had to learn the Volume from scratch, with Greig leading the way. Have you changed your workflow to fit each DP? First of all, I have to say that I am so blessed to have the opportunity to work with so many amazing DPs . Not only are they great at what they do, they are all wonderful human beings – very approachable with no egos and totally open to suggestions. When Greig started [Season 1, The Mandalorian], everything was new, not only for me, but for many departments, and discovering the potential of this technology was exciting and challenging. We learned a lot of what we could – and couldn’t do – and Greig was amazing leading us into this new world. When Baz continued with Season 2, he had gained experience and successfully continued on this journey. Later, he raised the bar on a bigger Volume, pushing the limits even more with amazing results. Now with David Klein [DK], technology and capabilities have evolved, and not only has DK mastered every aspect of virtual production, but he’s also led the production crew into new frontiers. In terms of my workflow, it has remained pretty similar, while still adapting to new technologies and capabilities of the virtual production under the leadership of such talented cinematographers. After years of doing these shows, we have become a true family – every department helps each other out to put the very best product on the screen. And the results speak for themselves.
COLOR & GLASS
mainl dominik
Focus Puller Dominik Mainl has been as key a component of Disney’s Star Wars episodics as any other creative partner inside the L.A.based LED Volume. But the German-born filmmaker, who has been working in Hollywood for more than 25 years, has also seen his share of industry peaks and valleys – during the 2007 WGA strike, for instance, Mainl turned to shooting portrait still photography as a backup career. Thankfully, when that strike ended, he returned to pulling focus, forging relationships with acclaimed episodic cinematographers like Scott Kevan, ASC; Matt Jensen, ASC; and, most notably, David Klein, ASC; with whom Mainl teamed for some 50-plus episodes of the acclaimed drama Homeland. Other shows the pair have done along the way include True Blood, Six, Runner, and Deadwood: The Movi e. Perhaps the biggest challenge of their partnership began on Season 2 of The Mandalorian, carried over into The Book of Boba Fett , and continues on the upcoming Mandalorian feature. (Mainl also worked on Ahsoka – ICG Magazine January 2024 - with Quyen Tran, ASC, and Eric Steelberg, ASC.) Ask Klein (or any other DP Mainl’s worked with), and you’ll hear endless praise about a union member whose passion for lenses creates the perfect blend of art and science on any set.
When did you and Dave Klein begin to forge a pathway for lensing on the Star Wars episodics? Dominik Mainl: We really started looking at other lenses at the end of Boba Fett because we just didn’t have enough glass at our disposal. We had been using Panavision’s Ultra Vistas, which were custom detuned by Dan Sasaki, and they were great. But we could only get two sets, and we had four to six cameras running on any given day. So we started testing a bunch of anamorphics and found the Caldwell Chameleons, which are spectacular! They feature dual astigmatizing elements that create a different moiré pattern than typical anamorphic lenses. The issue in the Volume is when you get too close to the screen with your cameras or focus too close to the LED wall, the moiré just explodes into this big blob, and your take is ruined. The common moiré pattern is called a Galilean moiré, which is an oval shape. But the Caldwells create a different moiré called Pointcaré, named after the French mathematician Henri Pointcaré. It’s more like squiggly lines, horizontal and vertical, and you can almost see them before your take is ruined but, more importantly, it also decreases the distance to the wall by about half. It’s a huge advantage for shooting in the Volume as your usable Set space becomes much larger. A third type of moiré, Euclidean, is typical for spherical lenses but using spherical glass in the Volume is opening a whole other box of worms.
Why anamorphics in the Volume? We prefer them because of their “flaws” – some call it
“character” – they fall apart on the edges, and there are lens distortions and a certain softness, which is great so you don’t see the moiré. Our Volume is an oval so you don’t have to worry as much about moiré left and right of the frame. As for depth of field, if you try to pull focus close to the wall, the moiré issue comes up again. That is why Dave Klein likes to shoot everything as close to wide open as humanly possible [laughs], which is not always fun for me [and the remaining focus team consisting of Josh Greer, Niranjan Martin and Paul Metcalf] but we love a good challenge. We also have something called virtual depth-of-field in our Volume. The Volume Control team connect the Preston Hand Unit to the wall, and as I pull focus, the Volume defocuses accordingly. But, truth be told, it doesn’t always work. For example, you have the Mandalorian standing in a bamboo forest, the reflection of the forest is sharp on his armor. But as I pull focus closer, or he moves toward camera, the wall defocuses and now the bamboo forest has become a blur in the reflection and it doesn’t look realistic anymore. So, the virtual depth of field is really a shot-by-shot choice. But, generally speaking, being able to defocus the [virtual] background while we pull focus is a huge advantage.
Does Dave like to use the Volume as a lighting source? For sure. As an ambient source, and from the content itself for reflections. The Volume gives off only soft light [LED panels], but Dave is really smart how he utilizes the wall, even removing panels to accommodate mirror boards that reflect a hard light source that could mimic sunlight. And you can dim the wall all the way down to act as a negative fill. Shooting between a T2 and T2.8 is ideal in the Volume to make it look more real. The downside is that there is a lot more gack on the camera when you’re in the Volume to relay tilt, roll, frustum, et cetera. You definitely can’t stand next to the camera to pull focus, as we did in the film days. I miss being next to the operator, but technology drives everything, and that’s just the way it is.
Tell me about shooting for IMAX in the Volume. We tested many different formats and systems but I can’t get into camera, lens and aspect ratio specifics at this point due to an NDA – that should be a discussion for another time. However, in terms of focus pulling, it’s very important for me to be looking at a highresolution 4K image for everything Volumerelated, regardless of format. Wireless video transmission is getting to a point where we can get a reliable 4K signal without much latency. But the Light Ranger, for example, does not work in a 4K workflow, so I try to be hard-wired as much as possible, and get the best image possible, ignoring the LR system at times. What I can say, speaking from experience, is that resolution is the key because everything
is revealed on a big IMAX screen. What looked sharp on a small screen on stage may not be at all crisp on that big screen, so I like to utilize SmallHD’s 4K monitors, both 13-inch and 17inch versions. Another big challenge was to find spherical lenses that have an anamorphic look and work in the Volume. For weeks, we tested 23 sets of spherical lenses, including optical tuner equipped OttoBlads and the Hawk MHX Hybrids, both are a relatively new set of lenses, the latter featuring a dual iris blade system to get that oval bokeh. When you shoot them wide-open, both have some fall-off on the perimeter and, while the OttoBlads are tunable on the spot, the Hybrids mimic the anamorphic look pretty close.
Where are we in this “moment,” where rehoused vintage glass, often coupled with large sensors, is so popular? I think the days of the clean lenses, like Master Primes and Master Anamorphics, are coming to an end as everyone wants glass with character. Which makes sense as almost nobody shoots on film anymore, so we’re putting clean lenses on sterile sensors. No wonder old glass is so popular! When I was researching lenses for this project, I watched recent shows like Shogun and Trying, both shot on anamorphic glass, so it’s a trend that even television is embracing. I have a 24-year-old who just finished film school and he wants to work in the industry, and I tell him: “Watch old movies! Or what you consider old movies, something my generation grew up with.” I do understand there’s a generation who may have never even seen a film shot with anamorphic glass, on film, in a theater, and there’s a disconnect. I just had a friend tell me he watched Oppenheimer on an airplane and liked the look! That's sort of missing the point, right? [Laughs.]
What about metadata? It’s certainly a growing trend that VFX shows love having all that lens information as soon as possible. None of our lenses are smart lenses, so we would have to use an encoder. We discussed using the Ottoblad lenses, which are Hasselblad glass rehoused by Otto Nemenz. And they are beautiful! You have focus, iris and a third gear that’s a built-in optical tuner that can detune a lens on-the-fly. We looked into using an encoder with those lenses, a small ARRI motor that could plug right into the camera and record the optical tuner data to SD cards for our VFX team, but we ended up not utilizing the encoder. The team from Lucasfilm is amazing, always on set, and they take notes for everything. They like to stand by the focus puller’s monitor, so we’re in communication all the time.
Will AI impact your craft? During the strike, I was in Atlanta and had a chance to see the Moon Smart Focus system, which uses AI to track the actor’s face in real-time. You tell it
which face to track with a touch screen and, even in a crowd, it will only track that face. Auto-Focus feels like a similar discussion that’s been going on for a while. My feeling is that you still need a focus puller. The human touch, the actual focus pulls – is not something a computer can replicate live on set. Not yet, anyways. Focus pulling is a creative position. The audience looks where we decide to place
the focus and if we are doing a good job they won’t even notice it. Sometimes we just ignore dialogue and rack focus to another actor’s reaction. And whatever the approach, it needs to feel organic. So, there are emotional and creative choices made in the moment that AI can’t do. Will it take over one day? Probably. I’m sure we’ll be using touch screens to dial in focus and AI will take care of that, maybe you
tell it when to start and when to stop. But the value of talking with the director and DP on the set is so important to our craft. David Klein and Jon Favreau are fantastic about letting me come up and ask: “Hey, how about we do something unconventional here?” They’re so open to creative suggestions, which is the beauty of being part of a team. And that’s what filmmaking is all about, right?
COLOR & GLASS
karaismailoglu kaz
DIT Kazim “Kaz” Karaismailoglu says he’s always been the guy others turn to for creative technology solutions dating back to his childhood in Turkey. “My friends would bring their video recorders to my house to ask if I could fix them,” he laughs. “Turkey is a limited country when it comes to new technology, so I’d patch stuff together with parts from security cameras, plastics, whatever was available.” Although Karaismailoglu passed on film school to study philosophy, the tug of a moviemaking career landed him in London, where he learned to be a film loader at a friend’s company. That firm also had a branch in Dubai, so for more than half of his 20s, Karaismailoglu hopscotched between Europe, Asia and the Middle East, making movies and commercials. Eventually he landed in New York City, where his career continued to soar, working on massive recent hits like Fallout [ICG Magazine May 2024] and the upcoming Marvel entry, Deadpool & Wolverine
Tell us about your first experience as a DIT – how did it come about? Kazim “Kaz” Karaismailoglu: They didn’t call it a “DIT” – I was called a “RED tech,” shooting with the original RED ONE. I remember we had a threecamera car crash scene, and one of the RED cameras had burned out. With that first RED, the cards were like one-terabyte bricks. Just massive. The DP was Joel Ransom, a legend from The X-Files, and he was obviously upset about losing a camera. I’m like, “Hey, you guys go to lunch. Let me see if I can fix it.” Just like when I was a kid. I go to the truck, fix it, find the SD’s and take the clips out. When they came back, I asked, “I want to show you something. Is that the take?” and they’re all thrilled I saved the shot. After that, I became the guy [laughs].
Has that continued throughout your career? I’ve worked probably 2,000 commercials and forty films and it’s always been that way. Never a simple job when I’m involved. It happened just last week on a Ray-Ban commercial. My first real job as a “DIT” was on a commercial shot in Dubai for the government of Qatar. It was 2008 and the ALEXA hadn’t even officially come to market but they wanted to use it in the desert for this spot advertising Qatar as the host country for the 2020 World Cup. They said: “Well, we’re not sure if this camera will work in the desert so let’s get Kaz. He can fix anything.” [Laughs.]
Fallout is a huge hit, and Director of Photography Stuart Dryburgh [ASC] couldn’t stop talking about your contributions. How did that come about? Whew, that’s a long story, and probably goes back to me seeing The Piano [shot by Dryburgh] in Turkey as a kid,
which made me want to be a filmmaker. With Fallout, you’re talking about [Director] Jonah Nolan and Stuart Dryburgh, two men with real vision who are not afraid of challenges. And they surround themselves with the best crew possible. DP Bruce McCleery [who shot in the Volume], Gaffer Bill Almeida, Key Grip Charlie Marroquin, Production Designer Howard Cummings – these guys are a privilege to work with, and you only want to bring your A-game every day.
So you shot on film – in the Volume?! Jonah only shoots film, so that meant we had to figure out how to do it in the Volume. For me, that started during COVID. The Mandalorian had just come out, and everybody was talking about shooting in the Volume. AC’s, DIT’s – a lot of ICG crew people were on the phone or Zoom talking about working in the Volume. I went to a friend who owned a studio and said, “I want to build the first LED volume on the East Coast.” He was like, “Go ahead. The stage is sitting empty [during COVID].” I worked with Gaffer John Velez, who had tons of LED screens not being used. I also had a friend at a VFX company. So with these elements – large studio space, many LED panels, and a VFX company for the assets – I was able to build the “Brain Bar” in my shop, running four nodes. Now, I needed a DP to shoot in this new “Volume,” and that was Stuart Dryburgh, whom I had been working with for 10 years. He shot our first commercial on that virtual production stage, and we sort of continued on, with me as virtual production producer and Stuart shooting more commercials.
But that Volume is not where you shot Fallout? No, that one’s 40 feet wide and 18 feet high. Gold Coast Studios, where we shot, is 140 feet wide and 28 feet high. Definitely the biggest on the East Coast or Canada. Shooting film in the Volume, my first thought was, “How can we be efficient and not have to do a lot of tests? We had to see what we got on set and not have to wait for film-out dailies, which meant we also needed to have digital cameras in the Volume. What you see through the film’s HD tap is not what you’re going to get, colorwise. The idea I offered, that Stuart liked, was to genlock a Sony camera to the LED wall, and then create a base LUT for that camera that would mirror the 5219. I color-scienced 5219 and did a transform for the Sony BT.1886 gamma. In the Volume, the RGB levels are so different. It’s like you’re reverse LUT-making to make sure the digital camera will look like film. But once you get that dialed in, you can work in the Volume – on film – with the same speed as shooting digitally.
How hard was it to create a LUT that would emulate 5219? Really hard. Your matching point on the red-green-blue graph has to be so good. But green is not a line, it’s [makes fluttering motion with his hand], and so is red. You have to analyze the emulsion of the 5219 without any grading, and before the scan. You see numbers in the XML file and you’re like, “What is that?” They’re the breaking points of where the graph – or the LUT – would break. The challenge was to figure out how the 5219 would react to the LED wall. Your Volume RGB’s are already so limited because they’re just tiny points of light. They don’t put out the proper color gamma, so if you’re not spot-on with your exposure of the 5219, the blacks just disappear; they wash out and you lose them.
What about the color management of the assets on the wall? A lot of challenges there as well. They used a drone inside the Volume, which was Paul Cameron’s idea, that we had to match to the film capture. The VFX company was in L.A. and we were shooting in New York, and color-grading of an asset remotely takes time. So my idea was to color-grade the Volume, and when I found out Livegrade Studio can read Brompton [who makes the processors for LED walls], I knew I could control every node on the wall through my cart – live-grade the cameras, and live-grade the wall. First day of shooting we have a helicopter shot with two cameras, one inside the whirly bird with the operator remote on a joystick. But with a large, curved LED wall, the panels at the bottom look different when the camera tilts down from when it’s straight on. I saw this in Stuart’s pre-light and thought it could be fixed in VFX, but I still wanted to do my best on the day. So I started live-grading those lower panels during the shot. I knew when Stuart or Paul [Hughen’s] camera was going to tilt down, and I was changing the exposure of the Volume during the move. It was risky and experimental, but that’s what I love doing most.
Do you see any new technologies coming that will change how you work? For me, nothing will replace real-world lighting by a cinematographer, using a camera and lenses they’ve selected to convey a certain feeling. That’s why we do this. To evoke feelings in the audience. If this next generation likes to watch a movie that was made inside a computer, then we’ll need to learn how to do that. I see myself like Emmett Brown from Back To The Future, who jumps off the clocktower to connect the cable so they can go back home. I feel like that when I’m working. Holding this cable up high and letting the electricity do whatever it takes for the project. [Laughs.] Like I said: I never get the easy jobs.
THE HUMAN TOUCH
acosta acuña mariana
Mariana Acuña Acosta, who currently serves as senior vice president of Global Virtual Production and On-Set Services for Technicolor and also runs her fourth tech start-up, Glassbox Technologies, was raised in Mexico City in an all-female household.
“None of them were technically savvy,” Acosta smiles over our recent Zoom chat, “so at the age of nine, I decided to become the technician of the house. I love learning how things work.” The larger Hollywood industry has also loved Acosta’s youthful
career choice, having benefited for some 15 years from her expertise at companies like Sony Pictures Imageworks, Foundry, Escape Studios, Epic Games and others. Coming up through visual effects – supervising, compositing and CG artistry – Acosta has been uniquely poised to land on her feet with each new tech trend, including augmented and virtual reality, virtual production, and now generative AI, all with a goal of sharing how these complex systems impact filmmakers.
Growing up in Mexico City, what was your first introduction to visual effects?
Mariana Acuña Acosta: Actually, it was seeing movies like The Terminator and Alien that got me excited about visual effects. I founded two of my four companies in Mexico City, and at some point I realized I needed to be in Los Angeles because that’s where the industry was for the kind of work I wanted to do. I studied at Gnomon School of Visual Effects [in Hollywood], and after three years, my teacher helped me get a VFX job in L.A. I went on to
work at many VFX studios, doing supervision and compositing. When my son was born in 2011, I saw that being a woman in VFX was not that family-oriented. So, I made the switch to tech full-time, working for a software company that makes DCC’s for the entertainment industry. I got to visit a lot of the studios and I saw how virtual reality – with game-engine technology like Unreal – was going to change production. I visited with filmmakers to see what they were doing with 360-video; VR; and interactive, high-end experiences. That led me to start JoltVR in 2015.
Your role with Technicolor is focused on virtual production – where are the applications? Technicolor, parent company for MPC, has done a lot of projects with Disney. We also have a virtual art department; we do virtual scouting, and while we’re not in the LED Volume business, for shooting in the Volume we partner with Nant Studios [El Segundo, CA]. We have a visualization department, and MPC has done many hybrid projects like The Lion King, The Jungle Book, and others that are to come. [Smiles]
What did your virtual production portfolio look like before Technicolor? That’s why we founded Glassbox Technologies, which has won a Technology Lumiere Award, an Innovation Award and an HPA Engineering Award. I also thought I could do a good job helping filmmakers navigate all this “future tech” as you call it. We realized there wasn’t a set workflow or plug-and-play solutions using virtual cameras in either Maya or the game engines; the industry needed more education about virtual production. So, we started to do crossplatform virtual production tools. For example, our virtual camera, Dragonfly, would work the same in Maya, Unity and Unreal, and then we came up with a product called “Beehive,” which in very simple terms is like Google Docs for the game engine, where different artists could be working on the same scene, at the same time, iterating in the same level. We also had a real-time facial-capture solution; this is before MetaHuman [in Unreal], where you could start driving your virtual characters with video or a higher-end, head-mounted camera. At the tail end of the pandemic, Technicolor reached out, and it felt like bringing together all the areas I’ve been working in for the last 13 years.
Game-engine technology is getting more popular in all areas of production: how do you see that evolving? Because of the Apple Vision Pro and spatial computing, there will be even more virtual scouting, with DP’s working even more closely with other crafts. This merging of VR and AR is where the industry was going. With the visual quality of Unreal 5, you can preview your effects, your lighting and animation in real time, so you don’t have to wait
for renders. During the pandemic, Epic Games asked if I could be the virtual production mentor for their fellowships. They had six fellowships total, and we had 100 people go through each fellowship – 600 people! People from all trades – DP’s, production designers, gamers, digital artists, even producers, all wanting to learn more about the technology. The work they produced was incredible. Everyone was using different lenses, rack focus and zoom –traditional cinematography tools to establish film language, virtually, rather than camera movements that wouldn’t exist in the real world. I feel like game engines are the binding agent that glues together many different industries that, in the past, have not even talked to each other.
I visited the 2016 set of The Jungle Book, watching Rob Legato [ASC], Bill Pope [ASC], and director Jon Favreau working with a virtual camera. At the time I thought this may be the future of filmmaking, but it hasn’t exactly turned out that way. Why? Then, and now, it still feels like a big jump from traditional filmmaking. It’s been used widely, for some time, in the gaming world, as well as in advertising. I think there’s been some misinformation and misconceptions about virtual production, in general. I have seen filmmakers embrace this technology and do wonderful work. But that doesn’t mean we should suddenly ditch traditional filmmaking knowledge. On the contrary, it’s needed more than ever with a virtual camera. One other thing I was trying to evangelize through the pandemic was the intersection of production, creative and technical. I’ve spent a lot of time on sets, and those technicians, for example at the “Brain Bar,” didn’t have the same feel for what DP’s and other crafts do. Without someone who supports that intersection, there’s a disconnect – and filmmakers get turned off.
You gave a talk a few months ago on the evolution of AI at the HPA Women In Post luncheon – where do you see AI’s impact in the near term? Let me just say that machine learning and artificial intelligence have been embedded in the VFX world for more than a decade. It’s been widely used for upscaling, batch processing of shots, de-noising, camera tracking and all the enhancements AI has created with facial animation, like with The Planet of the Apes franchise. There’s a feeling that AI is a brand-new technology, but nothing that’s being done today would be possible without what came before. Having said that, my feeling is that, just like the invention of the camera, or the ATM, or the Internet of Things, where your fridge reminds you to go buy strawberries – none of these technologies has replaced human creativity and neither will AI. For me, ChatGPT is just a much faster and more efficient Google. Generative AI is not
reinventing the wheel – it’s taken from what humanity has already created and is giving you that version faster, which you can then iterate from. They’re making a lot of advancements when it comes to latent diffusion models, but as of right now, generative AI is not specifically designed to focus on optical flow or temporal pixel changes.
Have you seen AI being advocated in a way that could be detrimental? I have. I was at an AI event recently where I started to hear how creatives were talking about storytelling in the service of AI, with no real concept or vision of what story they wanted to tell. This industry has had a history of technology being used to facilitate storytelling – think about Spielberg with the mechanical shark in Jaws, or motion-control cameras for the original Star Wars . When the story is only there to service the tools, as some people believe with generative AI, that’s a problem. Gen AI is merely a reflection of human experience, and, in this business, it’s there to help storytellers share their visions faster and more efficiently. Nothing will replace human creativity.
How has AI touched your work at Glassbox? Our latest product, coming at the end of 2024, is called Project Cocoon, an AI-assisted visualization and look-development tool for concept exploration. It will also be able to generate stylized animation and specific VFX shots. We are using known VFX 3D techniques to drive the output inside of Sequencer in Unreal Engine by generating control maps in near-time. You can think of this as “controlmaps prompts.” The thought behind this tool was to have the immediacy of virtual production join forces with the efficiencies of AI in other parts of the production process.
Your native country, Mexico, has produced so many talented filmmakers – Chivo, Rodrigo Prieto, Guillermo Navarro, Iñárritu, Alfonso Cuarón – the list goes on and on. Did they influence your path in any way? Absolutely. When the Academy’s Motion Picture Museum opened up, there was a whole floor dedicated to those names you mentioned, and I felt so proud of my roots, so proud to be Mexican. Not just traditional filmmaking – seeing Alejandro González Iñárritu’s virtual-reality film Carne y Arena at LACMA made me cry as you could feel what immigrants endure. For me, technology is about bringing us closer together, and that will never change. For example, the Mexican in me would never sit at home watching a movie with my mom and 13-year-old son while we’re all wearing our own Apple Vision Pro headsets. That would be isolating, and Mexicans are all about family and togetherness. I think it’s something in the Mexican character to want to do more and learn more – it’s a place with a lot of humanity. [Smiles]
jenkins renard
THE HUMAN TOUCH
To say Renard Jenkins, current president of SMPTE and I2A2 Technologies, Labs and Studios, and former senior vice president, Production Integration & Creative Technology Services at Warner Bros., has worn many hats is a true understatement. Jenkins sits on the board of the Hollywood Professionals Association (HPA), as well as the Advisory Board of Exceptional Minds. He’s a member of SIGGRAPH and a former board member of MovieLabs. And while Jenkins’ honors include two Emmys, a Peabody, a Broadcasting and Cable Technology Leadership award, and an Innovator of the Year award (for his leadingedge work via the PBS Advanced Format Center), at heart, he’s just “a film and TV nerd” who loves using new technology to help creatives share their stories.
You recently gave the keynote speech at “AI On The Lot,” you were a panelist at the Produced By Conference, and you did another AI forum in your home base of Washington D.C. Where will AI’s impact be in the next five years? Renard T. Jenkins: My feeling is that it will come in the areas of animation, VFX and postproduction. Those places are fertile ground for AI to grow and improve workflows. But when it comes to production – and specifically, cinematography – there is nothing (at this current time) that can replace live-action camera acquisition of a human performance. It cannot be replicated. I know there are people out there trying. [Laughs.] It is easy – with the various generative AI tools available now, to create still images that are fantasy based. But as far as producing narrative content, with real people, in the real world, what the cinematographer and his or her camera crew do, that’s not going away.
Virtual production (VP) was touted as the disruptive technology of our time, but it hasn’t quite turned out that way. One thing that slowed the adoption of virtual production was a talent gap. When it first arrived, VP stages popped up quickly, but there weren’t enough people trained in that vertical to exploit its true capabilities. It also matured in the middle of a worldwide pandemic, so people could not be trained quickly enough as they would on other technologies. But I do think virtual production has incredible legs to help this industry imagine the future of live-action filmmaking. The use of game engines and AI to create a more efficient pipeline for those images that populate a Volume will play into a quicker adoption. The end of 2023 saw an uptick in virtual production – if we look at a show like Fallout [ICG Magazine May 2024], and the process used by the virtual production supervisor, Kathryn Brillhart, and her team,
we see how VP and AI can be folded into the process, while still keeping the human factor uppermost in its adoption.
Aren’t market forces a facto in adoption? A recent example is 3D, where consumers didn’t want to wear the glasses despite a massive industry push. [Laughs.] One thing in this industry you can count on is that 3D will come back every 20 years. Each time it gets a little bit better, but the public does not seem to be as fascinated with 3D as those who are pushing it. But your point is well taken. The way individuals consume content will always be the driver for any new technology. And there are so many ways to engage with content these days.
Do you have an example? We did a small sample set within my company, I2A2 Technologies, and found that the average amount of time that someone would engage with AI-generated content – that was meant to be photorealistic and tell a story – was between six and 11 minutes. Some people went further, others didn’t last more than 30 seconds. Either way, the technology for what’s being created right now in AI does not seem to be good enough to hold a viewer’s interest. Now, as these tools are better able to mimic human speech and facial expression, as well as traditional filmmaking processes like camera movement and lighting, those numbers will change. It’s important to remember that the movie experience hinges on suspending disbelief; and to do that, we need to see something that looks – even a little bit – like the reality we’re used to seeing. The classic example would be Star Wars, which is all set in an imagined universe. And yet if you look at Tatooine, the jumping-off point for the first film in the franchise, it looks just like a desert –except it has two suns!
Is your view of AI similar? Definitely. AI and all the new technologies we are building have the ability to connect with us because they’re offering something we’ve seen before. Animation is a great example. Generative AI can produce images that are similar to what we would see in a Pixar film or a Japanese anime. You can look at a gen-AI image and say, “I’m comfortable with what it’s trying to emulate because I’ve seen it before.”
Years ago we covered the work of Roger Deakins [ASC, BSC] on the How To Train Your Dragon franchise. The CG artists were blown away by Deakins’ real-world lighting. They’d never seen fire look so real – in a computer! That’s something I’ve talked about in my keynote speeches. The visual artists will rise, especially cinematographers, in the midst
of all this change. We can ask these digital tools to “generate” something for us, but they will have a similar look across many different platforms, as I just described about animation. Now, when we put these tools in the hands of cinematographers who are used to walking onto a set, assessing what is around them, making aesthetic choices based on the lenses they’ve chosen, the camera they’re using, and then working with the different craft departments to build out the vision that’s in their minds – when an artist like that can type in a prompt that includes all of that human experience, you’ll start to see something that will keep audiences engaged for more than the 6 to 12 minutes we found in our research.
We talked to Local 600 Director of Photography Andrew Shulkind for our April issue [ICG Magazine April 2024], and he said they had to invent a new camera because a display technology like Sphere had never existed. Will it take many more Spheres to ensure the survival of theatrical exhibition? Well, Sphere is truly amazing – inside and outside. You ride the tram through Vegas and the [Exo-Sphere] interacts with you! Inside is the closest technology we have yet to a fully immersive experience. Are there other technologies out there today that can bring people back to the theater? Well, I think you have to define that core theatrical experience, which is many strangers sitting in a common space and having their own experiences. For me, there’s nothing better than that feeling. I can still remember seeing Close Encounters of the Third Kind and me and my friends walking out of the theater singing the magic keys that communicate with the aliens. You don’t get that watching something alone at home.
True enough. And yet, Sphere aside, technology is pushing us toward being ever more siloed – I tried the Apple Vision Pro at HPA and it’s amazing, but amazingly personal to the person wearing it. Technology is pushing us toward more customizable experiences, which means we have to look at predictive AI, and not just generative AI, that leads users via algorithms that keep us within a certain range of things that it thinks we like. Predictive AI happens just because we stopped on a certain TikTok or Instagram feed – for a moment! From that comes adaptive content that will allow people to say, “Hey, generate me a story that looks like X, Y or Z,” thus forcing us even more into an individual space. But at some point, I believe we all have to look up from our devices and seek out another human being to share the experience that we just customized. When that sharing happens, hopefully, over time, we’re headed right back into a communal place where all technology ultimately leads.
jeffrey
DROP ONE, STITCH TWO
The film (as in celluloid) roots of Editor Jeff Ford, ACE, whose credits include at least 10 Marvel projects, run deep – back to USC’s legendary cinematic-arts program in the early 1990s, where he was part of a cadre of students – directors James Gray and Matt Reeves among them – who went on to change the face of entertainment. Ford’s editing career started with him apprenticing on Gray’s directing debut, Little Odessa , and took off quickly from there. “I was at the tail end of the film generation,” Ford explains from the editing room of his current project, The Electric State, directed by Joe and Anthony Russo (whom Ford worked with on the final two Avengers features – Endgame and Infinity War). Although Ford’s USC class didn’t yet have AVIDs, he says, “I was always good with computers and knew Macs pretty well. I was also fortunate to be around cutting rooms when AVIDs were coming in, and the fact that I knew both film and the first non-linear systems allowed me to transition between worlds.”
You had some pretty great friends in film school, many of whom went on to direct. What made you choose the editing room? The goal was always just to be a filmmaker. I worked as a camera assistant, initially, out of film school, and I loved being around cinematographers. But editorial was the most interesting because that’s where the film came to life. I understood that early on, even when I was making Super 8 Spider-man movies as a kid…
Well, that’s ironic! [Ford edited Spider-Man: No Way Home ]… [laughs] Yeah, I ended up getting to do that…a lot! It’s what I was doing in the fourth grade and what I was doing in my 40s. There’s nothing like the temporal manipulations of editing. Combining music, sound, and images, to me, is the purest form of cinema. It was fun; I liked it. And, I seemed to have a natural ability for it, so I ended up editing a lot of other people’s films in school.
I can’t think of a better storyteller to start a career with than James Gray. And things only went up from there. After Little Odessa, I was very lucky to become an apprentice on a crew with one of my childhood heroes, Richard Marks, who cut Apocalypse Now, The Godfather II and had Oscar nominations all day long. The film was called Things To Do In Denver When You’re Dead, and it was Richie’s first AVID show. He took to the AVID very quickly, and it provided a window on where things were headed. But the thing I remember most about watching Richie work on that first AVID show was the technology – while it did facilitate a faster, more efficient workflow, it was only as good as the very talented person
who is making those choices of rhythm, pacing and storytelling. Like Richie.
The AVID still had a film component in the early days. Right, and on those early AVID shows, that became my focus. The editor would cut on AVID, and the assistant would supervise the conforming of the physical work print. We’d get a change list from the AVID that would tell you where to cut on celluloid. Then you assembled the work print and printed the soundtrack onto mag, and that’s what we’d project in theaters for previews. It was the only way we could show a high-resolution version of the film to audiences, as the resolution of the AVID in those early days was not good. The first time we’d screen a preview was a revelation because you were seeing all that resolution of film, which is still my favorite medium. My kids didn’t get to see film projected in theaters like I did. For me, nothing else compares.
You mentioned the temporal aspects of editing – we live in a world now where everything can be manipulated and technology that may do that for us. How do you feel about AI? I’ve tried to feel terrified, outraged, scared – all of those things – and it just doesn’t work. Maybe I should feel that way. But it feels similar to two advances I already experienced in my lifetime. The first was seeing my dad typing up a job résumé on a Smith-Corona typewriter. He was yelling at it, cursing at it, having to re-do it over and over. And then one day he brought an Apple II Plus home, and everything changed. Word processing is the analog to non-linear editing. One day you’re writing a screenplay on a typewriter, the next you’re using Final Draft on a personal computer. Both of those things have changed filmmaking, just like digital capture or LED lighting has changed cinematography. All these technological advances have changed how people do the work, but they don’t change the work itself. We could run an AI computer simulation on the Dodger season and not have to watch them play. It might be full of randomized, unique moments, but I’d much rather watch the Dodgers play. Just like I’d rather see a movie made by another human being. That’s never going to change.
So this future tech thing will be felt where? AI may give us new tools to help tell humancreated stories, and that may be liberating. Having worked on a dozen Marvel movies, I can tell you the VFX side is very labor-intensive and very expensive. What if AI tools, especially in VFX, opened up a type of storytelling to a lowbudget independent filmmaker that was only previously available to a “corporation” making “content,” like these $100-million movies are? That might give people the chance to access a
kind of storytelling they might never have been able to before new technology, which could have amazing results.
You became the “Marvel guy,” but had you done many VFX projects before Captain America ? Hardly anything. I helped out on the Director’s Cut for I, Robot for a month, but never got the full blast of VFX for that one. Fortunately, Captain America was with Joe Johnston, and he was another hero of mine. One of the founders of modern visual effects, and the guy who basically drew Star Wars , along with Ralph McQuarrie and Brian Muir. Working on that, for me, was eye-opening, as it felt like a new way to make a movie.
We’ve done stories about how highresolution capture has created huge “data trains” that end up dumping on postproduction. Is that a problem? I have an incredible first assistant, Robin Buday, who’s been with me since the first Avengers movie. His biggest task is to make sure all these systems run well because we do push them hard. But this is not necessarily a new thing. I went to film school with Matt Reeves, one of my favorite filmmakers and a good friend, and when he was working in TV, the joke was that Matt would turn on the camera in the morning and turn it off at night – and that was with film! [Laughs.] I did a movie with Jim Brooks – As Good As It Gets – and he shot 40,000 feet of film a day. My arms were jelly synching that up. But today’s digital systems are so fast, and HD moves quickly through the pipe, so your question goes more to a management thing. Creatively, what’s valuable about having all that material is being able to do multiple versions of a scene that are non-destructive. I can show the director a scene in singles or with the big crane move. I can show the director a cut of the film that’s already been colored by the DIT on set, supporting the DP’s vision. And, oh, by the way, I can always tell when a shot is in or out of focus. Even with all the new technology, there is no system that compares to a skilled camera assistant who can pull focus.
Do you interface with the camera team much during production? All the time. I edit on set, at least I did with the Russos, and I’d go hang out with the DP and the DIT at his cart. I’m fascinated by the camera craft and speaking with them helps me understand how to approach the scene. There are components of the lighting and visual strategy of the scene that help tell the story, and if I know that in advance, it can help in the edit.
I’ve heard about colorists interfacing very early with the DP and DIT for LUT’s and sending high-res stills, but the editor being on
set during production sounds like something new. I’ve done it forever. Maybe because I started out as a loader, and then as a focus puller in commercials. I love cinematography and I love talking to cinematographers! The first feature I cut was James Gray’s The Yards, shot by Harris Savides [ASC]…[pause] I learned so much about movies from Harris because he had that passion in his soul. Harris pre-flashed the stock, used C Series lenses, and created this vintage look that…The Yards is a stunning movie. Like everything he shot. That relationship with Harris, like that with
the director, is so important to me because it needs to be honored all the way through the process. I want to be the guy who says, “Do not open this shot up. That’s not how it was lit on set.”
You’ve worked with some terrific DP’s. They’re like heroes to me. Tak Fujimoto, Harris, Mandy Walker [ASC, ACS], Jeff Cronenweth [ASC], Jonathan Brown…the list goes on and on. And what’s interesting now, since we’re talking about new technologies, is that when I’m on set, every take goes to the DIT cart and
then to my computer, so I can start sketching out a scene as it’s being shot. I may show it to the director, I may just have it for myself knowing I have an approach to the scene in mind. Being on set for reshoots, with the Marvel stuff, is much more targeted as it may have to match up to a performance done much earlier – years even, like with Endgame . We talked about AI…the reason we make movies is for these collaborative relationships. They’re really fun! And I haven’t met a DP – or anyone else on that set – who isn’t as obsessed with that as I am.
STITCH TWO
pollack amanda
Amanda Pollack, ACE, grew up in a musical household, and that influence has informed all her work as an editor. From her earliest experiences working at Sound One in New York City’s famous Brill Building (once the home of Broadway songwriters) to her most recent job on Apple TV+’s Palm Royale [ICG Magazine April 2024], Pollack has transposed her passion for music into an approach to cinematic storytelling that finds rhythm wherever she looks. “Dialogue is music, a conversation is music – even silence is music,” she explains from her Manhattan home. “My old friend, Mark Livolsi, whom I assisted for The Devil Wears Prada, used to say that the sound and pacing were so important that if that felt right, you could almost put any picture on top and it would be okay.” Pollack says her instinct for editorial pacing comes from her musical background, and it has underscored a career path that began in commercials – cutting on film – all the way through to acclaimed TV episodics, including The Patient , The Good Fight and the multiple Emmy-winning drama The Americans.
How did you end up in editorial? Amanda Pollack: My father was a film editor in commercials. He cut on film, on a Moviola, in our basement in Westchester. But I studied writing and storytelling at Hampshire College and had no thought of working in film. At college, I was into Women’s Studies and Black Studies and even got to take a class with James Baldwin. It was my mom who suggested that [after college] I go and talk to a friend of my father’s, who ran a commercial editing house in Manhattan. My dad passed away quite young – 49 – so he never got the chance to work in long-form. But, as my mother tells it, he certainly would have made that move if he’d had the chance.
Were they still cutting on film at that commercial house? Yes, they were. I went to meet with my dad’s friend and he suggested I come on as an apprentice. He’d say: “Go sit at the table and practice splicing.” And I’d be there all day long just whacking with filler. I was like, “Can’t I do something more [makes air quotes] important.” [Laughs.] But it was great because I learned to splice. And I loved film. You can see that 16 frames is a foot, and 24 frames is a second. The cut-and-paste of it all, being hands-on, was so much fun. It suited the crafty person in me.
When did you move into narrative? After a few years of working in commercials. I wanted to work directly with creatives, and the one thing I didn’t love about commercials is you have the agency people in between the creative and client. They usually pushed the product placement and not so much the storytelling. I went over to Sound One, in the Brill Building,
when so many features were being cut there. Reds was cutting there; Martin Scorsese was cutting there. I went up to the transfer desk, where everyone would bring their production audio to be put onto 35mm mag stock, and just asked for a job. I’m forever indebted to Jay Rubin for giving me that job. Jay knew as soon as I found an internship I’d be gone. I wrote letters of introduction to many of the feature editors who worked there. I even got a few letters back. I still have one from Thelma Schoonmaker, who was there working on Cape Fear
What else did you like about learning on film? The first feature I assisted was 35mm, and the first feature I cut myself was Super 16 on a flatbed, which only has sprockets on one side and tears easily. Super 16 was a challenge! But it was such a cool experience. I loved assisting film editors because you would stand right next to them. It was a true partnership as you’d try to anticipate what was the next cut they wanted. I used to pride myself on finding them a two-frame trim. The crews were also larger on a film show. On Meet Joe Black, we had a room of four assistants and we’d all be spinning reels down, synching them up. It was physical and interactive. Nothing like today.
Everything changed in the move to non-linear editing. It really did. For one thing, we’re all in separate rooms; and, thanks to COVID, a lot of us are remote. It’s hard for an assistant today to have the kind of apprenticeship I enjoyed cutting on film. Back then, the assistant’s in the room with multiple editors, who are all talking to the director and producer. And we’d hear those conversations and learn a lot from them. I still invite my assistants to come in, if we’re in person. The trend is hybrid – remote for much of the show, and then in the office when it’s necessary to meet with everyone.
Did you know the AVID workflow? I did not! I interviewed for a job with David Ray, who cut Scarface, and I told him I did. When he hired me, I rushed out to take a one-week course in AVID, and then, thankfully, the facility where we were working had a lot of techs around me. I’d ask them for help all the time. I used to hear them say, “Is that an RTFM?” to each other all the time. Finally, I asked, “What is that [acronym]? I’ve never heard that one before.” They were like: “Do you really want to know?” I said, “Yes, I do. I need to learn everything about these new technologies.” And they said it stands for “Read the f••$%n manual!” [Laughs]
Have things like 4K deliverables and 8K capture impacted your craft? Not me that much, as all of our drives live somewhere else. We’re remote and coming in, through VPN and Jump [Desktop]. I just need reliable internet in my house. I can even restart what’s in the room
at the facility from my system. The large amount of data coming in really impacts the assistant editors. They have to do exports, outputs, comping all the green screens and tracking all the visual effects material. VFX supervisors and VFX editors are now commonplace on all of our jobs, and they have taken off some of that load.
What else has changed the craft? In the old days, you’d have to send a dissolve out to the lab and wait three days to get it back. And then it would have a pop in it and you’d have to send it out again! It’s crazy how much time we used to spend just getting the editors cut to a place where it was showable. Thanks to [the] polishing, and how quickly it can be done in AVID, the expectations are much higher. I turn in a cut that has sound effects, songs and music the composer scored if it’s multiple seasons and not brand new like Palm Royale I do like the scoring part and adding in music.
Do you have an example? There was an episode from The Americans that had a murder, and a girl on the bus with headphones, moving to the beat of “Sweet Dreams Are Made Of This” by Annie Lennox. That was the song they used to shoot the scene, but then Annie Lennox decided we couldn’t use the song. So, I quickly had to find another song from the same period to replace it that would line up with how the girl’s moving her head. I ended up using “Tainted Love” by Soft Cell. When it aired, the reviewers were all raving about how the words in the song perfectly fit the action in the scene. No one had a clue how it had evolved. And that’s how it should be. The process should be invisible to the viewer.
The pilot for Palm Royale established, from the opening shot, the tone, feel and flavor of the show. I imagine cutting pilots has changed as well. I didn’t do the Palm Royale pilot; I came in later on. But I did the pilot for The Patient, and that was a very tricky one to figure out. We don’t have a “pilot season” anymore, where the network had two months after the pilot debuted to figure out what worked for the rest of the series. Now, we get 10 episodes greenlit and off you go. The Americans pilot was made a year before it went to series.
How do you feel about AI, since so many people are predicting its impact will be felt first in postproduction? Obviously in the audio world, like with ADR, they can alter things so much and even change a performance. So, I was very happy the actors were able to negotiate protections for their work. For the writers as well, it could change things; especially in their research, it may play a bigger part. As for what we do, I just keep hoping that human creativity, which is what makes this industry so special, will keep us all relevant.
PRODUCTION CREDITS
COMPILED
BY
TERESA MUÑOZ
The input of Local 600 members is of the utmost importance, and we rely on our membership as the prime (and often the only) source of information in compiling this section. In order for us to continue to provide this service, we ask that Guild members submitting information take note of the following requests:
Please provide up-to-date and complete crew information (including Still Photographers, Publicists, Additional Units, etc.). Please note that the deadline for the Production Credits is on the first of the preceding cover month (excluding weekends & holidays).
Submit your jobs online by visiting: www.icg600.com/MY600/Report-Your-Job
Any questions regarding the Production Credits should be addressed to Teresa Muñoz at teresa@icgmagazine.com
THE STUDIO
20TH TELEVISION
“DOCTOR ODYSSEY”
DIRECTORS OF PHOTOGRAPHY: SIMON DENNIS, ASC, BSC, JOHN T. CONNOR
OPERATORS: ERIC SCHILLING, KEITH DUNKERLEY
ASSISTANTS: DAVID LEB, ROB MONROY, NATHAN CRUM, JARED WILSON
STEADICAM OPERATOR: ERIC SCHILLING
STEADICAM ASSISTANT: DAVID LEB
DIGITAL IMAGING TECH: SCOTT RESNICK LOADER: SONIA BARRIENTOS
“MID-CENTURY MODERN” PILOT
DIRECTOR OF PHOTOGRAPHY: GARY BAUM, ASC
OPERATORS: ALEC ELIZONDO, DEBORAH O’BRIEN, LANCE BILLITZER, EDDIE FINE
ASSISTANTS: NIGEL STEWART, BRADLEY TRAVER, CHRIS WORKMAN, JEFF ROTH, YUKA KADONO
CAMERA UTILITIES: DAN LORENZE, RICHIE FINE
DIGITAL IMAGING TECH: DEREK LANTZ
VIDEO CONTROLLER: JOHN O’BRIEN
ABC STUDIOS
“GODFATHER OF HARLEM” SEASON 4
DIRECTORS OF PHOTOGRAPHY: JACK DONNELLY, JAY FEATHER
OPERATORS: ERIC ROBINSON, ALAN MEHLBRECH
ASSISTANTS: JEROME WILLIAMS, MARC CHARBONNEAU, MIKE SWEARINGEN, VINCENT LARAWAY
STEADICAM OPERATOR: ALAN MEHLBRECH
LOADERS: THOMAS FOY, RALEIGH CAPOZZALO
STILL PHOTOGRAPHERS: SCOTT MCDERMOTT, LINDA KALLERUS, DAVID GIESBRECHT
APPLE STUDIOS
“YOUR FRIENDS AND NEIGHBORS AKA SWIPE”
DIRECTOR OF PHOTOGRAPHY: ZACHARY GALLER
OPERATORS: PHILIP J. MARTINEZ, SOC, JUSTIN FOSTER
ASSISTANTS: GUS LIMBERIS, RANDY LEE SCHWARTZ, NATHALIE RODRIGUEZ
STEADICAM OPERATOR: PHILIP J. MARTINEZ, SOC
DIGITAL IMAGING TECH: LOIC DE LAME
LOADER: SEAN GALCZYK
CAMERA UTILITIES: RICHARD PENA, MAGGIE HUGHES
STILL PHOTOGRAPHERS: JESSICA KOURKOUNIS, CARA HOWE, SARAH SHATZ, PAUL SCHIRALDI, DAVID GIESBRECHT
ATS PRODUCTION, LLC
“AT THE SEA”
DIRECTOR OF PHOTOGRAPHY: YORICK LESAUX
OPERATOR: M. DEAN EGAN
ASSISTANTS: JAMIE FITZPATRICK, TOM BELLOTTI
DIGITAL UTILITY: KEENAN KIMETTO
STILL PHOTOGRAPHERS: ROBERT CLARK, SEACIA PAVAO
BEACHWOOD SERVICES, INC.
“DAYS OF OUR LIVES” SEASON 59
DIRECTOR OF PHOTOGRAPHY: DAVID MEAGHER
OPERATORS: MARK WARSHAW, MICHAEL J. DENTON, JOHNNY BROMBEREK, JOHN BOYD, STEVE CLARK
CAMERA UTILITY: GARY CYPHER
VIDEO CONTROLLER: ALEXIS DELLAR HANSON
BIG INDIE OMEGA, INC.
“UNTITLED BALLET SHOW”
DIRECTORS OF PHOTOGRAPHY: DAVID MULLEN, ASC, ALEX NEPOMNIASCHY, ASC
OPERATORS: JIM MCCONKEY, NIKNAZ TAVAKOLIAN
ASSISTANTS: ANTHONY CAPPELLO, LIZ SINGER, BRANDON BABBIT, BRIAN GIALLORENZO
DIGITAL IMAGING TECH: MALIKA FRANKLIN
LOADERS: TREVOR BRENDEN, NICHOLAS MISISCO
STILL PHOTOGRAPHERS: PHILLIPE ANTONELLO, LESLEY ROBSON-FOSTER, DANIEL NOVARRO
CBS STUDIOS
“ELSBETH” SEASON 2
DIRECTOR OF PHOTOGRAPHY: JOHN ARONSON
OPERATORS: BARNABY SHAPIRO, KATE LAROSE
ASSISTANTS: SOREN NASH, RENE CROUT, NIALANEY RODRIGUEZ, ALISA COLLEY
LOADERS: PARKER RICE, JANAE HARRISON
STILL PHOTOGRAPHERS: MICHAEL PARMELEE, ERIC LIEBOWITZ
“NCIS” SEASON 22
DIRECTOR OF PHOTOGRAPHY: WILLIAM WEBB
OPERATORS: GREG COLLIER, CHAD ERICKSON
ASSISTANTS: JAMES TROOST, NATE LOPEZ, HELEN TADESSE, YUSEF EDMONDS, ANNA FERRARIE, DREW HAN CHO
LOADER: MIKE GENTILE
“NCIS: ORIGINS” PILOT
DIRECTOR OF PHOTOGRAPHY: KEVIN MCKNIGHT
OPERATORS: MICHAEL ALBA, MATT VALENTINE
ASSISTANTS: TAYLOR FENNO, KEVIN MILES, RICH FLOYD, HUNTER JENSEN
LOADER: VICTORIA BETANCOURT
DIGITAL IMAGING TECH: BRANNON BROWN
DIGITAL UTILITY: BECKY CINTORA
“SEAL TEAM” SEASON 7
DIRECTOR OF PHOTOGRAPHY: ERIC LEACH
OPERATORS: DOMINIC BARTOLONE, THOMAS TIECHE
ASSISTANTS: TODD AVERY, ARTURO ROJAS, RYAN JACKSON, JULIO ZEPEDA, GARY WEBSTER
STEADICAM OPERATOR: DOMINIC BARTOLONE
DIGITAL IMAGING TECHS: RAUL RIVEROS, TIM BALCOMB LOADER: WILLIAM RANDALL
CHOICE FILMS, INC.
“THE ACCUSED”
OPERATORS: ANDREW PRIESTLEY, SAWYER OUBRE
ASSISTANTS: MICHAEL BELARDI, MATT LYNCH, ANDREW BOYD, KATE IUELE
STILL PHOTOGRAPHER: VINCENT VERSACE
“THE OUTLAWS”
DIRECTOR OF PHOTOGRAPHY: JOHN INWOOD
OPERATOR: DAVID TAICHER
ASSISTANTS: DOUGLAS FOOTE, DONALD GAMBLE
STILL PHOTOGRAPHER: DAVID SCOTT HOLLOWAY
CMS PRODUCTIONS
“QUEEN OF THE DEAD”
DIRECTOR OF PHOTOGRAPHY: SHANNON MADDEN
ASSISTANTS: NICALENA IOVINO, JORDAN COOKE-LEONARD
STEADICAM OPERATOR: TOM WILLS
“ROYALTY”
DIRECTOR OF PHOTOGRAPHY: MATTHEW LIBATIQUE, ASC
OPERATORS: JULIAN DELACRUZ, JASON ROBBINS
ASSISTANTS: AURELIA WINBORN, MICHAEL GUTHRIE, ELIZABETH HEDGES, EMMALINE HING
DIGITAL IMAGING TECH: JEFFREY LAWTON FLOHR
LOADERS: HOLDEN HLINOMAZ, ELIZABETH COMPTON
STILL PHOTOGRAPHER: DAVID LEE
UNIT PUBLICIST: CID SWANK
2ND UNIT
DIRECTOR OF PHOTOGRAPHY: RICARDO SARMIENTO
COOLER WATER PRODUCTIONS, LLC
“THE CHAIR COMPANY” PILOT
DIRECTOR OF PHOTOGRAPHY: ASHLEY CONNOR
OPERATOR: JENNIE JEDDRY
ASSISTANTS: KALI RILEY, DEAN MARTINEZ,
ADAM RUSSELL, HOLLY MCCARTHY
DIGITAL IMAGING TECH: BRIAN CARDENAS
STILL PHOTOGRAPHER: VIRGINIA SHERWOOD
CRANETOWN MEDIA, LLC
“THE BETTER SISTER” SEASON 1
DIRECTORS OF PHOTOGRAPHY: ISIAH DONTE LEE, DUANE MANWILLER
OPERATORS: MARK SCHMIDT, AILEEN TAYLOR
ASSISTANTS: SAMANTHA SILVER, CAMERON SIZEMORE, ANABEL CAICEDO, FRANK MILEA, BENEDICT BALDAUFF
DIGITAL IMAGING TECHS: HUNTER FAIRSTONE, MICHAEL ASHLEY
LOADERS: JERON BLACK, HUSSEIN FARRAJ
DENISE SZALMA
STILL PHOTOGRAPHERS: JOJO WHILDEN, EMILY ARAGONES, CRAIG BLANKENHORN, CARA HOWE
DISH SERVED COLD PRODUCTIONS, LLC
“MOTOR CITY”
DIRECTOR OF PHOTOGRAPHY: JOHN MATYSIAK
OPERATORS: SAM WILLEY, DAVID MCFARLAND
ASSISTANTS: GEOFF STORTS, JAMES SCHLITTENHART, ALEC NICKEL, JOSH REYES
DIGITAL IMAGING TECH: ANTHONY HECHANOVA
LOADERS: ANDREW TRICE, TORI DUNN
STILL PHOTOGRAPHER: MATTHEW INFANTE
DXO PRODUCTIONS, LLC
“DEXTER: ORIGINAL SIN”
DIRECTOR OF PHOTOGRAPHY: ED PEI
OPERATORS: BRIAN BERNSTEIN, JORDAN KESLOW
ASSISTANTS: JAMES SPRATTLEY, JAMES DUNHAM,
MARYAN ZUREK, JEREMY HILL
LOADER: ANDREW FLORIO
STEADICAM OPERATOR: JORDAN KESLOW
CAMERA UTILITY: KAREN CLANCY
HBO
“THE GILDED AGE” SEASON 3
DIRECTORS OF PHOTOGRAPHY: MANUEL BILLETER, CHRIS LA VASSEUR
OPERATORS: OLIVER CARY, PYARE FORTUNATO, SCOTT TINSLEY
ASSISTANTS: JOHN OLIVERI, TROY SOLA, BRENDAN RUSSELL, BRIAN LYNCH, LOTTE SKUTCH
STEADICAM OPERATOR: PYARE FORTUNATO
DIGITAL IMAGING TECHS: MATT SELKIRK, JAKOB FRIEDMAN
LOADER: BRANDON OSBORN
STILL PHOTOGRAPHERS: KAROLINA WOJTASIK, JON PACK
HIGH ROLLER
PRODUCTIONS,
LLC
“POKER FACE” SEASON 2
DIRECTORS OF PHOTOGRAPHY: JARON PRESANT, CHRISTINE NG, TARI SEGAL
OPERATORS: REBECCA ARNDT, NADINE MARTINEZ
ASSISTANTS: HAMILTON LONGYEAR, COURTNEY BRIDGERS, KELLON INNOCENT, AMBER MATHES
DIGITAL IMAGING TECH: GEORGE ROBERT MORSE
LOADER: AARON CHAMPAGNE
STILL PHOTOGRAPHER: SARAH SHATZ
HORIZON SCRIPTED TELEVISION, INC.
“YOU” SEASON 5
DIRECTOR OF PHOTOGRAPHY: MOTT HUPFEL
OPERATORS: THOMAS SCHNAIDT, DANIEL HERSEY
ASSISTANTS: MARCOS RODRIGUEZ-QUIJANO, BEHNOON DADFAR, TONI SHEPPARD, KYRA KILFEATHER
DIGITAL IMAGING TECH: GUILLERMO TUNON
LOADER: ANGEL VASQUEZ
STILL PHOTOGRAPHERS: PHIL CARUSO, CLIFTON PRESCOD
KING STREET PRODUCTIONS
“TULSA KING” SEASON 2
DIRECTORS OF PHOTOGRAPHY: JOHN LINDLEY, ASC, JEFFREY GREELEY
OPERATORS: HENRY CLINE, BRANDON THOMPSON, JULES LABARTHE
ASSISTANTS: ERIC LEFTRIDGE, SCOTT FORTE, DANNY VANZURA, PETER JOHNSTON
STEADICAM OPERATOR: BRANDON THOMPSON
STEADICAM ASSISTANT: SCOTT FORTE
LOADER: BRENDAN NAJEE RAWLINS
DIGITAL UTILITY: AMANDA ASHLEY
STILL PHOTOGRAPHER: BRIAN DOUGLAS
MIXED BAG PRODUCTIONS
“RIGHTEOUS GEMSTONES” SEASON 4
DIRECTORS OF PHOTOGRAPHY: PAUL DALEY, MICHAEL SIMMONDS
OPERATORS: PETER VIETRO HANNUM, BARRET BURLAGE, TIM SUTHERLAND, DON DAVIS, DAN JONES, JAMIE SILVERSTEIN
ASSISTANTS: DAMON LEMAY, JAMES THOMAS, MATTHEW MEBANE, NICK BROWN, JUSTIN SIMPSON, EMILY RUDY, JUSTIN URBAN, DOUG TORTORICI, LAURA ROBINSON, BRODY DOCAR, AMANDA ROTZLER, OREN MALIK
DIGITAL IMAGING TECH: CHANDLER TUCKER
CAMERA LOADER: ERICH COMBS
STILL PHOTOGRAPHER: JAKE GILES NETTER
MONTEREY PICTURES, INC.
“UNTITLED MLB PROJECT”
DIRECTOR OF PHOTOGRAPHY: TERRY ZUMALT
OPERATORS: DEVON HOFF-WEEKES, DAVID NEWTON
ASSISTANTS: DEVIN KEEBLER, ETHAN SERLING, CASSIE COKER
DIGITAL IMAGING TECH: JOSHUA GREYTAK
NBC UNIVERSAL TELEVISION, LLC
“CHICAGO MED”SEASON 10
DIRECTOR OF PHOTOGRAPHY: SHAWN MAURER
OPERATORS: JOE TOLITANO, BILLY NIELSEN
ASSISTANTS: GEORGE OLSON, PATRICK DOOLEY, BRIAN KILBORN, RICHARD COLMAN, MATTHEW WILBAT, JJ LITTLEFIELD
STEADICAM OPERATOR: CHRISTOPHER GLASGOW
LOADER: TREVOR SNYDER
DIGITAL UTILITY: TRENTON LUETTICH
STILL PHOTOGRAPHER: GEORGE BURNS
The Ultimate Static Profile!
The All-In-One KL PROFILE FC performs with ease on any set and sound stage.
Contact us for your demo today and see how well these fixtures perform.
The Ultimate Versatile Profile Fixture
Ideal for indoor key lighting or performance applications.
305w Full Spectrum RGBMA LED Array Precise CCT Control with CRI 93+ Manual Zoom Range of 6°- 50° Integrated Manual Iris
Over 10,000 Lumens of Output Motorized Rotating/Indexing Gobo Holder, Optional Fresnel Lens 10 Glass Gobos Included
“FBI” SEASON 7
DIRECTOR OF PHOTOGRAPHY: BART TAU
OPERATORS: AFTON GRANT, ANDY FISHER
ASSISTANTS: ADAM GONZALEZ, YURI INOUE, MIKE LOBB, MARVIN LEE
LOADERS: MATTHEW JENSEN, DAVID DIAZ
STILL PHOTOGRAPHER: BENNETT RAGLIN
“FBI MOST WANTED” SEASON 6
DIRECTOR OF PHOTOGRAPHY: LUDOVIC LITTEE
OPERATORS: CHRIS MOONE, SCOTT TINSLEY
ASSISTANTS: JOHN FITZPATRICK, DAN PFEIFER, JOHN CONQUY, TYLER MANCUSO
LOADERS: ANTHONY VITALE, HUSSEIN FARRAJ
STILL PHOTOGRAPHER: MARK SCHAFER
“LAW & ORDER” SEASON 24
DIRECTOR OF PHOTOGRAPHY: JON DELGADO
OPERATORS: DEKE KEENER, BEAU GRANTLAND
ASSISTANTS: JASON RIHALY, JACOB STAHLMAN, EMILY DUMBRILL, KELSEY MIDDLETON
STEADICAM OPERATOR: RICHARD KEENER
LOADER: LISA CHIN
STILL PHOTOGRAPHERS: VIRGINIA SHERWOOD, IAN BRACONE, MICHAEL PARMELEE
“LAW & ORDER: SPECIAL VICTIMS UNIT” SEASON 25
DIRECTOR OF PHOTOGRAPHY: FELIKS PARNELL
OPERATORS: JON HERRON, CHRISTOPHER DEL SORDO
ASSISTANTS: JOSEPH METZGER, CHRISTIAN CARMODY, RYAN HADDON, LIAM GANNON, MARY NEARY
STEADICAM OPERATOR: JONATHAN HERRON
LOADER: JAMES WILLIAMS
STILL PHOTOGRAPHERS: PETER KRAMER, VIRGINIA SHERWOOD, EMILY ARAGONES
NETFLIX
PRODUCTIONS, LLC
“BLACK RABBIT” SEASON 1
DIRECTOR OF PHOTOGRAPHY: IGOR MARTINOVIC
OPERATOR: ARI ISSLER
ASSISTANTS: ALEXANDER WORSTER, STEPHEN MCBRIDE, ANJELA COVIAUX, YALE GROPMAN
STEADICAM OPERATOR: MATTHEW PEBLER
DIGITAL IMAGING TECH: LUKE GARAI TAYLOR
LOADERS: MCKENZIE JAMES RAYCROFT, DAVID STOREY
“SIRENS”
DIRECTOR OF PHOTOGRAPHY: GREG MIDDLETON
OPERATORS: JEFF DUTEMPLE, JOHN GARRETT
ASSISTANTS: ERIC SWANEK, EMMA REESE-SCANLON, TYLER SWANEK, TAYLOR PRINZIVALLI
DIGITAL IMAGING TECH: CHANDLER TUCKER
LOADER: JEFF DICKERSON
STILL PHOTOGRAPHERS: MACALL POLAY, CHRIS SAUNDERS, EMILY ARAGONES
“SURVIVAL OF THE THICKEST” SEASON 2
DIRECTOR OF PHOTOGRAPHY: DAGMAR WEAVERMADSEN
OPERATORS: CHRIS WAIREGI, QIANZHI SHEN
ASSISTANTS: MARCOS HERRERA, TIM TROTMAN, BABETTE GIBSON, ALLY HOOVER
LOADER: SKYE WILLIAMS
STILL PHOTOGRAPHER: VANESSA CLIFTON
NO TICKETS PRODUCTIONS, LLC
“DIARIO, MUJER Y CAFE”
DIRECTOR OF PHOTOGRAPHY: BRENDALIZ NEGRON
OPERATORS: CARLOS ZAYAS, HECTOR SANTOS
ASSISTANTS: MARAYDA CABRERA, ADAM SANTOS, ZORAIDA LUNA, ANDRES VILA
LOADER: NESTER CESTERO
DIGITAL UTILITY: VICTOR RODRIGUEZ CASTRO
STILL PHOTOGRAPHERS: ROBERT CLARK, SEACIA PAVAO
PARALLAX TV PRODUCTIONS, LLC
“THE PIT” SEASON 1
DIRECTOR OF PHOTOGRAPHY: JOHANNA COELHO
OPERATORS: RYAN WOOD, AYMAE SULICK
ASSISTANTS: JACOB DEPP, KIRSTEN CELO, PETER DEPHILIPPIS, KELLSIE DOMNITZ
DIGITAL IMAGING TECH: JEFFERSON FUGITT
DIGITAL UTILITY: TOSHA PALANI
STILL PHOTOGRAPHER: GREG LEWIS
RANDOM PRODUCTIONS, LLC
“TASK FORCE”
DIRECTORS OF PHOTOGRAPHY: ALEXANDER DISENHOF, ELIE SMOLKIN
OPERATORS: SHAWN SUNDBY, RYAN BALDWIN
ASSISTANTS: TROY DOBBERTIN, KIMBERLY HERMAN, MIKE TOLAND, ALEC FREUND, JAMES MCCANN
CREW PHOTO
“QUEEN OF THE DEAD”
DIGITAL IMAGING TECH: MATTHEW SELKIRK
STEADICAM OPERATOR: STEWART CANTRELL
LOADERS: MAD BISHOP, CORRINE MCANDREWS
STILL PHOTOGRAPHERS: PETER KRAMER, RYAN COLLERD, KAROLINA WOJTASIK
ROBERTS MEDIA, LLC
“GET HIM BACK FOR CHRISTMAS”
DIRECTOR OF PHOTOGRAPHY: RYAN GALVAN
ASSISTANTS: DAGAN REINHARDT, NICKY FUCHS
DIGITAL IMAGING TECH: HENRIKAS GENUTIS
“MY GROWN UP CHRISTMAS WISH”
DIRECTOR OF PHOTOGRAPHY: RYAN GALVAN
ASSISTANTS: JOHN WATERMAN, PATTI NOONAN
DIGITAL IMAGING TECH: HENRIKAS GENUTIS
SANTA MONICA SUMMER, INC.
“SUMMER OF 69”
FROM LEFT TO RIGHT: JORDAN COOK-LEONARD | 2ND AC
TOM WILLS | CAMERA/STEADICAM OPERATOR
SHANNON MADDEN | DIRECTOR OF PHOTOGRAPHY
NICK IOVINO | 1ST AC
BY: SPENCER PAZER
DIRECTOR OF PHOTOGRAPHY: MARIA RUSCHE
OPERATOR: PATRICK MORGAN
ASSISTANTS: MAX BATCHELDER, SYMON MINK, NOTES KAEWBAIDHOON, JOSIAH WEINHOLD
DIGITAL IMAGING TECH: MICHAEL POMORSKI
STILL PHOTOGRAPHER: BRETT ROEDEL
SHAKE IT UP/DISNEY
“FREAKY FRIDAY 2 AFTERSHOCK”
DIRECTOR OF PHOTOGRAPHY: MATTHEW CLARK, ASC
OPERATORS: DAVID EMMERICHS, JANICE MIN
ASSISTANTS: SARAH GALLEY, DANNY BROWN, LISA GUERRIERO, CASEY MULDOON
STEADICAM OPERATOR: DAVID EMMERICHS
STEADICAM ASSISTANT: SARAH GALLEY
DIGITAL IMAGING TECH: NINA CHADHA
LOADER: KELLY FILLINGER
DIGITAL UTILITY: ROBIN KIM
STILL PHOTOGRAPHER: GLEN WILSON
SONY PICTURES TELEVISION
“JEOPARDY!” SEASON 39
DIRECTOR OF PHOTOGRAPHY: JEFF ENGEL
OPERATORS: DIANE L. FARRELL, SOC, MIKE TRIBBLE, JEFF SCHUSTER, L. DAVID IRETE
JIB ARM OPERATOR: MARC HUNTER
HEAD UTILITY: TINO MARQUEZ
CAMERA UTILITY: RAY THOMPSON
VIDEO CONTROLLER: JEFF MESSENGER
VIDEO UTILITIES: MICHAEL CORWIN, JEFF KLIMUCK
STILL PHOTOGRAPHER: TYLER GOLDEN
YOUR NEXT FILM + MUSIC DESTINATION
“WHEEL OF FORTUNE” SEASON 40
DIRECTOR OF PHOTOGRAPHY: JEFF ENGEL
OPERATORS: DIANE L. FARRELL, SOC, L.DAVID IRETE, RAY GONZALES, MIKE TRIBBLE
HEAD UTILITY: TINO MARQUEZ
CAMERA UTILITY: RAY THOMPSON
VIDEO CONTROLLER: JEFF MESSENGER
VIDEO UTILITIES: MICHAEL CORWIN, JEFF KLIMUCK
JIB ARM OPERATOR: STEVE SIMMONS
STILL PHOTOGRAPHER: CAROL KAELSON
STALWART PRODUCTIONS, LLC
“THE TERROR: DEVIL IN SILVER”
SEASON 3
DIRECTOR OF PHOTOGRAPHY: JULIE KIRKWOOD
OPERATORS: RYAN TOUSSIENG, JULIEN ZEITOUNI
ASSISTANTS: ANDREW PECK, RANDY MALDONADO GALARZA, EMMALINE HING, ROSE FORMAN
DIGITAL IMAGING TECH: CURTIS ABBOTT
LOADERS: AMELIA SUMMAR, MAX SCHWARZ
STILL PHOTOGRAPHER: EMILY ARAGONES
STAMFORD MEDIA CENTER AND PRODUCTIONS, LLC
“STAMFORD MEDIA CENTER-WILKOS”
SEASON 17
OPERATORS: RON THOMPSON, VICTOR MATHEWS, ANTHONY LENZO, MARC NATHAN, DOMINICK
CIARDIELLO, JON ROSE, CHARLES BEDI
ASSISTANT: ROBERT BENEDETTI
CAMERA UTILITIES: JOE MANCUSI, ANTHONY DEFONZO, ROBERT FRITCHE, FRANK CAIOLA
CHYRON OPERATOR: DAVID KATZ
SUMMER 1, LLC
“THE SUMMER I TURNED PRETTY”
SEASON 3
DIRECTOR OF PHOTOGRAPHY: SANDRA VALDE-HANSEN
OPERATORS: MATTHEW DOLL, MICHAEL REPETA
ASSISTANTS: ALAN ALDRIDGE, SEAN YAPLE, SETH LEWIS, NICK COCUZZA
CAMERA UTILITY: HAILEY NELMS
LOADER: BRANDON ROBEY
STILL PHOTOGRAPHER: ERIKA DOSS
THJ, LLC
“THE HOLIDAY JUNKIE”
DIRECTOR OF PHOTOGRAPHY: DUANE MIELIWOCKI
OPERATOR: MIKE SHARP
ASSISTANTS: CLAUDIO BANKS, TODD DURBORAW
STEADICAM OPERATOR: MIKE SHARP
LOADER: BEN IKER
TURNER NORTH CENTER PRODUCTIONS, INC.
“AND JUST LIKE THAT”SEASON 3
DIRECTORS OF PHOTOGRAPHY: TIMOTHY NORMAN, ANDREI BOWDEN SCHWARZ
OPERATORS: PETER AGLIATA, TODD ARMITAGE
ASSISTANTS: CHRISTOPHER ENG, JOHN REEVES, SARAH SCRIVENER, MABEL SANTOS HAUGEN
DIGITAL IMAGING TECH: ANDREW NELSON
LOADERS: NYLE HIGGS, PHILIP BABICH
STILL PHOTOGRAPHER: CRAIG BLANKENHORN
UNIVERSAL TELEVISION
“THE EQUALIZER” SEASON 5
DIRECTORS OF PHOTOGRAPHY: TERRENCE L. BURKE, CLIFF CHARLES
OPERATORS: JOE BLODGETT, MALCOLM PURNELL, RICARDO SARMIENTO
ASSISTANTS: STACY MIZE, LOLA BANKS, CHRIS GLEATON, ROB WRASE, ZAKIYA LUCAS-MURRAY, COLIN MORRIS
DIGITAL IMAGING TECH: TIFFANY ARMOUR-TEJADA
LOADERS: CHRIS BAZATA, ALEX LILJA
STILL PHOTOGRAPHERS: EMILY ARAGONES, MICHAEL GREENBERG
VIBRANT PRODUCTIONS, LLC
“UNTITLED BIGELOW-OPPENHEIM”
DIRECTOR OF PHOTOGRAPHY: BARRY ACKROYD, BSC
OPERATORS: GREGOR TAVENNER, KATHERINE CASTRO, ALAN PIERCE
ASSISTANTS: NOLAN BALL, CORY STAMBLER, JASON BRIGNOLA, TIM METIVIER, CHRISTINA CARMODY, JAMES DEAN DRUMMOND, EVE STRICKMAN, ANDY HENSLER
DIGITAL IMAGING TECH: KYO MOON
LOADERS: CLARIE SNODE, PAUL SPANG, SAM FORNASIERO
STILL PHOTOGRAPHER: EROS HOAGLAND
COMMERCIALS
ACCOMPLICE MEDIA
“WELLPOINT”
DIRECTOR OF PHOTOGRAPHY: NICK TAYLOR
ASSISTANTS: KC CAPEK, MATT ARREDONDO
DIGITAL IMAGING TECH: THOMAS ZIMMERMAN
AE COMMERCIALS, LLC
“DUNKIN”
OPERATORS: JANICE MIN, DAVID WELDON, SAMUEL BUTT
ASSISTANTS: MICHAEL PANCZENKO, JR.,
STEVE WONG, ERIC GUERIN, JINUK LEE, KARLA MENDOZA
STEADICAM OPERATOR: JANICE MIN
STEADICAM ASSISTANT: MICHAEL PANCZENKO, JR.
DIGITAL IMAGING TECH: MARGARET PARUS
ANONYMOUS CONTENT
“VW ATLAS”
DIRECTOR OF PHOTOGRAPHY: CHRIS BLAUVELT
ASSISTANTS: MIKE BLAUVELT, PETER PARSON, JOHN PARSON, COURTNEY MILLER, CARRIE LAZAR, NOAH GLAZER
DIGITAL IMAGING TECH: SEAN GOLLER
BLOND
“STANLEY”
DIRECTOR OF PHOTOGRAPHY: KELLY JEFFERY
OPERATOR: JUN LI
ASSISTANTS: PAYAM YAZDANDOOST, CHRIS MARIUS JONES
STEADICAM OPERATOR: JUN LI
DIGITAL IMAGING TECH: ROB LYNN
TECHNOCRANE TECH: DERRICK ROSE
REMOTE HEAD TECH/OPERATOR: CHRISTIAN HURLEY
CMS PRODUCTIONS
“TOYOTA 4 RUNNER”
DIRECTOR OF PHOTOGRAPHY: SAM CHASE
OPERATOR: JULIA LIU, PATRICK QUINN, NATHAN SWINGLE, DAN MASON
ASSISTANTS: ASA REED, MARY ANNE JANKE, FELIX GIUFFRIDA, CHRIS MALENFANT, AUDREY STEVENS, MICHAEL RODRIGUEZ TORRENT
DIGITAL IMAGING TECH: WILL FORTUNE
LOADER: MATT MEIGS
DIGITAL UTILITY: KEENAN KIMETTO
DUO FILMS
“TK”
DIRECTOR OF PHOTOGRAPHY: PAUL THEODOROFF
ASSISTANTS: ERICK AGUILAR, BRYAM AGUILAR
DIGITAL IMAGING TECH: COLIN WEINBERG
EPOCH
“COMCAST”
DIRECTOR OF PHOTOGRAPHY: ALEXIS ZABE
ASSISTANTS: WAYNE GORING, KARLA MENDOZA
STEADICAM OPERATOR: RENARD CHEREN
DIGITAL IMAGING TECH: NINA CHADHA
FEATURE
“LOLLIPOP”
DIRECTOR OF PHOTOGRAPHY: LEONIDAS JARAMILLO
ASSISTANTS: JOSE DE LOS ANGELES, BEN MOHLER
DIGITAL IMAGING TECH: NATE KALUSHNER
GIFTED YOUTH
“NATIONWIDE”
DIRECTOR OF PHOTOGRAPHY: CRISTINA DUNLAP
OPERATORS: SHANELE ALVAREZ, SOC, RACHEL DUSA
ASSISTANTS: KYLE PETITJEAN, JENNA HOFFMAN, EVAN WILHELM, JASON GARCIA, LIAM MILLER, JONATHAN DEC
DIGITAL IMAGING TECH: NINA CHADHA
GLP PRODUCTIONS, LLC
“GLP TITLEIST”
DIRECTOR OF PHOTOGRAPHY: DAVID WILSON
ASSISTANTS: JILL TUFTS, MICHAEL RODRIGUEZ TORRENT
DIGITAL IMAGING TECH: MATTIE HAMER
HUNGRY MAN
“MILLER LITE”
DIRECTOR OF PHOTOGRAPHY: BENN MARTENSON
ASSISTANTS: IAN CONGDON, TAMARA ARROBA
DIGITAL IMAGING TECH: BRANNON BROWN
“WHATSAPP”
DIRECTOR OF PHOTOGRAPHY: JONATHAN SELA
OPERATOR: LAURENT SORIANO
ASSISTANTS: LILA BYALL, MICHAEL ASHE, KELLY SIMPSON, NOAH GLAZER, GAVIN GROSSI
DIGITAL IMAGING TECH: JORDAN HARRIMAN
ICONOCLAST
“NIKE”
DIRECTOR OF PHOTOGRAPHY: JAMES LAXTON, ASC
ASSISTANTS: ALEX SCOTT, JONATHAN CLARK
DIGITAL IMAGING TECH: KEVIN SHIRAMIZU
“PROJECT DH NY”
DIRECTOR OF PHOTOGRAPHY: MICHAEL MERRIMAN
ASSISTANTS: PETER MORELLO, NATE MCGARIGAL, ADAM MILLER
DIGITAL IMAGING TECH: TYLER ISAACSON
LITTLE PRINCE
“SOFI”
OPERATOR: CAMERON DUNCAN
ASSISTANTS: CARRIE LAZAR, NOAH GLAZER, LILA BYALL, GAVIN GROSSI
DIGITAL IMAGING TECH: CASEY SHERRIER
LOARD DANGER
“FINRA”
DIRECTOR OF PHOTOGRAPHY: KAI SAUL
OPERATOR: JUN LI
ASSISTANTS: PAYAM YAZDANDOOST, CHRIS JONES
DIGITAL IMAGING TECH: KYLE HOLLAR
MINTED
“PINK SAND”
DIRECTOR OF PHOTOGRAPHY: JONATHAN SELA
OPERATOR: CHRIS CUNNINGHAM
ASSISTANTS: LILA BYALL, NOAH GLAZER
DIGITAL IMAGING TECH: STEVE HARNELL
OMAHA PRODUCTIONS, LLC
“KINGS HAWAIIAN”
OPERATORS: BRYCE PLATZ, ADAM HULL, JR KRAUS, DANNY BROWN
ASSISTANTS: DARRELL NASH, AARON SELLER, KALEN WOLF
DIGITAL IMAGING TECH: TRAVIS THOMPSON
OBJECT & ANIMAL
“EBAY”
DIRECTOR OF PHOTOGRAPHY: EVAN PROSOFSKY
ASSISTANTS: JASMINE CHANG, JOE ASHI
DIGITAL IMAGING TECH: NINA CHADHA
RADICAL MEDIA
“AMAZON FIRE”
DIRECTOR OF PHOTOGRAPHY: CRISTINA DUNLAP
ASSISTANTS: KYLE PETITJEAN, COLLEEN MLEZIVA
DIGITAL IMAGING TECH: NINA CHADHA
“STAND STRONG CAMPAIGN 24”
DIRECTOR OF PHOTOGRAPHY: MATTHEW BALLARD
ASSISTANTS: TOM ATWELL, GREG PACE
STEADICAM OPERATOR: DEVON CATUCCI
LOADER: GREGORY HOWARD
SLIM PICTURES
“PROJECT AUSTIIN”
DIRECTOR OF PHOTOGRAPHY: PAUL THEODOROFF
ASSISTANTS: ERICK AGUILAR, BRYAM AGUILAR, SEAN DELAHUNT
DIGITAL IMAGING TECH: COLIN WEINBERG
SMUGGLER, INC.
“WEIGHT WATCHERS”
DIRECTOR OF PHOTOGRAPHY: MIKA ALTSKAN
OPERATOR: CALVIN FALK
ASSISTANTS: FILIPP PENSON, GOVINDA ANGULO, JORDAN HRISTOV, MICHAEL CAMBRIA, KYLE REPKA, SEAN FOLKI
DIGITAL IMAGING TECH: WILL FORTUNE
LOADER: GREGORY HOWARD
STINK FILMS USA
“NBA PLAYOFFS”
DIRECTOR OF PHOTOGRAPHY: RINA YANG
OPERATOR: DAVE KESSLER
ASSISTANTS: KC CAPEK, JOHN WATERMAN, DILLON BORHAM, MAC KOZI
DIGITAL IMAGING TECH: HENRIKAS GENUTIS
“PNC, GET RICH QUICK”
DIRECTOR OF PHOTOGRAPHY: DOUG P. GORDON
ASSISTANT: MARY ANNE JANKE
SUPPLY & DEMAND
“MAZDA”
DIRECTOR OF PHOTOGRAPHY: SAMUEL BAYER
OPERATOR: JESS CANNON
ASSISTANTS: NICOLAS MARTIN, JAMES BARELA, ROBYN BUCHANAN, LUIS GOMEZ
DIGITAL IMAGING TECH: FABRICIO DISANTO
ARM CAR OPERATOR: ROB RUBIN
OCULUS TECH: YURIY FUKS
SWEET RICKEY
“DRAFT KINGS”
DIRECTOR OF PHOTOGRAPHY: KIP BOGDAHN
OPERATOR: KRISTY TULLY
ASSISTANTS: CARRIE LAZAR, NOAH GLAZER, LILA BYALL
DIGITAL IMAGING TECH: CASEY SHERRIER
UTILITY: ANDI CORBAXHI
TOOL OF NORTH AMERICA
“AFLAC”
DIRECTOR OF PHOTOGRAPHY: ERIC SCHMIDT
ASSISTANTS: LILA BYALL, DANIEL HANYCH, LAURA GOLDBERG, GAVIN GROSSI, ERIC MATOS
DIGITAL IMAGING TECH: JOHN SPELLMAN
“AMAZON”
DIRECTOR OF PHOTOGRAPHY: ERIC SCHMIDT
ASSISTANTS: LAURA GOLDBERG, JOEL MARTIN
DIGITAL IMAGING TECH: JOHN SPELLMAN
MELINDA SUE GORDON, SMPSP
TWISTERS
“I knew what I was getting into when I signed on for Twisters, but the opportunity to work with Director Lee Isaac Chung and Director of Photography Dan Mindel, ASC, BSC, outweighed practical considerations. Wind, water and smoke were all givens. This image and vantage point started out reasonably tame, until a Ritter fan turned toward me – and then, wipe out! Happily, throughout, there were enough moments before visual obliteration to celebrate the people and practical special effects it took to tell this story. I hope the audience has as wonderful an experience as we had.