VFX Voice - Fall 2019 Issue

Page 1

VFXVOICE.COM FALL 2019

THE ADDAMS FAMILY REANIMATED

HOW TO START A VFX STUDIO • FUTURE TECH: LIGHT FIELDS • AD ASTRA STRANGER THINGS 3 • FOCUS: VFX FILM SCHOOLS • PROFILES: JEN UNDERDAHL & ANNE KOLBE




[ EXECUTIVE NOTE ]

Welcome to the Fall issue of VFX Voice! Awards season is in full bloom, rich with opportunities to recognize the artists and innovators behind the outstanding visual imagery that drives filmed entertainment. Congratulations to all of the Emmy Award nominees and winners for your outstanding achievements in visual effects. And with some great VFX-driven films and new TV shows capping off the year, we’re excited for what comes next as we head towards the VES Awards! In this issue, our cover feature spends time with that creepy and kooky clan, The Addams Family. We go behind the scenes of Stranger Things, Ad Astra and Ford v Ferrari, and take a look at the season’s most-anticipated film and TV titles. We share insights into Disney’s juggernaut adaptations of beloved animation to live-action films and get the inside scoop on facial capture in Avengers: Endgame. We talk light fields, AR/VR trends and starting a VFX studio; survey top-notch VFX and animation schools; profile Anne Kolbe, Warner Bros.’ celebrated Senior VP of Visual Effects, and Jen Underdahl, VFX pro of the Marvel cinematic universe; and we share the VES Section happenings in Washington State – and much more! Thank you for being a part of the global VFX Voice community. We’re proud to be the definitive authority of all things VFX. Got an idea for a future story? Contact us at publisher@vfxvoice.com. Also, be sure to check us out online at VFXVoice.com for exclusive stories between print issues, and get timely updates by following us on Twitter at @VFXSociety.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM FALL 2019



[ CONTENTS ] FEATURES 8 FILM: STARTING A VFX STUDIO Four new VFX studios share their startup stories.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 94 VES SECTION – WASHINGTON

14 TECH & TOOLS: FACIAL CAPTURE ILM/Disney’s Anyma system bows in Avengers: Endgame. 18 TV: STRANGER THINGS, SEASON 3 Digital gains traction on Netflix’s hit sci-fi series. 24 TECH & TOOLS: LIGHT FIELDS How light fields might impact VFX and filmmaking.

95 VES NEWS 96 FINAL FRAME – THE ADDAMS FAMILY

ON THE COVER: The Addams Family (Image copyright © 2019 MGM. The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.)

30 COVER: THE ADDAMS FAMILY Behind the scenes of Cinesite’s latest animated showcase. 36 PROFILE: JEN UNDERDAHL Master builder of Marvel’s franchise productions. 40 FILM: AD ASTRA How effects created moonscape and action in the desert. 46 PROFILE: ANNE KOLBE Striving to create a rich visual legacy for Warner Bros. 50 FOCUS: VFX FILM SCHOOLS An in-depth survey of leading VFX and animation schools. 60 FILM: TRANSLATING DISNEY Taking 2D animated classics into the photoreal 3D world. 66 FILM: FORD V FERRARI Utilizing digital to enhance a fast-moving period piece. 72 VR/AR TRENDS: CYCLES A home has a life of its own in Disney’s first-ever VR film. 76 FILM AND TV: FALL VFX A breakdown of Fall’s top VFX films and TV shows. 82 TV: AMERICAN GODS, SEASON 2 Mesmerizing visuals elevate the Starz/Amazon fantasy. 86 TV: CHERNOBYL Effects help recreate a horrific event for HBO series. 88 VFX VAULT: APOCALYPSE NOW Still the gold standard for massive practical explosions.

4 • VFXVOICE.COM FALL 2019



FALL 2019 • VOL. 3, NO. 4

MULTIPLE WINNER OF THE 2018 FOLIO: EDDIE & OZZIE AWARDS

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING VFXVoiceAds@gmail.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan Chris McKittrick Barbara Robertson ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Lisa Cooke, 2nd Vice Chair Brooke Lyndon-Stanford, Treasurer Rita Cahill, Secretary DIRECTORS Brooke Breton, Kathryn Brillhart, Colin Campbell Bob Coleman, Dayne Cowan, Kim Davidson Rose Duignan, Richard Edlund, VES, Bryan Grill Dennis Hoffman, Pam Hogarth, Jeff Kleiser Suresh Kondareddy, Kim Lavery, VES Tim McGovern, Emma Clifton Perry Scott Ross, Jim Rygiel, Tim Sassoon Lisa Sepp-Wilson, Katie Stetson David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Joe Weidenbach ALTERNATES Andrew Bly, Gavin Graham, Charlie Iturriaga Andres Martinez, Dan Schrecker Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2019 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM FALL 2019



FILM

HOW TO START A VFX STUDIO By IAN FAILES

Like any business, getting a new visual effects studio off the ground can be a monumental effort. The tasks of hiring talent, setting up pipelines, bidding for work, and managing the intricacies of VFX production are not trivial ones. That said, the wider availability of accessible VFX software and collaboration tools have perhaps made the task of getting started as a studio and delivering shots somewhat easier than ever before. VFX Voice asked the founders of four relatively new visual effects studios – CVD VFX, Mavericks VFX, Outpost VFX and Future Associate – how they began as independent operations, how they took up often unexpected opportunities, what hurdles they had to overcome to get their studio going, and what advice they had for others who might be looking to start their own VFX company. GOING OUT ALONE

TOP: Inside Outpost VFX’s Bournemouth studio. (Photo courtesy of Outpost VFX)

8 • VFXVOICE.COM FALL 2019

The one constant among the studios VFX Voice spoke to is that they all began as startups by visual effects supervisors who had already gained experience elsewhere. Brendan Taylor, for example, started Mavericks VFX in Toronto in 2011 after working for several years with MR. X. The TV show Transporter: The Series was intended to be Taylor’s entry into independent VFX supervising, but at the same time he also engaged a small group of freelance compositors as an in-house VFX team. Ultimately, that arrangement segued into a fully-fledged visual effects studio. The Visual Effects Supervisor admits that the early days of Mavericks were some of the hardest, since it was about keeping

work coming in and running the day-to-day life of a company. Things certainly started small: “We sublet a space to begin with, and one of the first things I got was a projector and Foundry’s Hiero, because I knew we had to review shots and couldn’t just do that on a computer monitor.” As projects rolled in, such as The Hundred-Foot Journey, The Light Between Oceans, A Dog’s Purpose and several television shows, Mavericks VFX grew. This, however, necessitated expanding office space and bringing on artists as full-time employees, with all the related expenses. Taylor eventually sought a Canadian small business loan around 2017. “That’s when I realized I was all-in,” says Taylor. “But it meant, for us, we could take on bigger jobs.” Looking back at how Mavericks has evolved, Taylor notes that it was incredibly organic. He knew the visual effects industry well, but planning for change has always been challenging. In the early days, Mavericks only spent minimal time crafting a VFX pipeline, and it’s something that Taylor says he wishes they had invested more in. “When we were just compositing, it was fine,” states Taylor. “The real challenge came when we introduced 3D. Any inefficiencies we had with 30 artists were going to be doubled with 60 artists. And one of my senior artists shared with me that experienced VFX artists are very concerned about the pipeline of the company they go to. If it is an inefficient pipeline, and they have to work harder at doing small tasks, they can’t spend as much time working on their craft, and they hate it.” Asked what his main piece of advice would be to others thinking of starting a VFX studio, Taylor identifies advice he received from another visual effects industry member – that he should straight away engage a great team of lawyers. “It has really helped me out,” says Taylor. “When you get into the legal and business affairs of DreamWorks and Disney, it can be daunting and scary. We could have saved a bit of money to go with a smaller law firm, but there’s something to be said for the letterhead. So when Warner Bros. or Disney gets a letter from our lawyers, they know them. I don’t know anything about law, and I don’t presume to, I just want to do great work. But all these things have complicated contracts which you need to understand.” The lawyers help, too, with production issues, such as tax credits. In Toronto, Mavericks VFX has the benefit of access to Ontario’s visual effects-related tax credits. While Taylor says he stayed in the city for family reasons, he would see the benefit of starting up operations in other Canadian cities such as Vancouver or Montreal to also take advantage of credits there. Still, he notes, one of the benefits of being a certain size and in one location is having a closer relationship to all the artists at the studio. “Once you get past say 50 people, the role of the management gets different, and the role of the founder gets very different,” suggests Taylor. “A lot of people came on here for the family vibe that the company has. I’m worried if we do get bigger or bring on a general manager or something like that, it could lose that vibe. But we’ll see what happens.”

TOP: Outpost’s U.K. facility has a 150-seat capacity, while the new Montreal studio has an 84-seat capacity. (Photo courtesy of Outpost VFX) MIDDLE: Duncan McWilliam, founder and CEO of Outpost VFX. (Photo courtesy of Outpost VFX) BOTTOM: Outpost’s CG train from the television series The ABC Murders. (Image copyright © 2018 BBC)

“We have had a huge amount of support from Montreal International and Invest Quebec, the government bodies for assisting overseas companies in setting up in Montreal. Equally, our lawyers and our accountants were instrumental in providing the advice we needed to ensure we are operating correctly to be incentive compliant.” —Duncan McWilliam, Founder and CEO, Outpost VFX

FALL 2019 VFXVOICE.COM • 9


FILM

HOW TO START A VFX STUDIO By IAN FAILES

Like any business, getting a new visual effects studio off the ground can be a monumental effort. The tasks of hiring talent, setting up pipelines, bidding for work, and managing the intricacies of VFX production are not trivial ones. That said, the wider availability of accessible VFX software and collaboration tools have perhaps made the task of getting started as a studio and delivering shots somewhat easier than ever before. VFX Voice asked the founders of four relatively new visual effects studios – CVD VFX, Mavericks VFX, Outpost VFX and Future Associate – how they began as independent operations, how they took up often unexpected opportunities, what hurdles they had to overcome to get their studio going, and what advice they had for others who might be looking to start their own VFX company. GOING OUT ALONE

TOP: Inside Outpost VFX’s Bournemouth studio. (Photo courtesy of Outpost VFX)

8 • VFXVOICE.COM FALL 2019

The one constant among the studios VFX Voice spoke to is that they all began as startups by visual effects supervisors who had already gained experience elsewhere. Brendan Taylor, for example, started Mavericks VFX in Toronto in 2011 after working for several years with MR. X. The TV show Transporter: The Series was intended to be Taylor’s entry into independent VFX supervising, but at the same time he also engaged a small group of freelance compositors as an in-house VFX team. Ultimately, that arrangement segued into a fully-fledged visual effects studio. The Visual Effects Supervisor admits that the early days of Mavericks were some of the hardest, since it was about keeping

work coming in and running the day-to-day life of a company. Things certainly started small: “We sublet a space to begin with, and one of the first things I got was a projector and Foundry’s Hiero, because I knew we had to review shots and couldn’t just do that on a computer monitor.” As projects rolled in, such as The Hundred-Foot Journey, The Light Between Oceans, A Dog’s Purpose and several television shows, Mavericks VFX grew. This, however, necessitated expanding office space and bringing on artists as full-time employees, with all the related expenses. Taylor eventually sought a Canadian small business loan around 2017. “That’s when I realized I was all-in,” says Taylor. “But it meant, for us, we could take on bigger jobs.” Looking back at how Mavericks has evolved, Taylor notes that it was incredibly organic. He knew the visual effects industry well, but planning for change has always been challenging. In the early days, Mavericks only spent minimal time crafting a VFX pipeline, and it’s something that Taylor says he wishes they had invested more in. “When we were just compositing, it was fine,” states Taylor. “The real challenge came when we introduced 3D. Any inefficiencies we had with 30 artists were going to be doubled with 60 artists. And one of my senior artists shared with me that experienced VFX artists are very concerned about the pipeline of the company they go to. If it is an inefficient pipeline, and they have to work harder at doing small tasks, they can’t spend as much time working on their craft, and they hate it.” Asked what his main piece of advice would be to others thinking of starting a VFX studio, Taylor identifies advice he received from another visual effects industry member – that he should straight away engage a great team of lawyers. “It has really helped me out,” says Taylor. “When you get into the legal and business affairs of DreamWorks and Disney, it can be daunting and scary. We could have saved a bit of money to go with a smaller law firm, but there’s something to be said for the letterhead. So when Warner Bros. or Disney gets a letter from our lawyers, they know them. I don’t know anything about law, and I don’t presume to, I just want to do great work. But all these things have complicated contracts which you need to understand.” The lawyers help, too, with production issues, such as tax credits. In Toronto, Mavericks VFX has the benefit of access to Ontario’s visual effects-related tax credits. While Taylor says he stayed in the city for family reasons, he would see the benefit of starting up operations in other Canadian cities such as Vancouver or Montreal to also take advantage of credits there. Still, he notes, one of the benefits of being a certain size and in one location is having a closer relationship to all the artists at the studio. “Once you get past say 50 people, the role of the management gets different, and the role of the founder gets very different,” suggests Taylor. “A lot of people came on here for the family vibe that the company has. I’m worried if we do get bigger or bring on a general manager or something like that, it could lose that vibe. But we’ll see what happens.”

TOP: Outpost’s U.K. facility has a 150-seat capacity, while the new Montreal studio has an 84-seat capacity. (Photo courtesy of Outpost VFX) MIDDLE: Duncan McWilliam, founder and CEO of Outpost VFX. (Photo courtesy of Outpost VFX) BOTTOM: Outpost’s CG train from the television series The ABC Murders. (Image copyright © 2018 BBC)

“We have had a huge amount of support from Montreal International and Invest Quebec, the government bodies for assisting overseas companies in setting up in Montreal. Equally, our lawyers and our accountants were instrumental in providing the advice we needed to ensure we are operating correctly to be incentive compliant.” —Duncan McWilliam, Founder and CEO, Outpost VFX

FALL 2019 VFXVOICE.COM • 9


FILM

‘UNINTENDED GROWTH’

“We sublet a space to begin with, and one of the first things I got was a projector and Foundry’s Hiero, because I knew we had to review shots and couldn’t just do that on a computer monitor.” —Brendan Taylor, Founder/Visual Effects Supervisor, Mavericks VFX TOP: CVD VFX founder Chris Van Dyck and crew members soak up the new mural painted at the CVD office. (Photo courtesy of CVD VFX) BOTTOM: CVD VFX’s office space in Vancouver. ‘CVD’ originally represented the founder’s name, but within the company CVD now stands for ‘connect, visualize and deliver.’ (Photo courtesy of CVD VFX)

Fellow Canadian Chris Van Dyck is another Visual Effects Supervisor who had built up experience at studios like ILM, Frantic Films, Method Studios and others, as well as being a VFX educator before starting his own outfit, CVD VFX, in 2015. “We were having our second child at the time, and I thought this was a great time to stay at home and work from home,” Van Dyck recalls. “I started putting feelers out and before I knew it I had a Canadian feature that was about 40 shots.” Van Dyck brought on colleagues and some students he had been teaching, and soon had a group of five people working together. Projects quickly piled up, and Van Dyck did not have any dedicated space. “I had to react to suddenly having three projects, so I went to the school I was teaching at, Lost Boys Studios, and they were moving at the exact same time I was hoping to set up. They had about half of the floor upstairs that wasn’t being used, so we had that.” “I started calling it ‘unintended growth,’” adds Van Dyck. “But with each opportunity that kept coming, I responded with, ‘Well, we can bring on one more person,’ or ‘We can just get a few more computers.’ And now we’re over 25 people.” Van Dyck had already had experience, too, in starting up a VFX studio back in 2008. At that time, he says it was approached as more of a fun venture, “but it was never enough to really sustain the life of having a family and really pushing the envelope. I learned a lot in those two years about being a good producer. So, with CVD, I made sure that we always had positive cashflow.” As CVD started working on more projects (such as Dark Matter, The Magicians and A Series of Unfortunate Events), Van Dyck says some of the obvious challenges became upgrading things like servers, as well as ensuring he hired the right people. One suggestion he has for other entrepreneurs is to not hire anyone who ‘wears two hats’ at the studio. “I wish I had also hired more production people sooner,” admits Van Dyck. “I kept trying to do that work myself, but it’s definitely something I’d recommend to people, to get a really solid VFX coordinator right away.” Looking to the future, Van Dyck says he is considering whether to expand in terms of VFX operations, and considering whether his experience in training could lead to a facility focused on training. “I taught for eight years, so I’d love to spin off an aspect of my business that would have an education angle.”

“I thought being a new and small company would be a tough sell, but it turns out small VFX companies are precisely what a lot of customers are looking for. There’s never been a better time to start a studio. I wish more artists out there had the courage to do it too. There is a huge opportunity right now for artists who have a vision of what a next generation studio could be.” —Lindsay Adams, Founder/Visual Effects Supervisor, Future Associate provided an alternative to draw experienced industry veterans to Bournemouth for a different kind of lifestyle while still working on high-end projects.” Adding to the point of difference by being outside of London, Outpost also offers their employees what they are calling ‘Life Time’ and ‘Extra Time.’ “These are unlimited holidays and payment for anti-social working hours,” explains McWilliam. “They reflect the huge effort our team puts in to getting the work done, and while work can mess with your life plans, it is good to know your days off are not being counted and you can flexibly take that time back.” Outpost’s projects for television and film include Nocturnal Animals, 47 Meters Down, Jason Bourne, Watchmen, Black Mirror and Catherine the Great. In recent times, the studio has also expanded to offices in Montreal and Singapore, and now has more than 140 employees. That has meant, along with the U.K. operation, a required understanding of tax incentives and subsidies in different parts of the world. “One of our non-executive board members, Tony Camilleri, was a prominent CFO in the VFX industry and as such we had a good idea of how to structure things, hence choosing Montreal as an Outpost location,” notes McWilliam. “Since then we have had a huge amount of support from Montreal International and Invest Quebec, the government bodies for assisting overseas companies in setting up in Montreal. Equally, our lawyers and our accountants were instrumental in providing the advice we needed to ensure we are operating correctly to be incentive compliant.”

TOP: Office evolution: CVD’s original workspace, which was a small area sublet from Lost Boys Studios in Vancouver. (Photo courtesy of CVD VFX) MIDDLE: A shot from A Series of Unfortunate Events, a series worked on by CVD VFX. (Image copyright © 2018 Netflix.) BOTTOM: Before and after frames from CVD VFX’s work for 12 Strong. (Image copyright © 2018 Amazon Studios.)

STARTING A U.K. STUDIO, BUT NOT IN SOHO

In the U.K., a majority of visual effects studios are in and around the Soho district of London. But back in 2012, one VFX artist looked to locate a new studio away from the city. “I saw an opportunity to support TV and indie film productions by locating our studio in a less expensive location, with close proximity to the two universities of Bournemouth,” outlines Outpost VFX founder and CEO Duncan McWilliam. “The initial ‘unique selling proposition’ was to try to get more of our clients’ budget on screen and less in the pockets of landlords, while also working closely with the next generation of VFX artists. Equally, our very different location to London

10 • VFXVOICE.COM FALL 2019

THE BIRTH OF A BRAND-NEW STUDIO

Sydney-based VFX studio Future Associate is a brand-new operation, launched in 2018 by Visual Effects Supervisor Lindsay Adams. Having worked for several studios around the world, Adams found himself starting his own after being given the opportunity to work on the pilot for HBO series Watchmen that his then-employer was not interested in taking on. Adams quit that job and worked on a shoot for Watchmen in the U.S. He returned to Sydney and crewed up for the shots, later being awarded more of the show. “While we were working on Watchmen,” says Adams, “it was announced that Method Sydney

FALL 2019 VFXVOICE.COM • 11


FILM

‘UNINTENDED GROWTH’

“We sublet a space to begin with, and one of the first things I got was a projector and Foundry’s Hiero, because I knew we had to review shots and couldn’t just do that on a computer monitor.” —Brendan Taylor, Founder/Visual Effects Supervisor, Mavericks VFX TOP: CVD VFX founder Chris Van Dyck and crew members soak up the new mural painted at the CVD office. (Photo courtesy of CVD VFX) BOTTOM: CVD VFX’s office space in Vancouver. ‘CVD’ originally represented the founder’s name, but within the company CVD now stands for ‘connect, visualize and deliver.’ (Photo courtesy of CVD VFX)

Fellow Canadian Chris Van Dyck is another Visual Effects Supervisor who had built up experience at studios like ILM, Frantic Films, Method Studios and others, as well as being a VFX educator before starting his own outfit, CVD VFX, in 2015. “We were having our second child at the time, and I thought this was a great time to stay at home and work from home,” Van Dyck recalls. “I started putting feelers out and before I knew it I had a Canadian feature that was about 40 shots.” Van Dyck brought on colleagues and some students he had been teaching, and soon had a group of five people working together. Projects quickly piled up, and Van Dyck did not have any dedicated space. “I had to react to suddenly having three projects, so I went to the school I was teaching at, Lost Boys Studios, and they were moving at the exact same time I was hoping to set up. They had about half of the floor upstairs that wasn’t being used, so we had that.” “I started calling it ‘unintended growth,’” adds Van Dyck. “But with each opportunity that kept coming, I responded with, ‘Well, we can bring on one more person,’ or ‘We can just get a few more computers.’ And now we’re over 25 people.” Van Dyck had already had experience, too, in starting up a VFX studio back in 2008. At that time, he says it was approached as more of a fun venture, “but it was never enough to really sustain the life of having a family and really pushing the envelope. I learned a lot in those two years about being a good producer. So, with CVD, I made sure that we always had positive cashflow.” As CVD started working on more projects (such as Dark Matter, The Magicians and A Series of Unfortunate Events), Van Dyck says some of the obvious challenges became upgrading things like servers, as well as ensuring he hired the right people. One suggestion he has for other entrepreneurs is to not hire anyone who ‘wears two hats’ at the studio. “I wish I had also hired more production people sooner,” admits Van Dyck. “I kept trying to do that work myself, but it’s definitely something I’d recommend to people, to get a really solid VFX coordinator right away.” Looking to the future, Van Dyck says he is considering whether to expand in terms of VFX operations, and considering whether his experience in training could lead to a facility focused on training. “I taught for eight years, so I’d love to spin off an aspect of my business that would have an education angle.”

“I thought being a new and small company would be a tough sell, but it turns out small VFX companies are precisely what a lot of customers are looking for. There’s never been a better time to start a studio. I wish more artists out there had the courage to do it too. There is a huge opportunity right now for artists who have a vision of what a next generation studio could be.” —Lindsay Adams, Founder/Visual Effects Supervisor, Future Associate provided an alternative to draw experienced industry veterans to Bournemouth for a different kind of lifestyle while still working on high-end projects.” Adding to the point of difference by being outside of London, Outpost also offers their employees what they are calling ‘Life Time’ and ‘Extra Time.’ “These are unlimited holidays and payment for anti-social working hours,” explains McWilliam. “They reflect the huge effort our team puts in to getting the work done, and while work can mess with your life plans, it is good to know your days off are not being counted and you can flexibly take that time back.” Outpost’s projects for television and film include Nocturnal Animals, 47 Meters Down, Jason Bourne, Watchmen, Black Mirror and Catherine the Great. In recent times, the studio has also expanded to offices in Montreal and Singapore, and now has more than 140 employees. That has meant, along with the U.K. operation, a required understanding of tax incentives and subsidies in different parts of the world. “One of our non-executive board members, Tony Camilleri, was a prominent CFO in the VFX industry and as such we had a good idea of how to structure things, hence choosing Montreal as an Outpost location,” notes McWilliam. “Since then we have had a huge amount of support from Montreal International and Invest Quebec, the government bodies for assisting overseas companies in setting up in Montreal. Equally, our lawyers and our accountants were instrumental in providing the advice we needed to ensure we are operating correctly to be incentive compliant.”

TOP: Office evolution: CVD’s original workspace, which was a small area sublet from Lost Boys Studios in Vancouver. (Photo courtesy of CVD VFX) MIDDLE: A shot from A Series of Unfortunate Events, a series worked on by CVD VFX. (Image copyright © 2018 Netflix.) BOTTOM: Before and after frames from CVD VFX’s work for 12 Strong. (Image copyright © 2018 Amazon Studios.)

STARTING A U.K. STUDIO, BUT NOT IN SOHO

In the U.K., a majority of visual effects studios are in and around the Soho district of London. But back in 2012, one VFX artist looked to locate a new studio away from the city. “I saw an opportunity to support TV and indie film productions by locating our studio in a less expensive location, with close proximity to the two universities of Bournemouth,” outlines Outpost VFX founder and CEO Duncan McWilliam. “The initial ‘unique selling proposition’ was to try to get more of our clients’ budget on screen and less in the pockets of landlords, while also working closely with the next generation of VFX artists. Equally, our very different location to London

10 • VFXVOICE.COM FALL 2019

THE BIRTH OF A BRAND-NEW STUDIO

Sydney-based VFX studio Future Associate is a brand-new operation, launched in 2018 by Visual Effects Supervisor Lindsay Adams. Having worked for several studios around the world, Adams found himself starting his own after being given the opportunity to work on the pilot for HBO series Watchmen that his then-employer was not interested in taking on. Adams quit that job and worked on a shoot for Watchmen in the U.S. He returned to Sydney and crewed up for the shots, later being awarded more of the show. “While we were working on Watchmen,” says Adams, “it was announced that Method Sydney

FALL 2019 VFXVOICE.COM • 11


FILM

“I wish I had also hired more production people sooner. I kept trying to do that work myself, but it’s definitely something I’d recommend to people, to get a really solid VFX coordinator right away.” —Chris Van Dyck, Founder/Visual Effects Supervisor, CVD VFX

TOP: A Mavericks VFX artist at work. (Photo courtesy of Mavericks VFX) MIDDLE: The Future Associate team after inspecting their under-construction office space in Sydney. (Photo courtesy of Future Associate) BOTTOM LEFT: The original plate for the fire scene in the first episode of season 3 of The Handmaid’s Tale. (Image copyright © 2019 Hulu) BOTTOM RIGHT: The final shot after compositing by Mavericks. (Image copyright © 2019 Hulu)

12 • VFXVOICE.COM FALL 2019

was closing its doors and laying off 200 staff. Since then we have had access to more local talent than we could have hoped for, so as the work continued to present itself I have kept taking it on, knowing we have the artists in Sydney who need the work.” Although Future Associate has had a steady flow of projects, Adams notes that there have been several hurdles to starting his own operation, including security and studio audits. “I had to become an expert in what you can and can’t do if you want work from major studios,” Adams says. “In the past we have had teams of people to manage this for us, but with my own studio I’ve had to learn every detail and hit the briefs with short-term staff to meet our clients’ security requirements.” Other aspects Adams has had to consider are space (the studio quickly grew to 20, which meant that a new location was needed), understanding commercial leases and payment terms, and the building up of a technology pipeline. “We don’t have IT staff. We have a group of talented artists and hard-working coordinators that make it happen. We run the cables ourselves, we learned how to be systems administrators, we build servers, we Google the crap out of everything.” For Adams, the fast growth has actually been surprising. “I thought being a new and small company would be a tough sell, but it turns out small VFX companies are precisely what a lot of customers are looking for. There’s never been a better time to start a studio. I wish more artists out there had the courage to do it too. There is a huge opportunity right now for artists who have a vision of what a next generation studio could be. The VFX community would be in a much better position if artists ran their own studios rather than relying on big corporations who don’t quite understand the intricacies of the business.”



TECH & TOOLS

“It’s the next evolution in getting closer to mixing digital and actor performances,” says Russell Earl, Visual Effects Supervisor at ILM on Avengers: The Endgame. “The fidelity is pretty amazing.” You can see the result on CG Hulk’s face as the character looks and performs with the sensitivity and humor of actor Mark Ruffalo. ILM created 80% of the Smart Hulk shots according to the studio, using the Anyma technology for the first time (Framestore did the hangar sequence for the time-travel testing). Doing so meant developing and modifying in-house tools during postproduction to accommodate Anyma’s high-fidelity data. “It was a great leap of faith,” Earl says, “but when we first saw the data, it was ‘Oh my gosh, this is great.’” ANATOMICALLY-INSPIRED MODEL

ILM AND DISNEY TEAM UP FOR CANNY FACIAL CAPTURES IN AVENGERS: ENDGAME By BARBARA ROBERTSON

Actors performing while balancing head helmets fitted with cameras pointed at their faces may be an odd sight, but it’s a common one on film productions when those actors play characters that become CG. The motion-capture gear allows the “CG characters” to interact with live-action actors on set, directors to direct them, the director of photography to light them, and camera operators to frame them. It has helped make the integration of CG characters into live-action films seamless, and it’s pushed forward the path toward creating believable digital humans. But it’s still awkward. Moving one step closer to the point when actors don’t have to wear silly pajamas and head cameras, ILM and Disney Research have worked together to make a markerless, high-fidelity, performance-capture system production-worthy. First prototyped in 2015 by Disney Research, the system they call Anyma has evolved, but had not been used for a film until ILM implemented it to create “Smart Hulk” in Avengers: Endgame. “I think it’s a revolution,” says Thabo Beeler, principal research scientist at Disney Research. “You can capture facial performances and preserve all the skin sliding with witness cameras. It gives the actor freedom to move around.”

All images copyright © 2019 Marvel Studios TOP: ILM joined with Disney Research to use the new Anyma facial-capture system for Hulk in Avengers: Endgame. BOTTOM: Anyma is a markerless tracker that’s able to capture pore-level facial detail.

14 • VFXVOICE.COM FALL 2019

To set the stage for Anyma, the team first did a facial scan of Ruffalo using Disney Research’s Medusa system to measure and describe his face. Unlike other performance-capture systems, Anyma needs only about 20 shapes. Not FACS-based phonemes and expressions that try to activate muscles but, instead, a few extreme positions – for example, everything lifted, or everything compressed, or both eyebrows at once. The system used those shapes from Medusa to automatically build a digital puppet that could be driven by Anyma. That is, the scanned data was integrated and fit to an underlying skull. The system fit the skull, jaw and eyes to the Ruffalo’s Medusa scan, based on forensic measurements of a typical male his age with his BMI [Body Mass Index] to create a digital puppet. “It doesn’t have to be anatomically correct,” Beeler says. “It just provides constraints for later on. The specialty of this puppet is that it has a notion of the underlying skull and the skin thickness. The skin thickness measurements indicate where the skull could be. Same for the jaw and other boney structures. Looking at just the skin is limiting – it doesn’t do well in performance capture.” That insight – thinking of the face as an anatomical structure rather than a shell – is one part of Anyma’s success. “The secret sauce is the anatomically-inspired model,” Beeler says. “Not anatomical muscle simulation. Anyma has a notion of the underlying bone structure and the tissue between the skin and bone. It’s data driven.”

“I think it’s a revolution. You can capture facial performances and preserve all the skin sliding with witness cameras. It gives the actor freedom to move around.” —Thabo Beeler, Principal Research Scientist, Disney Research

TOP: Animators were able to maintain Mark Ruffalo’s performance with more accuracy than on previous films. BOTTOM: ILM was responsible for 80% of Hulk shots in Avengers: Endgame.

DEFORMATION AND TRANSFORMATION

Part two of the secret sauce is that the researchers separated deformation and transformation – that is, they separated motion from deformation. They did this by dividing the face into small patches of approximately one-by-one centimeter. “If you look at a small patch on a face, it doesn’t do all that much,” Beeler says. “It stretches a bit (and) forms wrinkles and patterns. As a whole, the face can do much more; a small patch is not that complex. So Anyma chops up the face into a thousand small patches. Then, for each of those patches it builds a deformation space. And it learns how that patch can be formed from the input shapes.” The problem with that approach is the resulting digital face could do anything – rip apart, blow up. To avoid that, the second

FALL 2019 VFXVOICE.COM • 15


TECH & TOOLS

“It’s the next evolution in getting closer to mixing digital and actor performances,” says Russell Earl, Visual Effects Supervisor at ILM on Avengers: The Endgame. “The fidelity is pretty amazing.” You can see the result on CG Hulk’s face as the character looks and performs with the sensitivity and humor of actor Mark Ruffalo. ILM created 80% of the Smart Hulk shots according to the studio, using the Anyma technology for the first time (Framestore did the hangar sequence for the time-travel testing). Doing so meant developing and modifying in-house tools during postproduction to accommodate Anyma’s high-fidelity data. “It was a great leap of faith,” Earl says, “but when we first saw the data, it was ‘Oh my gosh, this is great.’” ANATOMICALLY-INSPIRED MODEL

ILM AND DISNEY TEAM UP FOR CANNY FACIAL CAPTURES IN AVENGERS: ENDGAME By BARBARA ROBERTSON

Actors performing while balancing head helmets fitted with cameras pointed at their faces may be an odd sight, but it’s a common one on film productions when those actors play characters that become CG. The motion-capture gear allows the “CG characters” to interact with live-action actors on set, directors to direct them, the director of photography to light them, and camera operators to frame them. It has helped make the integration of CG characters into live-action films seamless, and it’s pushed forward the path toward creating believable digital humans. But it’s still awkward. Moving one step closer to the point when actors don’t have to wear silly pajamas and head cameras, ILM and Disney Research have worked together to make a markerless, high-fidelity, performance-capture system production-worthy. First prototyped in 2015 by Disney Research, the system they call Anyma has evolved, but had not been used for a film until ILM implemented it to create “Smart Hulk” in Avengers: Endgame. “I think it’s a revolution,” says Thabo Beeler, principal research scientist at Disney Research. “You can capture facial performances and preserve all the skin sliding with witness cameras. It gives the actor freedom to move around.”

All images copyright © 2019 Marvel Studios TOP: ILM joined with Disney Research to use the new Anyma facial-capture system for Hulk in Avengers: Endgame. BOTTOM: Anyma is a markerless tracker that’s able to capture pore-level facial detail.

14 • VFXVOICE.COM FALL 2019

To set the stage for Anyma, the team first did a facial scan of Ruffalo using Disney Research’s Medusa system to measure and describe his face. Unlike other performance-capture systems, Anyma needs only about 20 shapes. Not FACS-based phonemes and expressions that try to activate muscles but, instead, a few extreme positions – for example, everything lifted, or everything compressed, or both eyebrows at once. The system used those shapes from Medusa to automatically build a digital puppet that could be driven by Anyma. That is, the scanned data was integrated and fit to an underlying skull. The system fit the skull, jaw and eyes to the Ruffalo’s Medusa scan, based on forensic measurements of a typical male his age with his BMI [Body Mass Index] to create a digital puppet. “It doesn’t have to be anatomically correct,” Beeler says. “It just provides constraints for later on. The specialty of this puppet is that it has a notion of the underlying skull and the skin thickness. The skin thickness measurements indicate where the skull could be. Same for the jaw and other boney structures. Looking at just the skin is limiting – it doesn’t do well in performance capture.” That insight – thinking of the face as an anatomical structure rather than a shell – is one part of Anyma’s success. “The secret sauce is the anatomically-inspired model,” Beeler says. “Not anatomical muscle simulation. Anyma has a notion of the underlying bone structure and the tissue between the skin and bone. It’s data driven.”

“I think it’s a revolution. You can capture facial performances and preserve all the skin sliding with witness cameras. It gives the actor freedom to move around.” —Thabo Beeler, Principal Research Scientist, Disney Research

TOP: Animators were able to maintain Mark Ruffalo’s performance with more accuracy than on previous films. BOTTOM: ILM was responsible for 80% of Hulk shots in Avengers: Endgame.

DEFORMATION AND TRANSFORMATION

Part two of the secret sauce is that the researchers separated deformation and transformation – that is, they separated motion from deformation. They did this by dividing the face into small patches of approximately one-by-one centimeter. “If you look at a small patch on a face, it doesn’t do all that much,” Beeler says. “It stretches a bit (and) forms wrinkles and patterns. As a whole, the face can do much more; a small patch is not that complex. So Anyma chops up the face into a thousand small patches. Then, for each of those patches it builds a deformation space. And it learns how that patch can be formed from the input shapes.” The problem with that approach is the resulting digital face could do anything – rip apart, blow up. To avoid that, the second

FALL 2019 VFXVOICE.COM • 15


TECH & TOOLS

“It’s the next evolution in getting closer to mixing digital and actor performances. The fidelity is pretty amazing.” —Russell Earl, Visual Effects Supervisor, ILM TOP: Questions like, “How red would the inside of Hulk’s mouth be?” had to be answered and ideated on. BOTTOM: The animators had to match the Ruffalo mesh created from Anyma to Hulk’s face.

thing Anyma learns is the relationship of the tiny skin patches to the underlying bone structure. In other words, how the eyebrow skin, for example, relates to the bone. “If you take those two things – the patches and the relationship of the bone to the skin surface – you can put them together into a face that is realistic,” Beeler says. “You can create a digital model that preserves the anatomical plausibility of a face. You can also meaningfully extrapolate what a particular face can do from the shapes you feed into the system.” Although the team had captured Ruffalo’s performance using head cameras, he could have been photographed with a single witness camera. The system creates the digital facial shape, synthesizes an image from that estimate, and optimizes the difference between it and the actor’s face using optical flow until there is a good convergence. The result is an accurate per-frame mesh of high-resolution geometry, a frame-by-frame reproduction of Ruffalo’s facial performance. “But it is not hooked up in a way that an animator can play with it,” Beeler says. “We know where each point moves over time, but it’s difficult to put that onto a rig.” That’s where ILM comes into the picture. APPLYING THE DATA

“The Anyma solve provided us with an accurate mesh of Mark Ruffalo’s face moving,” Earl says. “We had to take that mesh and apply it to Hulk’s face. We had to put the output of Anyma into a useful state. We knew we had to raise the bar to make sure the results were as accurate and highly detailed as we could get. So we rebuilt our entire system of in-house retargeting.” In the past, the artists running the retargeting solvers might use

16 • VFXVOICE.COM FALL 2019

corrective shapes to adjust the result of captured data transferred from a digital model of an actor’s face onto the targeted CG character’s face. “But when you correct subjectively, you get to the point where a character like Smart Hulk doesn’t look like him anymore,” Earl says. “You can get off model quickly. What we were trying to do with the new retargeting tool was get as close as we could to a clean solve.” Modelers had built Smart Hulk after looking at the scan data for Ruffalo, the shapes generated for Medusa for the facial library, and artwork for Smart Hulk. The retargeting system worked to apply data to that model in somewhat the same way as Anyma. “It looks at the mesh, applies it to Smart Hulk’s facial library broken into shapes similar to Ruffalo, and tries to hit the same deformations on Ruffalo,” Earl says. “It’s a learning system. It looks at the Hulk face library, at what curves it needs to drive those shapes.” The goal for the retargeting system and the artists using the system is to produce Maya scene files for animators to use. “We get controls at a gross level and then more finite detail,” Earl says. “That finite detail is what gives us a better result. In the past, because we didn’t have that finite detail, we had to go outside and drive a change with shapes. We’d lose nuance and detail. Now, we keep as much solve data as we can every step of the way. We don’t get rid of it; we creatively adjust it. We can dial it in and out. It is the perfect combination.” In addition, animators had deformers that could add subtleties to the facial performance – for sticky lips, a little more skin slide, pucker, and so forth. “It was late in the show when we had all the pieces together,”

“I think we’ll get to the point where actors don’t have to wear silly pajamas and head cameras. It’s coming.” —Russell Earl, Visual Effects Supervisor, ILM Earl says. “We tried to use the system as we made the changes, and ended up solving shots multiple times. But by the time we got the last shots, everything was in place. We had a system built and running smoothly. We could do a thousand shots. And the show came to an end.” Even though Anyma did its calculations in parallel – 1,000 cores could solve 1,000 frames – at first the solves needed to run overnight. Initially, retargeting took days – the crew was rebuilding the system during production – but Anyma got faster, retargeting took only a day, and the system worked. It was worth it. “When we first saw the result we thought, ‘OK, we’ve made the right choice,’” Earl says. “We knew Smart Hulk could be onscreen with close ups, in daylight, and in scenes with other actors, and he would have to hold up to that scrutiny. The key thing for us was making sure we could capture Mark Ruffalo’s facial performance with his body – that it didn’t require a separate ADR later. We had him on set with the other actors and captured that natural performance where he’s there in the moment.” Earl adds, “I think we’ll get to the point where actors don’t have to wear silly pajamas and head cameras. It’s coming.” TOP: Hulk’s glasses proved especially challenging for the animation team.

FALL 2019 VFXVOICE.COM • 17


TECH & TOOLS

“It’s the next evolution in getting closer to mixing digital and actor performances. The fidelity is pretty amazing.” —Russell Earl, Visual Effects Supervisor, ILM TOP: Questions like, “How red would the inside of Hulk’s mouth be?” had to be answered and ideated on. BOTTOM: The animators had to match the Ruffalo mesh created from Anyma to Hulk’s face.

thing Anyma learns is the relationship of the tiny skin patches to the underlying bone structure. In other words, how the eyebrow skin, for example, relates to the bone. “If you take those two things – the patches and the relationship of the bone to the skin surface – you can put them together into a face that is realistic,” Beeler says. “You can create a digital model that preserves the anatomical plausibility of a face. You can also meaningfully extrapolate what a particular face can do from the shapes you feed into the system.” Although the team had captured Ruffalo’s performance using head cameras, he could have been photographed with a single witness camera. The system creates the digital facial shape, synthesizes an image from that estimate, and optimizes the difference between it and the actor’s face using optical flow until there is a good convergence. The result is an accurate per-frame mesh of high-resolution geometry, a frame-by-frame reproduction of Ruffalo’s facial performance. “But it is not hooked up in a way that an animator can play with it,” Beeler says. “We know where each point moves over time, but it’s difficult to put that onto a rig.” That’s where ILM comes into the picture. APPLYING THE DATA

“The Anyma solve provided us with an accurate mesh of Mark Ruffalo’s face moving,” Earl says. “We had to take that mesh and apply it to Hulk’s face. We had to put the output of Anyma into a useful state. We knew we had to raise the bar to make sure the results were as accurate and highly detailed as we could get. So we rebuilt our entire system of in-house retargeting.” In the past, the artists running the retargeting solvers might use

16 • VFXVOICE.COM FALL 2019

corrective shapes to adjust the result of captured data transferred from a digital model of an actor’s face onto the targeted CG character’s face. “But when you correct subjectively, you get to the point where a character like Smart Hulk doesn’t look like him anymore,” Earl says. “You can get off model quickly. What we were trying to do with the new retargeting tool was get as close as we could to a clean solve.” Modelers had built Smart Hulk after looking at the scan data for Ruffalo, the shapes generated for Medusa for the facial library, and artwork for Smart Hulk. The retargeting system worked to apply data to that model in somewhat the same way as Anyma. “It looks at the mesh, applies it to Smart Hulk’s facial library broken into shapes similar to Ruffalo, and tries to hit the same deformations on Ruffalo,” Earl says. “It’s a learning system. It looks at the Hulk face library, at what curves it needs to drive those shapes.” The goal for the retargeting system and the artists using the system is to produce Maya scene files for animators to use. “We get controls at a gross level and then more finite detail,” Earl says. “That finite detail is what gives us a better result. In the past, because we didn’t have that finite detail, we had to go outside and drive a change with shapes. We’d lose nuance and detail. Now, we keep as much solve data as we can every step of the way. We don’t get rid of it; we creatively adjust it. We can dial it in and out. It is the perfect combination.” In addition, animators had deformers that could add subtleties to the facial performance – for sticky lips, a little more skin slide, pucker, and so forth. “It was late in the show when we had all the pieces together,”

“I think we’ll get to the point where actors don’t have to wear silly pajamas and head cameras. It’s coming.” —Russell Earl, Visual Effects Supervisor, ILM Earl says. “We tried to use the system as we made the changes, and ended up solving shots multiple times. But by the time we got the last shots, everything was in place. We had a system built and running smoothly. We could do a thousand shots. And the show came to an end.” Even though Anyma did its calculations in parallel – 1,000 cores could solve 1,000 frames – at first the solves needed to run overnight. Initially, retargeting took days – the crew was rebuilding the system during production – but Anyma got faster, retargeting took only a day, and the system worked. It was worth it. “When we first saw the result we thought, ‘OK, we’ve made the right choice,’” Earl says. “We knew Smart Hulk could be onscreen with close ups, in daylight, and in scenes with other actors, and he would have to hold up to that scrutiny. The key thing for us was making sure we could capture Mark Ruffalo’s facial performance with his body – that it didn’t require a separate ADR later. We had him on set with the other actors and captured that natural performance where he’s there in the moment.” Earl adds, “I think we’ll get to the point where actors don’t have to wear silly pajamas and head cameras. It’s coming.” TOP: Hulk’s glasses proved especially challenging for the animation team.

FALL 2019 VFXVOICE.COM • 17


TV

CHASING RUSSIANS AND MONSTERS IN STRANGER THINGS SEASON 3 By KEVIN H. MARTIN

All images copyright © 2019 Netflix

18 • VFXVOICE.COM FALL 2019

In its first two highly popular seasons, the Netflix series Stranger Things has managed to brilliantly reinterpret familiar tropes, ranging from outcast schoolkids pulling off unlikely triumphs to government conspiracies being carried out in plain sight. Taking inspiration from trademark Spielbergian films (Poltergeist and E.T., to name but two examples), plus the works of novelist Stephen King and genre-meister John Carpenter among many other works of horror, writer/director team the Duffer Brothers are a singular driving force behind the series. Taking place in the summer of 1985, season 3 finds the youthful leads again dealing with an otherworldly menace, while also logging significant time at the town’s Starcourt Mall. The Brothers were initially enamored of in-camera solutions, but that view expanded to encompass a greater reliance on postproduction as they went through a first season learning curve. “I wasn’t here that first year, but the plan had been initially intended to feature nearly all practical effects,” says Senior VFX Supervisor Paul Graff, a four-time Emmy winner for The Triangle, John Adams, Boardwalk Empire and Black Sails, who oversaw work on both seasons 2 and 3. “As it turned out, the Brothers became less enthusiastic about everything achieved going that route, and the digital approach gained more and more traction. We’ve continued along these lines, but I still find it important to have something practical on-set during live-action shooting that gets the physical idea of the thing across.” Graff goes on to observe, “The Duffers do reference other shows and other movies, this is a unique little tale. Solutions never just come off the shelf from them. The story usually stays in flux until a few weeks before shooting. In my experience, it is very wise to approach the VFX sections with a maximum of flexibility, because one never knows how a shooting day is going to play out, taking into account inspirations that turn up on the day as we shoot with the kids.”

While Netflix affords Stranger Things a healthy budget, Graff admits the effects are done at a price point less than the sky’s-thelimit Game of Thrones level. “Making an effort to remain frugal is often just being smart with your up-front planning, such as working out details with storyboards. Then there’s the matter of conveying the presence of something not actually there to those out onstage. If you have the money to do very tech-heavy lifting, you can project a pre-rendered creature into the camera so the director and operators can see it, but we didn’t have the time

TOP LEFT AND TOP RIGHT: To provide accurate lighting data, stuntman Ken Barefield wore a large silver helmet while standing in for a rampaging incarnation of the Demogorgon. Rodeo FX created the creature animation. BOTTOM: The Russian device fires up. Returning Senior VFX Supervisor Paul Graff called upon multiple vendors for the new season, with Spin VFX and Scanline dividing the work involving the massive Soviet-built interdimensional tunneling device.

FALL 2019 VFXVOICE.COM • 19


TV

CHASING RUSSIANS AND MONSTERS IN STRANGER THINGS SEASON 3 By KEVIN H. MARTIN

All images copyright © 2019 Netflix

18 • VFXVOICE.COM FALL 2019

In its first two highly popular seasons, the Netflix series Stranger Things has managed to brilliantly reinterpret familiar tropes, ranging from outcast schoolkids pulling off unlikely triumphs to government conspiracies being carried out in plain sight. Taking inspiration from trademark Spielbergian films (Poltergeist and E.T., to name but two examples), plus the works of novelist Stephen King and genre-meister John Carpenter among many other works of horror, writer/director team the Duffer Brothers are a singular driving force behind the series. Taking place in the summer of 1985, season 3 finds the youthful leads again dealing with an otherworldly menace, while also logging significant time at the town’s Starcourt Mall. The Brothers were initially enamored of in-camera solutions, but that view expanded to encompass a greater reliance on postproduction as they went through a first season learning curve. “I wasn’t here that first year, but the plan had been initially intended to feature nearly all practical effects,” says Senior VFX Supervisor Paul Graff, a four-time Emmy winner for The Triangle, John Adams, Boardwalk Empire and Black Sails, who oversaw work on both seasons 2 and 3. “As it turned out, the Brothers became less enthusiastic about everything achieved going that route, and the digital approach gained more and more traction. We’ve continued along these lines, but I still find it important to have something practical on-set during live-action shooting that gets the physical idea of the thing across.” Graff goes on to observe, “The Duffers do reference other shows and other movies, this is a unique little tale. Solutions never just come off the shelf from them. The story usually stays in flux until a few weeks before shooting. In my experience, it is very wise to approach the VFX sections with a maximum of flexibility, because one never knows how a shooting day is going to play out, taking into account inspirations that turn up on the day as we shoot with the kids.”

While Netflix affords Stranger Things a healthy budget, Graff admits the effects are done at a price point less than the sky’s-thelimit Game of Thrones level. “Making an effort to remain frugal is often just being smart with your up-front planning, such as working out details with storyboards. Then there’s the matter of conveying the presence of something not actually there to those out onstage. If you have the money to do very tech-heavy lifting, you can project a pre-rendered creature into the camera so the director and operators can see it, but we didn’t have the time

TOP LEFT AND TOP RIGHT: To provide accurate lighting data, stuntman Ken Barefield wore a large silver helmet while standing in for a rampaging incarnation of the Demogorgon. Rodeo FX created the creature animation. BOTTOM: The Russian device fires up. Returning Senior VFX Supervisor Paul Graff called upon multiple vendors for the new season, with Spin VFX and Scanline dividing the work involving the massive Soviet-built interdimensional tunneling device.

FALL 2019 VFXVOICE.COM • 19


TV

to previs and generate in advance like that. Last season we made a 3D-printed stand-in prop for our pollywog creature, which was used both as a lighting reference and an aid for the child actors. But this season, we’ve got something the size of a T-Rex invading a shopping mall, so there was no way that 3D printing approach was going to work for us this time out,” he laughs. In order to visualize and puppeteer the monster’s head, special effects initially constructed a zeppelin-shaped creature shell. “But it weighed a hundred pounds, so it wasn’t very practical having something like that way up in the air,” Graff recalls. “So I bought the lightest large object I could think of – a blow-up beach ball. Taping that to the end of a 20-foot boom pole let us puppeteer the ‘head’ enough to provide an eyeline for the actors while also giving camera operators a shot at framing for and tracking the creature’s movements.” Planning sessions concerning VFX – especially the monsters – involved production designer Chris Trujillo, senior concept illustrator Michael Maher and the Duffers sitting down with Graff. “Last season’s Shadow Monster was kind of ethereal, a particle creature, [manifested] by the Mind Flayer,” explains Graff. “Having now identified Eleven as his opponent, he returns this season in a much more corporeal way to deal with her. Everybody really liked the Duffers’ idea of leveraging off 1982’s The Thing, so that helped form a backbone for the whole season. The team’s initial deep exploration was all about visualizing this incarnation of the creature. VFX-wise, it allows us to play with some real weight and, consequently, a different new feel for our creature animation, including specularity and moisture.” The big menace from season 1, the Demogorgon, is also back, but seen in a new stage of its growth. “Last season featured a different, earlier phase of life for the creature. We called them stages one through four, from the pollywog to the adolescent, stage-four Demodog. This time out, we indicate the final evolutionary step from season 1’s Demogorgon to season 2’s Demodog. Having all these new aspects to the creatures meant it was going to be a really nice playground for us and our vendors. One of the biggest parts of this job is picking the right collaborators, because not every house is good at all things. We got very lucky this season, having Rodeo FX do the creature work.” On-set supervisory work was most often about going with the flow rather than saying “no.” “There was definitely a very attentive, creative and collaborative approach to shooting, and I think the show benefited from that,” states Graff. “Lack of energy and lack of interactivity can kill a scene, or work against creating the proper emotional response, so it can be better having to deal with a messy plate than to have a perfectly lit one that is too tame. Stranger Things has a lot of trademark visual elements, one of which is when shit hits the fan and things get crazy, that there are a lot of flickering lights. Because of the random aspect that comes with all that flickering, you can’t do multiple passes and get them to match, so we tried to get as much as we could in one, which meant getting our lighting reference during the same take.” Graff’s solution: “I came up with the idea of having a stunt guy wearing this giant silver ball helmet while standing in for an incarnation of the monster that was 6-feet or 7-feet tall. This helmet let us see the variance in brightness of the lights from frame to frame. I told assistant stunt coordinator Ken Barefield that I needed him to really

20 • VFXVOICE.COM FALL 2019

be that monster, conveying all that evil energy. ‘I need you to roar – and you can’t let yourself be intimidated by the fact you’re wearing a ridiculous red spandex suit with a giant silver ball helmet.’ He told me, ‘Don’t worry, Paul, I’ve got this – I’ll deliver for you’ – and then he sure did. The lights flicker and go out, and when they come back on, this guy is screaming at the top of his lungs, charging like a bull down the hallway. It looked completely absurd, and yet at the same time, it was really cool, one of the most amazing moments of the year for me.” The other major effects thrust revolved around a massive underground machine constructed by Russian agents, operating right under the feet of the unaware locals. Utilizing hard-surface modeling and elaborate lighting effects, the work was divided between Spin VFX (based out of Toronto, now also in Atlanta, where the series is shot) and Munich’s Scanline (Crafty Apes, Alchemy 24, Vitality Visual Effects and RISE also contributed to season 3’s workload.) Described by Graff as a cross between a jet-engine turbine and a ray gun, the device attempts to force open a new rift to the Upside Down. While production’s art department could have built the Russian machine, the turbine would not have been functional. “So we suggested they build only a skeleton,” notes Graff, “and let us take it the rest of the way with our vendors. That frame was enough to provide a sense of what the shape would be if it were fully built, but you could still see through it.” Production shot practical light passes to provide an interactive element to serve as a basis for animation, which featured heavy electrical flicker and a powerful beam emitted by the machine. “We built a Christmas tree-like metal scaffold, spiked with little tracking dots and featuring large silver and gray tracking balls to reflect the lighting every few feet,” says Graff. “That let us track the camera motion and changing light without slowing down or otherwise impacting the shoot. Then, for the end of the sequence when the lights really get going, special effects built this cylindrical rotating rig with lights in it and mirrors inside and out. This complicated device had a Viper [automobile] motor that could spin at any desired speed, which helped us capture really amazing plates with all this chaotic lighting.” The season climaxes with the episode “The Battle of Starcourt,” which gives evidence to the idea that in the 1980s, everybody, even monsters, went to the mall. “One thing that really saved us there was choosing very early on to address potential approaches to the big showdown with the monster,” Graff reveals. “The Duffers liked the T-Rex chasing the car in the first Jurassic Park, and in the pitch documents for the season, the monster was rumbling through the Hawkins Fourth of July parade.” Graff developed the idea of using the necessary animation development by Rodeo for pitching certain key shots to the Duffers. “I thought it would be a way to get a leg up on this,” he relates. “We could use these for testing approaches to the animation. It was like throwing a bunch of darts at the board – even if just one of these caught their eyes, it might stick and get written into the finale. We generated some nice grayscale views of the monster tearing through Hawkins, tossing cars around like they were footballs, with one crashing into the marquee of the town movie theater.”

“Everybody really liked the Duffers’ idea of leveraging off 1982’s The Thing, so that helped form a backbone for the whole season. The team’s initial deep exploration was all about visualizing this incarnation of the creature. VFX-wise, it allows us to play with some real weight and, consequently, a different new feel for our creature animation, including specularity and moisture.” —Paul Graff, Senior VFX Supervisor

OPPOSITE TOP TWO: Before and after. Mayhem is unleashed beneath town as the Russian plan to exploit the Upside Down gets turned topsy-turvy. Animated turbine and electricity effects feature prominently throughout, building on practical light passes shot by production. OPPOSITE BOTTOM TWO: Before and after. The town’s sheriff becomes engaged in a lifeand-death struggle below Hawkins. Only a skeletal version of the turbine machine was built on stage, with the rest created digitally. THIS PAGE TOP TWO: Before and after. Beach balls provided a lightweight, easily positionable reference for eyelines when shooting live-action that would feature a creature to be added in post.

FALL 2019 VFXVOICE.COM • 21


TV

to previs and generate in advance like that. Last season we made a 3D-printed stand-in prop for our pollywog creature, which was used both as a lighting reference and an aid for the child actors. But this season, we’ve got something the size of a T-Rex invading a shopping mall, so there was no way that 3D printing approach was going to work for us this time out,” he laughs. In order to visualize and puppeteer the monster’s head, special effects initially constructed a zeppelin-shaped creature shell. “But it weighed a hundred pounds, so it wasn’t very practical having something like that way up in the air,” Graff recalls. “So I bought the lightest large object I could think of – a blow-up beach ball. Taping that to the end of a 20-foot boom pole let us puppeteer the ‘head’ enough to provide an eyeline for the actors while also giving camera operators a shot at framing for and tracking the creature’s movements.” Planning sessions concerning VFX – especially the monsters – involved production designer Chris Trujillo, senior concept illustrator Michael Maher and the Duffers sitting down with Graff. “Last season’s Shadow Monster was kind of ethereal, a particle creature, [manifested] by the Mind Flayer,” explains Graff. “Having now identified Eleven as his opponent, he returns this season in a much more corporeal way to deal with her. Everybody really liked the Duffers’ idea of leveraging off 1982’s The Thing, so that helped form a backbone for the whole season. The team’s initial deep exploration was all about visualizing this incarnation of the creature. VFX-wise, it allows us to play with some real weight and, consequently, a different new feel for our creature animation, including specularity and moisture.” The big menace from season 1, the Demogorgon, is also back, but seen in a new stage of its growth. “Last season featured a different, earlier phase of life for the creature. We called them stages one through four, from the pollywog to the adolescent, stage-four Demodog. This time out, we indicate the final evolutionary step from season 1’s Demogorgon to season 2’s Demodog. Having all these new aspects to the creatures meant it was going to be a really nice playground for us and our vendors. One of the biggest parts of this job is picking the right collaborators, because not every house is good at all things. We got very lucky this season, having Rodeo FX do the creature work.” On-set supervisory work was most often about going with the flow rather than saying “no.” “There was definitely a very attentive, creative and collaborative approach to shooting, and I think the show benefited from that,” states Graff. “Lack of energy and lack of interactivity can kill a scene, or work against creating the proper emotional response, so it can be better having to deal with a messy plate than to have a perfectly lit one that is too tame. Stranger Things has a lot of trademark visual elements, one of which is when shit hits the fan and things get crazy, that there are a lot of flickering lights. Because of the random aspect that comes with all that flickering, you can’t do multiple passes and get them to match, so we tried to get as much as we could in one, which meant getting our lighting reference during the same take.” Graff’s solution: “I came up with the idea of having a stunt guy wearing this giant silver ball helmet while standing in for an incarnation of the monster that was 6-feet or 7-feet tall. This helmet let us see the variance in brightness of the lights from frame to frame. I told assistant stunt coordinator Ken Barefield that I needed him to really

20 • VFXVOICE.COM FALL 2019

be that monster, conveying all that evil energy. ‘I need you to roar – and you can’t let yourself be intimidated by the fact you’re wearing a ridiculous red spandex suit with a giant silver ball helmet.’ He told me, ‘Don’t worry, Paul, I’ve got this – I’ll deliver for you’ – and then he sure did. The lights flicker and go out, and when they come back on, this guy is screaming at the top of his lungs, charging like a bull down the hallway. It looked completely absurd, and yet at the same time, it was really cool, one of the most amazing moments of the year for me.” The other major effects thrust revolved around a massive underground machine constructed by Russian agents, operating right under the feet of the unaware locals. Utilizing hard-surface modeling and elaborate lighting effects, the work was divided between Spin VFX (based out of Toronto, now also in Atlanta, where the series is shot) and Munich’s Scanline (Crafty Apes, Alchemy 24, Vitality Visual Effects and RISE also contributed to season 3’s workload.) Described by Graff as a cross between a jet-engine turbine and a ray gun, the device attempts to force open a new rift to the Upside Down. While production’s art department could have built the Russian machine, the turbine would not have been functional. “So we suggested they build only a skeleton,” notes Graff, “and let us take it the rest of the way with our vendors. That frame was enough to provide a sense of what the shape would be if it were fully built, but you could still see through it.” Production shot practical light passes to provide an interactive element to serve as a basis for animation, which featured heavy electrical flicker and a powerful beam emitted by the machine. “We built a Christmas tree-like metal scaffold, spiked with little tracking dots and featuring large silver and gray tracking balls to reflect the lighting every few feet,” says Graff. “That let us track the camera motion and changing light without slowing down or otherwise impacting the shoot. Then, for the end of the sequence when the lights really get going, special effects built this cylindrical rotating rig with lights in it and mirrors inside and out. This complicated device had a Viper [automobile] motor that could spin at any desired speed, which helped us capture really amazing plates with all this chaotic lighting.” The season climaxes with the episode “The Battle of Starcourt,” which gives evidence to the idea that in the 1980s, everybody, even monsters, went to the mall. “One thing that really saved us there was choosing very early on to address potential approaches to the big showdown with the monster,” Graff reveals. “The Duffers liked the T-Rex chasing the car in the first Jurassic Park, and in the pitch documents for the season, the monster was rumbling through the Hawkins Fourth of July parade.” Graff developed the idea of using the necessary animation development by Rodeo for pitching certain key shots to the Duffers. “I thought it would be a way to get a leg up on this,” he relates. “We could use these for testing approaches to the animation. It was like throwing a bunch of darts at the board – even if just one of these caught their eyes, it might stick and get written into the finale. We generated some nice grayscale views of the monster tearing through Hawkins, tossing cars around like they were footballs, with one crashing into the marquee of the town movie theater.”

“Everybody really liked the Duffers’ idea of leveraging off 1982’s The Thing, so that helped form a backbone for the whole season. The team’s initial deep exploration was all about visualizing this incarnation of the creature. VFX-wise, it allows us to play with some real weight and, consequently, a different new feel for our creature animation, including specularity and moisture.” —Paul Graff, Senior VFX Supervisor

OPPOSITE TOP TWO: Before and after. Mayhem is unleashed beneath town as the Russian plan to exploit the Upside Down gets turned topsy-turvy. Animated turbine and electricity effects feature prominently throughout, building on practical light passes shot by production. OPPOSITE BOTTOM TWO: Before and after. The town’s sheriff becomes engaged in a lifeand-death struggle below Hawkins. Only a skeletal version of the turbine machine was built on stage, with the rest created digitally. THIS PAGE TOP TWO: Before and after. Beach balls provided a lightweight, easily positionable reference for eyelines when shooting live-action that would feature a creature to be added in post.

FALL 2019 VFXVOICE.COM • 21


TV

“This time out, we indicate the final evolutionary step from season 1’s Demogorgon to season 2’s Demodog. Having all these new aspects to the creatures meant it was going to be a really nice playground for us and our vendors. One of the biggest parts of this job is picking the right collaborators, because not every house is good at all things. We got very lucky this season, having Rodeo FX do the creature work.” —Paul Graff, Senior VFX Supervisor Ultimately, none of these wound up onscreen. “Even so, it was proof of concept for whether we could pull off something very feature-sized,” Graff notes, “and so it did inform how they wrote things, giving the Brothers new ideas about how things could play out after seeing the scale of our testing. Perhaps most importantly, this work gave them more of a feel for the character of the monster. I talked with the head of Rodeo the other day, and we both were saying this was the best way things could have worked out, because that fairly limited investment of resources and time got us to the point where we knew how the monster moved, and had it down cold when executing the finals.” The whole run of the series has been shot on RED cameras, and for season 3, director of photography Tim Ives began shooting on Leitz Thalia lenses in 8K via the RED DSMC2, which employs the Vista Vision-sized Monstro sensor. “The Duffers like shooting with a 15% overscan,” says Graff, “as it can be a great lifesaver if they choose to reframe or stabilize a shot. So we worked with a center extraction slightly over 7K before entering Ultra HD. Our workflow for the season used ACES 1.03. We built our color pipeline around Red’s wide-gamut RGB. Then there’s the show LUT and CDLs for each shot – our vendors deliver QuickTimes with the LUTs applied, so everybody sees things in the same way – which, in the DI, our phenomenal [Deluxe/EFILM digital immediate] colorist Skip Kimball kept fairly punchy and contrasty.” Just a couple months before the season debuted, Graff remained extremely optimistic about the future of the series. “We’re pretty stoked about how things on our end came out,” he states, “but the bottom line is since there are only two Duffers, even with them seemingly working 24/7, it all kind of revolves around how fast they can turn these great stories out. Whatever we can do to keep them on schedule while hopefully providing some additional inspiration is only going to help it along towards being another big win for us all.”

TOP TWO: Before and after. Conspicuous consumption in action as the Demogorgon feasts near the food court. BOTTOM TWO: Before and after. The mall battle climaxes with a pyrotechnic display worthy of the season’s Fourth of July setting.

22 • VFXVOICE.COM FALL 2019



TECH & TOOLS

LIGHT FIELDS AND THE FUTURE OF VFX By IAN FAILES

TOP LEFT: Unfolding, which shows a musician from different points of view, was developed by Filmakademie Baden-Württemberg. (Image source: Saarland University)

Can light fields change visual effects? The answer to that question might still be uncertain, but it is possible they are already changing how immersive experiences are captured and presented to viewers. What impact light fields will have on VFX is a question that has been asked several times in recent years, as an increasing number of academic researchers and companies experiment in the light fields area. Among the possibilities include new ways of acquiring images, and new approaches to compositing (although the death of greenscreen thanks to light fields may be slightly exaggerated). VFX and CG artists, and researchers working VR and AR, certainly have already found that light fields open up the degrees of freedom a user may experience in immersive experiences. VFX Voice canvassed a few experts in the field to find out about some of the latest research being done in light fields to see how it might impact visual effects and filmmaking in the near future. FIRST, WHAT ARE LIGHT FIELDS?

TOP LEFT: The light-field camera built by Saarland University that was then used to make Unfolding, as part of the SAUCE EU Research and Innovation project. (Image source: Saarland University) TOP RIGHT: Close-up on the light-field rig used to make Unfolding. (Image source: Saarland University)

24 • VFXVOICE.COM FALL 2019

Think of a light field as all the light that goes through a particular area or volume of space. How that quickly becomes important in filmmaking and visual effects is in relation to cameras. A typical camera captures rays of light that enter through its lens. But to capture light fields, multiple lenses generally need to be used. This ‘array’ of cameras or lenses ultimately gives the viewer a number of vantage points. In processing that imagery – which can be from scores of vantage points – you can ascertain details about the intensity of light, its angular direction and more. It also means things can be done with the captured light field that cannot typically be done with 2D images or video. WHAT ARE RESEARCHERS DOING WITH LIGHT FIELDS?

Dr. Joachim Keinert, Head of Group Computational Imaging

at the Fraunhofer Institute for Integrated Circuits in Germany, has been leading a team in the building of multi-camera arrays for post-production and virtual reality applications. Keinert sees light fields as a possible key contributor to the world of visual effects. “By capturing multiple perspectives from a scene, light fields preserve the spatial structure of a scene,” he says. “Light fields allow us to generate the correct perspective of a scene for any arbitrary virtual camera position. Compared to other technologies for 3D reconstruction, light fields excel by their large number of perspectives and hence by their capability to handle occlusions. Occlusions occur when one object is hiding another one while trying to compute a virtual camera position. This is a challenging problem that can be particularly well handled with light fields. “Moreover,” continues Keinert, “they can preserve viewdependent appearances, as for instance occurring for specular objects. If your eyes fix a certain point of such a specular object while moving your head, you will observe that the color and the luminance of this point changes. Since light fields capture a scene from many perspectives, they can record such effects. Finally, you can compute very good depth maps because of the large number of captured perspectives.” Keinert’s team also sees light fields as a promising way for bringing and adding photorealism to VR and AR applications. “Light fields can provide the correct scene perspectives without recurring to any meshing,” notes Keinert. “By these means, they are particularly well suited for live or real-time workflows and applications where reality should be reproduced as faithfully as possible.” A Fraunhofer IIS-developed plugin called Realception allows for playback of light-field-captured data. Realception for Nuke provides editing and processing features to handle light field data sets, while Realception for Unreal provides a shader that allows embedding the processed light field data set into a CG

TOP RIGHT: Light-field camera array used at Fraunhofer IIS. (Image source: Fraunhofer IIS) BELOW: Light-field capture inside of Nuke. (Image source: Fraunhofer IIS) BOTTOM: The user interface for Fraunhofer IIS’ Realception Unreal Engine plugin. (Image source: Fraunhofer IIS)

FALL 2019 VFXVOICE.COM • 25


TECH & TOOLS

LIGHT FIELDS AND THE FUTURE OF VFX By IAN FAILES

TOP LEFT: Unfolding, which shows a musician from different points of view, was developed by Filmakademie Baden-Württemberg. (Image source: Saarland University)

Can light fields change visual effects? The answer to that question might still be uncertain, but it is possible they are already changing how immersive experiences are captured and presented to viewers. What impact light fields will have on VFX is a question that has been asked several times in recent years, as an increasing number of academic researchers and companies experiment in the light fields area. Among the possibilities include new ways of acquiring images, and new approaches to compositing (although the death of greenscreen thanks to light fields may be slightly exaggerated). VFX and CG artists, and researchers working VR and AR, certainly have already found that light fields open up the degrees of freedom a user may experience in immersive experiences. VFX Voice canvassed a few experts in the field to find out about some of the latest research being done in light fields to see how it might impact visual effects and filmmaking in the near future. FIRST, WHAT ARE LIGHT FIELDS?

TOP LEFT: The light-field camera built by Saarland University that was then used to make Unfolding, as part of the SAUCE EU Research and Innovation project. (Image source: Saarland University) TOP RIGHT: Close-up on the light-field rig used to make Unfolding. (Image source: Saarland University)

24 • VFXVOICE.COM FALL 2019

Think of a light field as all the light that goes through a particular area or volume of space. How that quickly becomes important in filmmaking and visual effects is in relation to cameras. A typical camera captures rays of light that enter through its lens. But to capture light fields, multiple lenses generally need to be used. This ‘array’ of cameras or lenses ultimately gives the viewer a number of vantage points. In processing that imagery – which can be from scores of vantage points – you can ascertain details about the intensity of light, its angular direction and more. It also means things can be done with the captured light field that cannot typically be done with 2D images or video. WHAT ARE RESEARCHERS DOING WITH LIGHT FIELDS?

Dr. Joachim Keinert, Head of Group Computational Imaging

at the Fraunhofer Institute for Integrated Circuits in Germany, has been leading a team in the building of multi-camera arrays for post-production and virtual reality applications. Keinert sees light fields as a possible key contributor to the world of visual effects. “By capturing multiple perspectives from a scene, light fields preserve the spatial structure of a scene,” he says. “Light fields allow us to generate the correct perspective of a scene for any arbitrary virtual camera position. Compared to other technologies for 3D reconstruction, light fields excel by their large number of perspectives and hence by their capability to handle occlusions. Occlusions occur when one object is hiding another one while trying to compute a virtual camera position. This is a challenging problem that can be particularly well handled with light fields. “Moreover,” continues Keinert, “they can preserve viewdependent appearances, as for instance occurring for specular objects. If your eyes fix a certain point of such a specular object while moving your head, you will observe that the color and the luminance of this point changes. Since light fields capture a scene from many perspectives, they can record such effects. Finally, you can compute very good depth maps because of the large number of captured perspectives.” Keinert’s team also sees light fields as a promising way for bringing and adding photorealism to VR and AR applications. “Light fields can provide the correct scene perspectives without recurring to any meshing,” notes Keinert. “By these means, they are particularly well suited for live or real-time workflows and applications where reality should be reproduced as faithfully as possible.” A Fraunhofer IIS-developed plugin called Realception allows for playback of light-field-captured data. Realception for Nuke provides editing and processing features to handle light field data sets, while Realception for Unreal provides a shader that allows embedding the processed light field data set into a CG

TOP RIGHT: Light-field camera array used at Fraunhofer IIS. (Image source: Fraunhofer IIS) BELOW: Light-field capture inside of Nuke. (Image source: Fraunhofer IIS) BOTTOM: The user interface for Fraunhofer IIS’ Realception Unreal Engine plugin. (Image source: Fraunhofer IIS)

FALL 2019 VFXVOICE.COM • 25


TECH & TOOLS

“The tools for the generation of depth maps evolve very fast. Light fields make the task simpler, compared to, for example, the portrait mode of modern smart phones, and offer a significant number of rays to choose from. Hence, even objects partially occluded can be made visible, or look through, hazy or turbid foreground.” —Professor Thorsten Herfet, Saarland University

TOP: Viewing capture light fields with Realception. (Image source: Fraunhofer IIS) BOTTOM: The make-shift light-field array camera used by Google to capture the flight deck of the Space Shuttle Discovery. (Image source: Google)

26 • VFXVOICE.COM FALL 2019

environment. “Both plugins are heavily used in our internal research,” says Keinert, “because they provide the flexibility to adapt the algorithm parameters to the captured footage.” Keinert believes the larger the camera array, the more useful light fields can be to the visual effects industry. “This however requires massive capture systems, huge data volumes and long computation times, he says. “To solve these challenges, we need to come up with algorithms that can reconstruct the light fields from a sparse sampling with highest possible quality. The challenge is to make them perfectly and smoothly fit into today’s available film and virtual reality workflows.” Meanwhile, at Saarland University in Germany, Professor Thorsten Herfet and his team have been part of the EU research and innovation project called SAUCE, which is aimed at re-using digital assets in production. As part of that project, a trailer about a musician, called Unfolding, was filmed using a bespoke light field camera in multiple configurations to test and demonstrate the research. Out of that trailer, Herfet noted the group had come to several findings about the use of light fields in filmmaking. These include, he says, the ability to achieve “flexible depth of field beyond the physics of a single lens. This as well refers to free positioning and partially free of the focal plane as to apertures not practically possible with a single lens.” Herfet adds that depth-based image rendering is something that also looks promising. “The tools for the generation of depth maps evolve very fast. Light fields make the task simpler, compared to, for example, the portrait mode of modern smart phones, and offer a significant number of rays to choose from. Hence, even objects partially occluded can be made visible, or look through, hazy or turbid foreground.” Still, right now the light field camera developed for Unfolding requires some time to set up and calibrate. Herfet would like aspects such as focus and aperture to be controlled electronically and jointly for all cameras, for example. “I would tend to say that – comparable to current technology – the array itself could be fixed and solidly mounted, so that different variants can be used for different shooting and purposes. The back end, servers and storage, can then be connected to each of those rig heads, so that consistent usage of monitoring, processing and storage is ensured.”

LIGHT FIELDS IN ACTION

Several companies have engaged in light field research, directly and indirectly in relation to visual effects. Among those is Light Field Lab, Inc., which is aiming to use light fields in the generation of holographic displays, including by looking for new ways to handle the vast amounts of data produced. “The physics involved in holographic display and data transmission are extremely complex,” notes Light Field Lab CEO and co-founder Jon Karafin. “Historically, the large data footprint required for rasterized light field images has been a significant barrier to entry for many emerging technologies. At Light Field Lab, we are developing real-time holographic rendering technologies to enable the future distribution of light field content to every device over commercial networks. “In support of this vision,” adds Karafin, “we have co-founded the standards body IDEA (Immersive Digital Experiences Alliance) alongside CableLabs, Charter Communications, OTOY, Visby and a growing list of other industry experts, with the goal of providing a royalty-free immersive data format for effcient streaming of holographic content across next-generation networks.” Karafin is confident that light fields will be an industry standard for future productions. “Today, there are already dozens of studios creating light field and volumetric assets for virtual as well as live-action scene reconstruction. Additionally, recent productions including First Man leveraged 2D LED video wall displays on set to directly render virtual backgrounds for in-camera live-action capture without a greenscreen. In the future, we believe there is great potential to incorporate holographic video walls in production to directly project real environments that are indistinguishable from a physical location.” Paul Debevec, VES, is very familiar to many in visual effects based on his research into high dynamic range imagery, imagebased lighting and the Light Stages. Now, as Senior Staff Engineer at GoogleVR, Debevec has been responsible for helping to research light fields and their application to virtual reality. That culminated in the VR experience “Welcome to Light Fields,” available as an app on Steam VR for HTC Vive, Oculus Rift and Windows Mixed Reality. The idea of the piece was to enable a viewer to experience several panoramic light field still photographs and environments. In a presentation at SIGGRAPH 2018 in Vancouver called “The Making of Welcome to Light Fields VR,” Google team members noted that most VR experiences “offer at most omnidirectional stereo rendering, allowing a user to see stereo in all directions (though not when looking up and down, and only with the head held level) but not to move around in the scene. To address this, we developed an inside-out spherical light field capture system designed to be relatively easy to use and efficient to operate.” One of Google’s light field experiences in “Welcome to Light Fields” involved the flight deck of the Space Shuttle Discovery, captured with a specialized camera rig made up of a modified GoPro Odyssey Jump camera to have 16 cameras on a spinning arm. The explorable volume of space in the experience was, for

“Even with synthetic imaging now, blending those pixels against the pixels behind is a very complex anti-aliasing problem. And the answer, up to this point, is to use higher resolution! Which of course is more computation. But what’s going on in terms of extracting that information from photographic extractions of light fields, that remains an unsolved problem.” —John Berton Jr., Visual Effects Supervisor TOP: Concept art of room-scale holographic display from Light Field Lab, Inc. (Image source: Light Field Lab, Inc.) BOTTOM: Light Field Lab’s vision of a special venue holographic display with their light-field technology. (Image source: Light Field Lab, Inc.)

FALL 2019 VFXVOICE.COM • 27


TECH & TOOLS

“The tools for the generation of depth maps evolve very fast. Light fields make the task simpler, compared to, for example, the portrait mode of modern smart phones, and offer a significant number of rays to choose from. Hence, even objects partially occluded can be made visible, or look through, hazy or turbid foreground.” —Professor Thorsten Herfet, Saarland University

TOP: Viewing capture light fields with Realception. (Image source: Fraunhofer IIS) BOTTOM: The make-shift light-field array camera used by Google to capture the flight deck of the Space Shuttle Discovery. (Image source: Google)

26 • VFXVOICE.COM FALL 2019

environment. “Both plugins are heavily used in our internal research,” says Keinert, “because they provide the flexibility to adapt the algorithm parameters to the captured footage.” Keinert believes the larger the camera array, the more useful light fields can be to the visual effects industry. “This however requires massive capture systems, huge data volumes and long computation times, he says. “To solve these challenges, we need to come up with algorithms that can reconstruct the light fields from a sparse sampling with highest possible quality. The challenge is to make them perfectly and smoothly fit into today’s available film and virtual reality workflows.” Meanwhile, at Saarland University in Germany, Professor Thorsten Herfet and his team have been part of the EU research and innovation project called SAUCE, which is aimed at re-using digital assets in production. As part of that project, a trailer about a musician, called Unfolding, was filmed using a bespoke light field camera in multiple configurations to test and demonstrate the research. Out of that trailer, Herfet noted the group had come to several findings about the use of light fields in filmmaking. These include, he says, the ability to achieve “flexible depth of field beyond the physics of a single lens. This as well refers to free positioning and partially free of the focal plane as to apertures not practically possible with a single lens.” Herfet adds that depth-based image rendering is something that also looks promising. “The tools for the generation of depth maps evolve very fast. Light fields make the task simpler, compared to, for example, the portrait mode of modern smart phones, and offer a significant number of rays to choose from. Hence, even objects partially occluded can be made visible, or look through, hazy or turbid foreground.” Still, right now the light field camera developed for Unfolding requires some time to set up and calibrate. Herfet would like aspects such as focus and aperture to be controlled electronically and jointly for all cameras, for example. “I would tend to say that – comparable to current technology – the array itself could be fixed and solidly mounted, so that different variants can be used for different shooting and purposes. The back end, servers and storage, can then be connected to each of those rig heads, so that consistent usage of monitoring, processing and storage is ensured.”

LIGHT FIELDS IN ACTION

Several companies have engaged in light field research, directly and indirectly in relation to visual effects. Among those is Light Field Lab, Inc., which is aiming to use light fields in the generation of holographic displays, including by looking for new ways to handle the vast amounts of data produced. “The physics involved in holographic display and data transmission are extremely complex,” notes Light Field Lab CEO and co-founder Jon Karafin. “Historically, the large data footprint required for rasterized light field images has been a significant barrier to entry for many emerging technologies. At Light Field Lab, we are developing real-time holographic rendering technologies to enable the future distribution of light field content to every device over commercial networks. “In support of this vision,” adds Karafin, “we have co-founded the standards body IDEA (Immersive Digital Experiences Alliance) alongside CableLabs, Charter Communications, OTOY, Visby and a growing list of other industry experts, with the goal of providing a royalty-free immersive data format for effcient streaming of holographic content across next-generation networks.” Karafin is confident that light fields will be an industry standard for future productions. “Today, there are already dozens of studios creating light field and volumetric assets for virtual as well as live-action scene reconstruction. Additionally, recent productions including First Man leveraged 2D LED video wall displays on set to directly render virtual backgrounds for in-camera live-action capture without a greenscreen. In the future, we believe there is great potential to incorporate holographic video walls in production to directly project real environments that are indistinguishable from a physical location.” Paul Debevec, VES, is very familiar to many in visual effects based on his research into high dynamic range imagery, imagebased lighting and the Light Stages. Now, as Senior Staff Engineer at GoogleVR, Debevec has been responsible for helping to research light fields and their application to virtual reality. That culminated in the VR experience “Welcome to Light Fields,” available as an app on Steam VR for HTC Vive, Oculus Rift and Windows Mixed Reality. The idea of the piece was to enable a viewer to experience several panoramic light field still photographs and environments. In a presentation at SIGGRAPH 2018 in Vancouver called “The Making of Welcome to Light Fields VR,” Google team members noted that most VR experiences “offer at most omnidirectional stereo rendering, allowing a user to see stereo in all directions (though not when looking up and down, and only with the head held level) but not to move around in the scene. To address this, we developed an inside-out spherical light field capture system designed to be relatively easy to use and efficient to operate.” One of Google’s light field experiences in “Welcome to Light Fields” involved the flight deck of the Space Shuttle Discovery, captured with a specialized camera rig made up of a modified GoPro Odyssey Jump camera to have 16 cameras on a spinning arm. The explorable volume of space in the experience was, for

“Even with synthetic imaging now, blending those pixels against the pixels behind is a very complex anti-aliasing problem. And the answer, up to this point, is to use higher resolution! Which of course is more computation. But what’s going on in terms of extracting that information from photographic extractions of light fields, that remains an unsolved problem.” —John Berton Jr., Visual Effects Supervisor TOP: Concept art of room-scale holographic display from Light Field Lab, Inc. (Image source: Light Field Lab, Inc.) BOTTOM: Light Field Lab’s vision of a special venue holographic display with their light-field technology. (Image source: Light Field Lab, Inc.)

FALL 2019 VFXVOICE.COM • 27


TECH & TOOLS

the user, one in which they could move with a full six degrees of freedom (DOF). “Welcome to Light Fields” was also a showcase for efficient rendering of real-time light field VR experiences. The team has since worked on an improved capture rig, fully synthetic scenes with light fields and light field video. BACK TO VFX, WHAT IMPACT MIGHT LIGHT FIELDS HAVE?

TOP: The lens array camera used by Lytro for its Hallelujah VR experience released in 2017. (Image source: Lytro)

“Today, there are already dozens of studios creating light field and volumetric assets for virtual as well as live-action scene reconstruction. Additionally, recent productions including First Man leveraged 2D LED video wall displays on set to directly render virtual backgrounds for in-camera live-action capture without a greenscreen. In the future, we believe there is great potential to incorporate holographic video walls in production to directly project real environments that are indistinguishable from a physical location.” —Jon Karafin, CEO and Co-founder, Light Field Lab

28 • VFXVOICE.COM FALL 2019

There are other companies working in the light field space, too, such as Creal3D and reportedly Magic Leap, that are hoping light fields will aid in more mixed reality experiences, possibly to the point that the technology is reduced to wearable glasses rather than goggles. Indeed, much of the light fields research seems to be applicable more to immersive experiences, although a few short years ago it was touted as a technology that, thanks to depth compositing, may make greenscreens obsolete. But that future has not really hit, so far. Experienced Visual Effects Supervisor John Berton Jr., who worked at Lytro for a period before the light field company shut down operations, says this was an area he and his colleagues had been looking at the company in relation to depth compositing. “One of the biggest problems with creating light fields, if you really want good depth information for every pixel, you have to have more than one layer of depth,” observes Berton. “This is the classic problem for so-called depth compositing versus deep compositing. The first layer of information from a light field that you’re going to be able to get from photography is more akin to depth where you have a single value of z-depth per pixel. “One of the problems that we were trying to solve at Lytro was that we can’t do anti-aliasing because we don’t know what the pixel behind it is. So we started developing ways of figuring out how can we stack into each pixel – like true deep compositing – layers of pixel depth values. You can do it if you have a synthetic image, but if you’re working with live-action photography, which is what we were trying to do, then it’s a whole different problem. But even with synthetic imaging now, blending those pixels against the pixels behind is a very complex anti-aliasing problem. And the answer, up to this point, is to use higher resolution! Which of course is more computation. But what’s going on in terms of extracting that information from photographic extractions of light fields, that remains an unsolved problem.” Berton says more research in machine learning or artificial intelligence relating to computational photography and light fields may well help with extracting the right kind of depth information. “Finding the edges, that’s the hard part. And making those edges extremely fine is also difficult because there’s just too much noise in the picture. However, if you have someone who can go in and show you where that line is supposed to be and you get enough information of that nature, then you can use it to train a network to assist the computation of photography with finding that edge. It’s a really, really interesting prospect.”



COVER

CINESITE LEAPS INTO ANIMATED FEATURES WITH THE ADDAMS FAMILY By IAN FAILES

At first, it might seem obvious that a visual effects studio could easily transition into an animation studio. Many of the tools and techniques used in production are the same, and the artist roles, such as riggers or lighters, can also be similar. Studios like Sony Pictures Imageworks, Animal Logic and Mac Guff have made the move into animated features, and several others have dabbled in the area, too. But the path can be a challenging one. Not only does it require a re-thinking of ‘shot production,’ but there are also questions about working as a service provider or generating your own IP. And then there are technological, pipeline and talent issues to consider. One studio that has also made the leap into animated features, in addition to visual effects offerings, is Cinesite. The studio has already delivered a number of projects and is now showcasing its latest animation wares with The Addams Family, directed by Conrad Vernon and Greg Tiernan, and released by MGM. To go behind the scenes of the journey,VFX Voice spoke to Cinesite key creatives and artists on The Addams Family. IT ALL STARTED WITH A SPACEMAN SHORT

TOP: A portrait of the Addams family. (Image copyright © 2019 MGM. The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.)

30 • VFXVOICE.COM FALL 2019

Cinesite began on this road towards animated features after decades as a visual effects studio under parent company Kodak. The studio is now independent, with a presence in London, Montreal and Vancouver (via its merger with Image Engine and acquisition of Nitrogen Studios) and in Munich and Berlin (having acquired Trixter). Visual effects is still a large part of Cinesite’s work, but the push into animation began with Beans in 2013, a short CG film about a group of astronauts who encounter an alien

“With visual effects, you shoot the plates, of course, and then you put the effects and the characters into the plates. Whereas here, particularly with the richness of the Addams family [members], we are rendering everything, and in their own environments. So you’re getting the correct lighting and you get the correct shadows and interaction of the characters.” —Neil Eskuri, Visual Effects Supervisor, Cinesite

TOP: Wednesday Addams meets Parker Needler in The Addams Family. (Image copyright © 2019 MGM. The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.) BOTTOM: Cinesite Visual Effects Supervisor Neil Eskuri and Head of Lighting Laura Brousseau. Both artists worked on The Addams Family. (Image courtesy of Cinesite)

on the moon. “With Beans we were trying to develop our showreel,” says Cinesite Director of Animation Eamonn Butler. “Typically, what most studios would do is just a few tests or do some free work on shows. We said, ‘Well, let’s do something that rises above that, has a narrative, and can stand on its own feet.’” Beans ended up with around 50 million views across various platforms. Buoyed by the success of the short, and looking to develop animated films at less cost than some of the other bigger studios around, Cinesite – then with only a London office – started up a presence in Montreal and began hiring animation development personnel. One of those was Dave Rosenbaum as Cinesite’s Chief Creative Officer, who had previously worked at Illumination and DreamWorks Animation. “When I joined DreamWorks, it was a new company,” states

FALL 2019 VFXVOICE.COM • 31


COVER

CINESITE LEAPS INTO ANIMATED FEATURES WITH THE ADDAMS FAMILY By IAN FAILES

At first, it might seem obvious that a visual effects studio could easily transition into an animation studio. Many of the tools and techniques used in production are the same, and the artist roles, such as riggers or lighters, can also be similar. Studios like Sony Pictures Imageworks, Animal Logic and Mac Guff have made the move into animated features, and several others have dabbled in the area, too. But the path can be a challenging one. Not only does it require a re-thinking of ‘shot production,’ but there are also questions about working as a service provider or generating your own IP. And then there are technological, pipeline and talent issues to consider. One studio that has also made the leap into animated features, in addition to visual effects offerings, is Cinesite. The studio has already delivered a number of projects and is now showcasing its latest animation wares with The Addams Family, directed by Conrad Vernon and Greg Tiernan, and released by MGM. To go behind the scenes of the journey,VFX Voice spoke to Cinesite key creatives and artists on The Addams Family. IT ALL STARTED WITH A SPACEMAN SHORT

TOP: A portrait of the Addams family. (Image copyright © 2019 MGM. The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.)

30 • VFXVOICE.COM FALL 2019

Cinesite began on this road towards animated features after decades as a visual effects studio under parent company Kodak. The studio is now independent, with a presence in London, Montreal and Vancouver (via its merger with Image Engine and acquisition of Nitrogen Studios) and in Munich and Berlin (having acquired Trixter). Visual effects is still a large part of Cinesite’s work, but the push into animation began with Beans in 2013, a short CG film about a group of astronauts who encounter an alien

“With visual effects, you shoot the plates, of course, and then you put the effects and the characters into the plates. Whereas here, particularly with the richness of the Addams family [members], we are rendering everything, and in their own environments. So you’re getting the correct lighting and you get the correct shadows and interaction of the characters.” —Neil Eskuri, Visual Effects Supervisor, Cinesite

TOP: Wednesday Addams meets Parker Needler in The Addams Family. (Image copyright © 2019 MGM. The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.) BOTTOM: Cinesite Visual Effects Supervisor Neil Eskuri and Head of Lighting Laura Brousseau. Both artists worked on The Addams Family. (Image courtesy of Cinesite)

on the moon. “With Beans we were trying to develop our showreel,” says Cinesite Director of Animation Eamonn Butler. “Typically, what most studios would do is just a few tests or do some free work on shows. We said, ‘Well, let’s do something that rises above that, has a narrative, and can stand on its own feet.’” Beans ended up with around 50 million views across various platforms. Buoyed by the success of the short, and looking to develop animated films at less cost than some of the other bigger studios around, Cinesite – then with only a London office – started up a presence in Montreal and began hiring animation development personnel. One of those was Dave Rosenbaum as Cinesite’s Chief Creative Officer, who had previously worked at Illumination and DreamWorks Animation. “When I joined DreamWorks, it was a new company,” states

FALL 2019 VFXVOICE.COM • 31


COVER

“When we came [to Montreal] we were the only house doing feature animation. Now there are a few more, which is actually good because it means there’s more of an industry. But you still need to invest in the long term, and that means giving artists a chance to develop their skills over multiple projects, not just one. Or offering them developmental opportunities like becoming a lead or supervisor. We’ve managed to do that.” —Eamonn Butler, Director of Animation, Cinesite

Rosenbaum. “We were asking, ‘What are the new trends in animation? How can you do it better and smarter and more efficiently?’ And that was the same challenge when we started Illumination. Then, when I came to Cinesite, it was, ‘How can we be even more efficient?’” The intention, therefore, was to make films at lower price points than Pixar or DreamWorks typically make. “One of the reasons for sticking the pin in the map on that particular budgetary area was it makes it easier to raise money,” says Butler. “It’s less of a risk, too. It gets you going quicker. That was really important to us.” THE CHALLENGES OF TURNING A VFX STUDIO INTO AN ANIMATION STUDIO

With budgetary plans in mind, choosing a location, bidding for jobs, pitching projects, establishing a pipeline and recruiting the right talent have all been challenges for Cinesite’s move into feature animation. Although there are certainly crossovers between visual effects and animation, Butler and Rosenbaum caution that the management of VFX versus the management of animated features differs greatly. “Feature animation has to be leaner,” notes Butler. “We don’t look at it as a cost per shot, we look at it as an overall thing. It’s very hard to implement that in visual effects, even though a lot of people want to do it that way. It becomes shot-based there. “Now, shot-based costs are much higher because of that,” continues Butler. “You’re protecting and defending big days on shots because you’re a service company. Whereas in feature animation, we tend to wrap it up like it’s our own movie.” Finding talent and building talent has been one of Cinesite’s major challenges in ramping up in animation. Partly, that is due to the growth of Montreal as a VFX and animation hub, where the recruitment of artists is now quite competitive. “When we came here,” recounts Butler, “we were the only house doing feature animation. Now there are a few more, which is actually good because it means there’s more of an industry. But you still need to invest in the long term, and that means giving artists a chance to develop their skills over multiple projects, not just one. Or offering them developmental opportunities like becoming a lead or supervisor. We’ve managed to do that.” MAKING THE ADDAMS FAMILY

The Addams Family animated feature is based on the original comics by cartoonist Charles Addams about a strange family deeply entrenched in the macabre who seem to be unaware of their odd fit in society. It includes all the well-known characters – Gomez and Morticia Addams, their children Wednesday and Pugsley, Uncle Fester and Grandmama, the butler Lurch, the disembodied hand Thing and Gomez’s Cousin Itt. Realized previously as different TV series and films, this animated version brings the family into the 21st century. The film was principally made at Cinesite’s Vancouver studio, formerly Nitrogen Studios, although scenes were ultimately shared between Vancouver and Montreal. “The film is around 83 minutes, and Vancouver is doing 66 minutes of the film, with

32 • VFXVOICE.COM FALL 2019

Montreal delivering 17 minutes,” says Cinesite Visual Effects Supervisor Neil Eskuri. “Montreal did a lot of the asset builds and the initial lookdev, and then that got transferred over to Vancouver.” The studio looked to the original single-panel The Addams Family comics as initial inspiration in order to turn that 2D world into a 3D animated one. “One of the interesting things for us, at first, was just seeing how those comics became character designs and then became 3D models,” discusses Cinesite head of lighting Laura Brousseau. “We took the essence of the comic and story, as well as the design, and brought it to life in this 3D world.” Brousseau and Eskuri say the characters in the film are instantly recognizable – even in their newly animated form – as those from The Addams Family. However, there were things that naturally had to be adapted for the characters in their 3D representations. “For example,” says Brousseau, “one of the things with Morticia, because she’s so sort of naturally skeletal, was that we had to be very careful about how we lit her so that she didn’t get too gaunt or too skeletal looking. We took inspiration from the film noir lighting style on Anjelica Huston in the live-action films where she has this very soft strip of lighting across her eyes in a lot of shots.” Cinesite moved to Foundry’s Katana for the show for lighting, with the idea to introduce a consistent approach to how shots and scenes would be managed. “We had long wanted to bring Katana into the studio,” says Brousseau. “So we saw that as a great opportunity, knowing that we were going to have a high volume of work to do and that we really wanted our artists to focus their time on lighting and artistic things.” Katana – which handled renders out of RenderMan in Vancouver and out of Arnold in Montreal – enabled Cinesite to design lighting across multiple scenes. “Everything’s built in for you to do those things that we need to do in every shot,” points out Brousseau. “We basically would set up recipes for every scene. And that has meant we spend less time troubleshooting things – ‘Why isn’t my override working? Why is my render pass broken?’ – and we get to spend more time focusing on lighting and making each shot better.” For Cinesite’s artists, The Addams Family was an opportunity to demonstrate just how far the studio has come in terms of animation, in addition to what it has already achieved in visual effects. “With visual effects,” notes Eskuri, “you shoot the plates, of course, and then you put the effects and the characters into the plates. Whereas here, particularly with the richness of the Addams family [members], we are rendering everything, and in their own environments. So you’re getting the correct lighting and you get the correct shadows and interaction of the characters.” For the Vancouver studio, in particular, production on the film has been an extension to what was already becoming a significant amount of animation production in the region. “There’s always new people joining the industry here and coming over from visual effects,” suggests Brousseau. “So you sort of have this continual joy of getting to work with the same people and really seasoned Vancouver artists, and then having new people coming in.” “Vancouver has grown dramatically over the last 10 years,” adds

OPPOSITE TOP: Inside Cinesite Vancouver with Head of Lighting Laura Brousseau as she reviews The Addams Family shots. (Image courtesy of Cinesite) OPPOSITE BOTTOM: An artist at Cinesite Vancouver works on a shot from The Addams Family. The studio is the result of Cinesite’s acquisition of Nitrogen Studios. (Image courtesy Cinesite) TOP: Cinesite Vancouver’s screening room. (Image courtesy of Cinesite) BOTTOM: Cinesite Chief Creative Officer and co-director of Riverdance, Dave Rosenbaum. (Image courtesy of Cinesite)

FALL 2019 VFXVOICE.COM • 33


COVER

“When we came [to Montreal] we were the only house doing feature animation. Now there are a few more, which is actually good because it means there’s more of an industry. But you still need to invest in the long term, and that means giving artists a chance to develop their skills over multiple projects, not just one. Or offering them developmental opportunities like becoming a lead or supervisor. We’ve managed to do that.” —Eamonn Butler, Director of Animation, Cinesite

Rosenbaum. “We were asking, ‘What are the new trends in animation? How can you do it better and smarter and more efficiently?’ And that was the same challenge when we started Illumination. Then, when I came to Cinesite, it was, ‘How can we be even more efficient?’” The intention, therefore, was to make films at lower price points than Pixar or DreamWorks typically make. “One of the reasons for sticking the pin in the map on that particular budgetary area was it makes it easier to raise money,” says Butler. “It’s less of a risk, too. It gets you going quicker. That was really important to us.” THE CHALLENGES OF TURNING A VFX STUDIO INTO AN ANIMATION STUDIO

With budgetary plans in mind, choosing a location, bidding for jobs, pitching projects, establishing a pipeline and recruiting the right talent have all been challenges for Cinesite’s move into feature animation. Although there are certainly crossovers between visual effects and animation, Butler and Rosenbaum caution that the management of VFX versus the management of animated features differs greatly. “Feature animation has to be leaner,” notes Butler. “We don’t look at it as a cost per shot, we look at it as an overall thing. It’s very hard to implement that in visual effects, even though a lot of people want to do it that way. It becomes shot-based there. “Now, shot-based costs are much higher because of that,” continues Butler. “You’re protecting and defending big days on shots because you’re a service company. Whereas in feature animation, we tend to wrap it up like it’s our own movie.” Finding talent and building talent has been one of Cinesite’s major challenges in ramping up in animation. Partly, that is due to the growth of Montreal as a VFX and animation hub, where the recruitment of artists is now quite competitive. “When we came here,” recounts Butler, “we were the only house doing feature animation. Now there are a few more, which is actually good because it means there’s more of an industry. But you still need to invest in the long term, and that means giving artists a chance to develop their skills over multiple projects, not just one. Or offering them developmental opportunities like becoming a lead or supervisor. We’ve managed to do that.” MAKING THE ADDAMS FAMILY

The Addams Family animated feature is based on the original comics by cartoonist Charles Addams about a strange family deeply entrenched in the macabre who seem to be unaware of their odd fit in society. It includes all the well-known characters – Gomez and Morticia Addams, their children Wednesday and Pugsley, Uncle Fester and Grandmama, the butler Lurch, the disembodied hand Thing and Gomez’s Cousin Itt. Realized previously as different TV series and films, this animated version brings the family into the 21st century. The film was principally made at Cinesite’s Vancouver studio, formerly Nitrogen Studios, although scenes were ultimately shared between Vancouver and Montreal. “The film is around 83 minutes, and Vancouver is doing 66 minutes of the film, with

32 • VFXVOICE.COM FALL 2019

Montreal delivering 17 minutes,” says Cinesite Visual Effects Supervisor Neil Eskuri. “Montreal did a lot of the asset builds and the initial lookdev, and then that got transferred over to Vancouver.” The studio looked to the original single-panel The Addams Family comics as initial inspiration in order to turn that 2D world into a 3D animated one. “One of the interesting things for us, at first, was just seeing how those comics became character designs and then became 3D models,” discusses Cinesite head of lighting Laura Brousseau. “We took the essence of the comic and story, as well as the design, and brought it to life in this 3D world.” Brousseau and Eskuri say the characters in the film are instantly recognizable – even in their newly animated form – as those from The Addams Family. However, there were things that naturally had to be adapted for the characters in their 3D representations. “For example,” says Brousseau, “one of the things with Morticia, because she’s so sort of naturally skeletal, was that we had to be very careful about how we lit her so that she didn’t get too gaunt or too skeletal looking. We took inspiration from the film noir lighting style on Anjelica Huston in the live-action films where she has this very soft strip of lighting across her eyes in a lot of shots.” Cinesite moved to Foundry’s Katana for the show for lighting, with the idea to introduce a consistent approach to how shots and scenes would be managed. “We had long wanted to bring Katana into the studio,” says Brousseau. “So we saw that as a great opportunity, knowing that we were going to have a high volume of work to do and that we really wanted our artists to focus their time on lighting and artistic things.” Katana – which handled renders out of RenderMan in Vancouver and out of Arnold in Montreal – enabled Cinesite to design lighting across multiple scenes. “Everything’s built in for you to do those things that we need to do in every shot,” points out Brousseau. “We basically would set up recipes for every scene. And that has meant we spend less time troubleshooting things – ‘Why isn’t my override working? Why is my render pass broken?’ – and we get to spend more time focusing on lighting and making each shot better.” For Cinesite’s artists, The Addams Family was an opportunity to demonstrate just how far the studio has come in terms of animation, in addition to what it has already achieved in visual effects. “With visual effects,” notes Eskuri, “you shoot the plates, of course, and then you put the effects and the characters into the plates. Whereas here, particularly with the richness of the Addams family [members], we are rendering everything, and in their own environments. So you’re getting the correct lighting and you get the correct shadows and interaction of the characters.” For the Vancouver studio, in particular, production on the film has been an extension to what was already becoming a significant amount of animation production in the region. “There’s always new people joining the industry here and coming over from visual effects,” suggests Brousseau. “So you sort of have this continual joy of getting to work with the same people and really seasoned Vancouver artists, and then having new people coming in.” “Vancouver has grown dramatically over the last 10 years,” adds

OPPOSITE TOP: Inside Cinesite Vancouver with Head of Lighting Laura Brousseau as she reviews The Addams Family shots. (Image courtesy of Cinesite) OPPOSITE BOTTOM: An artist at Cinesite Vancouver works on a shot from The Addams Family. The studio is the result of Cinesite’s acquisition of Nitrogen Studios. (Image courtesy Cinesite) TOP: Cinesite Vancouver’s screening room. (Image courtesy of Cinesite) BOTTOM: Cinesite Chief Creative Officer and co-director of Riverdance, Dave Rosenbaum. (Image courtesy of Cinesite)

FALL 2019 VFXVOICE.COM • 33


COVER

Eskuri. “Everybody works with everybody else. And that’s nice, because if you’re here, you have a lot of possibilities of other productions that are ramping up.” WHAT CINESITE IS DOING NEXT

With ambitions to create its own animated content, Cinesite started first as a services company. It has so far worked on The Star, Gnome Alone, The Addams Family and the short, SuperRoach. At the peak of production on The Star, Cinesite in Montreal had 650 artists. Now the studio has the Vancouver office as well, with capacity for between 350 and 500 artists. New projects in production or pre-production across the Vancouver and Montreal locations include Riverdance, Extinct, Princess Awesome and Harold Lloyd. That latter project is based on the life of the actor, comedian and stunt performer who appeared in numerous silent comedies in the early days of cinema. “Harold is one of cinema’s most iconic filmmakers,” reflects Rosenbaum. “He in many ways is the father of comedy. His comedic timing was unprecedented at the time, and comedians today still emulate their styles after him. Having worked at many different studios, you start noticing that so many of the animators are referencing silent films, because those performances had to be exaggerated and you had to convey a story without words. We were watching them reference these icons, like Harold Lloyd, and we were thinking, ‘Why aren’t we just making a movie about him, starring him?’” The studio approached Sue Lloyd, Harold’s granddaughter, who owns the rights and the film library, and managed to option all 200 of Lloyd’s silent films. Then they hired documentarian Leslie Iwerks to find a narrative across these movies. “So we’ve actually constructed an animatic that we can deliver to animators that will be 100% Harold Lloyd,” says Rosenbaum. “He will be performing in every shot, the shot setups, everything that he’s done, except it’ll be a completely new experience. It will take place in 1920s Hollywood. It will be an ode to Hollywood. We have some very exciting things planned for it including we’re going to do it with no dialogue.” The Harold Lloyd film is also part of Cinesite’s plan to have a very diverse slate of films. “We don’t want to be locked into one model,” outlines Butler. “We want to go after the projects and hit the demographics that are underserved, and the projects that aren’t given enough attention. We think that’s where we’ll separate ourselves. We’re not going to separate ourselves by doing a knockoff of what the huge studios are doing. So we need to think of things that are not only creatively interesting stories, but creatively interesting ways to make the actual film.” TOP TO BOTTOM: Cinesite Director of Animation and co-director of Riverdance, Eamonn Butler. (Image courtesy of Cinesite) A still from Beans, Cinesite’s first animated short. (Image courtesy of Cinesite) Cinesite worked on animation production for Sony Pictures Animation’s The Star. (Image copyright © 2017 Sony Pictures Animation) Another of Cinesite’s projects was Gnome Alone, directed by Peter Lepeniotis. (Image copyright © 2017 3QU Media)

34 • VFXVOICE.COM FALL 2019



PROFILE

JEN UNDERDAHL: FROM MODELMAKER TO MARVEL MAESTRO By TREVOR HOGG

Images courtesy of Jen Underdahl TOP: Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios. (Photo: Jose Armengol)

36 • VFXVOICE.COM FALL 2019

Marvel Studios Vice President of Visual Effects and Stereo Jen Underdahl was born in Pullman, Washington, and lived in Portland and Okinawa before her family settled in Seattle where she attended high school. Underdahl played basketball, volleyball and rowed throughout high school and into college at UCLA. A favorite pastime was watching Ray Harryhausen movies, such as Jason and the Argonauts and Clash of the Titans, on television. “I was mesmerized by the battling skeletons, Bubo the Owl, Medusa, the Kraken being petrified by Perseus, and Pegasus. But I also liked the Sinbad movies too and remember repeat matinee showings of Sinbad and the Eye of the Tiger as well as Clash. Seemed like movies stayed in theaters longer back then because I felt like my summers were somewhat defined by those two releases.” Underdahl’ s early film influences were defined by the science fiction and action genres where special effects helped to drive the storytelling. “Like every other kid growing up in the late ’70s and early ’80s, my mind was blown away by Star Wars, Planet of the Apes, Blade Runner, Raiders of the Lost Ark, and everything else Steven Spielberg. “In 1982, I spent a year in Okinawa, and we had Casablanca and The Wizard of Oz on VHS. Needless to say, I can recite both beginning to end!” The feature debut of Ethan and Joel Coen was a revelation for her. “I was a freshman in high school when I saw Blood Simple, and after I saw that film something fundamentally shifted. I started seeing the medium as something more complicated and interesting. In Seattle, there was no shortage of art-house theaters and I would make my way down to the Market Theater whenever I could. The 4th Man [A Dutch film], Twist and Shout [a Danish film] and Blue Velvet are some that come immediately to mind and which had a lasting impact.” After college, Underdahl taught English for five years, at the same time cultivating a fascination with how art represents and contextualizes the human experience. “Growing up outside of Los Angeles you don’t really ever think of filmmaking as something you can actually do as a career, particularly since I grew up in a world where putting food on the table was the primary objective. “I was 30 and did not want to continue in education as a career. I finally found the courage to pursue something I wanted to do instead of something I had to do. What I wanted to do was to build things. I didn’t know what that was going to look like, but I wanted to sculpt, weld, mold, hammer and paint. When I called an acquaintance, who was a special effects tech, to ask how one could get into doing what she did, she brought me out to PA on a couple of commercials. From there, it was making sure I kicked ass on every job so I could get the next gig and so on. Though I already had the aptitude and years of experience working with power tools, I learned a ton about being on set and working under production pressures while working in practical effects. “The official break into the visual effects industry came when I ran into long-time friend Nancy Bernstein at a sushi restaurant downtown,” recalls Underdahl. “She was running Digital

Domain at the time, and their miniature shop was crewing up for some big sequences on The Day After Tomorrow. Nancy asked if I wanted my name in the hat, and I leapt at the chance. It was on that show where I really started to learn lot more about materials and fabrication, and worked with brilliant artists who were kind enough to teach me the nuances of the job. I loved it. Every minute of it, even the motion-control shoots.” With the rise in the quality of CGI, the modelmaker shifted her focus and became a digital effects coordinator at Digital Domain. “A team of about 10 artists had been working for several months on the building collapse beat in Stealth. We fabricated enough pieces to create two versions of the 30-foot replica to be assembled on top of a 30-foot platform so that once the bomb was set off, the building would implode and drop into a crater created by the explosion.” After seeing the end result during a monthlies screening, the producers decided to do the shot entirely CG. “It didn’t take much of a leap to see that the flexibility for filmmakers and the speed at which large-scale effects could be done [digitally] were only going to increase as time went on.” An introduction to digital filmmaking took place for Underdahl by working with Clint Eastwood’s frequent collaborator, Michael Owens (Hereafter), and accelerated with Lana and Lilly Wachowski (The Matrix). “Flags of our Fathers and Letters from Iwo Jima needed to be photorealistic while Speed Racer was meant to be super arty and trippy. I got pretty excited seeing the breadth of applications of the medium. “During the final delivery of Speed Racer, the Wachowskis and their VFX team would review dailies from other vendors like ILM, Sony Pictures Imageworks, BUF and Rising Sun Pictures in our screening room at Digital Domain. Seeing how much of the work was done elsewhere and observing the creative relationship between the directors and their VFX team, I started to realize I didn’t want to be a vendor and only see a slice, I wanted to see how the whole pie is made.” After leaving Digital Domain, Underdahl was the Lead Visual Effects Coordinator on Percy Jackson & The Olympians: The Lightning Thief, where she met the Second Unit Visual Effects Supervisor Christopher Townsend (Ninja Assassin). “Chris and I formed a really solid working relationship on that show. When Chris got hired by Marvel Studios, he asked if I wanted to join him and his producer, Mark Soper, as the production manager. We finished the visual effects for Captain America: The First Avenger together.” A trio of Marvel Studios executives are responsible for the establishment and success of the Marvel Cinematic Universe: President Kevin Feige, Co-President Louis D’Esposito and Executive Vice President of Physical Production Victoria Alonso. “They are true filmmakers,” remarks Underdahl. “Because so many of us have worked in their system for so long now, and t he workflow is so well established, we can just knuckle down and get the job done, which allows all of us to do great work.” The projects keep on getting bigger in scope. “Captain America: Civil War was a giant one. It was often referred to as ‘Avengers 2.5.’ But when Avengers: Infinity War and

“I was 30 and did not want to continue in education as a career. I finally found the courage to pursue something I wanted to do instead of something I had to do. What I wanted to do was to build things. I didn’t know what that was going to look like, but I wanted to sculpt, weld, mold, hammer and paint.” —Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios

TOP: Critical to the success of Avengers: Infinity War and Avengers: Endgame was translating the motioncapture performance of Josh Brolin on to Thanos. Thanos was divided between Weta Digital and Digital Domain. (Image courtesy of Marvel Studios) BOTTOM: The entire Leipzig/Halle Airport in Germany was digitally recreated for Captain America: Civil War. (Image courtesy of Marvel Studios)

FALL 2019 VFXVOICE.COM • 37


PROFILE

JEN UNDERDAHL: FROM MODELMAKER TO MARVEL MAESTRO By TREVOR HOGG

Images courtesy of Jen Underdahl TOP: Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios. (Photo: Jose Armengol)

36 • VFXVOICE.COM FALL 2019

Marvel Studios Vice President of Visual Effects and Stereo Jen Underdahl was born in Pullman, Washington, and lived in Portland and Okinawa before her family settled in Seattle where she attended high school. Underdahl played basketball, volleyball and rowed throughout high school and into college at UCLA. A favorite pastime was watching Ray Harryhausen movies, such as Jason and the Argonauts and Clash of the Titans, on television. “I was mesmerized by the battling skeletons, Bubo the Owl, Medusa, the Kraken being petrified by Perseus, and Pegasus. But I also liked the Sinbad movies too and remember repeat matinee showings of Sinbad and the Eye of the Tiger as well as Clash. Seemed like movies stayed in theaters longer back then because I felt like my summers were somewhat defined by those two releases.” Underdahl’ s early film influences were defined by the science fiction and action genres where special effects helped to drive the storytelling. “Like every other kid growing up in the late ’70s and early ’80s, my mind was blown away by Star Wars, Planet of the Apes, Blade Runner, Raiders of the Lost Ark, and everything else Steven Spielberg. “In 1982, I spent a year in Okinawa, and we had Casablanca and The Wizard of Oz on VHS. Needless to say, I can recite both beginning to end!” The feature debut of Ethan and Joel Coen was a revelation for her. “I was a freshman in high school when I saw Blood Simple, and after I saw that film something fundamentally shifted. I started seeing the medium as something more complicated and interesting. In Seattle, there was no shortage of art-house theaters and I would make my way down to the Market Theater whenever I could. The 4th Man [A Dutch film], Twist and Shout [a Danish film] and Blue Velvet are some that come immediately to mind and which had a lasting impact.” After college, Underdahl taught English for five years, at the same time cultivating a fascination with how art represents and contextualizes the human experience. “Growing up outside of Los Angeles you don’t really ever think of filmmaking as something you can actually do as a career, particularly since I grew up in a world where putting food on the table was the primary objective. “I was 30 and did not want to continue in education as a career. I finally found the courage to pursue something I wanted to do instead of something I had to do. What I wanted to do was to build things. I didn’t know what that was going to look like, but I wanted to sculpt, weld, mold, hammer and paint. When I called an acquaintance, who was a special effects tech, to ask how one could get into doing what she did, she brought me out to PA on a couple of commercials. From there, it was making sure I kicked ass on every job so I could get the next gig and so on. Though I already had the aptitude and years of experience working with power tools, I learned a ton about being on set and working under production pressures while working in practical effects. “The official break into the visual effects industry came when I ran into long-time friend Nancy Bernstein at a sushi restaurant downtown,” recalls Underdahl. “She was running Digital

Domain at the time, and their miniature shop was crewing up for some big sequences on The Day After Tomorrow. Nancy asked if I wanted my name in the hat, and I leapt at the chance. It was on that show where I really started to learn lot more about materials and fabrication, and worked with brilliant artists who were kind enough to teach me the nuances of the job. I loved it. Every minute of it, even the motion-control shoots.” With the rise in the quality of CGI, the modelmaker shifted her focus and became a digital effects coordinator at Digital Domain. “A team of about 10 artists had been working for several months on the building collapse beat in Stealth. We fabricated enough pieces to create two versions of the 30-foot replica to be assembled on top of a 30-foot platform so that once the bomb was set off, the building would implode and drop into a crater created by the explosion.” After seeing the end result during a monthlies screening, the producers decided to do the shot entirely CG. “It didn’t take much of a leap to see that the flexibility for filmmakers and the speed at which large-scale effects could be done [digitally] were only going to increase as time went on.” An introduction to digital filmmaking took place for Underdahl by working with Clint Eastwood’s frequent collaborator, Michael Owens (Hereafter), and accelerated with Lana and Lilly Wachowski (The Matrix). “Flags of our Fathers and Letters from Iwo Jima needed to be photorealistic while Speed Racer was meant to be super arty and trippy. I got pretty excited seeing the breadth of applications of the medium. “During the final delivery of Speed Racer, the Wachowskis and their VFX team would review dailies from other vendors like ILM, Sony Pictures Imageworks, BUF and Rising Sun Pictures in our screening room at Digital Domain. Seeing how much of the work was done elsewhere and observing the creative relationship between the directors and their VFX team, I started to realize I didn’t want to be a vendor and only see a slice, I wanted to see how the whole pie is made.” After leaving Digital Domain, Underdahl was the Lead Visual Effects Coordinator on Percy Jackson & The Olympians: The Lightning Thief, where she met the Second Unit Visual Effects Supervisor Christopher Townsend (Ninja Assassin). “Chris and I formed a really solid working relationship on that show. When Chris got hired by Marvel Studios, he asked if I wanted to join him and his producer, Mark Soper, as the production manager. We finished the visual effects for Captain America: The First Avenger together.” A trio of Marvel Studios executives are responsible for the establishment and success of the Marvel Cinematic Universe: President Kevin Feige, Co-President Louis D’Esposito and Executive Vice President of Physical Production Victoria Alonso. “They are true filmmakers,” remarks Underdahl. “Because so many of us have worked in their system for so long now, and t he workflow is so well established, we can just knuckle down and get the job done, which allows all of us to do great work.” The projects keep on getting bigger in scope. “Captain America: Civil War was a giant one. It was often referred to as ‘Avengers 2.5.’ But when Avengers: Infinity War and

“I was 30 and did not want to continue in education as a career. I finally found the courage to pursue something I wanted to do instead of something I had to do. What I wanted to do was to build things. I didn’t know what that was going to look like, but I wanted to sculpt, weld, mold, hammer and paint.” —Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios

TOP: Critical to the success of Avengers: Infinity War and Avengers: Endgame was translating the motioncapture performance of Josh Brolin on to Thanos. Thanos was divided between Weta Digital and Digital Domain. (Image courtesy of Marvel Studios) BOTTOM: The entire Leipzig/Halle Airport in Germany was digitally recreated for Captain America: Civil War. (Image courtesy of Marvel Studios)

FALL 2019 VFXVOICE.COM • 37


PROFILE

“Seeing how much of the work was done elsewhere and observing the creative relationship between the directors and their VFX team, I started to realize I didn’t want to be a vendor and only see a slice, I wanted to see how the whole pie is made.” —Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios Avengers: Endgame rolled around, the difference between something like that and The First Avenger is almost like the distance between the Wachowskis and Clint Eastwood projects. We had gone from photoreal, grounded VFX to intensely complex digital character work, and creating worlds in which the Infinity Stones and their derivative magic exists. Fortunately, the growth of the VFX industry over that time period has allowed us to keep up with the growth in vision and innovation of Marvel Studios’ storytelling.” “We have an infrastructure at Marvel that is robust and an IT department that manages all of our dataflow and storage. We were in the very nascent stages of that on The First Avenger,” observes Underdahl. “It felt a bit like punch cards compared to where we are now. Victoria Alonso and Danielle Costa [Vice President of Visual Effects] have been huge in making sure that, in the final weeks of delivery, getting images in front of the filmmakers remains the focus and not the white-knuckle process of worrying about storage space or transfer speeds. “Other efficiencies have also come from being at Marvel for so many years. For instance, we knew the need to get going early on Thanos, The Black Order and Smart Hulk for Infinity War and Endgame. Questions about how we are going to achieve those characters needed to be answered before we started rolling cameras.” Underdahl points out that it is important to be familiar with all aspects of filmmaking. “Because visual effects are the last stop. You have to know the history of the decision-making proceeding it so you can best service the imagery. “I am lucky to have worked on the films I have at Marvel, and with the lionhearted vendors who have helped us achieve them because each film has pushed us all to deliver what would seemingly be impossible.”

Underdahl has to accommodate for last-minute changes that are common with Marvel Studios productions. “Sometimes it’s not a 100% what the filmmakers want, but it’s a ‘yes.’ When tasks come past the eleventh hour, you know you have brought on the biggest and best teams to do the most complicated work. The pressure is enormous but also thrilling.” The biggest challenges have been orchestrating the production of Avengers: Infinity War and Avengers: Endgame. “I am driven to be part of something extraordinary and Marvel is that place. I like the pressure and the challenge of, ‘How on earth are we going to do this?’ Then working with talented people to realize it. There is never a moment when you want to push back on the creativity, because that’s where the coolest stuff is made.”

TOP: ILM was tasked with crashing three Helicarriers into each other for the climatic sequence in Captain America: The Winter Soldier. (Image courtesy of Marvel Studios) MIDDLE LEFT: Getting notes from Miniature Supervisor at Cinema Production Services Mike Joyce on the progression of destruction on the house in Zathura (2005) for motion-control passes. MIDDLE RIGHT: On the Brooklyn chase set for Captain America: The First Avenger with Production Manager Sam Breckman and Visual Effects Supervisor Lisa Marra. BOTTOM: Underdahl dressing the set between takes for the London opening sequence on Peter Pan (2003), distributed by Universal Pictures.

TOP: Underdahl on the deck of the collapsing building miniature constructed for Stealth (2005). MIDDLE: At the Oscars for Captain America: Winter Soldier. BOTTOM: Lola VFX was critical in producing a weakling version of Steve Rogers (Chris Evans) for Captain America: The First Avenger. Skinny Steve had to be produced without the aid of motion-control cameras so not to interfere with shooting style of director Joe Johnston. (Image courtesy of Marvel Studios)

38 • VFXVOICE.COM FALL 2019

FALL 2019 VFXVOICE.COM • 39


PROFILE

“Seeing how much of the work was done elsewhere and observing the creative relationship between the directors and their VFX team, I started to realize I didn’t want to be a vendor and only see a slice, I wanted to see how the whole pie is made.” —Jen Underdahl, Vice President of Visual Effects and Stereo, Marvel Studios Avengers: Endgame rolled around, the difference between something like that and The First Avenger is almost like the distance between the Wachowskis and Clint Eastwood projects. We had gone from photoreal, grounded VFX to intensely complex digital character work, and creating worlds in which the Infinity Stones and their derivative magic exists. Fortunately, the growth of the VFX industry over that time period has allowed us to keep up with the growth in vision and innovation of Marvel Studios’ storytelling.” “We have an infrastructure at Marvel that is robust and an IT department that manages all of our dataflow and storage. We were in the very nascent stages of that on The First Avenger,” observes Underdahl. “It felt a bit like punch cards compared to where we are now. Victoria Alonso and Danielle Costa [Vice President of Visual Effects] have been huge in making sure that, in the final weeks of delivery, getting images in front of the filmmakers remains the focus and not the white-knuckle process of worrying about storage space or transfer speeds. “Other efficiencies have also come from being at Marvel for so many years. For instance, we knew the need to get going early on Thanos, The Black Order and Smart Hulk for Infinity War and Endgame. Questions about how we are going to achieve those characters needed to be answered before we started rolling cameras.” Underdahl points out that it is important to be familiar with all aspects of filmmaking. “Because visual effects are the last stop. You have to know the history of the decision-making proceeding it so you can best service the imagery. “I am lucky to have worked on the films I have at Marvel, and with the lionhearted vendors who have helped us achieve them because each film has pushed us all to deliver what would seemingly be impossible.”

Underdahl has to accommodate for last-minute changes that are common with Marvel Studios productions. “Sometimes it’s not a 100% what the filmmakers want, but it’s a ‘yes.’ When tasks come past the eleventh hour, you know you have brought on the biggest and best teams to do the most complicated work. The pressure is enormous but also thrilling.” The biggest challenges have been orchestrating the production of Avengers: Infinity War and Avengers: Endgame. “I am driven to be part of something extraordinary and Marvel is that place. I like the pressure and the challenge of, ‘How on earth are we going to do this?’ Then working with talented people to realize it. There is never a moment when you want to push back on the creativity, because that’s where the coolest stuff is made.”

TOP: ILM was tasked with crashing three Helicarriers into each other for the climatic sequence in Captain America: The Winter Soldier. (Image courtesy of Marvel Studios) MIDDLE LEFT: Getting notes from Miniature Supervisor at Cinema Production Services Mike Joyce on the progression of destruction on the house in Zathura (2005) for motion-control passes. MIDDLE RIGHT: On the Brooklyn chase set for Captain America: The First Avenger with Production Manager Sam Breckman and Visual Effects Supervisor Lisa Marra. BOTTOM: Underdahl dressing the set between takes for the London opening sequence on Peter Pan (2003), distributed by Universal Pictures.

TOP: Underdahl on the deck of the collapsing building miniature constructed for Stealth (2005). MIDDLE: At the Oscars for Captain America: Winter Soldier. BOTTOM: Lola VFX was critical in producing a weakling version of Steve Rogers (Chris Evans) for Captain America: The First Avenger. Skinny Steve had to be produced without the aid of motion-control cameras so not to interfere with shooting style of director Joe Johnston. (Image courtesy of Marvel Studios)

38 • VFXVOICE.COM FALL 2019

FALL 2019 VFXVOICE.COM • 39


FILM

VFX CREWS SHIP OUT ON SPACE ODYSSEY AD ASTRA By KEVIN H. MARTIN

Images copyright (c) 2019 Twentieth Century Fox TOP AND OPPOSITE TOP: McBride (Brad Pitt) and company depart from their base on their lunar sojourn to a launch facility on the other side of the terminator line. Method’s lunar surface and production’s desert footage were composited with MPC’s digital matte painting of the expansive construct.

40 • VFXVOICE.COM FALL 2019

On its planet-hopping voyage ‘up-river’ to the ends of our own star system, filmmaker James Gray’s Ad Astra, the Fox, New Regency and Plan B release from Disney, follows Army Corps of Engineers’ Roy McBride (Brad Pitt) on a voyage into the infinite darkness chasing down his long-lost father (Tommy Lee Jones), last seen in the vicinity of Neptune. To realize his vision, Gray drew on the efforts of several visual effects vendors, including MPC, Weta Digital, Methods Studios, MR. X, Vitality VFX, Brainstorm Digital, Atomic Fiction, capital T, Territory Studio and Shade VFX, with Halon providing previs and postvis. MR. X was tasked with the film’s big open taking place in low Earth orbit. “We handled those shots seen in the trailer with that big antenna 80,000 feet up,” MR. X VFX Supervisor Olaf Wendt explains. “McBride is working on it when a mysterious surge damages the structure, knocking him off. Production’s art department generated quite a bit of concept illustrations for the antenna, a small section of which was created as a full-size set piece that Brad could be seen climbing. “We spent awhile conceptualizing the core components, using the International Space Station as one of our touchstones, which is reflected in the materials selected,” continues Wendt. “While the antenna is supposed to be a fairly new structure, things weather from sunlight up there differently than on Earth, with ablation patterns differing due to particle winds from the sun impacting structures. Delving into that aspect while getting close to something that could be built with future tech were principal focuses for us.” Gray’s dedication to a gritty realism aims, in some ways, to go beyond the 2001: A Space Odyssey benchmark and groundbreaking sci-fi/space films since. “His vision required a different

“[Director James Gray] was very clear about having this not look like just the next space picture. Keeping it grounded meant being brave and photochemical in look. There’s a very earthy feel to his work, even before Lost City of Z. If you watch The Yards, it is just steeped in an analog feel. Another element to juggle here is the heightened aspect of a cinematic image, which can be a big factor.” —Olaf Wendt, Visual Effects Supervisor, MR. X

MIDDLE AND BOTTOM: Method addressed the numerous cuts featuring visor reflections. Production switched from tinted visors to clear ones during the shoot, so selective tinting was required to create suitable reflection elements on the visors. Method’s visor replacement combined reflection elements from the live-action shoot as well as their CG structures.

feel,” allows Wendt. “He was very clear about having this not look like just the next space picture. Keeping it grounded meant being brave and photochemical in look. There’s a very earthy feel to his work, even before Lost City of Z. If you watch The Yards, it is just steeped in an analog feel. Another element to juggle here is the heightened aspect of a cinematic image, which can be a big factor.” After being dispatched on his mission, McBride’s first stop is the moon, where he comes under attack from pirates while en route to his next launch vehicle. Method Studios handled the VFX for this sequence, which featured principal photography shot on location at Dumont Dunes in the Mojave Desert. “Fortunately for me, this grueling shoot was supervised by another Method Visual Effects Supervisor, Aidan Fraser,” says Compositing Supervisor Jedediah Smith [Method’s Ryan Tudhope also supervised part of the shoot before leaving for another project.] “I heard many stories of buckets of ice being used to keep actors and cameras cooled down

FALL 2019 VFXVOICE.COM • 41


FILM

VFX CREWS SHIP OUT ON SPACE ODYSSEY AD ASTRA By KEVIN H. MARTIN

Images copyright (c) 2019 Twentieth Century Fox TOP AND OPPOSITE TOP: McBride (Brad Pitt) and company depart from their base on their lunar sojourn to a launch facility on the other side of the terminator line. Method’s lunar surface and production’s desert footage were composited with MPC’s digital matte painting of the expansive construct.

40 • VFXVOICE.COM FALL 2019

On its planet-hopping voyage ‘up-river’ to the ends of our own star system, filmmaker James Gray’s Ad Astra, the Fox, New Regency and Plan B release from Disney, follows Army Corps of Engineers’ Roy McBride (Brad Pitt) on a voyage into the infinite darkness chasing down his long-lost father (Tommy Lee Jones), last seen in the vicinity of Neptune. To realize his vision, Gray drew on the efforts of several visual effects vendors, including MPC, Weta Digital, Methods Studios, MR. X, Vitality VFX, Brainstorm Digital, Atomic Fiction, capital T, Territory Studio and Shade VFX, with Halon providing previs and postvis. MR. X was tasked with the film’s big open taking place in low Earth orbit. “We handled those shots seen in the trailer with that big antenna 80,000 feet up,” MR. X VFX Supervisor Olaf Wendt explains. “McBride is working on it when a mysterious surge damages the structure, knocking him off. Production’s art department generated quite a bit of concept illustrations for the antenna, a small section of which was created as a full-size set piece that Brad could be seen climbing. “We spent awhile conceptualizing the core components, using the International Space Station as one of our touchstones, which is reflected in the materials selected,” continues Wendt. “While the antenna is supposed to be a fairly new structure, things weather from sunlight up there differently than on Earth, with ablation patterns differing due to particle winds from the sun impacting structures. Delving into that aspect while getting close to something that could be built with future tech were principal focuses for us.” Gray’s dedication to a gritty realism aims, in some ways, to go beyond the 2001: A Space Odyssey benchmark and groundbreaking sci-fi/space films since. “His vision required a different

“[Director James Gray] was very clear about having this not look like just the next space picture. Keeping it grounded meant being brave and photochemical in look. There’s a very earthy feel to his work, even before Lost City of Z. If you watch The Yards, it is just steeped in an analog feel. Another element to juggle here is the heightened aspect of a cinematic image, which can be a big factor.” —Olaf Wendt, Visual Effects Supervisor, MR. X

MIDDLE AND BOTTOM: Method addressed the numerous cuts featuring visor reflections. Production switched from tinted visors to clear ones during the shoot, so selective tinting was required to create suitable reflection elements on the visors. Method’s visor replacement combined reflection elements from the live-action shoot as well as their CG structures.

feel,” allows Wendt. “He was very clear about having this not look like just the next space picture. Keeping it grounded meant being brave and photochemical in look. There’s a very earthy feel to his work, even before Lost City of Z. If you watch The Yards, it is just steeped in an analog feel. Another element to juggle here is the heightened aspect of a cinematic image, which can be a big factor.” After being dispatched on his mission, McBride’s first stop is the moon, where he comes under attack from pirates while en route to his next launch vehicle. Method Studios handled the VFX for this sequence, which featured principal photography shot on location at Dumont Dunes in the Mojave Desert. “Fortunately for me, this grueling shoot was supervised by another Method Visual Effects Supervisor, Aidan Fraser,” says Compositing Supervisor Jedediah Smith [Method’s Ryan Tudhope also supervised part of the shoot before leaving for another project.] “I heard many stories of buckets of ice being used to keep actors and cameras cooled down

FALL 2019 VFXVOICE.COM • 41


FILM

TOP TWO: Method utilized terrain replacement for many shots like the one seen here. BOTTOM TWO: Once combined, the stereo camera rig plates created an ultra-contrast look appropriate for the moon, which Method then enhanced with terrain models that replaced the too-terrestrial grounds seen in the plates.

42 • VFXVOICE.COM FALL 2019

in the desert heat! Director of photography Hoyte Van Hoytema had an interesting idea to make the photography of the desert look more like the moonscape. He used a stereo rig when shooting many of the wide shots, employing an Arri 435 film camera along with a modified Arri Alexa recording in infrared. When those were married together we got a very striking and stark look that strongly evoked the photography we all know from the NASA Apollo missions.” Ad Astra isn’t the first film to utilize infrared shooting to aid in realizing an otherworldly countenance. That honor goes to 1980’s Galaxina, shot by cinematographer Dean Cundey, which filmed location exteriors on Kodak’s since-discontinued Ekachrome infrared stock. Providing enhanced contrast and delivering bold highlights, Van Hoytema’s digital infrared approach delivered a controllable vision of the extreme conditions on our airless satellite. “Some shots only utilized the infrared camera,” continues Smith. “For those, we had to colorize the footage so it would all cut together, rotoscoping every different material on the rovers, from the metal foil to their red hubcaps.” The lunar sequence prep commenced with a study of NASA’s Apollo archive photos on Flickr. “We spent a long time trying to analyze what makes the moon look so strange and alien,” Smith recalls. “The texture and complexity of the surface is something we studied closely. No air means no wind, which means no erosion. Features on the surface are formed by eons of meteor impacts. To mimic this look we created a base model with the main features of our terrain, then developed a procedural lookdev system to create craters of varying sizes [in] high-resolution detail and a random scattering of rocks. We used Substance Designer and Houdini to drive displacement in our Katana/Renderman lookdev, lighting and rendering pipeline. Part way through the show, the client added a few wider aerial shots to help make the action clearer. Those full CG shots were a great opportunity to show off our moon surface environment. We animated rovers, astronauts, and also simulated dust trails and tire tracks, to help tell the story of the pursuit.” During the chase, a support rover becomes disabled from gunfire, while another explodes after taking a hit to its engine. “That was a full CG shot involving rigid-body simulation for the vehicle, plus simulations for all the dust and debris kicked up on the surface and the simulated engine explosion,” Smith reveals. “Later in the sequence, missiles are called in to put an end to the remaining pirate rovers. We created a full CG shot showing two missiles flying overhead, leading to a massive explosion. One of our FX artists had found reference footage simulating a meteor impact in a vacuum. The parabolic shape and behavior of debris were very strange compared to what we’re used to seeing on Earth. We showed it to our client, Visual Effects Producer Allen Maris, and he loved it, so we had a really good target to work towards.” McBride’s rover takes a hit from the pirates, sending it spinning out of control. “The lower lunar gravity would make traction more difficult,” explains Smith, “so the rover goes into a spin before plunging into a giant crater. We got to be really creative with the layout here, placing the rover so the crater wall was behind it in the

“The [pirate] rover goes into a spin before plunging into a giant crater. We got to be really creative with the layout here, placing the rover so the crater wall was behind it in the background, which let us play the drop moment visually as it falls inside, while seeing the pirates stop along the lip of the crater.” —Jedediah Smith, Visual Effects Producer background, which let us play the drop moment visually as it falls inside, while seeing the pirates stop along the lip of the crater.” The trek out of the crater reveals the terminator line separating day and night on the moon. “Once they are in shadow, we play things realistically dark, only seeing the console lights, the play of headlights on the ground and the starfield overhead,” says Smith. “Then they reach the second spaceport that houses the Cephius, the ship that will take them on the next leg of their journey. We took the plate shot on location at a military base and added a large spaceport with solar panels and supporting structures.” The other huge challenge in the sequence involved the visors of the spacesuited combatants. “These helmets were very much in the style of NASA’s Apollo program, with visors like mirror balls that reflect everything,” states Smith. “While they were shooting closer views on stage, tinted visors were used that would color the reflections while making the faces within look more gold and green. But when they realized this impacted being able to see the talent clearly, a switch was made to clear visors, so those shots required us to treat everything to get that gold-mirror look.” That proved to be a multi-step process. “First, we had to treat what was seen inside the visor,” Smith remarks, “making the faces darker and greener, and then add reflections for the lunar terrain environment. In some instances, that meant animating other characters who were in proximity. This required a massive rotoscoping effort to separate all the layers, then combine everything in a photorealistic way. All of the stage shots were shot on black with a single distant key light to try to mimic the lighting conditions on the moon. While it did help with realism in the end, it was a huge amount of effort in roto and compositing to get the edges looking nice over the bright background moon surface that we added to these shots.” After leaving the moon, Pitt’s ship encounters the Vesta, an abandoned vessel. “They suit up and spacewalk over to it, which production shot on stage against black, so again there was a huge amount of rotoscope,” reports Smith. “The client was very interested in realism, and the plates they shot reflected that, with a really harsh strong single key light. We ended up cheating a bit, adding some fill to keep it from being totally black. The comping challenge was addressing the edge treatment with finesse, plus incorporating our CG Vesta – which was blended with production’s practical hatch element – as a reflection on the visors.” The ending sequence again involved MR. X, working in

TOP TWO: Method Studios augmented the location shoot, extracting distance-attenuating atmosphere from the plates and adding set extensions based on imagery from the Apollo moon missions, plus a variety of animation enhancements ranging from dust to vehicle explosions. BOTTOM TWO: Some of the vehicular mayhem was accomplished practically on location, then augmented with CG lunar terrain and animated weapons fire.

FALL 2019 VFXVOICE.COM • 43


FILM

TOP TWO: Method utilized terrain replacement for many shots like the one seen here. BOTTOM TWO: Once combined, the stereo camera rig plates created an ultra-contrast look appropriate for the moon, which Method then enhanced with terrain models that replaced the too-terrestrial grounds seen in the plates.

42 • VFXVOICE.COM FALL 2019

in the desert heat! Director of photography Hoyte Van Hoytema had an interesting idea to make the photography of the desert look more like the moonscape. He used a stereo rig when shooting many of the wide shots, employing an Arri 435 film camera along with a modified Arri Alexa recording in infrared. When those were married together we got a very striking and stark look that strongly evoked the photography we all know from the NASA Apollo missions.” Ad Astra isn’t the first film to utilize infrared shooting to aid in realizing an otherworldly countenance. That honor goes to 1980’s Galaxina, shot by cinematographer Dean Cundey, which filmed location exteriors on Kodak’s since-discontinued Ekachrome infrared stock. Providing enhanced contrast and delivering bold highlights, Van Hoytema’s digital infrared approach delivered a controllable vision of the extreme conditions on our airless satellite. “Some shots only utilized the infrared camera,” continues Smith. “For those, we had to colorize the footage so it would all cut together, rotoscoping every different material on the rovers, from the metal foil to their red hubcaps.” The lunar sequence prep commenced with a study of NASA’s Apollo archive photos on Flickr. “We spent a long time trying to analyze what makes the moon look so strange and alien,” Smith recalls. “The texture and complexity of the surface is something we studied closely. No air means no wind, which means no erosion. Features on the surface are formed by eons of meteor impacts. To mimic this look we created a base model with the main features of our terrain, then developed a procedural lookdev system to create craters of varying sizes [in] high-resolution detail and a random scattering of rocks. We used Substance Designer and Houdini to drive displacement in our Katana/Renderman lookdev, lighting and rendering pipeline. Part way through the show, the client added a few wider aerial shots to help make the action clearer. Those full CG shots were a great opportunity to show off our moon surface environment. We animated rovers, astronauts, and also simulated dust trails and tire tracks, to help tell the story of the pursuit.” During the chase, a support rover becomes disabled from gunfire, while another explodes after taking a hit to its engine. “That was a full CG shot involving rigid-body simulation for the vehicle, plus simulations for all the dust and debris kicked up on the surface and the simulated engine explosion,” Smith reveals. “Later in the sequence, missiles are called in to put an end to the remaining pirate rovers. We created a full CG shot showing two missiles flying overhead, leading to a massive explosion. One of our FX artists had found reference footage simulating a meteor impact in a vacuum. The parabolic shape and behavior of debris were very strange compared to what we’re used to seeing on Earth. We showed it to our client, Visual Effects Producer Allen Maris, and he loved it, so we had a really good target to work towards.” McBride’s rover takes a hit from the pirates, sending it spinning out of control. “The lower lunar gravity would make traction more difficult,” explains Smith, “so the rover goes into a spin before plunging into a giant crater. We got to be really creative with the layout here, placing the rover so the crater wall was behind it in the

“The [pirate] rover goes into a spin before plunging into a giant crater. We got to be really creative with the layout here, placing the rover so the crater wall was behind it in the background, which let us play the drop moment visually as it falls inside, while seeing the pirates stop along the lip of the crater.” —Jedediah Smith, Visual Effects Producer background, which let us play the drop moment visually as it falls inside, while seeing the pirates stop along the lip of the crater.” The trek out of the crater reveals the terminator line separating day and night on the moon. “Once they are in shadow, we play things realistically dark, only seeing the console lights, the play of headlights on the ground and the starfield overhead,” says Smith. “Then they reach the second spaceport that houses the Cephius, the ship that will take them on the next leg of their journey. We took the plate shot on location at a military base and added a large spaceport with solar panels and supporting structures.” The other huge challenge in the sequence involved the visors of the spacesuited combatants. “These helmets were very much in the style of NASA’s Apollo program, with visors like mirror balls that reflect everything,” states Smith. “While they were shooting closer views on stage, tinted visors were used that would color the reflections while making the faces within look more gold and green. But when they realized this impacted being able to see the talent clearly, a switch was made to clear visors, so those shots required us to treat everything to get that gold-mirror look.” That proved to be a multi-step process. “First, we had to treat what was seen inside the visor,” Smith remarks, “making the faces darker and greener, and then add reflections for the lunar terrain environment. In some instances, that meant animating other characters who were in proximity. This required a massive rotoscoping effort to separate all the layers, then combine everything in a photorealistic way. All of the stage shots were shot on black with a single distant key light to try to mimic the lighting conditions on the moon. While it did help with realism in the end, it was a huge amount of effort in roto and compositing to get the edges looking nice over the bright background moon surface that we added to these shots.” After leaving the moon, Pitt’s ship encounters the Vesta, an abandoned vessel. “They suit up and spacewalk over to it, which production shot on stage against black, so again there was a huge amount of rotoscope,” reports Smith. “The client was very interested in realism, and the plates they shot reflected that, with a really harsh strong single key light. We ended up cheating a bit, adding some fill to keep it from being totally black. The comping challenge was addressing the edge treatment with finesse, plus incorporating our CG Vesta – which was blended with production’s practical hatch element – as a reflection on the visors.” The ending sequence again involved MR. X, working in

TOP TWO: Method Studios augmented the location shoot, extracting distance-attenuating atmosphere from the plates and adding set extensions based on imagery from the Apollo moon missions, plus a variety of animation enhancements ranging from dust to vehicle explosions. BOTTOM TWO: Some of the vehicular mayhem was accomplished practically on location, then augmented with CG lunar terrain and animated weapons fire.

FALL 2019 VFXVOICE.COM • 43


FILM

“[The pirate rover hit by gunfire] was a full CG shot involving rigid-body simulation for the vehicle, plus simulations for all the dust and debris kicked up on the surface and the simulated engine explosion. Later in the sequence, missiles are called in to put an end to the remaining pirate rovers. We created a full CG shot showing two missiles flying overhead, leading to a massive explosion.” —Jedediah Smith, Visual Effects Producer

TOP TWO: A closer view of the action reveals how the infrared pass creates a more pronounced sense of contrast, which when combined with the 35mm film plate, conveys lunar verisimilitude. BOTTOM TWO: McBride’s rover spins out and sails over the lip of a large crater. The out-of-control rotation was facilitated through the use of a motion base rig.

44 • VFXVOICE.COM FALL 2019

collaboration with MPC. “Our toolset has undergone some refinement in how we handle the sharing of elements, which is what happened here with MPC,” notes Wendt. “Asset sharing between VFX companies has become the norm. While there are efforts to come up with standardized scene description and shader languages, it’s still quite a bit of work to move assets from one company to another and have them look the same. Color is much less of an issue these days, as we have very mature tools in that area.” A number of shots feature a nuclear blast taking place in space. “We saw some shaky 16mm footage of nuclear tests done high up in the atmosphere from decades back,” Wendt recalls. “This took place well above the cruising altitude of jets, and had a very distinct and unique look that we kept in mind while generating a rather elaborate gas-dynamic simulations to get a sense of naturalism. James was after a very stark effect, so it felt a little weird at first, as he felt we had gone a little too sci-fi for him. The toughest development on that process was in the way the layers were applied.” MR. X simulated the blast with individual layers rendered separately. “We began with those conventional gas dynamics, then broke the blast up with other elements, trying to get specific desired filaments,” Wendt elaborates. “With each phase separated, we found success when recombining them in an unconventional way, evolving the final as a kind of frozen look we introduced into the gas sim, which worked for us and also doesn’t look like anything I’ve seen before. We made an effort to duplicate and retain the fluctuations from that real blast, because they gave us interesting churning patterns, but that involved figuring out how to keep them going long enough to register, plus treating them in a style appropriate to the picture, which called for a certain elegance. While we’re trying to keep things realistic, we are also matching to something art-directed and lit to create a heightened emotional experience. The black hole in Interstellar is a great example of something [with] a very solid scientific basis, but is also quite beautiful and appropriate to that particular story.” Wendt considers Ad Astra to be “an ideal project for us, as it allows us to focus our efforts, as opposed a show with several thousand shots. I think the division of labor on this project allowed the houses to develop and refine looks that serviced the filmmaker’s needs very specifically and, I hope, successfully.”



PROFILE

ANNE KOLBE: PLAYING A LEADING ROLE IN MAKING CINEMATIC HISTORY By TREVOR HOGG

All images copyright © Warner Bros. Entertainment

46 • VFXVOICE.COM FALL 2019

Whether conjuring the Lasso of Truth with the help of LED lights in Wonder Woman or creating the ancient underwater civilization of Atlantis in Aquaman, or producing photorealistic digital doubles that can withstand close-up shots in Batman v Superman: Dawn of Justice, Warner Bros. Pictures Executive Vice President, Visual Effects Anne Kolbe has played a significant role in making these cinematic moments a reality. Over her 15 years as a Hollywood studio executive, Kolbe has collaborated with Christopher Nolan (Dunkirk), Zack Snyder (Man of Steel), Guy Ritchie (Sherlock Holmes), Clint Eastwood (Hereafter), Steven Spielberg (Ready Player One), Ridley Scott (Body of Lies), Ben Affleck (The Town) and George Miller (Mad Max: Fury Road). “Each of those filmmakers have a different need for how they work, and we’re adaptable to their processes as well as respectful of what they bring to the studio,” states Kolbe. Pursuing a career in visual effects was not an obvious choice for Kolbe, who graduated from UCLA with a BA in History and comes from a family where finance is the occupation of choice. “I worked as a PA on productions for commercials and TV movies and craft services, from being on a talent desk at CAA to working in a non-profit at a family foundation. I’ve gone on such different paths and I ended up at R/GA L.A. back in the mid-1990s. The visual effects industry was in its infantile stage. You were constantly learning and having to adapt to change. There were these new configurations of people and technology. It’s constantly evolving, moving, challenging, and you’re never allowed to rest and sit on your laurels. I found that was something that suited my personality. I like the technical and financial aspects while the creative side fits with my background from college.” Kolbe developed an affinity for comedies growing up in Diablo, California. “I ‘broke’ into the movie business as a post-production clerk for Warner Bros.,” chuckles Kolbe. “I first became interested in the production side when I had the opportunity to be a VFX coordinator on Dark Territory [after leaving R/GA L.A.].” Freelance work led to a fateful encounter with a Warner Bros. executive who temporarily left his position to produce Looney Tunes: Back in Action. “I was the Visual Effects Producer on that movie, and when Chris DeFaria finished producing it, Warner Bros. brought him in to oversee the visual effects department, and I came in with him at that time.” A lot of the learning process stemmed from on-the-job training, which taught Kolbe the importance of being adaptable, as well the need to read, listen and watch everything. “Mentorship has come up more in the last 10 years. There weren’t many mentors when I was starting in the business, especially when you were freelance. Nowadays there is more attention towards mentorship, which is fantastic. You can take certain college or university courses to learn it, but at the end of the day you have to be working on production to understand what the job is, the demands, and know how to handle certain situations.” The visual effects industry has changed completely from when Kolbe worked on Dead Man, directed by Jim Jarmusch, as a visual effects production coordinator. “Before it was just a ‘post process.’ Now we’re an integral part in how we make the film. We deal with

“The driving component for us is to make sure that we have the right creative team with the correct solution for the film and the filmmakers. Tax incentives are driving the business and the global workforce. It’s a factor in the equation of how we put the movies together.” —Anne Kolbe, Executive Vice President, Visual Effects, Warner Bros. Pictures everything from greenlight content presentations for the studio to overseeing over 80% of the content of all of our films, to delivering not only the 3D but also content for ancillary markets.” The advancement in digital tools has enabled CG solutions for safety concerns and logistical issues regarding locations. “The budgets have gone up correspondingly to handle that content and has created a massive global business structure to be able handle it. The driving component for us is to make sure that we have the right creative team with the correct solution for the film and the filmmakers. Tax incentives are driving the business and the global workforce. It’s a factor in the equation of how we put the movies together. We’ll run metrics, like 15 or 20, on how we’re going to do and award a film.” In order to be successful in creating visual effects it is essential to properly pair off the visual effects supervisors and producers with the filmmakers, as was the case with John ‘DJ’ DesJardin and Zack Snyder. “DJ and Zack are so similar,” notes Kolbe. “That was back on Watchmen. DJ was an up and coming supervisor. He was passionate and knew the comic book genre. They work so well together. We do that a lot. A key component of helping our films get done well is making sure that the relationship is solid between the supervisors and directors.” Another aspect of the job is having brainstorming sessions with visual effects vendors. During the making of 300: Rise of an

OPPOSITE TOP: Anne Kolbe OPPOSITE BOTTOM: Aerial blimp footage was utilized to reconstruct London during World War I for Wonder Woman. TOP: A thought-to-be extinct Megalodon gets brought back to life for The Meg. BOTTOM: Black Manta (Yahya Abdul-Mateen II) pursues Aquaman/Arthur Curry (Jason Momoa) in one of the few location scenes, in Sicily featured in Aquaman.

FALL 2019 VFXVOICE.COM • 47


PROFILE

ANNE KOLBE: PLAYING A LEADING ROLE IN MAKING CINEMATIC HISTORY By TREVOR HOGG

All images copyright © Warner Bros. Entertainment

46 • VFXVOICE.COM FALL 2019

Whether conjuring the Lasso of Truth with the help of LED lights in Wonder Woman or creating the ancient underwater civilization of Atlantis in Aquaman, or producing photorealistic digital doubles that can withstand close-up shots in Batman v Superman: Dawn of Justice, Warner Bros. Pictures Executive Vice President, Visual Effects Anne Kolbe has played a significant role in making these cinematic moments a reality. Over her 15 years as a Hollywood studio executive, Kolbe has collaborated with Christopher Nolan (Dunkirk), Zack Snyder (Man of Steel), Guy Ritchie (Sherlock Holmes), Clint Eastwood (Hereafter), Steven Spielberg (Ready Player One), Ridley Scott (Body of Lies), Ben Affleck (The Town) and George Miller (Mad Max: Fury Road). “Each of those filmmakers have a different need for how they work, and we’re adaptable to their processes as well as respectful of what they bring to the studio,” states Kolbe. Pursuing a career in visual effects was not an obvious choice for Kolbe, who graduated from UCLA with a BA in History and comes from a family where finance is the occupation of choice. “I worked as a PA on productions for commercials and TV movies and craft services, from being on a talent desk at CAA to working in a non-profit at a family foundation. I’ve gone on such different paths and I ended up at R/GA L.A. back in the mid-1990s. The visual effects industry was in its infantile stage. You were constantly learning and having to adapt to change. There were these new configurations of people and technology. It’s constantly evolving, moving, challenging, and you’re never allowed to rest and sit on your laurels. I found that was something that suited my personality. I like the technical and financial aspects while the creative side fits with my background from college.” Kolbe developed an affinity for comedies growing up in Diablo, California. “I ‘broke’ into the movie business as a post-production clerk for Warner Bros.,” chuckles Kolbe. “I first became interested in the production side when I had the opportunity to be a VFX coordinator on Dark Territory [after leaving R/GA L.A.].” Freelance work led to a fateful encounter with a Warner Bros. executive who temporarily left his position to produce Looney Tunes: Back in Action. “I was the Visual Effects Producer on that movie, and when Chris DeFaria finished producing it, Warner Bros. brought him in to oversee the visual effects department, and I came in with him at that time.” A lot of the learning process stemmed from on-the-job training, which taught Kolbe the importance of being adaptable, as well the need to read, listen and watch everything. “Mentorship has come up more in the last 10 years. There weren’t many mentors when I was starting in the business, especially when you were freelance. Nowadays there is more attention towards mentorship, which is fantastic. You can take certain college or university courses to learn it, but at the end of the day you have to be working on production to understand what the job is, the demands, and know how to handle certain situations.” The visual effects industry has changed completely from when Kolbe worked on Dead Man, directed by Jim Jarmusch, as a visual effects production coordinator. “Before it was just a ‘post process.’ Now we’re an integral part in how we make the film. We deal with

“The driving component for us is to make sure that we have the right creative team with the correct solution for the film and the filmmakers. Tax incentives are driving the business and the global workforce. It’s a factor in the equation of how we put the movies together.” —Anne Kolbe, Executive Vice President, Visual Effects, Warner Bros. Pictures everything from greenlight content presentations for the studio to overseeing over 80% of the content of all of our films, to delivering not only the 3D but also content for ancillary markets.” The advancement in digital tools has enabled CG solutions for safety concerns and logistical issues regarding locations. “The budgets have gone up correspondingly to handle that content and has created a massive global business structure to be able handle it. The driving component for us is to make sure that we have the right creative team with the correct solution for the film and the filmmakers. Tax incentives are driving the business and the global workforce. It’s a factor in the equation of how we put the movies together. We’ll run metrics, like 15 or 20, on how we’re going to do and award a film.” In order to be successful in creating visual effects it is essential to properly pair off the visual effects supervisors and producers with the filmmakers, as was the case with John ‘DJ’ DesJardin and Zack Snyder. “DJ and Zack are so similar,” notes Kolbe. “That was back on Watchmen. DJ was an up and coming supervisor. He was passionate and knew the comic book genre. They work so well together. We do that a lot. A key component of helping our films get done well is making sure that the relationship is solid between the supervisors and directors.” Another aspect of the job is having brainstorming sessions with visual effects vendors. During the making of 300: Rise of an

OPPOSITE TOP: Anne Kolbe OPPOSITE BOTTOM: Aerial blimp footage was utilized to reconstruct London during World War I for Wonder Woman. TOP: A thought-to-be extinct Megalodon gets brought back to life for The Meg. BOTTOM: Black Manta (Yahya Abdul-Mateen II) pursues Aquaman/Arthur Curry (Jason Momoa) in one of the few location scenes, in Sicily featured in Aquaman.

FALL 2019 VFXVOICE.COM • 47


PROFILE

“We have to figure out an approach that helps us get to content that looks amazing and is more efficient, but also preserves what is the most expensive part of the process, which is the artists’ time.” —Anne Kolbe, Executive Vice President, Visual Effects, Warner Bros. Pictures

TOP LEFT: Newt Scamander (Eddie Redmayne) gets acquainted with a baby Niffler in Fantastic Beasts: The Crimes of Grindelwald. TOP RIGHT: King Ricou (Djimon Hounsou)is accompanied by Queen Rina (NataliaSafran) and Fisherman Princess (SophiaForrest) in one of the 2,300 visual effects shots featured in Aquaman. BOTTOM LEFT: Special effects created a shaker rig to vibrate the shield as if it was being hit by several machine-gun rounds a second as Wonder Woman races across No Man’s Land. BOTTOM RIGHT: Wonder Woman/ Diana Prince (Gal Gadot) battling Hades on an airfield in Wonder Woman.

48 • VFXVOICE.COM FALL 2019

Empire, Scanline VFX met with Kolbe to discuss how to shoot the water action dry for wet in front of a greenscreen and to develop the look of the four different battles sequences. “That one had a bunch of challenges: budget, schedule and location. What I do, especially when there’s a film that’s particularly challenging, is go to a company that might have a software solution or a creative and technical solution, bring them in and partner with them to figure out a holistic workflow plan on the film that could achieve the objective. I enjoy working with a company like Scanline VFX because we have a good relationship in trying to think outside of the box.” A balancing act takes place between financial and creative requirements of a project. “It is tricky, but at the end of the day we have to make a really good movie,” observes Kolbe. “The number one focus is to figure out with the filmmakers how we’re going to do that. We have gotten to be good at it, but that is never easy. There’s never one solution to any problem. We’re always looking at multiple ways to handle a problem or a scenario that we foresee

coming up. You are constantly having to make sure that you’re keeping your priorities in line, but also understand that you have to be able to handle shifts and changes.” Keeping an open mind is critical. “I might ask questions that are completely off the wall, but I always want to challenge our visual effects supervisors and producers to try to think of a better way to get the content.” Post-production schedules are becoming shorter. “You can use technology to help you solve some of the problems. At the end of the day, humans can only do so much in a day, but technology is always advancing. We have to figure out an approach that helps us get to content that looks amazing and is more efficient, but also preserves what is the most expensive part of the process, which is the artists’ time.” “Our industry will have nothing but more growth in the future, whether it’s augmented reality, virtual reality, or whatever that next thing is,” believes Kolbe. “It’s all going to be part of creating visual content across not only those different areas, but also different platforms that are derivative of the film content.” The job of managing the visual effects department for Warner Bros. is constantly evolving and changing. “The most interesting part is not only the amount of content we touch but also the complexity of it as well. When I look across my desk and I have 35 movies I’m either in some way, shape or form involved in, there are different challenges in each and every one of them. Ten years ago, we didn’t have to deal with this type of complexity in figuring out how to make the movies, and it’s not only that, but also the financial structure of how we’re going to be able to deal with it.” States Kolbe, “For me, I like challenges and being part of figuring out solutions. Whenever there is a situation where a film goes well or suddenly has to deal with a challenge that we have to overcome to get a film into the theatre, that is where I get the most satisfaction. Just knowing that we were able to do a good job against all of the obstacles and put out a great-looking movie to the best of our abilities. Our department takes a lot of pride in what we do and the quality of the work that comes out of the studio. That, to me, is what drives me and keeps me going to work every day.”

TOP LEFT: King Orm (Patrick Wilson) meets with King Nereus (Dolph Lundgren) to discuss the fate of Atlantis in Aquaman. TOP RIGHT: Arthur Curry (Jason Momoa) travels with Mera (Amber Heard) who has hydrokinetic and telepathic powers in Aquaman. MIDDLE: An adult Niffler has the ability to locate treasure and cause mayhem if kept indoors as it is attracted to shiny objects in Fantastic Beasts: The Crimes of Grindelwald. BOTTOM: A significant Kolbe collaborator and contributor to Warner Bros.’ success with the Harry Potter and Fantastic Beasts franchises has been Oscar-winning production designer Stuart Craig.

FALL 2019 VFXVOICE.COM • 49


PROFILE

“We have to figure out an approach that helps us get to content that looks amazing and is more efficient, but also preserves what is the most expensive part of the process, which is the artists’ time.” —Anne Kolbe, Executive Vice President, Visual Effects, Warner Bros. Pictures

TOP LEFT: Newt Scamander (Eddie Redmayne) gets acquainted with a baby Niffler in Fantastic Beasts: The Crimes of Grindelwald. TOP RIGHT: King Ricou (Djimon Hounsou)is accompanied by Queen Rina (NataliaSafran) and Fisherman Princess (SophiaForrest) in one of the 2,300 visual effects shots featured in Aquaman. BOTTOM LEFT: Special effects created a shaker rig to vibrate the shield as if it was being hit by several machine-gun rounds a second as Wonder Woman races across No Man’s Land. BOTTOM RIGHT: Wonder Woman/ Diana Prince (Gal Gadot) battling Hades on an airfield in Wonder Woman.

48 • VFXVOICE.COM FALL 2019

Empire, Scanline VFX met with Kolbe to discuss how to shoot the water action dry for wet in front of a greenscreen and to develop the look of the four different battles sequences. “That one had a bunch of challenges: budget, schedule and location. What I do, especially when there’s a film that’s particularly challenging, is go to a company that might have a software solution or a creative and technical solution, bring them in and partner with them to figure out a holistic workflow plan on the film that could achieve the objective. I enjoy working with a company like Scanline VFX because we have a good relationship in trying to think outside of the box.” A balancing act takes place between financial and creative requirements of a project. “It is tricky, but at the end of the day we have to make a really good movie,” observes Kolbe. “The number one focus is to figure out with the filmmakers how we’re going to do that. We have gotten to be good at it, but that is never easy. There’s never one solution to any problem. We’re always looking at multiple ways to handle a problem or a scenario that we foresee

coming up. You are constantly having to make sure that you’re keeping your priorities in line, but also understand that you have to be able to handle shifts and changes.” Keeping an open mind is critical. “I might ask questions that are completely off the wall, but I always want to challenge our visual effects supervisors and producers to try to think of a better way to get the content.” Post-production schedules are becoming shorter. “You can use technology to help you solve some of the problems. At the end of the day, humans can only do so much in a day, but technology is always advancing. We have to figure out an approach that helps us get to content that looks amazing and is more efficient, but also preserves what is the most expensive part of the process, which is the artists’ time.” “Our industry will have nothing but more growth in the future, whether it’s augmented reality, virtual reality, or whatever that next thing is,” believes Kolbe. “It’s all going to be part of creating visual content across not only those different areas, but also different platforms that are derivative of the film content.” The job of managing the visual effects department for Warner Bros. is constantly evolving and changing. “The most interesting part is not only the amount of content we touch but also the complexity of it as well. When I look across my desk and I have 35 movies I’m either in some way, shape or form involved in, there are different challenges in each and every one of them. Ten years ago, we didn’t have to deal with this type of complexity in figuring out how to make the movies, and it’s not only that, but also the financial structure of how we’re going to be able to deal with it.” States Kolbe, “For me, I like challenges and being part of figuring out solutions. Whenever there is a situation where a film goes well or suddenly has to deal with a challenge that we have to overcome to get a film into the theatre, that is where I get the most satisfaction. Just knowing that we were able to do a good job against all of the obstacles and put out a great-looking movie to the best of our abilities. Our department takes a lot of pride in what we do and the quality of the work that comes out of the studio. That, to me, is what drives me and keeps me going to work every day.”

TOP LEFT: King Orm (Patrick Wilson) meets with King Nereus (Dolph Lundgren) to discuss the fate of Atlantis in Aquaman. TOP RIGHT: Arthur Curry (Jason Momoa) travels with Mera (Amber Heard) who has hydrokinetic and telepathic powers in Aquaman. MIDDLE: An adult Niffler has the ability to locate treasure and cause mayhem if kept indoors as it is attracted to shiny objects in Fantastic Beasts: The Crimes of Grindelwald. BOTTOM: A significant Kolbe collaborator and contributor to Warner Bros.’ success with the Harry Potter and Fantastic Beasts franchises has been Oscar-winning production designer Stuart Craig.

FALL 2019 VFXVOICE.COM • 49


FOCUS

VFX AND ANIMATION SCHOOLS: KEEPING UP WITH THE ESSENTIALS By CHRIS MCGOWAN

TOP: Ringling College students and teachers in the greenscreen room on the Motion Design floor. (Photo courtesy of Ringling College of Art and Design)

50 • VFXVOICE.COM FALL 2019

Leading VFX and animation schools these days must ensure that students graduate “studio-ready,” says co-owner and director Ria Bénard of Lost Boys Studios in Canada. “With more and more demand for complicated visual effects shots, studios have much less time to train new artists, so they need the graduate to hit the ground running, transition into studio work faster and more efficiently.” Ron Honn, Florida State University Visual Effects Filmmaker in Residence adds, “The biggest, and I find most rewarding, challenge for preparing students for work in the industry is preparing a curriculum that is software agnostic – to find the essential technologies that all VFX artists need to know.” To keep up with the essentials, schools are always in “learning mode,” to use Honn’s phrase, with many faculty members also working in the industry. As students acquire such skills, they also learn how to connect with potential employers via networking, mentoring, showcases, recruitment events, joint projects with studios and other means. The best schools, out of a few hundred operating globally, teach or facilitate all of the above and more. Following is a sampling of leading VFX programs. This is a not intended to be a complete listing. RINGLING COLLEGE OF ART AND DESIGN, Sarasota, Florida www.ringling.edu Degrees: Motion Design, Computer Animation, Virtual Reality Development, Game Art, Film, and Photography & Imaging. Number of students: 870 VR/AR: Ringling launched a Virtual Reality Development major, a four-year BFA degree program, in the fall of 2018. In it, students learn to design, create and analyze immersive experiences in various industries. Keeping up with technology: “We keep up with technology by making sure we have faculty that are active in the industry. Their professional experience is brought into the classroom. As full-time faculty members, we value collaboration between professors and work to integrate new technology into our

flexible curriculum,” says Ed Cheetham, Head of the Department of Motion Design. Connecting: “In addition to the vast connections brought to our students through the working faculty, the department brings in industry leaders to talk and engage with our students. Every year it holds a conference called FutureProof – five days of presentations, workshops, talks and networking events. In addition, each year over 100 recruiters come to campus to present, meet and interview students.” Challenges: “Although VFX is such a broad field encompassing so many different skillsets within this discipline, ultimately the biggest challenge comes down to helping students develop a critical and aesthetic eye,” continues Cheetham. “Motion Design Professor and VFX artist Dante Rinaldi uses a phrase that sums up the challenges for his students: ‘You can’t fix what you don’t see.’” RISD (RHODE ISLAND SCHOOL OF DESIGN), Providence, Rhode Island www.risd.edu Degrees: BFA Film/Animation/Video. Number of students: 143 Connecting: “RISD’s Career Center offers a comprehensive program of one-on-one counseling, portfolio reviews, seminars and other targeted events aimed at helping undergraduates, graduate students and alumni translate their creativity into meaningful and rewarding careers,” notes Sheri Wills, Film/Animation/Video Department Head. Comments: “We develop the artist as a whole, knowing that our graduates need the flexibility to make time-based work that continually pushes the cultural dialogue forward. Through a rigorous examination of both technique and concept, we teach students to closely and critically consider every aspect of their work, the work of their peers, and contemporary and historic works,” says Wills. Students regularly analyze and solve technical and aesthetic problems, and benefit from exposure to critical review, film festivals and visiting artists and specialists.

“We develop the artist as a whole, knowing that our graduates need the flexibility to make time-based work that continually pushes the cultural dialogue forward.” —Sheri Wills, Film/Animation/Video Department Head, Rhode Island School of Design

TOP: Students at the John C. Hench Division of Animation and Digital Arts, which is part of the USC School of Cinematic Arts (SCA). (Photo courtesy of USC) BOTTOM: Students work on a project for the Film/ Animation/Video department at Rhode Island School of Design, a private, nonprofit college founded in 1877. (Photo courtesy of Rhode Island School of Design)

FALL 2019 VFXVOICE.COM • 51


FOCUS

VFX AND ANIMATION SCHOOLS: KEEPING UP WITH THE ESSENTIALS By CHRIS MCGOWAN

TOP: Ringling College students and teachers in the greenscreen room on the Motion Design floor. (Photo courtesy of Ringling College of Art and Design)

50 • VFXVOICE.COM FALL 2019

Leading VFX and animation schools these days must ensure that students graduate “studio-ready,” says co-owner and director Ria Bénard of Lost Boys Studios in Canada. “With more and more demand for complicated visual effects shots, studios have much less time to train new artists, so they need the graduate to hit the ground running, transition into studio work faster and more efficiently.” Ron Honn, Florida State University Visual Effects Filmmaker in Residence adds, “The biggest, and I find most rewarding, challenge for preparing students for work in the industry is preparing a curriculum that is software agnostic – to find the essential technologies that all VFX artists need to know.” To keep up with the essentials, schools are always in “learning mode,” to use Honn’s phrase, with many faculty members also working in the industry. As students acquire such skills, they also learn how to connect with potential employers via networking, mentoring, showcases, recruitment events, joint projects with studios and other means. The best schools, out of a few hundred operating globally, teach or facilitate all of the above and more. Following is a sampling of leading VFX programs. This is a not intended to be a complete listing. RINGLING COLLEGE OF ART AND DESIGN, Sarasota, Florida www.ringling.edu Degrees: Motion Design, Computer Animation, Virtual Reality Development, Game Art, Film, and Photography & Imaging. Number of students: 870 VR/AR: Ringling launched a Virtual Reality Development major, a four-year BFA degree program, in the fall of 2018. In it, students learn to design, create and analyze immersive experiences in various industries. Keeping up with technology: “We keep up with technology by making sure we have faculty that are active in the industry. Their professional experience is brought into the classroom. As full-time faculty members, we value collaboration between professors and work to integrate new technology into our

flexible curriculum,” says Ed Cheetham, Head of the Department of Motion Design. Connecting: “In addition to the vast connections brought to our students through the working faculty, the department brings in industry leaders to talk and engage with our students. Every year it holds a conference called FutureProof – five days of presentations, workshops, talks and networking events. In addition, each year over 100 recruiters come to campus to present, meet and interview students.” Challenges: “Although VFX is such a broad field encompassing so many different skillsets within this discipline, ultimately the biggest challenge comes down to helping students develop a critical and aesthetic eye,” continues Cheetham. “Motion Design Professor and VFX artist Dante Rinaldi uses a phrase that sums up the challenges for his students: ‘You can’t fix what you don’t see.’” RISD (RHODE ISLAND SCHOOL OF DESIGN), Providence, Rhode Island www.risd.edu Degrees: BFA Film/Animation/Video. Number of students: 143 Connecting: “RISD’s Career Center offers a comprehensive program of one-on-one counseling, portfolio reviews, seminars and other targeted events aimed at helping undergraduates, graduate students and alumni translate their creativity into meaningful and rewarding careers,” notes Sheri Wills, Film/Animation/Video Department Head. Comments: “We develop the artist as a whole, knowing that our graduates need the flexibility to make time-based work that continually pushes the cultural dialogue forward. Through a rigorous examination of both technique and concept, we teach students to closely and critically consider every aspect of their work, the work of their peers, and contemporary and historic works,” says Wills. Students regularly analyze and solve technical and aesthetic problems, and benefit from exposure to critical review, film festivals and visiting artists and specialists.

“We develop the artist as a whole, knowing that our graduates need the flexibility to make time-based work that continually pushes the cultural dialogue forward.” —Sheri Wills, Film/Animation/Video Department Head, Rhode Island School of Design

TOP: Students at the John C. Hench Division of Animation and Digital Arts, which is part of the USC School of Cinematic Arts (SCA). (Photo courtesy of USC) BOTTOM: Students work on a project for the Film/ Animation/Video department at Rhode Island School of Design, a private, nonprofit college founded in 1877. (Photo courtesy of Rhode Island School of Design)

FALL 2019 VFXVOICE.COM • 51


FOCUS

JOHN C. HENCH DIVISION OF ANIMATION AND DIGITAL ARTS, USC SCHOOL OF CINEMATIC ARTS (SCA), Los Angeles, California www.cinema.usc.edu/animation/index.cfm Degrees: BA and MFA in Animation & Digital Arts. Number of students: 120 VR/AR: Classes in Immersive Media and a lab where students can collaborate with others across SCA divisions as well as across schools. Keeping up with technology: “Many of our professors are engaged in top productions outside of USC. For example, John Brennan, who teaches Motion Capture, worked on the recent remakes of The Jungle Book and The Lion King. We have other top industry professionals teaching master classes – all are working for major Hollywood studios,” notes Professor Teresa Cheng, chair of the John C. Hench Division. This keeps students in touch with tech trends. “Many technology companies are also connected with us to introduce and demonstrate their latest innovations.” Connecting: “We have an annual event called Studio Day. This year, 33 studios/companies came on campus for full-day portfolio reviews with 80 students. At SCA, we also have staff members whose jobs are to connect all students to potential internships and first jobs. Based on our own industry contacts, which include ex-colleagues and outreach people, we are often notified directly about open positions. SCA also actively features industry speakers on a regular basis and hosts an annual event called Talent Week.” Challenges: “The landscape of the whole film industry is changing rapidly, so keeping up with what is happening outside of our campus is essential. Our focus is on content making, which is always in demand and forms the foundation for all types of production. Tools and methods come and go, but developing strengths in storytelling is critical for our students’ careers.”

TOP AND BOTTOM: Ninety-nine per cent of the SCAD’s alumni are employed within 10 months of graduation, according to Jeff Light, Chair of Visual Effects, Savannah College of Art and Design. (Photos courtesy of Savannah College of Art and Design)

52 • VFXVOICE.COM FALL 2019

SCAD (THE SAVANNAH COLLEGE OF ART AND DESIGN), Savannah, Georgia; Atlanta, Georgia; Hong Kong; Lacoste, France www.scad.edu Degrees: BFA, MA and MFA in both animation and visual effects degree programs. Number of students: 200 undergraduates and 51 graduates (Visual Effects degree program for fall 2018/spring 2019). VR/AR: SCAD’s new Immersive Reality degree program launched this year. Keeping up with technology: “We send faculty members to innovative conferences like SIGGRAPH, Unreal Academy, and GDC to research and network about evolving technology,” says Jeff Light, SCAD Chair of Visual Effects. In addition, we have at least two alumni mentors that visit our classes multiple times throughout the academic year to share their experience with the latest industry tools and methods.” Connecting: SCAD president and founder Paul Wallace and the SCAD faculty puts an importance on “bringing leading industry professionals to SCAD’s global locations to interact with the students through master classes, portfolio reviews, recruitment

opportunities, signature events, and the university’s SCADPro initiative,” according to Light. “SCADPro is a design shop and innovation studio that generates business solutions for the world’s most influential brands.” New in 2019: “By the end of last year, it was abundantly clear that the visual effects industry is rapidly moving into the area of rendering in real-time. Previously, this seemed to be more in the purview of game design, but the clients we worked with in VFX increasingly demonstrated a need for rendering shots with Unreal Engine or Unity. The work our students produce is recognized as stunning in the area of photorealism and dynamic simulation, and the next level is to achieve that caliber of imagery in real-time, especially for AR/VR applications.” ANIMATIONSINSTITUT DER FILMAKADEMIE BADEN-WÜRTTEMBERG, Ludwigsburg, Germany www.animationsinstitut.de/en Degrees: Animationsinstitut offers basic studies and project-based/postgraduate studies in the fields of Animation and Interactive Media. Number of students: In basic studies, 12 students are admitted in each academic year. In project studies/postgraduate studies up to 16 students can participate. VR/AR: VR, AR, 360° and other XR techniques are an essential part of the curriculum. Keeping up with technology: “Training follows a practice-oriented and project-based approach and is always state of the art,” says Professor Andreas Hykade, Director of Animationsinstitut. This includes giving students the opportunity to implement 360° and VR projects. Hykade adds, “We only work with guest lecturers who work in the industry, which ensures that industry standards are taught. The Research and Development department of Animationsinstitut consists of a multidisciplinary team dedicated to the creation of innovative tools and technologies. In addition to commercially available tools, Animationsinstitut relies on its own developments like VPET (Virtual Production Editing Tools) or the Facial Animation Toolset for Autodesk Maya.” Connecting: “National and international professionals from the film and media industry pass on their knowledge and their experience to the students in seminars, mentorships and workshops. Many of them offer students the possibility to work for current projects during their internship semester,” says Hykade. Challenges: “To provide the knowledge for our students to be able to develop their own IPs.” New in 2019: Over the years, R&D at Animationsinstitut has explored a wide variety of topics that have been crucial in media productions of today and tomorrow. The current projects have a strong focus on Digital Actors, Virtual Production, Immersion, and the potential of Virtual and Augmented Reality in narrative entertainment productions.

TOP AND MIDDLE: The R&D department at Animationsinstitut der Filmakademie Baden-Württemberg has developed student projects with Foundry, DNEG, NCAM, Trixter, Ikinema and Stargate Studios, among others. (Photos courtesy of Animationsinstitut der Filmakademie Baden-Württemberg) BOTTOM: MA Digital Effects students use a Panasonic VariCams in a greenscreen tank. It was a promotional shot used for a Panasonic case study. The National Centre for Computer Animation at Bournemouth University was the first educational institution in the U.K. to use Panasonic’s VariCam digital video camera. (Photos courtesy of Bournemouth University)

NATIONAL CENTRE FOR COMPUTER ANIMATION, BOURNEMOUTH UNIVERSITY, Bournemouth, U.K. www.bournemouth.ac.uk/about/our-faculties/

FALL 2019 VFXVOICE.COM • 53


FOCUS

JOHN C. HENCH DIVISION OF ANIMATION AND DIGITAL ARTS, USC SCHOOL OF CINEMATIC ARTS (SCA), Los Angeles, California www.cinema.usc.edu/animation/index.cfm Degrees: BA and MFA in Animation & Digital Arts. Number of students: 120 VR/AR: Classes in Immersive Media and a lab where students can collaborate with others across SCA divisions as well as across schools. Keeping up with technology: “Many of our professors are engaged in top productions outside of USC. For example, John Brennan, who teaches Motion Capture, worked on the recent remakes of The Jungle Book and The Lion King. We have other top industry professionals teaching master classes – all are working for major Hollywood studios,” notes Professor Teresa Cheng, chair of the John C. Hench Division. This keeps students in touch with tech trends. “Many technology companies are also connected with us to introduce and demonstrate their latest innovations.” Connecting: “We have an annual event called Studio Day. This year, 33 studios/companies came on campus for full-day portfolio reviews with 80 students. At SCA, we also have staff members whose jobs are to connect all students to potential internships and first jobs. Based on our own industry contacts, which include ex-colleagues and outreach people, we are often notified directly about open positions. SCA also actively features industry speakers on a regular basis and hosts an annual event called Talent Week.” Challenges: “The landscape of the whole film industry is changing rapidly, so keeping up with what is happening outside of our campus is essential. Our focus is on content making, which is always in demand and forms the foundation for all types of production. Tools and methods come and go, but developing strengths in storytelling is critical for our students’ careers.”

TOP AND BOTTOM: Ninety-nine per cent of the SCAD’s alumni are employed within 10 months of graduation, according to Jeff Light, Chair of Visual Effects, Savannah College of Art and Design. (Photos courtesy of Savannah College of Art and Design)

52 • VFXVOICE.COM FALL 2019

SCAD (THE SAVANNAH COLLEGE OF ART AND DESIGN), Savannah, Georgia; Atlanta, Georgia; Hong Kong; Lacoste, France www.scad.edu Degrees: BFA, MA and MFA in both animation and visual effects degree programs. Number of students: 200 undergraduates and 51 graduates (Visual Effects degree program for fall 2018/spring 2019). VR/AR: SCAD’s new Immersive Reality degree program launched this year. Keeping up with technology: “We send faculty members to innovative conferences like SIGGRAPH, Unreal Academy, and GDC to research and network about evolving technology,” says Jeff Light, SCAD Chair of Visual Effects. In addition, we have at least two alumni mentors that visit our classes multiple times throughout the academic year to share their experience with the latest industry tools and methods.” Connecting: SCAD president and founder Paul Wallace and the SCAD faculty puts an importance on “bringing leading industry professionals to SCAD’s global locations to interact with the students through master classes, portfolio reviews, recruitment

opportunities, signature events, and the university’s SCADPro initiative,” according to Light. “SCADPro is a design shop and innovation studio that generates business solutions for the world’s most influential brands.” New in 2019: “By the end of last year, it was abundantly clear that the visual effects industry is rapidly moving into the area of rendering in real-time. Previously, this seemed to be more in the purview of game design, but the clients we worked with in VFX increasingly demonstrated a need for rendering shots with Unreal Engine or Unity. The work our students produce is recognized as stunning in the area of photorealism and dynamic simulation, and the next level is to achieve that caliber of imagery in real-time, especially for AR/VR applications.” ANIMATIONSINSTITUT DER FILMAKADEMIE BADEN-WÜRTTEMBERG, Ludwigsburg, Germany www.animationsinstitut.de/en Degrees: Animationsinstitut offers basic studies and project-based/postgraduate studies in the fields of Animation and Interactive Media. Number of students: In basic studies, 12 students are admitted in each academic year. In project studies/postgraduate studies up to 16 students can participate. VR/AR: VR, AR, 360° and other XR techniques are an essential part of the curriculum. Keeping up with technology: “Training follows a practice-oriented and project-based approach and is always state of the art,” says Professor Andreas Hykade, Director of Animationsinstitut. This includes giving students the opportunity to implement 360° and VR projects. Hykade adds, “We only work with guest lecturers who work in the industry, which ensures that industry standards are taught. The Research and Development department of Animationsinstitut consists of a multidisciplinary team dedicated to the creation of innovative tools and technologies. In addition to commercially available tools, Animationsinstitut relies on its own developments like VPET (Virtual Production Editing Tools) or the Facial Animation Toolset for Autodesk Maya.” Connecting: “National and international professionals from the film and media industry pass on their knowledge and their experience to the students in seminars, mentorships and workshops. Many of them offer students the possibility to work for current projects during their internship semester,” says Hykade. Challenges: “To provide the knowledge for our students to be able to develop their own IPs.” New in 2019: Over the years, R&D at Animationsinstitut has explored a wide variety of topics that have been crucial in media productions of today and tomorrow. The current projects have a strong focus on Digital Actors, Virtual Production, Immersion, and the potential of Virtual and Augmented Reality in narrative entertainment productions.

TOP AND MIDDLE: The R&D department at Animationsinstitut der Filmakademie Baden-Württemberg has developed student projects with Foundry, DNEG, NCAM, Trixter, Ikinema and Stargate Studios, among others. (Photos courtesy of Animationsinstitut der Filmakademie Baden-Württemberg) BOTTOM: MA Digital Effects students use a Panasonic VariCams in a greenscreen tank. It was a promotional shot used for a Panasonic case study. The National Centre for Computer Animation at Bournemouth University was the first educational institution in the U.K. to use Panasonic’s VariCam digital video camera. (Photos courtesy of Bournemouth University)

NATIONAL CENTRE FOR COMPUTER ANIMATION, BOURNEMOUTH UNIVERSITY, Bournemouth, U.K. www.bournemouth.ac.uk/about/our-faculties/

FALL 2019 VFXVOICE.COM • 53


FOCUS

faculty-media-communication/our-departments/national-centre-computer-animation Degrees: BA Computer Animation Technical Arts, BA Computer Animation Art and Design, BA Visual Effects, MA 3D Computer Animation, MA Digital Effects, MSc Computer Animation and Visual Effects. Number of students: About 300 Undergraduates, 80 Masters, 30 Doctoral students (in VFX/Animation studies). Keeping up with technology: “We are fortunate to host one of the most vibrant and active research centers within the technical areas of Computer Graphics and associated digital arts in the world,” observes Richard Southern, Co-Head of the Department of National Centre for Computer Animation. “As an example, the drive in the industry towards automation facilitated by innovations in AI have been mirrored in our own internationally recognized research.” Connecting: “There are several opportunities for students to connect, such as the BFX Festival, Masterclass (a credit-bearing undergraduate unit in which students engage in an industry set and mentored project), access to industry mentors, weekly visiting speakers, and degree shows for each graduating cohort.” Challenges: “We recognize that there is a general convergence between the disciplines of games, VR, AR and film, which will shake up the animation education sector in the coming decade – although our department is certainly ready for these impending challenges.”

TOP AND BOTTOM: Gnomon’s course topics run the gamut from traditional art to character and creature animation, modeling and texturing, rigging, visual effects animation, game art, compositing, scripting and programming for visual effects, lighting and rendering, and more. (Photos courtesy of Gnomon)

54 • VFXVOICE.COM FALL 2019

GNOMON, Hollywood, California www.gnomon.edu Degrees: Bachelor of Fine Arts in Digital Production. Gnomon additionally offers an accredited two-year Certificate Program, which covers foundational 3D skills before allowing students to emphasize in further areas of study: Character & Creature Animation, Modeling & Texturing, Visual Effects Animation, Games or 3D Generalist. Number of students: Approximately 450 students at Gnomon study VFX, animation or game art. Online: Gnomon offers a selection of online courses available to residents in California and those outside of the United States. VR/AR: Most of Gnomon’s 3D software courses teach skills that are applicable to creating VR and AR content. The campus features a VR lab for students looking to explore these mediums in a dedicated space. Keeping up with technology: “Having a constant flow of industry professionals teaching our classes, visiting campus for special events and evaluating our curriculum gives us deep insight into the trends and advancements in production pipelines,” notes Shannon Wiggins, Director of Placement and Alumni Relations. Connecting: “To help prepare students for the industry, we regularly host employer days, where studio representatives are invited to meet with other students, review their reels and provide feedback. Through studio engagement, networking activities, and industry-related events showcasing the latest artistic and CG techniques, students at Gnomon gain a broad understanding of

the operational characteristics of different studios, insight into the current job market, and the comprehensive knowledge of how to navigate a career in the digital production industries.” Challenges: “Giving the students a perspective of how they can use the raw skills they are learning in a creative context and to appreciate the staggering possibilities they are inheriting as CG artists,” comments Beau Jansen, VFX Educational Lead. Further comments: “Gnomon has a dual-customer philosophy – we’re not just educating students, we’re supplying the VFX and animation industry with qualified graduates. There’s a sense of responsibility there,” comments Brian Bradford, Executive Director of Enrollment Management. SVA (SCHOOL OF VISUAL ARTS), New York, New York www.sva.edu Degrees: MFA in Computer Arts, BFA Computer Art, Computer Animation and Visual Effects, and BFA Animation. Number of students: 100 graduate students Keeping up with technology: “The faculty body consists of adjunct instructors who are all working professionals – all work in the industry and teach part time at SVA MFA Computer Arts. This helps the program stay on top of technology that the industry is utilizing,” notes Hsiang Chin Moe, Director of Operations, MFA Computer Arts. Select collaborations: SVA students have worked with Plymptons Studio to transform alumnus Bill Plympton’s traditional 2D animation into a VR experience. SVA hosted an event with Epic UnReal in 2018. The Cartoon Network offered a short residency to work on-site on a project for OK K.O.! Let’s Be Heroes, a series created by alumnus Ian Jones-Quartey. Connecting: “SVA has a one-year-long required class, Digital Art Seminar, where it invites guest lecturers to speak about their work. Students get to meet them and learn about their experiences. SVA hosts an annual career fair where students get to meet employers for job/internship opportunities. It also hosts many events and meetings for organizations like the Visual Effects Society, Women in Animation, and the N.Y. Users Groups for Maya and Houdini. Additionally, SVA’s student and alumni film and animation showcases (SVA Premieres Hollywood and After School Special in New York City) are opportunities for students to meet professionals and working alumni.” Challenges: “There is a careful balance to ensuring students have the skills to step into a studio and work on someone else’s project while also ensuring they are developing their own individual creative vision to give the world something new and original.”

“We recognize that there is a general convergence between the disciplines of games, VR, AR and film, which will shake up the animation education sector in the coming decade – although our department is certainly ready for these impending challenges.” —Richard Southern, Co-Head of the Department of National Centre for Computer Animation, Bournemouth University

TOP AND BOTTOM: At the School of Visual Arts in New York, the entire faculty body consists of adjunct instructors who are working professionals in the industry teaching part-time at SVA MFA Computer Arts. (Photos courtesy of School of Visual Arts)

VANCOUVER FILM SCHOOL (VFS), Vancouver, Canada www.vfs.edu Degrees: One Year Diplomas in Animation Concept Art, 3D Animation & VFX, and Classical Animation. Number of students: Over 350 per year. VR/AR: Eight-month program in VR/AR Design and Development. Keeping up with technology: VFS has a continuous dialogue

FALL 2019 VFXVOICE.COM • 55


FOCUS

faculty-media-communication/our-departments/national-centre-computer-animation Degrees: BA Computer Animation Technical Arts, BA Computer Animation Art and Design, BA Visual Effects, MA 3D Computer Animation, MA Digital Effects, MSc Computer Animation and Visual Effects. Number of students: About 300 Undergraduates, 80 Masters, 30 Doctoral students (in VFX/Animation studies). Keeping up with technology: “We are fortunate to host one of the most vibrant and active research centers within the technical areas of Computer Graphics and associated digital arts in the world,” observes Richard Southern, Co-Head of the Department of National Centre for Computer Animation. “As an example, the drive in the industry towards automation facilitated by innovations in AI have been mirrored in our own internationally recognized research.” Connecting: “There are several opportunities for students to connect, such as the BFX Festival, Masterclass (a credit-bearing undergraduate unit in which students engage in an industry set and mentored project), access to industry mentors, weekly visiting speakers, and degree shows for each graduating cohort.” Challenges: “We recognize that there is a general convergence between the disciplines of games, VR, AR and film, which will shake up the animation education sector in the coming decade – although our department is certainly ready for these impending challenges.”

TOP AND BOTTOM: Gnomon’s course topics run the gamut from traditional art to character and creature animation, modeling and texturing, rigging, visual effects animation, game art, compositing, scripting and programming for visual effects, lighting and rendering, and more. (Photos courtesy of Gnomon)

54 • VFXVOICE.COM FALL 2019

GNOMON, Hollywood, California www.gnomon.edu Degrees: Bachelor of Fine Arts in Digital Production. Gnomon additionally offers an accredited two-year Certificate Program, which covers foundational 3D skills before allowing students to emphasize in further areas of study: Character & Creature Animation, Modeling & Texturing, Visual Effects Animation, Games or 3D Generalist. Number of students: Approximately 450 students at Gnomon study VFX, animation or game art. Online: Gnomon offers a selection of online courses available to residents in California and those outside of the United States. VR/AR: Most of Gnomon’s 3D software courses teach skills that are applicable to creating VR and AR content. The campus features a VR lab for students looking to explore these mediums in a dedicated space. Keeping up with technology: “Having a constant flow of industry professionals teaching our classes, visiting campus for special events and evaluating our curriculum gives us deep insight into the trends and advancements in production pipelines,” notes Shannon Wiggins, Director of Placement and Alumni Relations. Connecting: “To help prepare students for the industry, we regularly host employer days, where studio representatives are invited to meet with other students, review their reels and provide feedback. Through studio engagement, networking activities, and industry-related events showcasing the latest artistic and CG techniques, students at Gnomon gain a broad understanding of

the operational characteristics of different studios, insight into the current job market, and the comprehensive knowledge of how to navigate a career in the digital production industries.” Challenges: “Giving the students a perspective of how they can use the raw skills they are learning in a creative context and to appreciate the staggering possibilities they are inheriting as CG artists,” comments Beau Jansen, VFX Educational Lead. Further comments: “Gnomon has a dual-customer philosophy – we’re not just educating students, we’re supplying the VFX and animation industry with qualified graduates. There’s a sense of responsibility there,” comments Brian Bradford, Executive Director of Enrollment Management. SVA (SCHOOL OF VISUAL ARTS), New York, New York www.sva.edu Degrees: MFA in Computer Arts, BFA Computer Art, Computer Animation and Visual Effects, and BFA Animation. Number of students: 100 graduate students Keeping up with technology: “The faculty body consists of adjunct instructors who are all working professionals – all work in the industry and teach part time at SVA MFA Computer Arts. This helps the program stay on top of technology that the industry is utilizing,” notes Hsiang Chin Moe, Director of Operations, MFA Computer Arts. Select collaborations: SVA students have worked with Plymptons Studio to transform alumnus Bill Plympton’s traditional 2D animation into a VR experience. SVA hosted an event with Epic UnReal in 2018. The Cartoon Network offered a short residency to work on-site on a project for OK K.O.! Let’s Be Heroes, a series created by alumnus Ian Jones-Quartey. Connecting: “SVA has a one-year-long required class, Digital Art Seminar, where it invites guest lecturers to speak about their work. Students get to meet them and learn about their experiences. SVA hosts an annual career fair where students get to meet employers for job/internship opportunities. It also hosts many events and meetings for organizations like the Visual Effects Society, Women in Animation, and the N.Y. Users Groups for Maya and Houdini. Additionally, SVA’s student and alumni film and animation showcases (SVA Premieres Hollywood and After School Special in New York City) are opportunities for students to meet professionals and working alumni.” Challenges: “There is a careful balance to ensuring students have the skills to step into a studio and work on someone else’s project while also ensuring they are developing their own individual creative vision to give the world something new and original.”

“We recognize that there is a general convergence between the disciplines of games, VR, AR and film, which will shake up the animation education sector in the coming decade – although our department is certainly ready for these impending challenges.” —Richard Southern, Co-Head of the Department of National Centre for Computer Animation, Bournemouth University

TOP AND BOTTOM: At the School of Visual Arts in New York, the entire faculty body consists of adjunct instructors who are working professionals in the industry teaching part-time at SVA MFA Computer Arts. (Photos courtesy of School of Visual Arts)

VANCOUVER FILM SCHOOL (VFS), Vancouver, Canada www.vfs.edu Degrees: One Year Diplomas in Animation Concept Art, 3D Animation & VFX, and Classical Animation. Number of students: Over 350 per year. VR/AR: Eight-month program in VR/AR Design and Development. Keeping up with technology: VFS has a continuous dialogue

FALL 2019 VFXVOICE.COM • 55


FOCUS

with industry partners and its faculty is always testing new software or production solutions with the students. “As a private, one-year program model we can pivot very quickly on new tech,” says Colin Giles, Head of Faculty of the Animation & VFX Program. Connecting: “We have a staff of over 90 industry experienced professionals working with our students – the majority of whom are currently working in industry positions. Our mentorship program is a formal way of students working directly with experts in their chosen field. We also have industry nights where graduation students showcase their work to industry pros and recruiters.” Goals: “Our goal is to provide a place for students to train their artistic voice and allow for experimentation, while training for all the skills they need to enter the industry.” New in 2019: “We have developed a fully integrated relationship with Beyond Capture – an in-house private company providing ground breaking performance and motion capture to industry and our students.” THINK TANK TRAINING CENTRE, Vancouver, Canada www.tttc.ca Degrees: Think Tank offers no “degrees.” “We find the work is what the studios really want to see. Degrees are great and all, but in our industry having the degree helps with visas, etc. but not much else,” says Think Tank co-founder Scott Thompson. Number of students: 80-90 students Online: “We are expanding our online school into new career paths. We look forward to adding Compositing, Matte Painting, Coding for Games, and Character Animation to the online training platform.” Keeping up with technology: “Think Tank is often the first school to implement new software worldwide. We were the world Beta school for Mudbox and Marvelous Designer. As far as I know we were among the very first to teach Mari, ZBrush and Substance Designer across the globe,” notes Thompson. Connecting: “We have 36 instructors from the industry. They are an excellent conduit for employment. Students also attend many studio tours we arrange and studios regularly visit Think Tank looking for talent.” Challenges: “Staying current and on top. There is a lot of competition these days. More than 400 schools worldwide.”

TOP AND MIDDLE: The Vancouver Film School strives to ensure that their students are prepared for the long term by being adaptable problem solvers who can navigate a constantly shifting and evolving entertainment industry. (Photos courtesy of Vancouver Film School) BOTTOM: Greenscreen filming at Lost Boys Studios with owners Mark and Ria Bénard. (Photo courtesy of Lost Boys Studios)

56 • VFXVOICE.COM FALL 2019

LOST BOYS STUDIOS, Vancouver, Canada; Montreal, Canada www.lostboys-studios.com Degrees: Five specialized, career-focused programs. Vancouver: Advanced Visual Effects Compositing (12 months), Effects Technical Director (12 months) and Digital Lighting Artist (four months). Montreal: Effects Artist (five months) and Compositing Artist (five months). Number of students: Around 50 in the Vancouver location and 17 in the Montreal startup. Select collaborations: “When a studio needs training in a specialized area, they come to us to collaborate on designing a program

to meet their needs,” says co-owner and co-director Ria Bénard. “When Sony was in need of Katana Lighters, they approached us knowing we are the specialized preeminent focused training center. This is how our Digital Lighting Artist certificate program was born.” Keeping up with technology: “We constantly tweak our systems to reflect current industry standards. We meet with leads and supervisors as well as working alumni to ensure we are always relevant,” says founder and co-director Mark Bénard. “We consult with studios who collaborate to keep our programs [up-to-date]. The input they provide during reel reviews, screening, tours and presentations allows our programs to be in a state of continual evolution,” adds Ria Bénard. Connecting: “We are fortunate to have been teaching students since 2006, with experienced graduates still actively working in the industry and happy to help new Lost Boys graduates,” says Compositing Instructor Andrew Zeller. “Lost Boys attends events and career fairs with students such as SIGGRAPH, Spark FX Conferences, the Vancouver Digital Entertainment Career Fairs, and more. Lost Boys also runs a public online job board (vfxvancouver.com) used by major studios throughout Vancouver, Canada.” Further comments: “We keep our class size small (around 8 to 12 students per class) so our instructors can get to know every student and help them individually improve,” says Ria Bénard. NEXTGEN SKILLS ACADEMY, London, England, and nine other U.K. locations www.nextgenskillsacademy.com Degrees: NextGen is an academy consisting of member institutions. Its network of Further Education (FE) colleges offer pre-university courses (“Level 3”). Students study for the AIM Awards Level 3 Extended Diploma in Games, Animation and VFX Skills. Courses are full-time and last two years. Number of students: 500 registered this year across 10 FE Colleges, with locations around the U.K. More colleges are coming on board for the 2019-2020 academic year. VR/AR: Its FE colleges are offering VR/AR options as part of the Higher National courses and NextGen facilitates industry briefs and links to ensure the courses are relevant. Keeping up with technology: “We meet with our Employer Steering Group quarterly. The group includes Sony Interactive Entertainment, Ubisoft Reflections, Blue Zoo, Framestore, DNEG, The Imaginarium, Centroid 3D, Jagex, The Mill and Playground Games. We work closely with them to ensure our course is always teaching the latest skills and industry,” says Phil Attfield, VFX and Animation Partnership Director. Connecting: “Students work to real industry briefs with feedback and regularly visit studios. We also facilitate industry placements, host industry pop-up labs at colleges and organize the Graduate Showcase where students present their work to leading employers from the industry,” says Attfield. Challenges: “Adapting course content to the shifting needs of employers as their pipelines and procedures adjust to take

TOP: Think Tank has a Jules Verne décor and is building an airplane in its lobby. (Photo courtesy of Think Tank Training Centre) BOTTOM: The University of Hertfordshire’s student works have been shortlisted in over 140 animation and film festivals worldwide, according to Martin Bowman, Joint Animation Program Leader. (Photo courtesy of the University of Hertfordshire)

FALL 2019 VFXVOICE.COM • 57


FOCUS

with industry partners and its faculty is always testing new software or production solutions with the students. “As a private, one-year program model we can pivot very quickly on new tech,” says Colin Giles, Head of Faculty of the Animation & VFX Program. Connecting: “We have a staff of over 90 industry experienced professionals working with our students – the majority of whom are currently working in industry positions. Our mentorship program is a formal way of students working directly with experts in their chosen field. We also have industry nights where graduation students showcase their work to industry pros and recruiters.” Goals: “Our goal is to provide a place for students to train their artistic voice and allow for experimentation, while training for all the skills they need to enter the industry.” New in 2019: “We have developed a fully integrated relationship with Beyond Capture – an in-house private company providing ground breaking performance and motion capture to industry and our students.” THINK TANK TRAINING CENTRE, Vancouver, Canada www.tttc.ca Degrees: Think Tank offers no “degrees.” “We find the work is what the studios really want to see. Degrees are great and all, but in our industry having the degree helps with visas, etc. but not much else,” says Think Tank co-founder Scott Thompson. Number of students: 80-90 students Online: “We are expanding our online school into new career paths. We look forward to adding Compositing, Matte Painting, Coding for Games, and Character Animation to the online training platform.” Keeping up with technology: “Think Tank is often the first school to implement new software worldwide. We were the world Beta school for Mudbox and Marvelous Designer. As far as I know we were among the very first to teach Mari, ZBrush and Substance Designer across the globe,” notes Thompson. Connecting: “We have 36 instructors from the industry. They are an excellent conduit for employment. Students also attend many studio tours we arrange and studios regularly visit Think Tank looking for talent.” Challenges: “Staying current and on top. There is a lot of competition these days. More than 400 schools worldwide.”

TOP AND MIDDLE: The Vancouver Film School strives to ensure that their students are prepared for the long term by being adaptable problem solvers who can navigate a constantly shifting and evolving entertainment industry. (Photos courtesy of Vancouver Film School) BOTTOM: Greenscreen filming at Lost Boys Studios with owners Mark and Ria Bénard. (Photo courtesy of Lost Boys Studios)

56 • VFXVOICE.COM FALL 2019

LOST BOYS STUDIOS, Vancouver, Canada; Montreal, Canada www.lostboys-studios.com Degrees: Five specialized, career-focused programs. Vancouver: Advanced Visual Effects Compositing (12 months), Effects Technical Director (12 months) and Digital Lighting Artist (four months). Montreal: Effects Artist (five months) and Compositing Artist (five months). Number of students: Around 50 in the Vancouver location and 17 in the Montreal startup. Select collaborations: “When a studio needs training in a specialized area, they come to us to collaborate on designing a program

to meet their needs,” says co-owner and co-director Ria Bénard. “When Sony was in need of Katana Lighters, they approached us knowing we are the specialized preeminent focused training center. This is how our Digital Lighting Artist certificate program was born.” Keeping up with technology: “We constantly tweak our systems to reflect current industry standards. We meet with leads and supervisors as well as working alumni to ensure we are always relevant,” says founder and co-director Mark Bénard. “We consult with studios who collaborate to keep our programs [up-to-date]. The input they provide during reel reviews, screening, tours and presentations allows our programs to be in a state of continual evolution,” adds Ria Bénard. Connecting: “We are fortunate to have been teaching students since 2006, with experienced graduates still actively working in the industry and happy to help new Lost Boys graduates,” says Compositing Instructor Andrew Zeller. “Lost Boys attends events and career fairs with students such as SIGGRAPH, Spark FX Conferences, the Vancouver Digital Entertainment Career Fairs, and more. Lost Boys also runs a public online job board (vfxvancouver.com) used by major studios throughout Vancouver, Canada.” Further comments: “We keep our class size small (around 8 to 12 students per class) so our instructors can get to know every student and help them individually improve,” says Ria Bénard. NEXTGEN SKILLS ACADEMY, London, England, and nine other U.K. locations www.nextgenskillsacademy.com Degrees: NextGen is an academy consisting of member institutions. Its network of Further Education (FE) colleges offer pre-university courses (“Level 3”). Students study for the AIM Awards Level 3 Extended Diploma in Games, Animation and VFX Skills. Courses are full-time and last two years. Number of students: 500 registered this year across 10 FE Colleges, with locations around the U.K. More colleges are coming on board for the 2019-2020 academic year. VR/AR: Its FE colleges are offering VR/AR options as part of the Higher National courses and NextGen facilitates industry briefs and links to ensure the courses are relevant. Keeping up with technology: “We meet with our Employer Steering Group quarterly. The group includes Sony Interactive Entertainment, Ubisoft Reflections, Blue Zoo, Framestore, DNEG, The Imaginarium, Centroid 3D, Jagex, The Mill and Playground Games. We work closely with them to ensure our course is always teaching the latest skills and industry,” says Phil Attfield, VFX and Animation Partnership Director. Connecting: “Students work to real industry briefs with feedback and regularly visit studios. We also facilitate industry placements, host industry pop-up labs at colleges and organize the Graduate Showcase where students present their work to leading employers from the industry,” says Attfield. Challenges: “Adapting course content to the shifting needs of employers as their pipelines and procedures adjust to take

TOP: Think Tank has a Jules Verne décor and is building an airplane in its lobby. (Photo courtesy of Think Tank Training Centre) BOTTOM: The University of Hertfordshire’s student works have been shortlisted in over 140 animation and film festivals worldwide, according to Martin Bowman, Joint Animation Program Leader. (Photo courtesy of the University of Hertfordshire)

FALL 2019 VFXVOICE.COM • 57


FOCUS

“We’re not just educating students, we’re supplying the VFX and animation industry with qualified graduates. There’s a sense of responsibility there.” —Brian Bradford, Executive Director of Enrollment Management, Gnomon advantage of new technology.” New in 2019: “We have been building and promoting the progression route from Level 3 to VFX Apprenticeships, opening up opportunities at more studios for apprenticeship placements. We have also been offering support for the introduction of Higher National courses at our network of FE colleges.” UNIVERSITY OF HERTFORDSHIRE, Hertfordshire, England www.herts.ac.uk Degrees: BA (Honors) Visual Effects for Film and Television, BA (Honors) 3D Computer Animation and Modelling, BA (Honors) 3D Games Art and Design, and BA (Honors) 2D Animation and Character for Digital Media. Number of students: About 387 across all four degrees, including Games Art and Animation postgraduate students studying at the MA level. VR/AR: “We teach VR skills in the Games Art and Design degree and cover 360- degree video compositing in our VFX degree,” says Martin Bowman, Joint Animation Program Leader. Keeping up with technology: “Our staff are all industry trained and constantly updating their skills. They keep in touch with current developments in the industry either via their own contacts or via our alumni, many of whom are in senior positions in companies and are happy to advise the course on the way the industry is changing. We also take part in software beta programs to ensure we have an understanding of how software and pipelines will change over the next few years,” says Bowman. Connecting: The university’s “Animation Exposé” event in May, now in its 14th year, showcases outstanding student films from the Digital Animation courses and is attended by VFX and animation industry representatives. Challenges: “Learning the wide range of skills required to teach a fast-changing, intensive subject at an industry level of quality. And then working out how to explain that to students so that they can understand these skills without any prior experience. Helping a student go from zero to hero in only three years!”

TOP: Escape Studios at Pearson College continuously has speakers visiting to share breakdowns of projects and give career tips. (Photo courtesy of Pearson College) MIDDLE AND BOTTOM: At Florida State, the visual effects program mirrors the visual effects industry to stimulate a realworld experience. The program aims to ensure that students are exposed to the latest trends and technologies that drive the industry. (Photos courtesy of Florida State University)

58 • VFXVOICE.COM FALL 2019

ESCAPE STUDIOS, PEARSON COLLEGE, London, England www.pearsoncollegelondon.ac.uk/escape-studios.html Number of students: 213 undergraduate VFX and Animation students currently enrolled. Keeping up with technology: Online forums, MeetUps, software demos, professional talks, conferences and peer learning opportunities to keep up to date. Escape Studios engages with the industry

often to hear about new projects they have completed, and their future tech objectives and sometimes collaborates to narrow the gap between academic training and the real world. Connecting: Escape Studios continuously has speakers visiting to share breakdowns of projects and give career tips. It also has industry mentors giving feedback to students on their projects. It encourages them to contact recruiters in their final stages of study to enquire about available positions and promote a strong online presence. And part of the undergraduate assessment process includes industry feedback. Goals: Escape Studios usually looks for instructors with experience at senior artist positions or from lead artist roles, as they will have more people-management skills and exposure to training junior artists on the job. COLLEGE OF MOTION PICTURE ARTS, FLORIDA STATE UNIVERSITY, Tallahassee, Florida www.film.fsu.edu Degrees: The College of Motion Picture Arts offers two BFA majors, in Production and Animation & Digital Arts and an MFA in Production. Number of students: Around 30 students admitted annually who study VFX and/or animation in the above departments. Keeping up with technology: “We stay on top of emerging tech as a school through our faculty, who are working filmmakers with ties to industry,” says Jason Maurer, Animation Filmmaker in Residence. Connecting: “The policy and practice of our faculty is an ongoing process of reaching out to industry colleagues on a regular basis, introducing our students to them for mentoring, and engaging them within the community upon -- and even before – graduation. Faculty visit N.Y. and L.A. as often as possible to connect with post and VFX facilities, asking what they’re looking for in new artists, and using the opportunity to talk about our program and our students,” says Ron Honn, Visual Effects Filmmaker in Residence. Challenges: “I jokingly explain to my students that my job is to prepare them for ‘wilderness survival’ with basic tools – the five or eight scaffolding concepts you absolutely must know in order to critically describe and plan for executing VFX,” notes Honn. New in 2019: Honn adds, “The advent of the new Torchlight Center in 2019 means, among other things, a new program incorporating AR/VR into pre-vis and virtual production, using new software that allows a filmmaker to map a space and place production assets and even actors and CG into it for the purposes of planning and shooting elements like camera angles and blocking.” Comments: “It’s all about connection. As artists, we are doing work to connect with someone on an emotional level. Teaching how to design a performance, an edit, a character, a shot that truly connects with the audience – that is a challenge. Mostly because so many people think that it’s all about knowing just the software, or only having to be good at tech or the tools. The best students are the ones who embrace connection and realize that the tool is the means through which connection happens and not the end,” says Jason Maurer, Animation Filmmaker in Residence.

Other Notable VFX Schools More highly-ranked colleges that offer ranked degrees and/or programs in VFX and/or digital animation. Academy of Art University, San Francisco, California www.academyart.edu ArtFX, Montpellier, France www.artfx.school/en DigiPen Institute of Technology, Redmond, Washington www.digipen.edu Digital Animation & Visual Effects (DAVE) School, Orlando, Florida www.daveschool.com Entertainment Technology Center, Carnegie Mellon University, Pittsburgh, Pennsylvania www.etc.cmu.edu Gobelins, L’École de L’Image, Paris, France www.gobelins-school.com Polytechnic Institute, Purdue University, West Lafayette, Indiana www.polytechnic.purdue.edu Ravensbourne, London, England www.ravensbourne.ac.uk Sheridan College, Ontario, Canada www.sheridancollege.ca The University of the Arts, Philadelphia, Pennsylvania www.uarts.edu

“Our focus is on content making, which is always in demand and forms the foundation for all types of production. Tools and methods come and go, but developing strengths in storytelling is critical for our students’ careers.” —Professor Teresa Cheng, Chair of the John C. Hench Division, USC School of Cinematic Arts

FALL 2019 VFXVOICE.COM • 59


FOCUS

“We’re not just educating students, we’re supplying the VFX and animation industry with qualified graduates. There’s a sense of responsibility there.” —Brian Bradford, Executive Director of Enrollment Management, Gnomon advantage of new technology.” New in 2019: “We have been building and promoting the progression route from Level 3 to VFX Apprenticeships, opening up opportunities at more studios for apprenticeship placements. We have also been offering support for the introduction of Higher National courses at our network of FE colleges.” UNIVERSITY OF HERTFORDSHIRE, Hertfordshire, England www.herts.ac.uk Degrees: BA (Honors) Visual Effects for Film and Television, BA (Honors) 3D Computer Animation and Modelling, BA (Honors) 3D Games Art and Design, and BA (Honors) 2D Animation and Character for Digital Media. Number of students: About 387 across all four degrees, including Games Art and Animation postgraduate students studying at the MA level. VR/AR: “We teach VR skills in the Games Art and Design degree and cover 360- degree video compositing in our VFX degree,” says Martin Bowman, Joint Animation Program Leader. Keeping up with technology: “Our staff are all industry trained and constantly updating their skills. They keep in touch with current developments in the industry either via their own contacts or via our alumni, many of whom are in senior positions in companies and are happy to advise the course on the way the industry is changing. We also take part in software beta programs to ensure we have an understanding of how software and pipelines will change over the next few years,” says Bowman. Connecting: The university’s “Animation Exposé” event in May, now in its 14th year, showcases outstanding student films from the Digital Animation courses and is attended by VFX and animation industry representatives. Challenges: “Learning the wide range of skills required to teach a fast-changing, intensive subject at an industry level of quality. And then working out how to explain that to students so that they can understand these skills without any prior experience. Helping a student go from zero to hero in only three years!”

TOP: Escape Studios at Pearson College continuously has speakers visiting to share breakdowns of projects and give career tips. (Photo courtesy of Pearson College) MIDDLE AND BOTTOM: At Florida State, the visual effects program mirrors the visual effects industry to stimulate a realworld experience. The program aims to ensure that students are exposed to the latest trends and technologies that drive the industry. (Photos courtesy of Florida State University)

58 • VFXVOICE.COM FALL 2019

ESCAPE STUDIOS, PEARSON COLLEGE, London, England www.pearsoncollegelondon.ac.uk/escape-studios.html Number of students: 213 undergraduate VFX and Animation students currently enrolled. Keeping up with technology: Online forums, MeetUps, software demos, professional talks, conferences and peer learning opportunities to keep up to date. Escape Studios engages with the industry

often to hear about new projects they have completed, and their future tech objectives and sometimes collaborates to narrow the gap between academic training and the real world. Connecting: Escape Studios continuously has speakers visiting to share breakdowns of projects and give career tips. It also has industry mentors giving feedback to students on their projects. It encourages them to contact recruiters in their final stages of study to enquire about available positions and promote a strong online presence. And part of the undergraduate assessment process includes industry feedback. Goals: Escape Studios usually looks for instructors with experience at senior artist positions or from lead artist roles, as they will have more people-management skills and exposure to training junior artists on the job. COLLEGE OF MOTION PICTURE ARTS, FLORIDA STATE UNIVERSITY, Tallahassee, Florida www.film.fsu.edu Degrees: The College of Motion Picture Arts offers two BFA majors, in Production and Animation & Digital Arts and an MFA in Production. Number of students: Around 30 students admitted annually who study VFX and/or animation in the above departments. Keeping up with technology: “We stay on top of emerging tech as a school through our faculty, who are working filmmakers with ties to industry,” says Jason Maurer, Animation Filmmaker in Residence. Connecting: “The policy and practice of our faculty is an ongoing process of reaching out to industry colleagues on a regular basis, introducing our students to them for mentoring, and engaging them within the community upon -- and even before – graduation. Faculty visit N.Y. and L.A. as often as possible to connect with post and VFX facilities, asking what they’re looking for in new artists, and using the opportunity to talk about our program and our students,” says Ron Honn, Visual Effects Filmmaker in Residence. Challenges: “I jokingly explain to my students that my job is to prepare them for ‘wilderness survival’ with basic tools – the five or eight scaffolding concepts you absolutely must know in order to critically describe and plan for executing VFX,” notes Honn. New in 2019: Honn adds, “The advent of the new Torchlight Center in 2019 means, among other things, a new program incorporating AR/VR into pre-vis and virtual production, using new software that allows a filmmaker to map a space and place production assets and even actors and CG into it for the purposes of planning and shooting elements like camera angles and blocking.” Comments: “It’s all about connection. As artists, we are doing work to connect with someone on an emotional level. Teaching how to design a performance, an edit, a character, a shot that truly connects with the audience – that is a challenge. Mostly because so many people think that it’s all about knowing just the software, or only having to be good at tech or the tools. The best students are the ones who embrace connection and realize that the tool is the means through which connection happens and not the end,” says Jason Maurer, Animation Filmmaker in Residence.

Other Notable VFX Schools More highly-ranked colleges that offer ranked degrees and/or programs in VFX and/or digital animation. Academy of Art University, San Francisco, California www.academyart.edu ArtFX, Montpellier, France www.artfx.school/en DigiPen Institute of Technology, Redmond, Washington www.digipen.edu Digital Animation & Visual Effects (DAVE) School, Orlando, Florida www.daveschool.com Entertainment Technology Center, Carnegie Mellon University, Pittsburgh, Pennsylvania www.etc.cmu.edu Gobelins, L’École de L’Image, Paris, France www.gobelins-school.com Polytechnic Institute, Purdue University, West Lafayette, Indiana www.polytechnic.purdue.edu Ravensbourne, London, England www.ravensbourne.ac.uk Sheridan College, Ontario, Canada www.sheridancollege.ca The University of the Arts, Philadelphia, Pennsylvania www.uarts.edu

“Our focus is on content making, which is always in demand and forms the foundation for all types of production. Tools and methods come and go, but developing strengths in storytelling is critical for our students’ careers.” —Professor Teresa Cheng, Chair of the John C. Hench Division, USC School of Cinematic Arts

FALL 2019 VFXVOICE.COM • 59


FILM

TRANSLATING DISNEY’S 2D ANIMATED FEATURES INTO LIVE-ACTION By IAN FAILES

You may have noticed that 2019 has been an epic year in terms of the release of Disney films based on 2D animated classics. The Lion King, Aladdin and Dumbo are among the major films from the studio that have brought the original animated films into a new dimension, either as live-action or fully photoreal features. With more such films in the works from Disney including Maleficent: Mistress of Evil, Lady and the Tramp and Cruella – all of which likely have a large CG component – visual effects studios are being called upon to help translate what had only been seen previously as ‘flat’ animation into enriched characters and worlds. So, what does that involve, exactly? What are the challenges of taking often highly expressive 2D animated characters into the photoreal 3D world? Do VFX studios need to specifically retain certain parts of the 2D performance and the film? VFX Voice asked artists from The Lion King, Aladdin and Dumbo about how the translation process worked from their point of view. TURNING A 2D CHARACTER INTO 3D

All images copyright © 2019 Disney Enterprises, Inc. TOP: Mena Massoud as Aladdin in the Guy Ritchie film. Although many of the same events happen as in the 1992 animated version, this new live-action movie had to be grounded much more in reality.

60 • VFXVOICE.COM FALL 2019

One of the first questions faced by visual effects teams on these 2D to ‘live-action’ films is, ‘How will the main characters be realized?’ Visual effects studios regularly work from production designs, concept art and filmmaker notes. It’s a collaborative effort to bring these characters to the screen, and in the case of characters such as Simba, Genie or Dumbo, audiences also know them very well. That means that any major departures from the original have to be thought through significantly. In the case of 2019 Aladdin’s Genie (voiced by Will Smith, who also drove the performance of the character via facial capture in the Guy Ritchie film), ILM was tasked with crafting the character in his blue form completely in CG. A predominant trait of the 2D character in the original film from 1992 was that Genie (voiced by Robin Williams) often changed forms. “In 2D animation, one of the easiest things to do is change a

shape,” notes ILM Animation Supervisor Steve Aplin, who says in 3D it is usually harder to go ‘off-model’ and do so many shape changes. “The way we approached it in the new movie was not so much shape changing, but costume changing. We did do experiments with stretching limbs, but we found we came away from the physicality, which you need for a live-action film. We couldn’t quite push Genie around and deform him as much, although we do it in some places.” Similarly, the character of Abu, a capuchin monkey, is a character with extreme expressions and movement in the original Aladdin. But in the new film, ILM felt that Abu had to appear as much more photoreal, so that the audience would ‘buy it’ and not actually question whether it was a real capuchin. “Abu started off probably a bit more human, a bit more character-ful,” states ILM Visual Effects Supervisor Mike Mulholland. “And then he gradually steered toward some rules that we established, which were basically that every performance, every action, needed to be based on a real-world action. We found we couldn’t replicate what was in the 2D animation, and so then we had to draw on other sources, like real-world footage, to get something that was believable within our new environment for the film. A lot of it ended up being about finding the right reference and beginning to stick a performance together using real monkey footage as a guide.” ACTING FROM ‘LIVE-ACTION’ ANIMALS

A similar challenge lay ahead for the visual effects team from MPC on Tim Burton’s Dumbo, a re-imagining of the 1941 film. “You’ve got this highly emotive 2D cartoon animation that you have to re-realize in a photorealistic way, using the technology we have,” discusses Production Supervisor Richard Stammers, who also hails from MPC. “Once you start taking that hugely expressive 2D animation into the photoreal world, you’re suddenly becoming a lot more limited in the range of expressions you have. So getting to that point where you can actually still capture the feeling of the original cartoon can be hard.” For MPC, that ultimately meant that their baby elephant was – compared to a real elephant – quite different. He remains in the new film a caricature, but, as Stammers notes, “feeling as real as possible.” The CG Dumbo retained a cartoon-quality cute factor and was somewhat unusually proportioned. To alleviate the mis-match between that and the real world, the filmmakers also crafted a slightly expressionistic world for him to live in. “I think that was an important thing, trying to get the film to have this sort of coherent look, while still being photoreal, but not hyper-real,” adds Stammers. The cute factor – partly achieved with an oversized head and large eyes – initially wasn’t something MPC was going to replicate, but after looking to real baby elephants, which are incredibly wrinkly and hairy, they noted that simply creating a photoreal baby elephant was not going to work. Still, the team did visit zoos and collect countless amounts of reference for Dumbo, but then pulled back on the detail in their CG model. “Tim wanted more of an idealized elephant,” remarks MPC

“The way we approached it in the new [Aladdin] movie was not so much shape changing, but costume changing. We did do experiments with stretching limbs, but we found we came away from the physicality, which you need for a live-action film. We couldn’t quite push Genie around and deform him as much, although we did do it in some places.” —Steve Aplin, Animation Supervisor, ILM

TOP: ILM used Disney Research Zurich’s new solver, Anyma, to help capture Will Smith’s performance as Aladdin’s Genie. BOTTOM: Director Guy Ritche talks to Naomi Scott, who plays Jasmine, about a scene that heralds the arrival of Aladdin.

FALL 2019 VFXVOICE.COM • 61


FILM

TRANSLATING DISNEY’S 2D ANIMATED FEATURES INTO LIVE-ACTION By IAN FAILES

You may have noticed that 2019 has been an epic year in terms of the release of Disney films based on 2D animated classics. The Lion King, Aladdin and Dumbo are among the major films from the studio that have brought the original animated films into a new dimension, either as live-action or fully photoreal features. With more such films in the works from Disney including Maleficent: Mistress of Evil, Lady and the Tramp and Cruella – all of which likely have a large CG component – visual effects studios are being called upon to help translate what had only been seen previously as ‘flat’ animation into enriched characters and worlds. So, what does that involve, exactly? What are the challenges of taking often highly expressive 2D animated characters into the photoreal 3D world? Do VFX studios need to specifically retain certain parts of the 2D performance and the film? VFX Voice asked artists from The Lion King, Aladdin and Dumbo about how the translation process worked from their point of view. TURNING A 2D CHARACTER INTO 3D

All images copyright © 2019 Disney Enterprises, Inc. TOP: Mena Massoud as Aladdin in the Guy Ritchie film. Although many of the same events happen as in the 1992 animated version, this new live-action movie had to be grounded much more in reality.

60 • VFXVOICE.COM FALL 2019

One of the first questions faced by visual effects teams on these 2D to ‘live-action’ films is, ‘How will the main characters be realized?’ Visual effects studios regularly work from production designs, concept art and filmmaker notes. It’s a collaborative effort to bring these characters to the screen, and in the case of characters such as Simba, Genie or Dumbo, audiences also know them very well. That means that any major departures from the original have to be thought through significantly. In the case of 2019 Aladdin’s Genie (voiced by Will Smith, who also drove the performance of the character via facial capture in the Guy Ritchie film), ILM was tasked with crafting the character in his blue form completely in CG. A predominant trait of the 2D character in the original film from 1992 was that Genie (voiced by Robin Williams) often changed forms. “In 2D animation, one of the easiest things to do is change a

shape,” notes ILM Animation Supervisor Steve Aplin, who says in 3D it is usually harder to go ‘off-model’ and do so many shape changes. “The way we approached it in the new movie was not so much shape changing, but costume changing. We did do experiments with stretching limbs, but we found we came away from the physicality, which you need for a live-action film. We couldn’t quite push Genie around and deform him as much, although we do it in some places.” Similarly, the character of Abu, a capuchin monkey, is a character with extreme expressions and movement in the original Aladdin. But in the new film, ILM felt that Abu had to appear as much more photoreal, so that the audience would ‘buy it’ and not actually question whether it was a real capuchin. “Abu started off probably a bit more human, a bit more character-ful,” states ILM Visual Effects Supervisor Mike Mulholland. “And then he gradually steered toward some rules that we established, which were basically that every performance, every action, needed to be based on a real-world action. We found we couldn’t replicate what was in the 2D animation, and so then we had to draw on other sources, like real-world footage, to get something that was believable within our new environment for the film. A lot of it ended up being about finding the right reference and beginning to stick a performance together using real monkey footage as a guide.” ACTING FROM ‘LIVE-ACTION’ ANIMALS

A similar challenge lay ahead for the visual effects team from MPC on Tim Burton’s Dumbo, a re-imagining of the 1941 film. “You’ve got this highly emotive 2D cartoon animation that you have to re-realize in a photorealistic way, using the technology we have,” discusses Production Supervisor Richard Stammers, who also hails from MPC. “Once you start taking that hugely expressive 2D animation into the photoreal world, you’re suddenly becoming a lot more limited in the range of expressions you have. So getting to that point where you can actually still capture the feeling of the original cartoon can be hard.” For MPC, that ultimately meant that their baby elephant was – compared to a real elephant – quite different. He remains in the new film a caricature, but, as Stammers notes, “feeling as real as possible.” The CG Dumbo retained a cartoon-quality cute factor and was somewhat unusually proportioned. To alleviate the mis-match between that and the real world, the filmmakers also crafted a slightly expressionistic world for him to live in. “I think that was an important thing, trying to get the film to have this sort of coherent look, while still being photoreal, but not hyper-real,” adds Stammers. The cute factor – partly achieved with an oversized head and large eyes – initially wasn’t something MPC was going to replicate, but after looking to real baby elephants, which are incredibly wrinkly and hairy, they noted that simply creating a photoreal baby elephant was not going to work. Still, the team did visit zoos and collect countless amounts of reference for Dumbo, but then pulled back on the detail in their CG model. “Tim wanted more of an idealized elephant,” remarks MPC

“The way we approached it in the new [Aladdin] movie was not so much shape changing, but costume changing. We did do experiments with stretching limbs, but we found we came away from the physicality, which you need for a live-action film. We couldn’t quite push Genie around and deform him as much, although we did do it in some places.” —Steve Aplin, Animation Supervisor, ILM

TOP: ILM used Disney Research Zurich’s new solver, Anyma, to help capture Will Smith’s performance as Aladdin’s Genie. BOTTOM: Director Guy Ritche talks to Naomi Scott, who plays Jasmine, about a scene that heralds the arrival of Aladdin.

FALL 2019 VFXVOICE.COM • 61


FILM

The Lion King: Projecting Emotions

TOP: Aladdin’s grand arrival, as Prince Ali. MIDDLE: The live-action film was shot on sound stages in the U.K. and on location in Wadi Rum, Jordan. BOTTOM LEFT: A greenscreen buck was used on set for some scenes in Dumbo in place of the baby elephant. BOTTOM RIGHT: MPC deliberately kept the ‘cute factor’ in realizing their CG elephant, with larger-than-life proportions such as the head and eyes.

Visual Effects Supervisor Patrick Ledda. “So, we essentially ended up creating texture maps almost from scratch, painted by hand, where we removed so much of the real, organic detail to create this idealized version. That made it very difficult – we had to look at the micro-detail of their skin to try and get something photoreal, but without having all of these other imperfections that real elephants have.” MPC was also behind the characters in The Lion King, and again faced the challenge of translating what were highly expressive characters in the 1994 animated feature to photoreal versions for Jon Favreau’s film. The director and MPC had previously tackled The Jungle Book, which had some similar challenges. For the new Lion King, however, Favreau was “determined that we were even more documentary in feeling than The Jungle Book,” relates Visual Effects Supervisor Adam Valdez from MPC. “He wanted to find a balance in an even more naturalistic way.” The difficult part of striking that balance came when the characters needed to launch into musical numbers in the Favreau film, since singing is obviously something real animals do not do. In the 2D animated film, the musical numbers were incredibly theatrical. This wasn’t an option for the new Lion King. “What we had to really do,” says Valdez, “was, in everything from camera

While Jon Favreau’s The Lion King utilized a raft of the latest filmmaking and visual effects techniques – including virtual cinematography, VR scouting and real-time rendering tools – in order to be made, one of MPC’s major challenges in ‘selling’ the photoreal characters to the audience was allowing viewers to relate to them even though they were not smiling, frowning or otherwise behaving as they had done in the 1994 2D animated film. Luckily, MPC already had some experience with this for The Jungle Book. “The basic trick we learned on that previous film,” says MPC Visual Effects Supervisor Adam Valdez, “was that people project emotions onto inanimate objects – the weather, animals, other people, all kinds of stuff. It’s something we naturally do, so it’s not hard to believe an animal is having feelings when you see it doing things that you perceive and project onto it as either sad or happy, lazy, slow, tired or hungry. Those basic physical and mental states are really easy to project onto animals, and you just have to find the things that make you connect for whatever intention you have.” Valdez equates that with the many YouTube videos out there of cats or dogs that appear to look guilty when they’ve done something wrong. “You project all the human emotion onto

TOP LEFT: The cleaning sequence was filmed with a black-covered Dumbo stand-in. BOTTOM LEFT: Effects simulations for soap and water made the final shots possible.

62 • VFXVOICE.COM FALL 2019

TOP RIGHT: Simba the lion cub talks to red-billed hornbill Zazu in Jon Favreau’s The Lion King.

those things,” he says. “That means we as VFX artists can really look through lots of reference, and study our subject, and observe the moments that we think look like focus, hunger, danger, or when an animal is giving you the blank stare of a predator and they look scary. Then we think we might know how an audience might react, and we use it. “It’s kind of like reverse engineered method acting,” suggests Valdez. “And you realize that a little bit of brow movement just right, from a completely human point of view, with the same animation that you would use on a human – is something that you can borrow in some cases. Then it’s a matter of the animation department trying it on and tuning scenes until it just kind of feels right. You’re using the way that humans project how they must be feeling onto other living things to get the emotion.”

TOP RIGHT: MPC’s CG Dumbo model. The character, brought into the 3D world, still exhibited many behaviors from the 2D animated film, but had to ‘exist’ in the real, if slightly stylized, world. BOTTOM RIGHT: The final shot. MPC looked to real elephants for texture reference, but then pulled back on detail slightly to keep Dumbo as somewhat more cartoon-ish.

FALL 2019 VFXVOICE.COM • 63


FILM

The Lion King: Projecting Emotions

TOP: Aladdin’s grand arrival, as Prince Ali. MIDDLE: The live-action film was shot on sound stages in the U.K. and on location in Wadi Rum, Jordan. BOTTOM LEFT: A greenscreen buck was used on set for some scenes in Dumbo in place of the baby elephant. BOTTOM RIGHT: MPC deliberately kept the ‘cute factor’ in realizing their CG elephant, with larger-than-life proportions such as the head and eyes.

Visual Effects Supervisor Patrick Ledda. “So, we essentially ended up creating texture maps almost from scratch, painted by hand, where we removed so much of the real, organic detail to create this idealized version. That made it very difficult – we had to look at the micro-detail of their skin to try and get something photoreal, but without having all of these other imperfections that real elephants have.” MPC was also behind the characters in The Lion King, and again faced the challenge of translating what were highly expressive characters in the 1994 animated feature to photoreal versions for Jon Favreau’s film. The director and MPC had previously tackled The Jungle Book, which had some similar challenges. For the new Lion King, however, Favreau was “determined that we were even more documentary in feeling than The Jungle Book,” relates Visual Effects Supervisor Adam Valdez from MPC. “He wanted to find a balance in an even more naturalistic way.” The difficult part of striking that balance came when the characters needed to launch into musical numbers in the Favreau film, since singing is obviously something real animals do not do. In the 2D animated film, the musical numbers were incredibly theatrical. This wasn’t an option for the new Lion King. “What we had to really do,” says Valdez, “was, in everything from camera

While Jon Favreau’s The Lion King utilized a raft of the latest filmmaking and visual effects techniques – including virtual cinematography, VR scouting and real-time rendering tools – in order to be made, one of MPC’s major challenges in ‘selling’ the photoreal characters to the audience was allowing viewers to relate to them even though they were not smiling, frowning or otherwise behaving as they had done in the 1994 2D animated film. Luckily, MPC already had some experience with this for The Jungle Book. “The basic trick we learned on that previous film,” says MPC Visual Effects Supervisor Adam Valdez, “was that people project emotions onto inanimate objects – the weather, animals, other people, all kinds of stuff. It’s something we naturally do, so it’s not hard to believe an animal is having feelings when you see it doing things that you perceive and project onto it as either sad or happy, lazy, slow, tired or hungry. Those basic physical and mental states are really easy to project onto animals, and you just have to find the things that make you connect for whatever intention you have.” Valdez equates that with the many YouTube videos out there of cats or dogs that appear to look guilty when they’ve done something wrong. “You project all the human emotion onto

TOP LEFT: The cleaning sequence was filmed with a black-covered Dumbo stand-in. BOTTOM LEFT: Effects simulations for soap and water made the final shots possible.

62 • VFXVOICE.COM FALL 2019

TOP RIGHT: Simba the lion cub talks to red-billed hornbill Zazu in Jon Favreau’s The Lion King.

those things,” he says. “That means we as VFX artists can really look through lots of reference, and study our subject, and observe the moments that we think look like focus, hunger, danger, or when an animal is giving you the blank stare of a predator and they look scary. Then we think we might know how an audience might react, and we use it. “It’s kind of like reverse engineered method acting,” suggests Valdez. “And you realize that a little bit of brow movement just right, from a completely human point of view, with the same animation that you would use on a human – is something that you can borrow in some cases. Then it’s a matter of the animation department trying it on and tuning scenes until it just kind of feels right. You’re using the way that humans project how they must be feeling onto other living things to get the emotion.”

TOP RIGHT: MPC’s CG Dumbo model. The character, brought into the 3D world, still exhibited many behaviors from the 2D animated film, but had to ‘exist’ in the real, if slightly stylized, world. BOTTOM RIGHT: The final shot. MPC looked to real elephants for texture reference, but then pulled back on detail slightly to keep Dumbo as somewhat more cartoon-ish.

FALL 2019 VFXVOICE.COM • 63


FILM

[“For the new Lion King] what we had to really do was, in everything from camera work and lighting, to the design of the world and set design, just make sure we had understated realism, but also really carefully composed images. We had to allow for light to be dramatic, and allow for the animals to be emotive so that you never felt emotionally disconnected.” —Adam Valdez, Visual Effects Supervisor, MPC

work and lighting, to the design of the world and set design, just make sure we had understated realism, but also really carefully composed images. We had to allow for light to be dramatic, and allow for the animals to be emotive so that you never felt emotionally disconnected. “So,” adds Valdez, “when the film rose up into a musical passage, it didn’t feel like you were literally cutting from something like a documentary to a Broadway feeling, or something extreme. It’s kind of a delicate tonal balance that starts with acting, I would say, and then goes on to set design and lighting and then ultimately music.” MEMORABLE MOMENTS FROM 2D, IN LIVE ACTION

TOP LEFT: Mufasa and Simba in Jon Favreau’s The Lion King. The film is completely CG. TOP RIGHT: VR scouting of synthetic sets, virtual cinematography techniques matching what real cameras could achieve, and real-time rendering were just some of the tools used to bring this version of the film to life. BOTTOM: Although the animals in this new Lion King could not be as expressive as they were in the 2D animated 1994 film, MPC did find ways to exploit small behaviors in the characters that would deliver certain emotions.

64 • VFXVOICE.COM FALL 2019

The live-action re-imagining of these 2D animated films do not exactly replicate the originals in terms of characters or shot design, but there are times when similarities occur or are necessary. For instance, the magic carpet in the original Aladdin and in the new film are relatively similar. That’s because, suggests ILM’s Aplin, “you’re already stretching the boundaries of reality so much with that carpet. I mean, it’s a living, breathing carpet, so I don’t think audiences are going to judge whether, ‘Oh, does it look like a real carpet, could a carpet do that?’ It’s already out of the box, it’s not real as soon as it starts moving. It works so well in the 2D animated film, so we thought if we could get some of that character into our CG version, then we were going to try to hold on to it.” For Dumbo, MPC’s Stammers notes there was not a conscious decision from the filmmakers to craft scenes exactly like the original. But he does point to a scene from the 1941 cartoon where Dumbo visits his mother when she is locked up in the train carriage, where the design and the compositions were made to be similar to those shots. Other than that, animators at MPC often referred to the cartoon for how to pose a baby elephant. “I’d say to the animators,” recalls Stammers, “‘here’s what the cartoon did, and here’s the nearest thing to the reality of a baby elephant doing something similarly cute. When we had the model of Dumbo as a sculpture that Tim Burton liked, the first thing we did with that was to start posing it, and using the cartoon as reference.”

Meanwhile, Valdez identifies the gorge scene in the new Lion King, when Mufasa saves Simba, as a key one that saw the filmmakers make significant reference to the original. “It’s a good example of where the animated film is a masterpiece,” Valdez says. “It’s really about rapid cutting and it’s got some of my favorite 2D squash-and-stretch moments in it, if you look at the frames where Mufasa gets hit and tumbles to the ground. He just stretches three times his length, it’s incredible! But half the things in it we couldn’t do because they’re just physically implausible. The way he climbs the cliff, and the way he leaps from nowhere out of a running herd of wildebeest. Some of that ‘super duper’ stuff we don’t do because it breaks the believability somehow. “We had to say, ‘What’s our movie version of that?’ Jon Favreau and the Animation Supervisor Andy Jones broke it down beat by beat, and they really worked on it for many months, analyzing

everything. They’d look back at the original film, saying, ‘Could we do that? Should we do that?’ It was sometimes yes, sometimes no – a very, very iterative process. And that original film is loved by so many people. When you’re constantly concerned about how people are going to take your choices compared to something they love, it’s definitely extra work. A lot of extra work.”

TOP LEFT: Young Simba is revealed in the ‘Circle of Life’ scene from the film, which echoes the original. TOP RIGHT: A grown-up Simba lounges with Timon and Pumbaa. BOTTOM: Scar takes control of the Pride Lands with the help of hyenas.

FALL 2019 VFXVOICE.COM • 65


FILM

[“For the new Lion King] what we had to really do was, in everything from camera work and lighting, to the design of the world and set design, just make sure we had understated realism, but also really carefully composed images. We had to allow for light to be dramatic, and allow for the animals to be emotive so that you never felt emotionally disconnected.” —Adam Valdez, Visual Effects Supervisor, MPC

work and lighting, to the design of the world and set design, just make sure we had understated realism, but also really carefully composed images. We had to allow for light to be dramatic, and allow for the animals to be emotive so that you never felt emotionally disconnected. “So,” adds Valdez, “when the film rose up into a musical passage, it didn’t feel like you were literally cutting from something like a documentary to a Broadway feeling, or something extreme. It’s kind of a delicate tonal balance that starts with acting, I would say, and then goes on to set design and lighting and then ultimately music.” MEMORABLE MOMENTS FROM 2D, IN LIVE ACTION

TOP LEFT: Mufasa and Simba in Jon Favreau’s The Lion King. The film is completely CG. TOP RIGHT: VR scouting of synthetic sets, virtual cinematography techniques matching what real cameras could achieve, and real-time rendering were just some of the tools used to bring this version of the film to life. BOTTOM: Although the animals in this new Lion King could not be as expressive as they were in the 2D animated 1994 film, MPC did find ways to exploit small behaviors in the characters that would deliver certain emotions.

64 • VFXVOICE.COM FALL 2019

The live-action re-imagining of these 2D animated films do not exactly replicate the originals in terms of characters or shot design, but there are times when similarities occur or are necessary. For instance, the magic carpet in the original Aladdin and in the new film are relatively similar. That’s because, suggests ILM’s Aplin, “you’re already stretching the boundaries of reality so much with that carpet. I mean, it’s a living, breathing carpet, so I don’t think audiences are going to judge whether, ‘Oh, does it look like a real carpet, could a carpet do that?’ It’s already out of the box, it’s not real as soon as it starts moving. It works so well in the 2D animated film, so we thought if we could get some of that character into our CG version, then we were going to try to hold on to it.” For Dumbo, MPC’s Stammers notes there was not a conscious decision from the filmmakers to craft scenes exactly like the original. But he does point to a scene from the 1941 cartoon where Dumbo visits his mother when she is locked up in the train carriage, where the design and the compositions were made to be similar to those shots. Other than that, animators at MPC often referred to the cartoon for how to pose a baby elephant. “I’d say to the animators,” recalls Stammers, “‘here’s what the cartoon did, and here’s the nearest thing to the reality of a baby elephant doing something similarly cute. When we had the model of Dumbo as a sculpture that Tim Burton liked, the first thing we did with that was to start posing it, and using the cartoon as reference.”

Meanwhile, Valdez identifies the gorge scene in the new Lion King, when Mufasa saves Simba, as a key one that saw the filmmakers make significant reference to the original. “It’s a good example of where the animated film is a masterpiece,” Valdez says. “It’s really about rapid cutting and it’s got some of my favorite 2D squash-and-stretch moments in it, if you look at the frames where Mufasa gets hit and tumbles to the ground. He just stretches three times his length, it’s incredible! But half the things in it we couldn’t do because they’re just physically implausible. The way he climbs the cliff, and the way he leaps from nowhere out of a running herd of wildebeest. Some of that ‘super duper’ stuff we don’t do because it breaks the believability somehow. “We had to say, ‘What’s our movie version of that?’ Jon Favreau and the Animation Supervisor Andy Jones broke it down beat by beat, and they really worked on it for many months, analyzing

everything. They’d look back at the original film, saying, ‘Could we do that? Should we do that?’ It was sometimes yes, sometimes no – a very, very iterative process. And that original film is loved by so many people. When you’re constantly concerned about how people are going to take your choices compared to something they love, it’s definitely extra work. A lot of extra work.”

TOP LEFT: Young Simba is revealed in the ‘Circle of Life’ scene from the film, which echoes the original. TOP RIGHT: A grown-up Simba lounges with Timon and Pumbaa. BOTTOM: Scar takes control of the Pride Lands with the help of hyenas.

FALL 2019 VFXVOICE.COM • 65


FILM

“We had to build over 30 cars, so you need the previs to budget this kind of stuff, and to break it down into different rigs and specialty equipment.” —Phedon Papamichael, Cinematographer, ASC, GSC

VFX ON A FAST TRACK WITH FORD V FERRARI By TREVOR HOGG

Images copyright © 2019 Twentieth Century Fox Film Corporation TOP: Director James Mangold (standing left) and cameraman behind the ARRI ALEXA LF camera.

66 • VFXVOICE.COM FALL 2019

As much as filmmaker James Mangold (Logan) believes in the old school mantra of capturing everything in-camera, he is not afraid to utilize digital technology to enhance the storytelling. Such is the case with Ford v Ferrari (or Le Mans ’66 in the U.K.), which depicts the efforts of American automobile designer Carroll Shelby (Matt Damon) and British driver Ken Miles (Christian Bale) to construct the Ford GT40 and beat Ferrari at the 1966 24 Hours of Le Mans race in France. Extensive previs was created before the production actually started. “We had to build over 30 cars, so you need the previs to budget this kind of stuff, and to break it down into different rigs and specialty equipment,” explains cinematographer Phedon Papamichael ASC, GSC (Nebraska), who reunited with Mangold for their fifth collaboration. “Even for the car scenes, we shot the previs piece but always rolled additional footage. We’re looking for something to happen that is less designed and sterile, which becomes a useful moment in telling the story.” The previs was essential in mapping out the Le Mans race, which is supposed to happen over a 24-hour period and encompasses almost an hour of screen time. “There is no single place to shoot the Le Mans track in France,” explains Mangold. “Le Mans doesn’t resemble at all what it was like in 1960s, which was a series of specific country roads that formed a loop. There is a long straightaway hemmed in by trees that’s called the Mulsanne Straight, plus the grandstand area. Le Mans itself was a Frankenstein’s monster of multiple locations sewn together.”

Locations were situated in Southern California and throughout Georgia. “To complete one lap in our movie you move through essentially four or five locations, and yet the physical relationship of all of the cars have to remain constant for it to be a continuous race. Then you had weather continuity at the same time, because in order to communicate that this is a 24-hour race I felt that we needed to have periods of rain, night, dawn, sunset and broad day.” Visual effects were critical in fixing continuity errors. “[Visual Effects Supervisor] Olivier Dumont (The New Mutants) and his team were so great in assisting or adjusting our shots that we ended up getting them involved in a lot of them. Some of the shots were as simple as adjusting a clock to the right time. There were so many variables appearing in frame at any given moment that could deny the continuity, so there was a lot to deal with.” Le Mans grandstands were the biggest build both practically and digitally. “I worked closely with production designer François Audouy (Logan), who had already gathered so much research,” states Dumont. “I also went to France to see how the track was for Le Mans.” A whole wall of 40-foot x 40-foot bluescreens were maneuvered from shot to shot. “The art department constructed three of the 12 buildings, and we didn’t have any grandstands on the other side of the track. Every year more bleachers were added, so we had to have fewer of them in 1959 compared to 1966. We had to pay attention to what existed at the time.” Sometimes compromises had to be made. “We shot in Fontana for Daytona,” Dumont says. “Daytona has a big lake in the middle of it, whereas Fontana doesn’t. We pretended that we never see that part of it. We were free to come up with what we thought was good for the movie.” Assisting the production was the creation of singular environments. “Unlike your average Marvel or Star Wars movie where there are a myriad of locations and worlds, there are not so many of those in Ford v Ferrari,” notes Mangold. “It was more about creating one hyper-real world and allowing every crevice of it to feel real.” Each of the departments were collaborative with one another. “François Audouy was concerned that the graphics be convincingly 1960s,” explains Visual Effects Producer Kathy Siegel (Maze Runner: The Death Cure). “He and a graphic designer took it upon themselves to spec out the whole run of pits with the correct signage so our vendors didn’t have to think what would have been here, or maybe make a mistake with a font that wouldn’t have been available then.” One of the major difficulties was conveying high speed of the race cars. “If you run a car through a forest, 50 miles per hour looks like a 100,” notes second unit director Darrin Prescott (Baby

TOP: Matt Damon, James Mangold and Christian Bale discuss a scene in a pit stop constructed by production designer François Audouy. MIDDLE: Mangold knew that the onscreen chemistry between Matt Damon and Christian Bale would work as both actors have long admired each other. BOTTOM: Christian Bale and James Mangold have to work quickly together as noise restrictions limited the time allowed for nighttime shooting.

FALL 2019 VFXVOICE.COM • 67


FILM

“We had to build over 30 cars, so you need the previs to budget this kind of stuff, and to break it down into different rigs and specialty equipment.” —Phedon Papamichael, Cinematographer, ASC, GSC

VFX ON A FAST TRACK WITH FORD V FERRARI By TREVOR HOGG

Images copyright © 2019 Twentieth Century Fox Film Corporation TOP: Director James Mangold (standing left) and cameraman behind the ARRI ALEXA LF camera.

66 • VFXVOICE.COM FALL 2019

As much as filmmaker James Mangold (Logan) believes in the old school mantra of capturing everything in-camera, he is not afraid to utilize digital technology to enhance the storytelling. Such is the case with Ford v Ferrari (or Le Mans ’66 in the U.K.), which depicts the efforts of American automobile designer Carroll Shelby (Matt Damon) and British driver Ken Miles (Christian Bale) to construct the Ford GT40 and beat Ferrari at the 1966 24 Hours of Le Mans race in France. Extensive previs was created before the production actually started. “We had to build over 30 cars, so you need the previs to budget this kind of stuff, and to break it down into different rigs and specialty equipment,” explains cinematographer Phedon Papamichael ASC, GSC (Nebraska), who reunited with Mangold for their fifth collaboration. “Even for the car scenes, we shot the previs piece but always rolled additional footage. We’re looking for something to happen that is less designed and sterile, which becomes a useful moment in telling the story.” The previs was essential in mapping out the Le Mans race, which is supposed to happen over a 24-hour period and encompasses almost an hour of screen time. “There is no single place to shoot the Le Mans track in France,” explains Mangold. “Le Mans doesn’t resemble at all what it was like in 1960s, which was a series of specific country roads that formed a loop. There is a long straightaway hemmed in by trees that’s called the Mulsanne Straight, plus the grandstand area. Le Mans itself was a Frankenstein’s monster of multiple locations sewn together.”

Locations were situated in Southern California and throughout Georgia. “To complete one lap in our movie you move through essentially four or five locations, and yet the physical relationship of all of the cars have to remain constant for it to be a continuous race. Then you had weather continuity at the same time, because in order to communicate that this is a 24-hour race I felt that we needed to have periods of rain, night, dawn, sunset and broad day.” Visual effects were critical in fixing continuity errors. “[Visual Effects Supervisor] Olivier Dumont (The New Mutants) and his team were so great in assisting or adjusting our shots that we ended up getting them involved in a lot of them. Some of the shots were as simple as adjusting a clock to the right time. There were so many variables appearing in frame at any given moment that could deny the continuity, so there was a lot to deal with.” Le Mans grandstands were the biggest build both practically and digitally. “I worked closely with production designer François Audouy (Logan), who had already gathered so much research,” states Dumont. “I also went to France to see how the track was for Le Mans.” A whole wall of 40-foot x 40-foot bluescreens were maneuvered from shot to shot. “The art department constructed three of the 12 buildings, and we didn’t have any grandstands on the other side of the track. Every year more bleachers were added, so we had to have fewer of them in 1959 compared to 1966. We had to pay attention to what existed at the time.” Sometimes compromises had to be made. “We shot in Fontana for Daytona,” Dumont says. “Daytona has a big lake in the middle of it, whereas Fontana doesn’t. We pretended that we never see that part of it. We were free to come up with what we thought was good for the movie.” Assisting the production was the creation of singular environments. “Unlike your average Marvel or Star Wars movie where there are a myriad of locations and worlds, there are not so many of those in Ford v Ferrari,” notes Mangold. “It was more about creating one hyper-real world and allowing every crevice of it to feel real.” Each of the departments were collaborative with one another. “François Audouy was concerned that the graphics be convincingly 1960s,” explains Visual Effects Producer Kathy Siegel (Maze Runner: The Death Cure). “He and a graphic designer took it upon themselves to spec out the whole run of pits with the correct signage so our vendors didn’t have to think what would have been here, or maybe make a mistake with a font that wouldn’t have been available then.” One of the major difficulties was conveying high speed of the race cars. “If you run a car through a forest, 50 miles per hour looks like a 100,” notes second unit director Darrin Prescott (Baby

TOP: Matt Damon, James Mangold and Christian Bale discuss a scene in a pit stop constructed by production designer François Audouy. MIDDLE: Mangold knew that the onscreen chemistry between Matt Damon and Christian Bale would work as both actors have long admired each other. BOTTOM: Christian Bale and James Mangold have to work quickly together as noise restrictions limited the time allowed for nighttime shooting.

FALL 2019 VFXVOICE.COM • 67


FILM

“Unlike your average Marvel or Star Wars movie where there are a myriad of locations and worlds, there are not so many of those in Ford v Ferrari. It was more about creating one hyper-real world and allowing every crevice of it to feel real.” —James Mangold, Director, Ford v Ferrari

“[Visual Effects Supervisor] Olivier Dumont and his team were so great in assisting or adjusting our shots that we ended up getting them involved in a lot of them. Some of the shots were as simple as adjusting a clock to the right time. There were so many variables appearing in frame at any given moment that could deny the continuity, so there was a lot to deal with.” —James Mangold, Director, Ford v Ferrari

TOP LEFT: Bluescreen was utilized to allow for set extensions as well as crowd replication. TOP RIGHT: ARRI ALEXA LF cameras were hard-mounted onto to the race cars to provide the driver’s point of view. MIDDLE: Pod cars that contained the actors were driven by stunt drivers during the race scenes. BOTTOM: It was important that the race car interiors be shot outdoors in order to capture the proper lighting conditions.

68 • VFXVOICE.COM FALL 2019

Driver). “If you run a car through a desert at a 100 miles per hour, it looks like 50. The racetrack is a wide-open desert, so it gets hard to sell speed.” In order to compensate for the lack of foreground elements, low close-ups were taken of the vehicles with a wide lens as the camera dynamically moved among them. “James didn’t want to make this look like a car commercial. He wanted to keep within the style of the day. Back in the day they didn’t have Russian Arms and all of these great tools that we have now.” What has become the standard for speed on the big screen defies reality. “One of the areas visual effects have almost done a disservice is by always taking things to 10, which is the mantra in most visual effects-heavy popcorn movies,” states Mangold. “The speed in Ford v Ferrari isn’t at a ridiculous level. Part of it was finding ways to denote speed that are in the shot construction, and more about showing the road moving underneath the cars, the speed of the background out of the windows, vibrations, and that the camera is never in a place that it could never be on a car that was really moving that fast.” Precipitation was produced in CG. “We had to shoot with water on the road when we were supposed to have rain,” recalls Dumont. “I decided at one point to forget about the rain because we had to speed up all of the shots, and water was going to get in the way. We had to make sure that we had a good setup to add the rain for a good chunk of the Le Mans race.”

Re-speeding the imagery was not a simple process. “It’s a lot more complicated than just saying we’re speeding up the shot per se,” explains Mangold. “It’s very often about seeing what elements need to be sped up or what denotes speed, and lastly sound is a huge influence on all of this and how it carries the cuts.” The other big challenge was making Los Angeles and Georgia looking like Le Mans. “The vegetation is different,” remarks Dumont. “Trees always feel yellow and warm when you shoot in Los Angeles. Whereas in Le Mans, they’re pine trees, so there’s a blue color that needs to be present there. I was always trying to get that going.” Throughout the imagery, modern elements such as cellphone towers had to be painted out. “For instance, when we shot the GT40 test drive in Los Angeles it was supposed to take place in Dearborn, Michigan, where there were trees all round at the time. We had to paint out a lot of the buildings.” CG extensions augmented the Ford factory set. “We have some full CG shots where we’re looking through a window at the factory. It’s a discussion between Henry Ford II (Tracy Letts) and Shelby. We also had the establishing shot of the Ford headquarters. Inside the factory, the foreground elements were always practical, but we extended the rest.” Numerous car interiors were captured with some of them being on a soundstage with a shell of a car placed on an air cushion to get the necessary vibrations. “Even when we were shooting outside, which is always the best way to do it, we had to replace the background because we were shooting in Agua Dulce and were surrounded with hills,” remarks Dumont. “We had to have a Le Mans-style background outside. We had a specially designed car called the Frankenstein, which was just a shell with roll cages. We could put cameras anywhere on it in order to shoot the right angles. This was where the previs was useful. James worked on a shot list for the second unit to shoot all of the car stunts.” An array of six ALEXA Mini cameras were placed on the vehicle. “On top of the car I installed an Insta360 Pro camera that could grab the full environment at once. I used that for HDRI and the LED panels.” A lot of the location car interiors were done on a biscuit rig (also referred to as a pod car), which is a drivable process trailer. “You’re basically out on location, doing all of your exterior drive-bys and have the actors in the hero car on this rig that drives like a race car,” remarks Prescott. “It’s unbelievable. I remember back on The Bourne Ultimatum we put Matt [Damon] inside this thing called the GO Mobile, and he called it the ‘no acting required vehicle’ because you feel like you’re actually racing around this canyon.”

TOP: Pod cars were critical in being able to minimize the amount of soundstage work required for the race car interiors, as well as to get the necessary racing footage from the perspective of the drivers. MIDDLE: The pod cars allowed for Christian Bale to be catapulted around with the actual G-forces. BOTTOM: Ford v Ferrari marks the fifth time that cinematographer Phedon Papamichael has collaborated with Mangold. One of their biggest challenges with this film was conveying the sense of speed of the cars in wide-open spaces.

FALL 2019 VFXVOICE.COM • 69


FILM

“Unlike your average Marvel or Star Wars movie where there are a myriad of locations and worlds, there are not so many of those in Ford v Ferrari. It was more about creating one hyper-real world and allowing every crevice of it to feel real.” —James Mangold, Director, Ford v Ferrari

“[Visual Effects Supervisor] Olivier Dumont and his team were so great in assisting or adjusting our shots that we ended up getting them involved in a lot of them. Some of the shots were as simple as adjusting a clock to the right time. There were so many variables appearing in frame at any given moment that could deny the continuity, so there was a lot to deal with.” —James Mangold, Director, Ford v Ferrari

TOP LEFT: Bluescreen was utilized to allow for set extensions as well as crowd replication. TOP RIGHT: ARRI ALEXA LF cameras were hard-mounted onto to the race cars to provide the driver’s point of view. MIDDLE: Pod cars that contained the actors were driven by stunt drivers during the race scenes. BOTTOM: It was important that the race car interiors be shot outdoors in order to capture the proper lighting conditions.

68 • VFXVOICE.COM FALL 2019

Driver). “If you run a car through a desert at a 100 miles per hour, it looks like 50. The racetrack is a wide-open desert, so it gets hard to sell speed.” In order to compensate for the lack of foreground elements, low close-ups were taken of the vehicles with a wide lens as the camera dynamically moved among them. “James didn’t want to make this look like a car commercial. He wanted to keep within the style of the day. Back in the day they didn’t have Russian Arms and all of these great tools that we have now.” What has become the standard for speed on the big screen defies reality. “One of the areas visual effects have almost done a disservice is by always taking things to 10, which is the mantra in most visual effects-heavy popcorn movies,” states Mangold. “The speed in Ford v Ferrari isn’t at a ridiculous level. Part of it was finding ways to denote speed that are in the shot construction, and more about showing the road moving underneath the cars, the speed of the background out of the windows, vibrations, and that the camera is never in a place that it could never be on a car that was really moving that fast.” Precipitation was produced in CG. “We had to shoot with water on the road when we were supposed to have rain,” recalls Dumont. “I decided at one point to forget about the rain because we had to speed up all of the shots, and water was going to get in the way. We had to make sure that we had a good setup to add the rain for a good chunk of the Le Mans race.”

Re-speeding the imagery was not a simple process. “It’s a lot more complicated than just saying we’re speeding up the shot per se,” explains Mangold. “It’s very often about seeing what elements need to be sped up or what denotes speed, and lastly sound is a huge influence on all of this and how it carries the cuts.” The other big challenge was making Los Angeles and Georgia looking like Le Mans. “The vegetation is different,” remarks Dumont. “Trees always feel yellow and warm when you shoot in Los Angeles. Whereas in Le Mans, they’re pine trees, so there’s a blue color that needs to be present there. I was always trying to get that going.” Throughout the imagery, modern elements such as cellphone towers had to be painted out. “For instance, when we shot the GT40 test drive in Los Angeles it was supposed to take place in Dearborn, Michigan, where there were trees all round at the time. We had to paint out a lot of the buildings.” CG extensions augmented the Ford factory set. “We have some full CG shots where we’re looking through a window at the factory. It’s a discussion between Henry Ford II (Tracy Letts) and Shelby. We also had the establishing shot of the Ford headquarters. Inside the factory, the foreground elements were always practical, but we extended the rest.” Numerous car interiors were captured with some of them being on a soundstage with a shell of a car placed on an air cushion to get the necessary vibrations. “Even when we were shooting outside, which is always the best way to do it, we had to replace the background because we were shooting in Agua Dulce and were surrounded with hills,” remarks Dumont. “We had to have a Le Mans-style background outside. We had a specially designed car called the Frankenstein, which was just a shell with roll cages. We could put cameras anywhere on it in order to shoot the right angles. This was where the previs was useful. James worked on a shot list for the second unit to shoot all of the car stunts.” An array of six ALEXA Mini cameras were placed on the vehicle. “On top of the car I installed an Insta360 Pro camera that could grab the full environment at once. I used that for HDRI and the LED panels.” A lot of the location car interiors were done on a biscuit rig (also referred to as a pod car), which is a drivable process trailer. “You’re basically out on location, doing all of your exterior drive-bys and have the actors in the hero car on this rig that drives like a race car,” remarks Prescott. “It’s unbelievable. I remember back on The Bourne Ultimatum we put Matt [Damon] inside this thing called the GO Mobile, and he called it the ‘no acting required vehicle’ because you feel like you’re actually racing around this canyon.”

TOP: Pod cars were critical in being able to minimize the amount of soundstage work required for the race car interiors, as well as to get the necessary racing footage from the perspective of the drivers. MIDDLE: The pod cars allowed for Christian Bale to be catapulted around with the actual G-forces. BOTTOM: Ford v Ferrari marks the fifth time that cinematographer Phedon Papamichael has collaborated with Mangold. One of their biggest challenges with this film was conveying the sense of speed of the cars in wide-open spaces.

FALL 2019 VFXVOICE.COM • 69


FILM

“One of the areas visual effects have almost done a disservice is by always taking things to 10, which is the mantra in most visual effects-heavy popcorn movies. The speed in Ford v Ferrari isn’t at a ridiculous level. Part of it was finding ways to denote speed that are in the shot construction, and more about showing the road moving underneath the cars, the speed of the background out of the windows, vibrations, and that the camera is never in a place that it could never be on a car that was really moving that fast.” —James Mangold, Director, Ford v Ferrari

TOP: Thirty custom-made race cars were created for the production. MIDDLE: The signature Dunlop Bridge had to be recreated for the Le Mans race. BOTTOM: Wide lens close-ups were key in conveying the high speed of the race cars.

70 • VFXVOICE.COM FALL 2019

Vintage Panavision C and T series anamorphic lenses were customized to cover the large sensor of the ARRI ALEXA LF. “We needed numerous lenses because we had a lot of different mounts that were being run simultaneously on the car,” explains Papamichael. “It was important for us that we try to do as little as possible onstage and greenscreen, with the exception of some specific beats that are extreme close-ups on Christian during the final race. Everything is hard-mounted to the car seated on the pod car driven by a stunt driver. Christian is being catapulted around with the actual G-forces going into these curves with all of the interactive light. We were trying to go at the correct speeds where you feel the cheeks moving, and the vibration of the body and camera gets transmitted to the frame. You can’t fake that on the stage.” A big crash occurs at the start of the Le Mans race, which is shown from the point of view of the drivers. Recalls Prescott, “We had this drivable cannon that went 70 miles per hour, which drove with the shell of the car on it, and at a certain point launched the shell like 300 feet. The shell would then hit the ground in front of the Frankenstein. “We were hoping to get the shell to bounce in-camera,” Prescott continues, “but it never did exactly what we wanted it to do. You start trying to figure out the physics and dynamics of these crashes. There’s just so much going on in milliseconds it’s hard to recreate them a lot of times. In that sequence, we took the best bits of the actual live-action crash, and then Olivier took over, really dialed that thing in and made it land where we wanted it to be.” Critical to completing the visual effects shots during the post-production period was having a quick editorial turnover. “James and our post-production supervisor, Aaron Downing (The Greatest Showman), were helpful with the turnovers,” notes Siegel. “I had expected a director of James’ caliber and narrative quality to hold onto the shots longer, but he understood that getting shots out would make them better. He was receptive

to the idea of sending out look development shots during the shoot and pulling shots with Olivier that would help the vendors to start to build their environments and allowed them to feel comfortable when the shots came in.” A good percentage of look development shots made it into Ford v Ferrari. “We talked with editor Michael McCusker (Walk the Line), Visual Effects Editor John Berri (Deadpool 2) and James about the shots that they loved,” Siegel recounts. “We were able to pull 25 to 30 shots early, which allowed us to show James tests and models while shooting was still taking place. John is skilled at After Effects, so he was helpful doing tests for previews, showing us lineups, giving us temp shots ideas, and translating Michael’s and James’ ideas to us.” Method Studios was the main vendor and looked after Le Mans. Rising Sun Pictures dealt with Daytona, and additional support was provided by The Yard VFX. “The show got a lot bigger,” states Siegel. “We initially bid 621 shots and were 1,048 at delivery.” Even the gauges on the dashboards became full CG shots in order to have more control of the imagery. “We added a lot of shake and pull out to get the right frequency,” remarks Dumont. “It was mostly a continuity thing between shots in the cut.” A safety issue needed to be addressed with a digital solution. “The stunt drivers were wearing helmets way bigger than the ones that everybody was wearing in the 1960s. It wasn’t just face replacements,” adds Dumont. “We also had to change the helmets in a lot of shots.” Skid marks were added to show how fast the race cars were leaving the pit, “but we also had to remove them from the previous day.” The heaviest simulation was the rain. “You have to analyze the rain. How does that work visually? At a certain distance you see the drops, but after that it becomes misty. You have to look at that to know when you can switch to 2D elements.” Ford v Ferrari covers the full range of visual effects. “The story is so good and is so well directed that you have to look at it as a whole experience,” believes Dumont. “I always felt the movie was a reverse Saving Private Ryan,” states Mangold. “It ends with a 40-minute race sequence, but the movie builds towards that as the audience follows this group of characters who are brought together when Ford decides to get into racing. It’s what’s missing in a lot of action films these days. We have all of this sound and fury, and these incredible images that visual effects and modern technology can produce, but very often I find the only thing that is holding me to the film is the sound and fury.” Concludes Mangold. “What I can’t wait for is that nexus between the emotional investment the audience has made in the characters and how much more the action means to you. I didn’t want my effects and sound teams showing off so much that it started to cross the boundary into the comic book aesthetic. I wanted this to feel much more influenced and of a [period] piece, like the original Le Mans film with Steve McQueen, or Grand Prix. We put all of our effort into making it feel that everything is 100% real.”

TOP TO BOTTOM: In order to portray the 24-hour time period of Le Mans, Mangold made sure to feature shots of day, night, sunrise, sunset and rain. Natural elements such as rain were added digitally. To complete one lap of Le Mans meant going through four or five locations, and yet the physical relationship of all of the cars had to remain constant for it to be a continuous race. Auto Club Speedway in Fontana, California, doubled as Daytona International Speedway. Elements that denote speed needed to be sped up with sound having a huge influence.

FALL 2019 VFXVOICE.COM • 71


FILM

“One of the areas visual effects have almost done a disservice is by always taking things to 10, which is the mantra in most visual effects-heavy popcorn movies. The speed in Ford v Ferrari isn’t at a ridiculous level. Part of it was finding ways to denote speed that are in the shot construction, and more about showing the road moving underneath the cars, the speed of the background out of the windows, vibrations, and that the camera is never in a place that it could never be on a car that was really moving that fast.” —James Mangold, Director, Ford v Ferrari

TOP: Thirty custom-made race cars were created for the production. MIDDLE: The signature Dunlop Bridge had to be recreated for the Le Mans race. BOTTOM: Wide lens close-ups were key in conveying the high speed of the race cars.

70 • VFXVOICE.COM FALL 2019

Vintage Panavision C and T series anamorphic lenses were customized to cover the large sensor of the ARRI ALEXA LF. “We needed numerous lenses because we had a lot of different mounts that were being run simultaneously on the car,” explains Papamichael. “It was important for us that we try to do as little as possible onstage and greenscreen, with the exception of some specific beats that are extreme close-ups on Christian during the final race. Everything is hard-mounted to the car seated on the pod car driven by a stunt driver. Christian is being catapulted around with the actual G-forces going into these curves with all of the interactive light. We were trying to go at the correct speeds where you feel the cheeks moving, and the vibration of the body and camera gets transmitted to the frame. You can’t fake that on the stage.” A big crash occurs at the start of the Le Mans race, which is shown from the point of view of the drivers. Recalls Prescott, “We had this drivable cannon that went 70 miles per hour, which drove with the shell of the car on it, and at a certain point launched the shell like 300 feet. The shell would then hit the ground in front of the Frankenstein. “We were hoping to get the shell to bounce in-camera,” Prescott continues, “but it never did exactly what we wanted it to do. You start trying to figure out the physics and dynamics of these crashes. There’s just so much going on in milliseconds it’s hard to recreate them a lot of times. In that sequence, we took the best bits of the actual live-action crash, and then Olivier took over, really dialed that thing in and made it land where we wanted it to be.” Critical to completing the visual effects shots during the post-production period was having a quick editorial turnover. “James and our post-production supervisor, Aaron Downing (The Greatest Showman), were helpful with the turnovers,” notes Siegel. “I had expected a director of James’ caliber and narrative quality to hold onto the shots longer, but he understood that getting shots out would make them better. He was receptive

to the idea of sending out look development shots during the shoot and pulling shots with Olivier that would help the vendors to start to build their environments and allowed them to feel comfortable when the shots came in.” A good percentage of look development shots made it into Ford v Ferrari. “We talked with editor Michael McCusker (Walk the Line), Visual Effects Editor John Berri (Deadpool 2) and James about the shots that they loved,” Siegel recounts. “We were able to pull 25 to 30 shots early, which allowed us to show James tests and models while shooting was still taking place. John is skilled at After Effects, so he was helpful doing tests for previews, showing us lineups, giving us temp shots ideas, and translating Michael’s and James’ ideas to us.” Method Studios was the main vendor and looked after Le Mans. Rising Sun Pictures dealt with Daytona, and additional support was provided by The Yard VFX. “The show got a lot bigger,” states Siegel. “We initially bid 621 shots and were 1,048 at delivery.” Even the gauges on the dashboards became full CG shots in order to have more control of the imagery. “We added a lot of shake and pull out to get the right frequency,” remarks Dumont. “It was mostly a continuity thing between shots in the cut.” A safety issue needed to be addressed with a digital solution. “The stunt drivers were wearing helmets way bigger than the ones that everybody was wearing in the 1960s. It wasn’t just face replacements,” adds Dumont. “We also had to change the helmets in a lot of shots.” Skid marks were added to show how fast the race cars were leaving the pit, “but we also had to remove them from the previous day.” The heaviest simulation was the rain. “You have to analyze the rain. How does that work visually? At a certain distance you see the drops, but after that it becomes misty. You have to look at that to know when you can switch to 2D elements.” Ford v Ferrari covers the full range of visual effects. “The story is so good and is so well directed that you have to look at it as a whole experience,” believes Dumont. “I always felt the movie was a reverse Saving Private Ryan,” states Mangold. “It ends with a 40-minute race sequence, but the movie builds towards that as the audience follows this group of characters who are brought together when Ford decides to get into racing. It’s what’s missing in a lot of action films these days. We have all of this sound and fury, and these incredible images that visual effects and modern technology can produce, but very often I find the only thing that is holding me to the film is the sound and fury.” Concludes Mangold. “What I can’t wait for is that nexus between the emotional investment the audience has made in the characters and how much more the action means to you. I didn’t want my effects and sound teams showing off so much that it started to cross the boundary into the comic book aesthetic. I wanted this to feel much more influenced and of a [period] piece, like the original Le Mans film with Steve McQueen, or Grand Prix. We put all of our effort into making it feel that everything is 100% real.”

TOP TO BOTTOM: In order to portray the 24-hour time period of Le Mans, Mangold made sure to feature shots of day, night, sunrise, sunset and rain. Natural elements such as rain were added digitally. To complete one lap of Le Mans meant going through four or five locations, and yet the physical relationship of all of the cars had to remain constant for it to be a continuous race. Auto Club Speedway in Fontana, California, doubled as Daytona International Speedway. Elements that denote speed needed to be sped up with sound having a huge influence.

FALL 2019 VFXVOICE.COM • 71


VR/AR/MR TRENDS

“A house contains history – little stickers on the mirrors, heights measured on doorways, door knobs, a heavily trafficked hallway, and other traces of who lived there. You can see the life that was there. That’s the space. You feel that in VR versus on a screen… That home is as much of a character as the characters. I think that’s unique to VR.” —Jeff Gipson, Director, Cycles

CYCLES: EXPERIENCING A LIFE LIVED IN A BELOVED HOUSE By CHRIS McGOWAN

Images copyright © 2019 Walt Disney Animation Studios TOP LEFT: In the Cycles VR film, Bert and Rae (named for director Jeff Gipson’s grandparents) share a romantic moment early in their marriage. TOP RIGHT: In the Cycles VR film, Bert carries Rae over the threshold into their new home. BOTTOM: ‘The Gomez Effect,’ developed by Senior Software Engineer Jose Gomez, desaturates and darkens the image subtly away from the center of attention.

72 • VFXVOICE.COM FALL 2019

Jeff Gipson, who directed Walt Disney Animation Studios’ firstever VR film, Cycles, was a lighting artist on animated works such as Frozen, Zootopia and Moana, and was about to work on Ralph Breaks the Internet when he decided to present his virtual-reality concept to the higher-ups at Disney. “We had a program where artists could pitch ideas, and so I pitched this idea as VR, kind of thinking, ‘Well we’ve never done one before, so I’m just going to suggest it,” and then when they greenlighted it, it was, ‘Oh crap, now we’ve got to figure out how to make it.’ We hadn’t done something like that in the studio.” The first-time director, who grew up in Colorado and now lives in Los Angeles, worked with a small team for four months on the three-minute movie and overcame technological and narrative challenges in the new medium. Cycles debuted in 2018 at SIGGRAPH and went on to garner three nominations at the 17th Annual Visual Effects Society Awards this year. The film is viewable on Oculus, VIVE or flat screen (for theater screenings), and has received a warm reception at various film festivals (it is not currently available for public download). “People have been saying, ‘It’s so cool to see Disney characters right here with me,’” says Gipson. Cycles is about memory, nostalgia and living spaces. It was inspired by Gipson’s grandparents, who lent their names to the two main characters, and is a poignant look at a life lived in a beloved house. A young couple moves into a new home, raises a family, happy and sad moments pass, and finally the house is empty once again. “A house contains history – little stickers on the mirrors, heights measured on doorways, doorknobs, a heavily trafficked hallway, and other traces of who lived there. You can see the life that was there,” says Gipson. The house in Cycles has a powerful presence when experienced from within, in virtual reality. “That’s the space. You feel that in VR versus on a screen,” says Gipson. “It’s so much about space. That home is as much of a character as the characters. I think that’s unique to VR. You’re rooted in a place. You feel things differently. The way you feel the light is different from how you’d feel the light on a screen.” The clever use of light was key to one of the main challenges in narrative VR – how to guide the viewer’s attention. “If you notice, if you look right at where the action is and then look away, the image

is a little bit desaturated and darkened. That’s what we call the Gomez Effect, after Jose Gomez, our Senior Software Engineer, who wrote [the code].” And great use is made of this, as well of time lapses involving the characters. “Those little staggered poses help to guide the motion,” notes Gipson. “It’s using color and light and motion to help guide you around the space to the story,” he continues. “There’s no set way of how to tell a story in VR. On the screen we know good composition, we know how your eye follows the lead character, and how to express different moods based on composition and lighting, but doing that in VR is relatively new still. It’s pretty exciting.” In addition, “You can really take advantage of spatial sound. You hear it on your right and you look over to your right, or above you, wherever it might be.” Gipson was also careful to avoid causing any VR nausea in the viewer. “We tried to be gentle about how we moved the camera. I chose not to move the camera a lot because I get kind of seasick from it. I didn’t want to have action that’s so fast you’re looking all the way over to your left and have to do a whole 180 [degree turn] to see the action on your right. It’s more a gentle orchestration, a ballet of movement, rather than a fast whipping over in a lot of directions. There’s a couple of times where we move the camera, but you don’t really know it.” Cycles also had to pay attention to another comfort factor. “Another challenge is how to make it run at 90 frames a second, because if it lags, the headset doesn’t track and then you get seasick,” Gipson adds. “Making it run at real time was a big challenge for the crew. They were so passionate. The reason Cycles happened was everyone wanted to push it to the next level.” Some 50 people worked on Cycles to varying degrees. “They could have worked an hour, a day, a week, it was tight,” recalls Gipson. The core group was made up of Gipson; Gomez; Edward Robbins, Character Lead; Lauren Brown, Production Lead; Jorge E. Ruiz Cano, Animation Lead; Michael Anderson, Environment Lead; Jose Velasquez, Look Development; Dan Cooper, Visual Development; and, Nicholas Russell, Production Lead. Unity was the real-time game engine that runs the film. Quill was employed for storyboarding. “We asked [ourselves],” continues Gipson, “‘How can we storyboard and get that into previs early?’ One of our animators,

TOP: A Quill storyboard used in the development of Cycles. MIDDLE: The protagonists Bert and Rae in Cycles. BOTTOM: Director Jeff Gipson. Cycles is his first VR film. He was a lighting artist on Frozen, Zootopia, Moana and Ralph Breaks the Internet.

FALL 2019 VFXVOICE.COM • 73


VR/AR/MR TRENDS

“A house contains history – little stickers on the mirrors, heights measured on doorways, door knobs, a heavily trafficked hallway, and other traces of who lived there. You can see the life that was there. That’s the space. You feel that in VR versus on a screen… That home is as much of a character as the characters. I think that’s unique to VR.” —Jeff Gipson, Director, Cycles

CYCLES: EXPERIENCING A LIFE LIVED IN A BELOVED HOUSE By CHRIS McGOWAN

Images copyright © 2019 Walt Disney Animation Studios TOP LEFT: In the Cycles VR film, Bert and Rae (named for director Jeff Gipson’s grandparents) share a romantic moment early in their marriage. TOP RIGHT: In the Cycles VR film, Bert carries Rae over the threshold into their new home. BOTTOM: ‘The Gomez Effect,’ developed by Senior Software Engineer Jose Gomez, desaturates and darkens the image subtly away from the center of attention.

72 • VFXVOICE.COM FALL 2019

Jeff Gipson, who directed Walt Disney Animation Studios’ firstever VR film, Cycles, was a lighting artist on animated works such as Frozen, Zootopia and Moana, and was about to work on Ralph Breaks the Internet when he decided to present his virtual-reality concept to the higher-ups at Disney. “We had a program where artists could pitch ideas, and so I pitched this idea as VR, kind of thinking, ‘Well we’ve never done one before, so I’m just going to suggest it,” and then when they greenlighted it, it was, ‘Oh crap, now we’ve got to figure out how to make it.’ We hadn’t done something like that in the studio.” The first-time director, who grew up in Colorado and now lives in Los Angeles, worked with a small team for four months on the three-minute movie and overcame technological and narrative challenges in the new medium. Cycles debuted in 2018 at SIGGRAPH and went on to garner three nominations at the 17th Annual Visual Effects Society Awards this year. The film is viewable on Oculus, VIVE or flat screen (for theater screenings), and has received a warm reception at various film festivals (it is not currently available for public download). “People have been saying, ‘It’s so cool to see Disney characters right here with me,’” says Gipson. Cycles is about memory, nostalgia and living spaces. It was inspired by Gipson’s grandparents, who lent their names to the two main characters, and is a poignant look at a life lived in a beloved house. A young couple moves into a new home, raises a family, happy and sad moments pass, and finally the house is empty once again. “A house contains history – little stickers on the mirrors, heights measured on doorways, doorknobs, a heavily trafficked hallway, and other traces of who lived there. You can see the life that was there,” says Gipson. The house in Cycles has a powerful presence when experienced from within, in virtual reality. “That’s the space. You feel that in VR versus on a screen,” says Gipson. “It’s so much about space. That home is as much of a character as the characters. I think that’s unique to VR. You’re rooted in a place. You feel things differently. The way you feel the light is different from how you’d feel the light on a screen.” The clever use of light was key to one of the main challenges in narrative VR – how to guide the viewer’s attention. “If you notice, if you look right at where the action is and then look away, the image

is a little bit desaturated and darkened. That’s what we call the Gomez Effect, after Jose Gomez, our Senior Software Engineer, who wrote [the code].” And great use is made of this, as well of time lapses involving the characters. “Those little staggered poses help to guide the motion,” notes Gipson. “It’s using color and light and motion to help guide you around the space to the story,” he continues. “There’s no set way of how to tell a story in VR. On the screen we know good composition, we know how your eye follows the lead character, and how to express different moods based on composition and lighting, but doing that in VR is relatively new still. It’s pretty exciting.” In addition, “You can really take advantage of spatial sound. You hear it on your right and you look over to your right, or above you, wherever it might be.” Gipson was also careful to avoid causing any VR nausea in the viewer. “We tried to be gentle about how we moved the camera. I chose not to move the camera a lot because I get kind of seasick from it. I didn’t want to have action that’s so fast you’re looking all the way over to your left and have to do a whole 180 [degree turn] to see the action on your right. It’s more a gentle orchestration, a ballet of movement, rather than a fast whipping over in a lot of directions. There’s a couple of times where we move the camera, but you don’t really know it.” Cycles also had to pay attention to another comfort factor. “Another challenge is how to make it run at 90 frames a second, because if it lags, the headset doesn’t track and then you get seasick,” Gipson adds. “Making it run at real time was a big challenge for the crew. They were so passionate. The reason Cycles happened was everyone wanted to push it to the next level.” Some 50 people worked on Cycles to varying degrees. “They could have worked an hour, a day, a week, it was tight,” recalls Gipson. The core group was made up of Gipson; Gomez; Edward Robbins, Character Lead; Lauren Brown, Production Lead; Jorge E. Ruiz Cano, Animation Lead; Michael Anderson, Environment Lead; Jose Velasquez, Look Development; Dan Cooper, Visual Development; and, Nicholas Russell, Production Lead. Unity was the real-time game engine that runs the film. Quill was employed for storyboarding. “We asked [ourselves],” continues Gipson, “‘How can we storyboard and get that into previs early?’ One of our animators,

TOP: A Quill storyboard used in the development of Cycles. MIDDLE: The protagonists Bert and Rae in Cycles. BOTTOM: Director Jeff Gipson. Cycles is his first VR film. He was a lighting artist on Frozen, Zootopia, Moana and Ralph Breaks the Internet.

FALL 2019 VFXVOICE.COM • 73


VR/AR/MR TRENDS

“It’s using color and light and motion to help guide you around the space to the story. There’s no set way of how to tell a story in VR. On the screen we know good composition, we know how your eye follows the lead character, and how to express different moods based on composition and lighting, but doing that in VR is relatively new still. It’s pretty exciting.” —Jeff Gipson, Director, Cycles “Another challenge is how to make it run at 90 frames a second, because if it lags, the headset doesn’t track and then you get seasick. Making it run at real time was a big challenge for the crew. They were so passionate. The reason Cycles happened was everyone wanted to push it to the next level.” —Jeff Gipson, Director, Cycles

TOP LEFT: The Cycles team, left to right: Jose Luis Gomez Diaz, Senior Software Engineer; Ed Robbins, Character Lead; Lauren Brown, Production Lead; Jeff Gipson, Director; Jorge E. Ruiz Cano, Animation Lead; Mike Anderson, Environment Lead. TOP RIGHT: Time passes, and Rachel has the difficult conversation with her mother, Rae (now older), that it is time to move into assisted living. BOTTOM LEFT: VisDev image of the house interior. BOTTOM RIGHT: Cycles director Jeff Gipson.

74 • VFXVOICE.COM FALL 2019

Daniel Peixe, is an amazing Quill storyboard artist. What’s great is they’re 3D models. So we were able to bring that in early on into Unity, and start timing those out and have our audio on top of it, so even if it’s a rough pose of holding the baby or a rough pose of them dancing, we got a feeling early on of how to do that. “We used pose tools [PoseVR], written in-house [by Gomez and others], and used our internal scout tool [VR Scout, written by Gomez] as well. So, we could be in the house and start thinking about how to proceed. A big part of VR is how to compose the frame when you can look everywhere. So that was a big part of the house design – trying to design the house in a way that composes each of the story moments, using the architecture and the straight lines and the horizontals. And that’s what’s cool about mid-century modern architecture, it kind of lends itself to that. And so we could get on early placing those Quill things in there and seeing how we could design the house to suit the action.” Prior to Cycles, Gomez had VR development experience, but most of the crew didn’t. Gipson elaborates, “There were a lot of firsts for the studio, which made it really challenging, which helped fuel the crew, pushing it into many new areas.” Another challenge was that the short VR film was “one big long shot mainly” as opposed to how regular films are, notes Gipson. “We’re so used to how we animate or edit things together. But with this it’s just running in real time in the engine. This was before Unity had a timeline. Our software engineer, Jose, had to write that capability into it.” The Cycles team also contributed an original graphic color

shader in Unity, which enhanced the living quality of the characters and environment. The Cycles team pushed for maximum animation quality. “We really wanted to push that appeal that Disney has in all its animated features and shorts,” comments Gipson. The end result was emotionally moving for the director. “I think initially when I first made it, I was emotional, because it was inspired by my grandmother, although the action doesn’t follow her story, but that feeling of seeing her house the last time I was there was a moment I’ll never forget. “I think the first time I saw the characters in VR was a special moment. Everybody else had seen it, all the crew, and then I waited until everybody left the room. And that was emotional, where I was in the room with them, and they come by you in that moment where they move into the house. I was in tears. It’s so weird because it’s like they’re here but they’re not. It gives me chills thinking about it now.” Concludes Gipson, “It’s been a fun challenge and really cool to see our executives excited about VR – a lot of them had never done VR before and they said, “Wow, there’s a potential for storytelling.” In terms of Disney as a whole, he says it’s great to think of “bringing our characters into this new medium for audiences.” Gipson is working on another Disney VR film, which will debut this year.

TOP: Jeff Gipson reviewing the work BOTTOM, LEFT TO RIGHT: Character Lead Ed Robbins with a VR headset. Senior Software Engineer Jose Gomez (creator of the ‘Gomez Effect’) with a VR headset. Animation Lead Jorge Ruiz with a VR headset. Director Jeff Gipson uses virtual production camera equipment to scout camera location in the virtual set. This technique was used early on to both scout locations for action as well as create the flat version of the film.

FALL 2019 VFXVOICE.COM • 75


VR/AR/MR TRENDS

“It’s using color and light and motion to help guide you around the space to the story. There’s no set way of how to tell a story in VR. On the screen we know good composition, we know how your eye follows the lead character, and how to express different moods based on composition and lighting, but doing that in VR is relatively new still. It’s pretty exciting.” —Jeff Gipson, Director, Cycles “Another challenge is how to make it run at 90 frames a second, because if it lags, the headset doesn’t track and then you get seasick. Making it run at real time was a big challenge for the crew. They were so passionate. The reason Cycles happened was everyone wanted to push it to the next level.” —Jeff Gipson, Director, Cycles

TOP LEFT: The Cycles team, left to right: Jose Luis Gomez Diaz, Senior Software Engineer; Ed Robbins, Character Lead; Lauren Brown, Production Lead; Jeff Gipson, Director; Jorge E. Ruiz Cano, Animation Lead; Mike Anderson, Environment Lead. TOP RIGHT: Time passes, and Rachel has the difficult conversation with her mother, Rae (now older), that it is time to move into assisted living. BOTTOM LEFT: VisDev image of the house interior. BOTTOM RIGHT: Cycles director Jeff Gipson.

74 • VFXVOICE.COM FALL 2019

Daniel Peixe, is an amazing Quill storyboard artist. What’s great is they’re 3D models. So we were able to bring that in early on into Unity, and start timing those out and have our audio on top of it, so even if it’s a rough pose of holding the baby or a rough pose of them dancing, we got a feeling early on of how to do that. “We used pose tools [PoseVR], written in-house [by Gomez and others], and used our internal scout tool [VR Scout, written by Gomez] as well. So, we could be in the house and start thinking about how to proceed. A big part of VR is how to compose the frame when you can look everywhere. So that was a big part of the house design – trying to design the house in a way that composes each of the story moments, using the architecture and the straight lines and the horizontals. And that’s what’s cool about mid-century modern architecture, it kind of lends itself to that. And so we could get on early placing those Quill things in there and seeing how we could design the house to suit the action.” Prior to Cycles, Gomez had VR development experience, but most of the crew didn’t. Gipson elaborates, “There were a lot of firsts for the studio, which made it really challenging, which helped fuel the crew, pushing it into many new areas.” Another challenge was that the short VR film was “one big long shot mainly” as opposed to how regular films are, notes Gipson. “We’re so used to how we animate or edit things together. But with this it’s just running in real time in the engine. This was before Unity had a timeline. Our software engineer, Jose, had to write that capability into it.” The Cycles team also contributed an original graphic color

shader in Unity, which enhanced the living quality of the characters and environment. The Cycles team pushed for maximum animation quality. “We really wanted to push that appeal that Disney has in all its animated features and shorts,” comments Gipson. The end result was emotionally moving for the director. “I think initially when I first made it, I was emotional, because it was inspired by my grandmother, although the action doesn’t follow her story, but that feeling of seeing her house the last time I was there was a moment I’ll never forget. “I think the first time I saw the characters in VR was a special moment. Everybody else had seen it, all the crew, and then I waited until everybody left the room. And that was emotional, where I was in the room with them, and they come by you in that moment where they move into the house. I was in tears. It’s so weird because it’s like they’re here but they’re not. It gives me chills thinking about it now.” Concludes Gipson, “It’s been a fun challenge and really cool to see our executives excited about VR – a lot of them had never done VR before and they said, “Wow, there’s a potential for storytelling.” In terms of Disney as a whole, he says it’s great to think of “bringing our characters into this new medium for audiences.” Gipson is working on another Disney VR film, which will debut this year.

TOP: Jeff Gipson reviewing the work BOTTOM, LEFT TO RIGHT: Character Lead Ed Robbins with a VR headset. Senior Software Engineer Jose Gomez (creator of the ‘Gomez Effect’) with a VR headset. Animation Lead Jorge Ruiz with a VR headset. Director Jeff Gipson uses virtual production camera equipment to scout camera location in the virtual set. This technique was used early on to both scout locations for action as well as create the flat version of the film.

FALL 2019 VFXVOICE.COM • 75


FILM AND TV

electricity. Caimin Bourne is Special Effects Supervisor, while Justin Ball is Visual Effects Supervisor, lighting up the era with BlueBolt (Visual Effects Supervisor Stuart Bullen), Nvizible (Visual Effects Supervisors Jason Evans and Jamie Wood), One of Us (Visual Effects Supervisor Oliver Cubbage) and Boundary Visual Effects.

CRISP FALL VFX FILMS AND TV By CHRIS McGOWAN

TOP LEFT: Angelina Jolie returns in Maleficent: Mistress of Evil (Image copyright © 2019 Walt Disney Studios) TOP RIGHT: Roman Griffin Davis, Taika Waititi and Scarlett Johansson in Jojo Rabbit (Photo credit: Kimberley French. Copyright © 2018 Twentieth Century Fox Film Corporation) BOTTOM: Gabriel Luna in Terminator: Dark Fate (Image copyright © 2018 Skydance Productions and Paramount Pictures)

This fall, VFX enliven a broad range of films and series, from muchanticipated sequels and spinoffs to original fantasy, sci-fi, suspense and historical fare. In terms of familiar territory, visual effects will heighten Gotham City’s reality in Joker, conjure up the magical realm of Maleficent: Mistress of Evil and power fantastic adventures in the latest Jumanji installment. Meanwhile, The Mandalorian, the first-ever live-action Star Wars series, and Star Wars: The Rise of Skywalker expand their joint universe. Terminator: Dark Fate continues that franchise’s history of groundbreaking VFX. And digital effects amp up the action in a Charlie’s Angels reboot and enhance the supernatural in two sequels, Zombieland: Double Tap and Doctor Sleep. The Addams Family, Frozen II, and the stop-motion A Shaun the Sheep Movie: Farmageddon build on previous titles and expand the animation palette, as does Spies in Disguise. VFX build a musical fantasy world in Cats, create a younger Will Smith in Gemini Man, heighten suspense in The Hunt and rev up Ford v Ferrari. Visual effects also take us back in time in the period films Jojo Rabbit, 1917, Harriet, The Current War, The Aeronauts and Midway. VFX producers, VFX and SFX supervisors and VFX houses are included when possible. This is not intended to be a complete listing. Release dates are subject to change. Joker (Warner Bros.) Release date: U.K., October 4; U.S., October 4 Failure and humiliation drive stand-up comedian Arthur Fleck (Joaquin Phoenix) to re-invent himself as the psychopathic, homicidal Joker in an origin story for Batman’s future nemesis. Todd Phillips directs. Robert De Niro and Zazie Beetz are in the cast. Edwin Rivera is Visual Effects Supervisor and worked with Scanline VFX (Visual Effects Supervisor Mathew Giampa) and Shade VFX (Visual Effects Producer Molly Pabian). The Current War (101 Studios U.S., Lantern Entertainment, U.K.) Release date: U.K., July 26; U.S., October 4 Alfonso Gomez-Rejon directed this film about the epic battle between Thomas Edison (Benedict Cumberbatch) and George Westinghouse (Michael Shannon) to power the U.S. with

76 • VFXVOICE.COM FALL 2019

Gemini Man (Paramount Pictures ) Release date: U.K., October 11; U.S., October 4 David Benioff (Game of Thrones) co-scripted this Ang Lee-directed tale of an elite assassin (Will Smith) confronted by a younger clone of himself. Clive Owen, Mary Elizabeth Winstead and Benedict Wong are in the cast. Mark Hawker is Special Effects Supervisor. Weta Digital (Visual Effects Supervisor Sheldon Stopsack) is adding visual effects, aided by Scanline VFX (Visual Effects Supervisor Bryan Hirota) and Clear Angle Studios. The Addams Family (United Artists Releasing/Universal Pictures) Release date: U.K., October 25; U.S., October 11 Ghoulish yet adorable, the Addams clan returns to the big screen in animated form, with voices supplied by Oscar Isaac (Gomez), Charlize Theron (Morticia), Bette Midler (Grandmama) and Nick Kroll (Uncle Fester), among others. Cinesite teamed with MGM to create the Addams universe. Zombieland: Double Tap (Columbia Pictures) Release date: U.K., October 18; U.S., October 11 Four veterans of an undead invasion face off against evolved zombies and fellow survivors. Ruben Fleischer directs the horrorcomedy, starring Woody Harrelson, Jesse Eisenberg, Zoey Deutch, Rosario Dawson, Emma Stone, Bill Murray, and Dan Aykroyd. J.D. Schwalm is Special Effects Supervisor and Christine Choi is Visual Effects Coordinator. Maleficent: Mistress of Evil (Walt Disney Studios) Release date: U.K., October 18; U.S., October 18 Maleficent (Angelina Jolie) and Princess Aurora (Elle Fanning) face new threats to their magic realm. Hayley J. Williams is Special Effects Supervisor and Gary Brozenich is Visual Effects Supervisor, backed by MPC (Visual Effects Supervisors Ferran Domenech, Bryan Litson, Jessica Norman and Damien Stumpf), Mill Film (Visual Effects Supervisor Laurent Gillet), Clear Angle Studios, Trace VFX and Gentle Giant Studios. A Shaun the Sheep Movie: Farmageddon (Lionsgate U.S., StudioCanal, U.K.) Release date: U.K., October 18; U.S., December 13 This Aardman Animations stop-motion tale concerns a sheep who must bring a lost alien home before a sinister organization catches up with her. Axis VFX added otherworldly visual effects.

TOP: Frozen II (Image copyright © 2019 Walt Disney Pictures) MIDDLE: Oscar Issac and John Boyega in Star Wars: The Rise of Skywalker (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM: Will Smith voices in Spies in Disguise (Image copyright © 2018 Twentieth Century Film Corporation)

FALL 2019 VFXVOICE.COM • 77


FILM AND TV

electricity. Caimin Bourne is Special Effects Supervisor, while Justin Ball is Visual Effects Supervisor, lighting up the era with BlueBolt (Visual Effects Supervisor Stuart Bullen), Nvizible (Visual Effects Supervisors Jason Evans and Jamie Wood), One of Us (Visual Effects Supervisor Oliver Cubbage) and Boundary Visual Effects.

CRISP FALL VFX FILMS AND TV By CHRIS McGOWAN

TOP LEFT: Angelina Jolie returns in Maleficent: Mistress of Evil (Image copyright © 2019 Walt Disney Studios) TOP RIGHT: Roman Griffin Davis, Taika Waititi and Scarlett Johansson in Jojo Rabbit (Photo credit: Kimberley French. Copyright © 2018 Twentieth Century Fox Film Corporation) BOTTOM: Gabriel Luna in Terminator: Dark Fate (Image copyright © 2018 Skydance Productions and Paramount Pictures)

This fall, VFX enliven a broad range of films and series, from muchanticipated sequels and spinoffs to original fantasy, sci-fi, suspense and historical fare. In terms of familiar territory, visual effects will heighten Gotham City’s reality in Joker, conjure up the magical realm of Maleficent: Mistress of Evil and power fantastic adventures in the latest Jumanji installment. Meanwhile, The Mandalorian, the first-ever live-action Star Wars series, and Star Wars: The Rise of Skywalker expand their joint universe. Terminator: Dark Fate continues that franchise’s history of groundbreaking VFX. And digital effects amp up the action in a Charlie’s Angels reboot and enhance the supernatural in two sequels, Zombieland: Double Tap and Doctor Sleep. The Addams Family, Frozen II, and the stop-motion A Shaun the Sheep Movie: Farmageddon build on previous titles and expand the animation palette, as does Spies in Disguise. VFX build a musical fantasy world in Cats, create a younger Will Smith in Gemini Man, heighten suspense in The Hunt and rev up Ford v Ferrari. Visual effects also take us back in time in the period films Jojo Rabbit, 1917, Harriet, The Current War, The Aeronauts and Midway. VFX producers, VFX and SFX supervisors and VFX houses are included when possible. This is not intended to be a complete listing. Release dates are subject to change. Joker (Warner Bros.) Release date: U.K., October 4; U.S., October 4 Failure and humiliation drive stand-up comedian Arthur Fleck (Joaquin Phoenix) to re-invent himself as the psychopathic, homicidal Joker in an origin story for Batman’s future nemesis. Todd Phillips directs. Robert De Niro and Zazie Beetz are in the cast. Edwin Rivera is Visual Effects Supervisor and worked with Scanline VFX (Visual Effects Supervisor Mathew Giampa) and Shade VFX (Visual Effects Producer Molly Pabian). The Current War (101 Studios U.S., Lantern Entertainment, U.K.) Release date: U.K., July 26; U.S., October 4 Alfonso Gomez-Rejon directed this film about the epic battle between Thomas Edison (Benedict Cumberbatch) and George Westinghouse (Michael Shannon) to power the U.S. with

76 • VFXVOICE.COM FALL 2019

Gemini Man (Paramount Pictures ) Release date: U.K., October 11; U.S., October 4 David Benioff (Game of Thrones) co-scripted this Ang Lee-directed tale of an elite assassin (Will Smith) confronted by a younger clone of himself. Clive Owen, Mary Elizabeth Winstead and Benedict Wong are in the cast. Mark Hawker is Special Effects Supervisor. Weta Digital (Visual Effects Supervisor Sheldon Stopsack) is adding visual effects, aided by Scanline VFX (Visual Effects Supervisor Bryan Hirota) and Clear Angle Studios. The Addams Family (United Artists Releasing/Universal Pictures) Release date: U.K., October 25; U.S., October 11 Ghoulish yet adorable, the Addams clan returns to the big screen in animated form, with voices supplied by Oscar Isaac (Gomez), Charlize Theron (Morticia), Bette Midler (Grandmama) and Nick Kroll (Uncle Fester), among others. Cinesite teamed with MGM to create the Addams universe. Zombieland: Double Tap (Columbia Pictures) Release date: U.K., October 18; U.S., October 11 Four veterans of an undead invasion face off against evolved zombies and fellow survivors. Ruben Fleischer directs the horrorcomedy, starring Woody Harrelson, Jesse Eisenberg, Zoey Deutch, Rosario Dawson, Emma Stone, Bill Murray, and Dan Aykroyd. J.D. Schwalm is Special Effects Supervisor and Christine Choi is Visual Effects Coordinator. Maleficent: Mistress of Evil (Walt Disney Studios) Release date: U.K., October 18; U.S., October 18 Maleficent (Angelina Jolie) and Princess Aurora (Elle Fanning) face new threats to their magic realm. Hayley J. Williams is Special Effects Supervisor and Gary Brozenich is Visual Effects Supervisor, backed by MPC (Visual Effects Supervisors Ferran Domenech, Bryan Litson, Jessica Norman and Damien Stumpf), Mill Film (Visual Effects Supervisor Laurent Gillet), Clear Angle Studios, Trace VFX and Gentle Giant Studios. A Shaun the Sheep Movie: Farmageddon (Lionsgate U.S., StudioCanal, U.K.) Release date: U.K., October 18; U.S., December 13 This Aardman Animations stop-motion tale concerns a sheep who must bring a lost alien home before a sinister organization catches up with her. Axis VFX added otherworldly visual effects.

TOP: Frozen II (Image copyright © 2019 Walt Disney Pictures) MIDDLE: Oscar Issac and John Boyega in Star Wars: The Rise of Skywalker (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM: Will Smith voices in Spies in Disguise (Image copyright © 2018 Twentieth Century Film Corporation)

FALL 2019 VFXVOICE.COM • 77


FILM AND TV

scientist (Eddie Redmayne) attempt to fly a hot-air balloon higher than anyone in history. Michael Dawson is Special Effects Supervisor. Framestore (Visual Effects Supervisor Romain Arnoux) and Alchemy 24 (Visual Effects Supervisor Jean-Francois Ferland) provide VFX that take us to the edge of the atmosphere.

The Hunt (Universal Pictures) Release date: U.K., September 27; U.S., September 27 Director Craig Zobel and writers Damon Lindelof and Nick Cuse all worked on The Leftovers (Lindelof was co-creator), and have now turned their talents to this action thriller with Matt Kutcher as Special Effects Supervisor and John Gibson as Visual Effects Supervisor. Jojo Rabbit (Fox Searchlight Pictures) Release date: U.K., January 3; U.S., October 18 Taika Waititi (Thor: Ragnarok) directs a dark satire set in Nazi Germany about a young boy who discovers his single mother (Scarlett Johansson) is hiding a young girl in the attic; his imaginary friend, Adolf Hitler, helps him cope with it. Sam Rockwell

plays a Nazi captain/Hitler Youth instructor. Jason Chen is Visual Effects Supervisor and Luma Pictures crafts the period VFX. Terminator: Dark Fate (Paramount Pictures) Release date: U.K., October 23; U.S., November 1 Sarah Connor (Linda Hamilton) and the T-800 (Arnold Schwarzenegger) are back to deal with Skynet. Tim Miller (Deadpool) directs and Neil Corbould is Special Effects Supervisor. ILM (Visual Effects Supervisors Eric Barba, Alex Wang, Jeff White and Georg Kaltenbrunner), Scanline VFX (Visual Effects Supervisor Arek Komorowski), Method Studios (Visual Effects Supervisor Glenn Melenhorst), Digital Domain (Digital Effects Supervisor Greg Teegarden), and Rebellion Visual Effects (Visual Effects Supervisor Jake Maymudes) supply the VFX. Doctor Sleep (Warner Bros.) Release date: U.K., October 31; U.S., November 8 A sequel to The Shining, Doctor Sleep is based on a Stephen King novel about a now-grownup Danny Torrance (Ewan McGregor), whose struggle with alcoholism reawakens his psychic powers. Ken Gorrell is Special Effects Coordinator. Marc Kolbe is Visual Effects Supervisor, linking with Peerless (Visual Effects Supervisor Marc Hutchings), Secret Lab (Visual Effects Supervisor Olcun Tan), Method Studios and Shade VFX.

TOP LEFT: Joaquin Phoenix in Joker (Image copyright © 2019 Warner Bros. Pictures) TOP RIGHT: The Mandalorian (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM: Eddie Redmayne and Felicity Jones in The Aeronauts (Image copyright © 2019 Amazon Studios)

78 • VFXVOICE.COM FALL 2019

Harriet (Focus Features) Release date: U.K., TBD; U.S., November 1 Kasi Lemmons helms the biographical film about Harriet Tubman (Cynthia Erivo), who escaped from slavery and helped others do the same through the Underground Railroad. Gary Pilkinton is Special Effects Coordinator. Brainstorm Digital (Visual Effects Supervisor Eran Dinur) and Powerhouse VFX add historical authenticity. The Aeronauts (Amazon Studios) Release date: U.K., November 8; U.S., November 1 A wealthy young widow (Felicity Jones) and an ambitious

Midway (Lionsgate) Release date: U.K., November 8; U.S., November 8 Roland Emmerich directs the story of U.S. sailors and aviators in the historic Battle of Midway. Woody Harrelson (as Admiral Nimitz), Luke Evans, Mandy Moore, Nick Jonas, Aaron Eckhart, Dennis Quaid lead the cast. Eric Rylander is the Special Effects Coordinator. Peter G. Travers is Visual Effects Supervisor, bringing World War II to life with Pixomondo (Visual Effects Supervisor Derek Spears) and Scanline VFX (Visual Effects Supervisor Laurent Taillefer). The Mandalorian (Disney+ series) Release date: U.K., TBD; U.S., November 12 Set in the Star Wars universe, the Jon Favreau-created series concerns the adventures of a lone masked gunfighter (Pedro Pascal) in the far reaches of the galaxy. Giancarlo Esposito, Gina Carano, Carl Weathers, Werner Herzog and Nick Nolte are part of the eclectic cast. Lindsay MacGowan, Shane Mahan, John Rosengrant and Alan Scott are Supervisors for Legacy Effects. Visual Effects Supervisors Richard Bluff and Jason Porter craft the VFX with ILM, Gentle Giant Studio (Visual Effects Supervisor Yoshi DeHerrera) and Image Engine. Charlie’s Angels (Sony Pictures) Release date: U.K., November 29; U.S., November 15 Kristen Stewart, Naomi Scott, and Ella Balinska are the three leading angels in a global spy operation. Uli Nefzer was Special Effects Supervisor. Karen Heston (Sony Visual Effects Supervisor) teams with Gentle Giant Studios (Visual Effects Supervisor Yoshi DeHerrera), and Scanline VFX.

TOP: Benedict Cumberbatch in The Current War (Image copyright © 2018 101 Studios and Lantern Entertainment) MIDDLE: A Shaun the Sheep Movie: Farmageddon (Image copyright © 2019 Lionsgate and StudioCanal) BOTTOM: Christian Bale in Ford v Ferrari (Image copyright © 2019 Twentieth Century Fox Film Corporation)

Ford v Ferrari (Twentieth Century Fox Film Corporation) Release date: U.K., November 15; U.S., November 15 A team of American engineers seek to build a new Ford car (the GT40) to dethrone perennial champion Ferrari at Le Mans in 1966. Christian Bale portrays British driver Ken Miles and Matt Damon is automotive designer Carroll Shelby. Mark R. Byers and Charles-Axel Vollard are Special Effects Supervisors. Olivier Dumont is the Visual Effects Supervisor, guiding Method Studios (Visual Effects Supervisor Dave Morley), The Yard VFX (Visual Effects Supervisor Laurens Ehrmann) and Rising Sun Pictures (Visual Effects Supervisor Malte Sarnes). Frozen II (Walt Disney Pictures) Release date: U.K., November 22; U.S., November 22 Walt Disney Animation Studios has created this sequel to the hit 2018 computer animated musical fantasy, continuing the adventures of sisters Anna (Kristen Bell’s voice) and Elsa (Idina Menzel).

FALL 2019 VFXVOICE.COM • 79


FILM AND TV

scientist (Eddie Redmayne) attempt to fly a hot-air balloon higher than anyone in history. Michael Dawson is Special Effects Supervisor. Framestore (Visual Effects Supervisor Romain Arnoux) and Alchemy 24 (Visual Effects Supervisor Jean-Francois Ferland) provide VFX that take us to the edge of the atmosphere.

The Hunt (Universal Pictures) Release date: U.K., September 27; U.S., September 27 Director Craig Zobel and writers Damon Lindelof and Nick Cuse all worked on The Leftovers (Lindelof was co-creator), and have now turned their talents to this action thriller with Matt Kutcher as Special Effects Supervisor and John Gibson as Visual Effects Supervisor. Jojo Rabbit (Fox Searchlight Pictures) Release date: U.K., January 3; U.S., October 18 Taika Waititi (Thor: Ragnarok) directs a dark satire set in Nazi Germany about a young boy who discovers his single mother (Scarlett Johansson) is hiding a young girl in the attic; his imaginary friend, Adolf Hitler, helps him cope with it. Sam Rockwell

plays a Nazi captain/Hitler Youth instructor. Jason Chen is Visual Effects Supervisor and Luma Pictures crafts the period VFX. Terminator: Dark Fate (Paramount Pictures) Release date: U.K., October 23; U.S., November 1 Sarah Connor (Linda Hamilton) and the T-800 (Arnold Schwarzenegger) are back to deal with Skynet. Tim Miller (Deadpool) directs and Neil Corbould is Special Effects Supervisor. ILM (Visual Effects Supervisors Eric Barba, Alex Wang, Jeff White and Georg Kaltenbrunner), Scanline VFX (Visual Effects Supervisor Arek Komorowski), Method Studios (Visual Effects Supervisor Glenn Melenhorst), Digital Domain (Digital Effects Supervisor Greg Teegarden), and Rebellion Visual Effects (Visual Effects Supervisor Jake Maymudes) supply the VFX. Doctor Sleep (Warner Bros.) Release date: U.K., October 31; U.S., November 8 A sequel to The Shining, Doctor Sleep is based on a Stephen King novel about a now-grownup Danny Torrance (Ewan McGregor), whose struggle with alcoholism reawakens his psychic powers. Ken Gorrell is Special Effects Coordinator. Marc Kolbe is Visual Effects Supervisor, linking with Peerless (Visual Effects Supervisor Marc Hutchings), Secret Lab (Visual Effects Supervisor Olcun Tan), Method Studios and Shade VFX.

TOP LEFT: Joaquin Phoenix in Joker (Image copyright © 2019 Warner Bros. Pictures) TOP RIGHT: The Mandalorian (Image copyright © 2019 Lucasfilm and Walt Disney Studios) BOTTOM: Eddie Redmayne and Felicity Jones in The Aeronauts (Image copyright © 2019 Amazon Studios)

78 • VFXVOICE.COM FALL 2019

Harriet (Focus Features) Release date: U.K., TBD; U.S., November 1 Kasi Lemmons helms the biographical film about Harriet Tubman (Cynthia Erivo), who escaped from slavery and helped others do the same through the Underground Railroad. Gary Pilkinton is Special Effects Coordinator. Brainstorm Digital (Visual Effects Supervisor Eran Dinur) and Powerhouse VFX add historical authenticity. The Aeronauts (Amazon Studios) Release date: U.K., November 8; U.S., November 1 A wealthy young widow (Felicity Jones) and an ambitious

Midway (Lionsgate) Release date: U.K., November 8; U.S., November 8 Roland Emmerich directs the story of U.S. sailors and aviators in the historic Battle of Midway. Woody Harrelson (as Admiral Nimitz), Luke Evans, Mandy Moore, Nick Jonas, Aaron Eckhart, Dennis Quaid lead the cast. Eric Rylander is the Special Effects Coordinator. Peter G. Travers is Visual Effects Supervisor, bringing World War II to life with Pixomondo (Visual Effects Supervisor Derek Spears) and Scanline VFX (Visual Effects Supervisor Laurent Taillefer). The Mandalorian (Disney+ series) Release date: U.K., TBD; U.S., November 12 Set in the Star Wars universe, the Jon Favreau-created series concerns the adventures of a lone masked gunfighter (Pedro Pascal) in the far reaches of the galaxy. Giancarlo Esposito, Gina Carano, Carl Weathers, Werner Herzog and Nick Nolte are part of the eclectic cast. Lindsay MacGowan, Shane Mahan, John Rosengrant and Alan Scott are Supervisors for Legacy Effects. Visual Effects Supervisors Richard Bluff and Jason Porter craft the VFX with ILM, Gentle Giant Studio (Visual Effects Supervisor Yoshi DeHerrera) and Image Engine. Charlie’s Angels (Sony Pictures) Release date: U.K., November 29; U.S., November 15 Kristen Stewart, Naomi Scott, and Ella Balinska are the three leading angels in a global spy operation. Uli Nefzer was Special Effects Supervisor. Karen Heston (Sony Visual Effects Supervisor) teams with Gentle Giant Studios (Visual Effects Supervisor Yoshi DeHerrera), and Scanline VFX.

TOP: Benedict Cumberbatch in The Current War (Image copyright © 2018 101 Studios and Lantern Entertainment) MIDDLE: A Shaun the Sheep Movie: Farmageddon (Image copyright © 2019 Lionsgate and StudioCanal) BOTTOM: Christian Bale in Ford v Ferrari (Image copyright © 2019 Twentieth Century Fox Film Corporation)

Ford v Ferrari (Twentieth Century Fox Film Corporation) Release date: U.K., November 15; U.S., November 15 A team of American engineers seek to build a new Ford car (the GT40) to dethrone perennial champion Ferrari at Le Mans in 1966. Christian Bale portrays British driver Ken Miles and Matt Damon is automotive designer Carroll Shelby. Mark R. Byers and Charles-Axel Vollard are Special Effects Supervisors. Olivier Dumont is the Visual Effects Supervisor, guiding Method Studios (Visual Effects Supervisor Dave Morley), The Yard VFX (Visual Effects Supervisor Laurens Ehrmann) and Rising Sun Pictures (Visual Effects Supervisor Malte Sarnes). Frozen II (Walt Disney Pictures) Release date: U.K., November 22; U.S., November 22 Walt Disney Animation Studios has created this sequel to the hit 2018 computer animated musical fantasy, continuing the adventures of sisters Anna (Kristen Bell’s voice) and Elsa (Idina Menzel).

FALL 2019 VFXVOICE.COM • 79


FILM AND TV

Jumanji: The Next Level (Columbia/Sony Pictures) Release date: U.K.,TBD; U.S., December 13 Jake Kasdan directs this sequel, which stars Dwayne Johnson, Jack Black, Kevin Hart, Karen Gillan, Nick Jonas, Danny DeVito and Danny Glover. J.D. Schwalm is Special Effects Supervisor. Mark Breakspear is the Visual Effects Supervisor, leading Sony Imageworks, and Weta Digital also contributes VFX. Star Wars: The Rise of Skywalker (Walt Disney Studios) Release date: U.K., December 19; U.S., December 20 Rey’s journey continues as the Resistance faces the First Order in the final Skywalker chapter. J.J. Abrams directs Daisy Ridley, Adam Driver, John Boyega, Oscar Isaac, Richard E. Grant, Mark Hamill and Billy Dee Williams. Dominic Tuohy is Special Effects Supervisor and Creature FX helps with the practical effects. Roger Guyett is Visual Effects Supervisor, and ILM (Visual Effects Supervisor Daniele Bigi) taps into the VFX Force, with assistance from Gentle Giant Studios. Cats (Universal Pictures) Release date: U.K., December 20; U.S., December 20 Tom Hooper directs a film adaptation of the Andrew Lloyd Webber musical. The cast includes Jennifer Hudson, Taylor Swift, James Corden, Idris Elba, Ian McKellen and Judi Dench. Paul Dimmer is Special Effects Supervisor. Harpreet Bahia is Senior Visual Effects Supervisor and MPC, Mill Film and Clear Angle Studios turn fantasy into reality. Spies in Disguise (20th Century Fox) Release date: U.K., December 26; U.S., December 25 In this computer animated film from Fox Animation and Blue Sky Studios, the world’s coolest spy (Will Smith) gets turned into a pigeon and must rely on a nerdy scientist (Tom Holland) to help him save the world. 1917 (Universal Pictures) Release date: U.K., January 10; U.S., December 25 The Sam Mendes film follows two young actors on a single day in World War I. George MacKay, Dean-Charles Chapman, Benedict Cumberbatch, Colin Firth and Mark Strong are in the cast. Dominic Tuohy is Special Effects Supervisor, and Clear Angle Studios works on the period VFX.

TOP: Woody Harrelson in Zombieland: Double Tap (Image copyright © 2019 Columbia/Sony Pictures) MIDDLE: Jumanji: The Next Level (Image copyright © 2019 Columbia/Sony Pictures) BOTTOM: Goliath (Image copyright © 2019 Amazon Studios)

80 • VFXVOICE.COM FALL 2019

Goliath (Amazon Studios) Release date: U.K., October TBA; U.S., October TBA In the season 3 opener, Billy Bob Thornton stars as a disgraced lawyer who battles ranchers in the Central Valley over water control in drought-stricken California. Dennis Quaid and Amy Brenneman co-star. Visual Effects Producers are Celine Zoleta (Pixomondo) and Aaron Greenberg (Amazon Studio). Overall VFX Supervisor is Michael Shelton working with VFX Supervisor Bojan Zoric (Pixomondo). Pixomondo is the primary effects house.



TV

THE PSYCHEDELIC VFX JOURNEY OF AMERICAN GODS By TREVOR HOGG

Images courtesy of Starz, Fremantle, Amazon Studios and MR. X

TOP: Obscuring vegetation was painted out of the drone plate photography to provide a clear view of the House on the Rock. OPPOSITE TOP: Trees were modeled, textured and instanced to match the surrounding real-world vegetation. OPPOSITE MIDDLE: A greyscale model is produced utilizing LiDAR scans and photogrammetry taken from the actual carousel at the House on the Rock. OPPOSITE BOTTOM: The final shot of the exploding carousel which concludes with Mr. Wednesday (Ian McShane) flying towards the camera.

82 • VFXVOICE.COM FALL 2019

Originally published as a novel in 2001 by Neil Gaiman, American Gods has been adapted for the small screen by Starz and Amazon Prime Video. The fantasy tale, which has been renewed for a third season, chronicles the relationship between ex-convict Shadow Moon (Ricky Whittle) and his employer Mr. Wednesday (Ian McShane) as they travel through America visiting various business associates who may or may not be deities disguised in a human form. A signature sequence for American Gods occurs during the opening episode of Season 2 when Shadow Moon and Bilquis (Yetide Badaki) embark on a psychedelic journey into the mind of Mr. Wednesday. MR. X handled the visual effects for House on the Rock, the exploding carousel, Asgardian beach and the exterior of the Hall of the Gods, while Tendril was responsible for the interior of the Viking longhouse and deity effects. A pristine exterior for the House on the Rock, the iconic Wisconsin tourist attraction, needed to be created because, at the time of the drone photography, it was covered by trees. “Using the existing structures and crazy architecture as reference, modelers and lookdev artists took to creating the new layout for the asset,” states MR. X Visual Effects Supervisor Chris MacLean. “Trees were modeled, textured and instanced to match the surrounding real-world vegetation. The assets were then lit and rendered in Redshift. To maintain realism, the comps used portions of the drone footage so our 3D assets matched the real-world photography.” Every inch of the Carousel Room was LiDAR scanned and photographed. “The asset teams went to work building the pieces of the Carousel room that would eventually make it into the explosion,” explains MacLean. “The assets were then handed off to the animators who hero-animated Carousel animals, Valkyries and

debris. The VFX artists were given a different set of assets which were then fractured and RBD simulated. The giant fireball was generated in Houdini using Pyro. The compositor for the shot then started the daunting task of putting everything together.” Plates of the riders were cleaned up, keyed and given a time echo or tracer effect to add an extra element of psychedelia. “With the CG explosion elements as the backdrop, we then added elements and a nebula background for our heroes to fly through,” explains MacLean. “This is just the beginning, though, as we fly into Wednesday’s eye which carries us on a slit scan defocused prism ride on the Bifrost to the Valhalla beach. This was all done with elements generated in Nuke by the compositor. With the nebula idea, we took extra care to make sure that every frame of this transition was interesting and fun. “When we reach the beaches of Valhalla we are greeted by a confused Shadow Moon and smartly-dressed Bilquis,” continues MacLean. “The interactive portion of the beach was practically shot, but CG mountains based on the Giant’s Causeway, whale bones, Viking ship and the Aurora Borealis were all built in 3D and added to the sequence. For the epic and super-wide establishing shot before we enter the Hall of the Gods, we added a fully CG building exterior and our CG Ravens.” Entering the Hall of the Gods, the viewer is taken backstage where the gods are shown in their true form. “We witness a conclave of ancient celestial beings, including Mama-Ji (Sakina Jaffrey), Anansi (Orlando Jones), Bilquis, Czernobog (Peter Stormare) and others, each fully manifested in god form,” states Tendril founder and director Chris Bahry. “It is here that Mr. Wednesday/Odin urges the old gods to take action [against the new gods] before it’s too late.” Executive producer and episode director Christopher Byrne

FALL 2019 VFXVOICE.COM • 83


TV

THE PSYCHEDELIC VFX JOURNEY OF AMERICAN GODS By TREVOR HOGG

Images courtesy of Starz, Fremantle, Amazon Studios and MR. X

TOP: Obscuring vegetation was painted out of the drone plate photography to provide a clear view of the House on the Rock. OPPOSITE TOP: Trees were modeled, textured and instanced to match the surrounding real-world vegetation. OPPOSITE MIDDLE: A greyscale model is produced utilizing LiDAR scans and photogrammetry taken from the actual carousel at the House on the Rock. OPPOSITE BOTTOM: The final shot of the exploding carousel which concludes with Mr. Wednesday (Ian McShane) flying towards the camera.

82 • VFXVOICE.COM FALL 2019

Originally published as a novel in 2001 by Neil Gaiman, American Gods has been adapted for the small screen by Starz and Amazon Prime Video. The fantasy tale, which has been renewed for a third season, chronicles the relationship between ex-convict Shadow Moon (Ricky Whittle) and his employer Mr. Wednesday (Ian McShane) as they travel through America visiting various business associates who may or may not be deities disguised in a human form. A signature sequence for American Gods occurs during the opening episode of Season 2 when Shadow Moon and Bilquis (Yetide Badaki) embark on a psychedelic journey into the mind of Mr. Wednesday. MR. X handled the visual effects for House on the Rock, the exploding carousel, Asgardian beach and the exterior of the Hall of the Gods, while Tendril was responsible for the interior of the Viking longhouse and deity effects. A pristine exterior for the House on the Rock, the iconic Wisconsin tourist attraction, needed to be created because, at the time of the drone photography, it was covered by trees. “Using the existing structures and crazy architecture as reference, modelers and lookdev artists took to creating the new layout for the asset,” states MR. X Visual Effects Supervisor Chris MacLean. “Trees were modeled, textured and instanced to match the surrounding real-world vegetation. The assets were then lit and rendered in Redshift. To maintain realism, the comps used portions of the drone footage so our 3D assets matched the real-world photography.” Every inch of the Carousel Room was LiDAR scanned and photographed. “The asset teams went to work building the pieces of the Carousel room that would eventually make it into the explosion,” explains MacLean. “The assets were then handed off to the animators who hero-animated Carousel animals, Valkyries and

debris. The VFX artists were given a different set of assets which were then fractured and RBD simulated. The giant fireball was generated in Houdini using Pyro. The compositor for the shot then started the daunting task of putting everything together.” Plates of the riders were cleaned up, keyed and given a time echo or tracer effect to add an extra element of psychedelia. “With the CG explosion elements as the backdrop, we then added elements and a nebula background for our heroes to fly through,” explains MacLean. “This is just the beginning, though, as we fly into Wednesday’s eye which carries us on a slit scan defocused prism ride on the Bifrost to the Valhalla beach. This was all done with elements generated in Nuke by the compositor. With the nebula idea, we took extra care to make sure that every frame of this transition was interesting and fun. “When we reach the beaches of Valhalla we are greeted by a confused Shadow Moon and smartly-dressed Bilquis,” continues MacLean. “The interactive portion of the beach was practically shot, but CG mountains based on the Giant’s Causeway, whale bones, Viking ship and the Aurora Borealis were all built in 3D and added to the sequence. For the epic and super-wide establishing shot before we enter the Hall of the Gods, we added a fully CG building exterior and our CG Ravens.” Entering the Hall of the Gods, the viewer is taken backstage where the gods are shown in their true form. “We witness a conclave of ancient celestial beings, including Mama-Ji (Sakina Jaffrey), Anansi (Orlando Jones), Bilquis, Czernobog (Peter Stormare) and others, each fully manifested in god form,” states Tendril founder and director Chris Bahry. “It is here that Mr. Wednesday/Odin urges the old gods to take action [against the new gods] before it’s too late.” Executive producer and episode director Christopher Byrne

FALL 2019 VFXVOICE.COM • 83


TV

God Mode – The Technique Tendril Founder and Director Chris Bahry goes through a step-by-step guide for what became known as the “God mode.” In essence the effect is a form of Rotomation, but with a few added tricks. The effect will be one that we can dial up and down in intensity over the underlying footage. 1. First, we track and roto the plates to isolate our characters from the background. 2. Then we project and warp the characters to exaggerate their scale and make them feel massive.

TOP LEFT: A wide plate shot of Cherry Beach in Toronto that features Shadow Moon (Ricky Whittle) and Bilquis (Yetide Badaki). MIDDLE: A greyscale model showcasing the digital augmentation of the mountains and Viking ship. BOTTOM: The CG mountains are based on the Giant’s Causeway and overshadow the massive whale bones. TOP RIGHT: The live-action plate was shot at Cherry Beach in Toronto of Shadow Moon and Bilquis. The sky was digitally replaced with the Aurora Borealis.

wanted the sequence to have an artful quality but be grounded in optical effects, like an acid trip filmed through the lens of a real camera. “Collectively, we all wanted the sequence to evoke the unsettling, hallucinogenic feel of the original material,” notes Bahry. “We needed to create a thick, almost liquid atmosphere that the various god manifestations could phase in and out of. Liveaction plates of the actors for this sequence were filmed in a partial interior set that would be extended later with a CG interior.” Concept art was developed for each of the gods by utilizing a selection of shots taken from the rough cut of the sequence. “In parallel with the concept art, we began R&D on what we eventually dubbed the ‘God Wind’ effect,” explains Bahry. “To achieve the effect, each shot went through a preparatory stage that included 2D roto, 3D matchmove, 3D object tracking and 3D rotomation using rigged proxy geometry of each of the characters. We used this geometry as our input for a custom-built tool in Houdini to advect particles along magnetic-field vectors generated on the surfaces. These particles were rendered to look like aurora borealis plasma, and the motion of the particles was used to distort and smear the pixels in the underlying footage. The eyes were created with a secondary system that included fluid simulation. “Several of the characters additionally required fully CG elements and props,” Bahry explains, “for example, Mamaji’s

“We wanted to pack each of the characters with Easter eggs and little touches that we hope fans will pick up on and love.” —Chris Bahry, Founder and Director, Tendril multiple arms and weapons, Zorya’s [Cloris Leachman] heads, Anansi’s spider legs, and Bilquis’ crown, to name a few. Other characters included both CG and extensive compositing enhancements. Czernobog, for example is surrounded by a cloud of cancerous smoke, and ethereal wisps of blood trail from his hammer. These were combined with blood stains that spread like ink-blotches on his suit and various tattoos tracked to his face. We wanted to pack each of the characters with Easter eggs and little touches that we hope fans will pick up on and love.” The final look was crafted inside of Nuke. “We separated each scene into layers, applying varying degrees of liquid distortion, embers and optical effects to each,” remarks Bahry. “The compositing team built each effect around a custom gizmo, which allowed for consistency and sharing of the ‘recipe’ for each character look among several compositors. The studio was also tasked with creating a CG miniature map that would help set a geographic context for the journey from Kentucky to Wisconsin. The idea here was to make the map look like something you might find at a roadside attraction or covered in dust in a museum basement. We started by matchmoving both the crane shot and an aerial drone element, aligned them to our CG miniature map, and created a third master camera that interpolated and stitched to the live-action elements.”

3. We object track and rotomate select 3D elements to key characters in order to have added control over their god forms (i.e., Mama-ji’s many arms). 4. We apply our look and optical flow/vector blur to the frames. This gives us the stylized colours and the ‘phasing’ effect on the background. 5. Additional layers of practical fluids and particles blended with the above completes the effect and adds power and movement – an LSD god aura.

TOP LEFT: Anansi (Orlando Jones) stands before the carousel situated at the House on the Rock. MIDDLE: Shadow Moon and Bilquis on the beaches of Valhalla. BOTTOM: The final composite of Ame-No Uzume (Uni Park), the Japanese goddess of dawn, mirth and revelry, with the fiery aura of the God Wind as well as the God Mane, which is a denser layer of particles.

84 • VFXVOICE.COM FALL 2019

FALL 2019 VFXVOICE.COM • 85


TV

God Mode – The Technique Tendril Founder and Director Chris Bahry goes through a step-by-step guide for what became known as the “God mode.” In essence the effect is a form of Rotomation, but with a few added tricks. The effect will be one that we can dial up and down in intensity over the underlying footage. 1. First, we track and roto the plates to isolate our characters from the background. 2. Then we project and warp the characters to exaggerate their scale and make them feel massive.

TOP LEFT: A wide plate shot of Cherry Beach in Toronto that features Shadow Moon (Ricky Whittle) and Bilquis (Yetide Badaki). MIDDLE: A greyscale model showcasing the digital augmentation of the mountains and Viking ship. BOTTOM: The CG mountains are based on the Giant’s Causeway and overshadow the massive whale bones. TOP RIGHT: The live-action plate was shot at Cherry Beach in Toronto of Shadow Moon and Bilquis. The sky was digitally replaced with the Aurora Borealis.

wanted the sequence to have an artful quality but be grounded in optical effects, like an acid trip filmed through the lens of a real camera. “Collectively, we all wanted the sequence to evoke the unsettling, hallucinogenic feel of the original material,” notes Bahry. “We needed to create a thick, almost liquid atmosphere that the various god manifestations could phase in and out of. Liveaction plates of the actors for this sequence were filmed in a partial interior set that would be extended later with a CG interior.” Concept art was developed for each of the gods by utilizing a selection of shots taken from the rough cut of the sequence. “In parallel with the concept art, we began R&D on what we eventually dubbed the ‘God Wind’ effect,” explains Bahry. “To achieve the effect, each shot went through a preparatory stage that included 2D roto, 3D matchmove, 3D object tracking and 3D rotomation using rigged proxy geometry of each of the characters. We used this geometry as our input for a custom-built tool in Houdini to advect particles along magnetic-field vectors generated on the surfaces. These particles were rendered to look like aurora borealis plasma, and the motion of the particles was used to distort and smear the pixels in the underlying footage. The eyes were created with a secondary system that included fluid simulation. “Several of the characters additionally required fully CG elements and props,” Bahry explains, “for example, Mamaji’s

“We wanted to pack each of the characters with Easter eggs and little touches that we hope fans will pick up on and love.” —Chris Bahry, Founder and Director, Tendril multiple arms and weapons, Zorya’s [Cloris Leachman] heads, Anansi’s spider legs, and Bilquis’ crown, to name a few. Other characters included both CG and extensive compositing enhancements. Czernobog, for example is surrounded by a cloud of cancerous smoke, and ethereal wisps of blood trail from his hammer. These were combined with blood stains that spread like ink-blotches on his suit and various tattoos tracked to his face. We wanted to pack each of the characters with Easter eggs and little touches that we hope fans will pick up on and love.” The final look was crafted inside of Nuke. “We separated each scene into layers, applying varying degrees of liquid distortion, embers and optical effects to each,” remarks Bahry. “The compositing team built each effect around a custom gizmo, which allowed for consistency and sharing of the ‘recipe’ for each character look among several compositors. The studio was also tasked with creating a CG miniature map that would help set a geographic context for the journey from Kentucky to Wisconsin. The idea here was to make the map look like something you might find at a roadside attraction or covered in dust in a museum basement. We started by matchmoving both the crane shot and an aerial drone element, aligned them to our CG miniature map, and created a third master camera that interpolated and stitched to the live-action elements.”

3. We object track and rotomate select 3D elements to key characters in order to have added control over their god forms (i.e., Mama-ji’s many arms). 4. We apply our look and optical flow/vector blur to the frames. This gives us the stylized colours and the ‘phasing’ effect on the background. 5. Additional layers of practical fluids and particles blended with the above completes the effect and adds power and movement – an LSD god aura.

TOP LEFT: Anansi (Orlando Jones) stands before the carousel situated at the House on the Rock. MIDDLE: Shadow Moon and Bilquis on the beaches of Valhalla. BOTTOM: The final composite of Ame-No Uzume (Uni Park), the Japanese goddess of dawn, mirth and revelry, with the fiery aura of the God Wind as well as the God Mane, which is a denser layer of particles.

84 • VFXVOICE.COM FALL 2019

FALL 2019 VFXVOICE.COM • 85


TV

DNEG TV RECREATES THE UNIMAGINABLE FOR HBO’S CHERNOBYL By KEVIN H. MARTIN

Occurring just three months after the Challenger disaster, the 1986 nuclear plant accident at Chernobyl in the Ukraine generated both political and radiation fallout. This real-life near China Syndrome event contributed to the destabilization of the Soviet Union while raising renewed concerns over atomic safety. HBO’s Chernobyl series puts very human faces on the disaster, and to aid in realistically recreating the horrific event, DNEG TV provided visual effects. VFX Supervisor Max Dennison’s experience on big-budget features dates back to Magic Camera prior to their acquisition by Mill Film. Later, while at Weta Digital, he was lead matte painter for The Lord of the Rings trilogy, then worked for ILM on Revenge of the Sith. In recent years, he has supervised visual effects for a dozen TV series, mostly while at DNEG TV. “When we got started in February 2018, the scope of the work wasn’t yet fully fleshed out,” Dennison recalls. “We began investigating some bigger shots, like the evacuation of Pripyat, and how we’d do a lot of helicopter shots [ultimately all were handled via CG], even prior to the start of shooting. The actual site wasn’t available, having changed drastically over the years, but we knew from the get-go that everything had to feel utterly authentic.” Dennison traveled to Lithuania, where he examined the set builds and met with director Johan Renck and production designer

All images copyright © 2019 HBO TOP AND BOTTOM: Location plates were modified by DNEG to closely replicate the Chernobyl accident site. All the helicopters seen in the film were created and rendered digitally.

86 • VFXVOICE.COM FALL 2019

Luke Hull. “Johan wanted to be able to put our work up next to reference photographs and not tell the difference,” Dennison recalls. “Johan hates impossible camera moves, so all of our camera positions had to be what I called ‘legal’ – a place you could put a set of legs down or a handheld camera could have been operated. Luke did an amazing job building interiors, and his biggest set was an exterior rubble pile outside the exploded reactor four, which was shot in Martinez.” Production also shot interiors and exteriors at the partly-decommissioned Lithuania’s Ignalina Nuclear Power Plant, a RBMK reactor and sister station to Chernobyl. “It was our job to reconstruct the rest of the reactor four environment,” says Dennison, “topping it up to show the whole plant as it was directly after the first explosion. That meant a huge 3D build, plus matte painting and effects work. And we had to marry up Ignalina with the Martinez rubble pile.” Both locations were lidared, and Ignalina was also surveyed via drone, helping DNEG create a very accurate blend. “We also got hold of a 3D model of Ignalina from a Russian postproduction facility, which had previously built a full-scale environmental model of Chernobyl,” states Dennison. “That gave us a fairly undetailed model of the other three reactors to build on while blending in our two locations. There were a lot of moving parts and a lot of heavy lifting to get all these elements working together. We had a huge amount of reference but that still meant generating an enormous number of textures for these buildings, plus there was a fair amount of interpretation required on the black-and-white [photographic] reference from the accident.” Close collaboration with other departments was essential in maintaining visual continuity while respecting historical accuracy. “[Cinematographer] Jakob Ihre used a huge skylight casting an orange glow over everything with the firefighters at Martinez, along with the practical smoke. We were lucky in getting some fantastic SFX guys from Germany [Rauch Special Effects UG] who provided a lot of the initial fire placements and smoke cannons, which gave us a real sense of what it must have been like. We referenced all of that for our big wide shot as the firefighters arrive, with the camera [tilting] up about 60 degrees, taking things well beyond our live-action set component, so we had to create a huge amount of additional CG smoke.” DNEG employed Maya, their standard go-to, with Isotropix’s Clarisse for rendering. “We built everything with that toolset,” Dennison reveals, “using a lot of Houdini effects work to scatter debris, allowing it to fall naturally, creating the sense of mayhem and disarray after the explosion, with all the graphite and fuel rods and superstructure and concrete everywhere.” Dennison found Chernobyl’s script to be brilliant. “It conveyed a need that I get under the skin of what it was like to be there,” he acknowledges. “We felt challenged to live up to all that the other departments were investing in this with respect to visual credibility and good storytelling. Johan has an incredibly good eye and was always after us to pull back and restrain our natural tendency to ‘over-indulge’ things, and though he never lingers on an effects shot, there’s always enough time to convey the story point.”

“[Cinematographer] Jakob Ihre used a huge skylight casting an orange glow over everything with the firefighters at Martinez, along with the practical smoke. We were lucky in getting some fantastic SFX guys from Germany [Rauch Special Effects UG] who provided a lot of the initial fire placements and smoke cannons, which gave us a real sense of what it must have been like.” —Max Dennison, VFX Supervisor

TOP AND BOTTOM: The rubble pile built by production designer Luke Hull was filmed at Martinez. Production shot other exteriors at a Chernobyl sister-station, the partly-decommissioned Ignalina Nuclear Power Plant. DNEG seamed these separate locales together while adding CG architecture to complete the structure, along with the requisite amount of fire and smoke.

FALL 2019 VFXVOICE.COM • 87


TV

DNEG TV RECREATES THE UNIMAGINABLE FOR HBO’S CHERNOBYL By KEVIN H. MARTIN

Occurring just three months after the Challenger disaster, the 1986 nuclear plant accident at Chernobyl in the Ukraine generated both political and radiation fallout. This real-life near China Syndrome event contributed to the destabilization of the Soviet Union while raising renewed concerns over atomic safety. HBO’s Chernobyl series puts very human faces on the disaster, and to aid in realistically recreating the horrific event, DNEG TV provided visual effects. VFX Supervisor Max Dennison’s experience on big-budget features dates back to Magic Camera prior to their acquisition by Mill Film. Later, while at Weta Digital, he was lead matte painter for The Lord of the Rings trilogy, then worked for ILM on Revenge of the Sith. In recent years, he has supervised visual effects for a dozen TV series, mostly while at DNEG TV. “When we got started in February 2018, the scope of the work wasn’t yet fully fleshed out,” Dennison recalls. “We began investigating some bigger shots, like the evacuation of Pripyat, and how we’d do a lot of helicopter shots [ultimately all were handled via CG], even prior to the start of shooting. The actual site wasn’t available, having changed drastically over the years, but we knew from the get-go that everything had to feel utterly authentic.” Dennison traveled to Lithuania, where he examined the set builds and met with director Johan Renck and production designer

All images copyright © 2019 HBO TOP AND BOTTOM: Location plates were modified by DNEG to closely replicate the Chernobyl accident site. All the helicopters seen in the film were created and rendered digitally.

86 • VFXVOICE.COM FALL 2019

Luke Hull. “Johan wanted to be able to put our work up next to reference photographs and not tell the difference,” Dennison recalls. “Johan hates impossible camera moves, so all of our camera positions had to be what I called ‘legal’ – a place you could put a set of legs down or a handheld camera could have been operated. Luke did an amazing job building interiors, and his biggest set was an exterior rubble pile outside the exploded reactor four, which was shot in Martinez.” Production also shot interiors and exteriors at the partly-decommissioned Lithuania’s Ignalina Nuclear Power Plant, a RBMK reactor and sister station to Chernobyl. “It was our job to reconstruct the rest of the reactor four environment,” says Dennison, “topping it up to show the whole plant as it was directly after the first explosion. That meant a huge 3D build, plus matte painting and effects work. And we had to marry up Ignalina with the Martinez rubble pile.” Both locations were lidared, and Ignalina was also surveyed via drone, helping DNEG create a very accurate blend. “We also got hold of a 3D model of Ignalina from a Russian postproduction facility, which had previously built a full-scale environmental model of Chernobyl,” states Dennison. “That gave us a fairly undetailed model of the other three reactors to build on while blending in our two locations. There were a lot of moving parts and a lot of heavy lifting to get all these elements working together. We had a huge amount of reference but that still meant generating an enormous number of textures for these buildings, plus there was a fair amount of interpretation required on the black-and-white [photographic] reference from the accident.” Close collaboration with other departments was essential in maintaining visual continuity while respecting historical accuracy. “[Cinematographer] Jakob Ihre used a huge skylight casting an orange glow over everything with the firefighters at Martinez, along with the practical smoke. We were lucky in getting some fantastic SFX guys from Germany [Rauch Special Effects UG] who provided a lot of the initial fire placements and smoke cannons, which gave us a real sense of what it must have been like. We referenced all of that for our big wide shot as the firefighters arrive, with the camera [tilting] up about 60 degrees, taking things well beyond our live-action set component, so we had to create a huge amount of additional CG smoke.” DNEG employed Maya, their standard go-to, with Isotropix’s Clarisse for rendering. “We built everything with that toolset,” Dennison reveals, “using a lot of Houdini effects work to scatter debris, allowing it to fall naturally, creating the sense of mayhem and disarray after the explosion, with all the graphite and fuel rods and superstructure and concrete everywhere.” Dennison found Chernobyl’s script to be brilliant. “It conveyed a need that I get under the skin of what it was like to be there,” he acknowledges. “We felt challenged to live up to all that the other departments were investing in this with respect to visual credibility and good storytelling. Johan has an incredibly good eye and was always after us to pull back and restrain our natural tendency to ‘over-indulge’ things, and though he never lingers on an effects shot, there’s always enough time to convey the story point.”

“[Cinematographer] Jakob Ihre used a huge skylight casting an orange glow over everything with the firefighters at Martinez, along with the practical smoke. We were lucky in getting some fantastic SFX guys from Germany [Rauch Special Effects UG] who provided a lot of the initial fire placements and smoke cannons, which gave us a real sense of what it must have been like.” —Max Dennison, VFX Supervisor

TOP AND BOTTOM: The rubble pile built by production designer Luke Hull was filmed at Martinez. Production shot other exteriors at a Chernobyl sister-station, the partly-decommissioned Ignalina Nuclear Power Plant. DNEG seamed these separate locales together while adding CG architecture to complete the structure, along with the requisite amount of fire and smoke.

FALL 2019 VFXVOICE.COM • 87


VFX VAULT

“The [Vietnam] war was still on. It was a hot topic. Nobody wanted to make Apocalypse Now.” —Walter Murch, Sound Designer and Sound Editor

UPRIVER WITH APOCALYPSE NOW – 40 YEARS LATER By TREVOR HOGG

Images courtesy of American Zoetrope, Lionsgate, Chas Gerretsen and John Frazier TOP: Marlon Brando talks with Francis Ford Coppola during the shooting of Apocalypse Now.

88 • VFXVOICE.COM FALL 2019

Four decades after its original theatrical release, Apocalypse Now remains the gold standard for executing massive practical explosions. Originally helmed by George Lucas before his success with Star Wars, the misadventures of U.S. Army Captain Benjamin Willard (Martin Sheen) as he embarks on a river trip to assassinate renegade Special Forces Colonel Walter Kurtz (Marlon Brando) during the Vietnam War was supposed to have a small-scale newsreel quality rather than epic surreal imagery. “George Lucas wanted it to look like what you saw on television each evening – 16mm black and white,” recalls Academy Award-winning Film Editor and Sound Designer Walter Murch (The English Patient), who was a University of Southern California film school classmate of Lucas’. The project was aborted because of bad timing. “The [Vietnam] war was still on,” Murch explains. “It was a hot topic. Nobody wanted to make Apocalypse Now.” A year after the Vietnam War ended in 1975, Francis Ford Coppola (The Godfather) revived the project and headed to the Philippines to shoot for 14 weeks. “The damn thing took two years to make!” laughs Academy Award-winning Special Effects Supervisor John Frazier (Spider-Man 2). “I was doing small shows and had rented equipment from Joe Lombardi’s shop, which was the go-to place for advice or help. At that time, it wasn’t a big-budget picture, but Joe needed some help and I said, ‘I’ll go

over there for awhile and give you a hand.’ We only had five U.S. guys over there and the rest were Filipino.” Contending with the natural elements in the Philippines was a major concern, explains Frazier. “Everything around there could kill you, and a lot of it ended up being comical. I remember shooting the scene when Lance B. Johnson (played by Sam Bottoms) is in some kind of trance on the bow of the patrol boat doing his little Ifugao moves. We would put everything that we were going to do that day in a dugout canoe and then go upriver. Francis wanted all of this in different colored smoke as the patrol boat is going upriver. You would get your duty station and always have a piece of plywood with you because the monkeys up on the top of the cliff would throw mangos at you and laugh.” “We were making stuff right in the jungle with what we had,” notes Frazier. “We made smoke every day. What color do you want? If Francis wanted white, we would burn palm tree fronds. If Francis wanted black then we would burn rubber tires.” Destroyed helicopters needed to be built as well as an authentic PBR (Patrol Boat, Riverine), which was used from 1966 to 1971 during the Vietnam War by the U.S. Navy. “No help was provided from the U.S. government because they viewed the movie as being antiwar, so we rented a PBR from Thailand in order to make all of the boats.” The water level in the river kept on going down which made shooting difficult. “We had to keep cutting the boat off, so by the time we got the scene done it was floating on air tubes!” Frazier recalls. He continues, “The other comical part was the mango and tiger scene. We all volunteered to go up to make smoke in the jungle. Francis blocked the scene with Martin Sheen and Frederic Forrest a couple of times without the tiger. Then the boys said, ‘Now I want to see where the tiger is actually going to go. Let’s make sure that the chain is right.’ The tiger comes out and the chain ends 10 feet before a piece of string. They say, ‘Okay, that’s cool.’ While Martin and Frederic go back down to the boat, Francis moves the tiger down 10 feet and makes the chain 10 feet longer. Now we’re rolling. When Martin and Frederic get near to the piece of string Francis let the tiger go, and it came within inches of them. That’s when the two of them took off running. When Frederic Forrest says, ‘Never get out of the boat,’ – that wasn’t scripted. He meant it. Oh my god! And we laughed!” The production design by Dean Tavoularis (Rising Sun) was sometimes too convincing. “We would party up at the Pagsanjan Hotel, and there was a group of French people who had gone up to the world-famous rapids,” remarks Frazier. “They thought the temple set [which was part of the Kurtz compound] was real. I said, ‘We’re going to blow that up in a couple of days.’ They

TOP: No stunt double was used for Robert Duvall, who actual flew in the helicopters that take part in the Valkyrie Attack. MIDDLE: Francis Ford Coppola out on location in the Philippines shooting Apocalypse Now. BOTTOM: Francis Ford Coppola directs Ifugao cast members who portray the tribe of natives that fall under the spell of Colonel Walter Kurtz (Marlon Brando)

FALL 2019 VFXVOICE.COM • 89


VFX VAULT

“The [Vietnam] war was still on. It was a hot topic. Nobody wanted to make Apocalypse Now.” —Walter Murch, Sound Designer and Sound Editor

UPRIVER WITH APOCALYPSE NOW – 40 YEARS LATER By TREVOR HOGG

Images courtesy of American Zoetrope, Lionsgate, Chas Gerretsen and John Frazier TOP: Marlon Brando talks with Francis Ford Coppola during the shooting of Apocalypse Now.

88 • VFXVOICE.COM FALL 2019

Four decades after its original theatrical release, Apocalypse Now remains the gold standard for executing massive practical explosions. Originally helmed by George Lucas before his success with Star Wars, the misadventures of U.S. Army Captain Benjamin Willard (Martin Sheen) as he embarks on a river trip to assassinate renegade Special Forces Colonel Walter Kurtz (Marlon Brando) during the Vietnam War was supposed to have a small-scale newsreel quality rather than epic surreal imagery. “George Lucas wanted it to look like what you saw on television each evening – 16mm black and white,” recalls Academy Award-winning Film Editor and Sound Designer Walter Murch (The English Patient), who was a University of Southern California film school classmate of Lucas’. The project was aborted because of bad timing. “The [Vietnam] war was still on,” Murch explains. “It was a hot topic. Nobody wanted to make Apocalypse Now.” A year after the Vietnam War ended in 1975, Francis Ford Coppola (The Godfather) revived the project and headed to the Philippines to shoot for 14 weeks. “The damn thing took two years to make!” laughs Academy Award-winning Special Effects Supervisor John Frazier (Spider-Man 2). “I was doing small shows and had rented equipment from Joe Lombardi’s shop, which was the go-to place for advice or help. At that time, it wasn’t a big-budget picture, but Joe needed some help and I said, ‘I’ll go

over there for awhile and give you a hand.’ We only had five U.S. guys over there and the rest were Filipino.” Contending with the natural elements in the Philippines was a major concern, explains Frazier. “Everything around there could kill you, and a lot of it ended up being comical. I remember shooting the scene when Lance B. Johnson (played by Sam Bottoms) is in some kind of trance on the bow of the patrol boat doing his little Ifugao moves. We would put everything that we were going to do that day in a dugout canoe and then go upriver. Francis wanted all of this in different colored smoke as the patrol boat is going upriver. You would get your duty station and always have a piece of plywood with you because the monkeys up on the top of the cliff would throw mangos at you and laugh.” “We were making stuff right in the jungle with what we had,” notes Frazier. “We made smoke every day. What color do you want? If Francis wanted white, we would burn palm tree fronds. If Francis wanted black then we would burn rubber tires.” Destroyed helicopters needed to be built as well as an authentic PBR (Patrol Boat, Riverine), which was used from 1966 to 1971 during the Vietnam War by the U.S. Navy. “No help was provided from the U.S. government because they viewed the movie as being antiwar, so we rented a PBR from Thailand in order to make all of the boats.” The water level in the river kept on going down which made shooting difficult. “We had to keep cutting the boat off, so by the time we got the scene done it was floating on air tubes!” Frazier recalls. He continues, “The other comical part was the mango and tiger scene. We all volunteered to go up to make smoke in the jungle. Francis blocked the scene with Martin Sheen and Frederic Forrest a couple of times without the tiger. Then the boys said, ‘Now I want to see where the tiger is actually going to go. Let’s make sure that the chain is right.’ The tiger comes out and the chain ends 10 feet before a piece of string. They say, ‘Okay, that’s cool.’ While Martin and Frederic go back down to the boat, Francis moves the tiger down 10 feet and makes the chain 10 feet longer. Now we’re rolling. When Martin and Frederic get near to the piece of string Francis let the tiger go, and it came within inches of them. That’s when the two of them took off running. When Frederic Forrest says, ‘Never get out of the boat,’ – that wasn’t scripted. He meant it. Oh my god! And we laughed!” The production design by Dean Tavoularis (Rising Sun) was sometimes too convincing. “We would party up at the Pagsanjan Hotel, and there was a group of French people who had gone up to the world-famous rapids,” remarks Frazier. “They thought the temple set [which was part of the Kurtz compound] was real. I said, ‘We’re going to blow that up in a couple of days.’ They

TOP: No stunt double was used for Robert Duvall, who actual flew in the helicopters that take part in the Valkyrie Attack. MIDDLE: Francis Ford Coppola out on location in the Philippines shooting Apocalypse Now. BOTTOM: Francis Ford Coppola directs Ifugao cast members who portray the tribe of natives that fall under the spell of Colonel Walter Kurtz (Marlon Brando)

FALL 2019 VFXVOICE.COM • 89


VFX VAULT

“We did everything in-camera. At that time there were only a few special effects people, like Joe Lombardi, who could have pulled that [napalm explosion] off and had the stamina to keep up with Francis. These guys set the bar. I was a young guy and they were like mentors to me. I learned a lot about life.” —John Frazier, Special Effects Supervisor

TOP LEFT: A stand-in dummy is used for a cut sequence when Lieutenant Richard Colby (Scott Glenn) is about to execute the American photojournalist portrayed by Dennis Hopper. TOP RIGHT: Francis Ford Coppola captured 236 hours of footage while shooting Apocalypse Now. MIDDLE: Brando used a tape recorder to assist him with his acting cues. BOTTOM: Explosions had to be meticulously timed as was the case when Lieutenant Colonel Bill Kilgore (Robert Duvall) gives his famous ‘I love the smell of Napalm in the morning’ speech.

90 • VFXVOICE.COM FALL 2019

replied, ‘What are you talking about?’ I said, ‘It’s a movie set.’ They said, ‘You can’t blow that up. It’s like thousands of years old!’ The production shut down for a week while people came in to make sure it was a movie set.” Then there was the matter with the military helicopters supplied by the Philippine Army. “The government was fighting the Muslims down in Mindanao, so at any time the army could say, ‘Come on down.’ When they would go in to shoot up the camps most of the time blanks were in the guns. When the helicopters came back, we had to go through all of the ammo to make sure there were no live rounds in the guns. It was like a scene out of Tropic Thunder where the guys are firing blanks and the bad guys are going, ‘They’re not hitting anything!’” “We did an explosion every night seven days a week,” remarks Frazier. “We got to go in early and prep, and then blow it that night.” Nowadays, the famous napalm explosion would seem ordinary. “We did bigger shots than that in Transformers. But back then it was unique. We used a six-inch PVC pipe that was about 200 feet long, filled it up with gasoline, tied explosives underneath it, and set it off that way. That was the first time it had been done. I’ve since used it on several movies. It’s the way you do it if you want a napalm run. Nobody has come up with anything better.” The fire was kept under control for the most part. “You could never put that napalm fire out if you had to. It’s too much. You let it burn out. “We never did two takes of anything,” states Frazier. “You were right on or it didn’t make the movie.” Sometimes

miscommunication took place. “When we did one explosion it blew me 25 feet. I was following another guy on his explosions and he went early, which meant I was in the wrong place at the wrong time.” Other practical effects were the decapitated head of Jay ‘Chef’ Hicks (Frederic Forrest) and the thrown spear that impales George ‘Chief’ Phillips (Albert Hall). “All you saw was the thud when it went into his chest,” explains Frazier. “Albert put on a chest plate of some balsa wood, put his shirt back on, and a guy stabbed him off camera with a spear.” Thousands of arrows were made with rubber tips. “We would put them into these mortars and blast them off [at the patrol boat]. All of that stuff was in-camera. For the firefight, we were shooting flares at that boat. We were timing it so we didn’t hit anybody. The river rose 19 feet in a couple of hours, and we were stuck in the temple for days. We found a couple cases of beer floating by, and we drank beer for two days until the water went down!” A not-so-funny event occurred when the biggest typhoon to hit the Philippines since 1932 wrecked the sets situated at Iba, causing the production to shutdown for six weeks. “Everybody who worked on Apocalypse Now got there at least a week early,” remarks Scott Glenn (The Silence of the Lambs), who at the time was a former U.S. Marine and struggling actor. “I was going to do a scene as a soldier with a handheld grenade launcher at the Do Lung Bridge. The production office was in small tourist hotel on the beach that originally was a Japanese bunker during World War II. Francis, [cinematographer] Vittorio Storaro [The Last Emperor], and most of the cast and crew went back to Manila for the weekend. I decided to stay there along with Martin Sheen, his wife Janet and Doug Claybourne, who was a PA. Late Friday night we got hit by Typhoon Didang. It turned the isthmus that we were living on into an island. Doug is also a Marine, so even though we were low men on the totem pole the two of us knew how to deal with those kinds of situations. We took over the place and set up

TOP: An iconic moment when Captain Benjamin Willard (Martin Sheen) leaves the PBR and makes his way through the compound to kill Colonel Walter Kurtz (Marlon Brando). MIDDLE: Captain Benjamin Willard smokes a cigarette while travelling up river in the PBR (patrol boat) on his way to assassinate Colonel Walter Kurtz. BOTTOM: From left to right: Marlon Brando stand-in Peter Cooper, John Frazier, Francis Ford Coppola, and helicopter pilot and military advisor Dick White.

FALL 2019 VFXVOICE.COM • 91


VFX VAULT

“We did everything in-camera. At that time there were only a few special effects people, like Joe Lombardi, who could have pulled that [napalm explosion] off and had the stamina to keep up with Francis. These guys set the bar. I was a young guy and they were like mentors to me. I learned a lot about life.” —John Frazier, Special Effects Supervisor

TOP LEFT: A stand-in dummy is used for a cut sequence when Lieutenant Richard Colby (Scott Glenn) is about to execute the American photojournalist portrayed by Dennis Hopper. TOP RIGHT: Francis Ford Coppola captured 236 hours of footage while shooting Apocalypse Now. MIDDLE: Brando used a tape recorder to assist him with his acting cues. BOTTOM: Explosions had to be meticulously timed as was the case when Lieutenant Colonel Bill Kilgore (Robert Duvall) gives his famous ‘I love the smell of Napalm in the morning’ speech.

90 • VFXVOICE.COM FALL 2019

replied, ‘What are you talking about?’ I said, ‘It’s a movie set.’ They said, ‘You can’t blow that up. It’s like thousands of years old!’ The production shut down for a week while people came in to make sure it was a movie set.” Then there was the matter with the military helicopters supplied by the Philippine Army. “The government was fighting the Muslims down in Mindanao, so at any time the army could say, ‘Come on down.’ When they would go in to shoot up the camps most of the time blanks were in the guns. When the helicopters came back, we had to go through all of the ammo to make sure there were no live rounds in the guns. It was like a scene out of Tropic Thunder where the guys are firing blanks and the bad guys are going, ‘They’re not hitting anything!’” “We did an explosion every night seven days a week,” remarks Frazier. “We got to go in early and prep, and then blow it that night.” Nowadays, the famous napalm explosion would seem ordinary. “We did bigger shots than that in Transformers. But back then it was unique. We used a six-inch PVC pipe that was about 200 feet long, filled it up with gasoline, tied explosives underneath it, and set it off that way. That was the first time it had been done. I’ve since used it on several movies. It’s the way you do it if you want a napalm run. Nobody has come up with anything better.” The fire was kept under control for the most part. “You could never put that napalm fire out if you had to. It’s too much. You let it burn out. “We never did two takes of anything,” states Frazier. “You were right on or it didn’t make the movie.” Sometimes

miscommunication took place. “When we did one explosion it blew me 25 feet. I was following another guy on his explosions and he went early, which meant I was in the wrong place at the wrong time.” Other practical effects were the decapitated head of Jay ‘Chef’ Hicks (Frederic Forrest) and the thrown spear that impales George ‘Chief’ Phillips (Albert Hall). “All you saw was the thud when it went into his chest,” explains Frazier. “Albert put on a chest plate of some balsa wood, put his shirt back on, and a guy stabbed him off camera with a spear.” Thousands of arrows were made with rubber tips. “We would put them into these mortars and blast them off [at the patrol boat]. All of that stuff was in-camera. For the firefight, we were shooting flares at that boat. We were timing it so we didn’t hit anybody. The river rose 19 feet in a couple of hours, and we were stuck in the temple for days. We found a couple cases of beer floating by, and we drank beer for two days until the water went down!” A not-so-funny event occurred when the biggest typhoon to hit the Philippines since 1932 wrecked the sets situated at Iba, causing the production to shutdown for six weeks. “Everybody who worked on Apocalypse Now got there at least a week early,” remarks Scott Glenn (The Silence of the Lambs), who at the time was a former U.S. Marine and struggling actor. “I was going to do a scene as a soldier with a handheld grenade launcher at the Do Lung Bridge. The production office was in small tourist hotel on the beach that originally was a Japanese bunker during World War II. Francis, [cinematographer] Vittorio Storaro [The Last Emperor], and most of the cast and crew went back to Manila for the weekend. I decided to stay there along with Martin Sheen, his wife Janet and Doug Claybourne, who was a PA. Late Friday night we got hit by Typhoon Didang. It turned the isthmus that we were living on into an island. Doug is also a Marine, so even though we were low men on the totem pole the two of us knew how to deal with those kinds of situations. We took over the place and set up

TOP: An iconic moment when Captain Benjamin Willard (Martin Sheen) leaves the PBR and makes his way through the compound to kill Colonel Walter Kurtz (Marlon Brando). MIDDLE: Captain Benjamin Willard smokes a cigarette while travelling up river in the PBR (patrol boat) on his way to assassinate Colonel Walter Kurtz. BOTTOM: From left to right: Marlon Brando stand-in Peter Cooper, John Frazier, Francis Ford Coppola, and helicopter pilot and military advisor Dick White.

FALL 2019 VFXVOICE.COM • 91


VFX VAULT

“There was a group of French people who had gone up to the world-famous rapids. They thought the temple set [which was part of the Kurtz compound] was real. I said, ‘We’re going to blow that up in a couple of days.’ They replied, ‘What are you talking about?’ I said, ‘It’s a movie set.’ They said, ‘You can’t blow that up. It’s like thousands of years old!’” —John Frazier, Special Effects Supervisor

TOP: Scott Glenn is seated in the foreground while Dennis Hopper stands at the top of the stairs at the Kurtz compound, which was mistaken by tourists as being an ancient temple. Glenn gave up the role of Roach in order to have the opportunity to be around and learn from the great American actor Marlon Brando.

92 • VFXVOICE.COM FALL 2019

the bathrooms. The people in Manila weren’t even sure we were alive at that point.” Another dangerous situation arose after the bulk of the storm was over. “The helicopter pilot didn’t know if he could fly Francis back to Manila,” recounts Glenn. “The rain was coming down so hard and it was so windy that they were afraid that water would get mixed in with the jet fuel when refueling the helicopter. I’ve been around that more times than I would like to remember. I said, ‘Don’t worry, I’ll fill it up.’ Later, Francis took me aside and said, ‘I want to reward you for your behavior. I can write you a better part anywhere in this movie.’ I thought about it and said, ‘I want to be in the end of the film.’ He said, ‘Scott, that’s the one part of the film that I can’t bring in a new character. You could play the part of Lieutenant Richard Colby [who was sent upriver ahead of Willard to kill Kurtz], but you’d be like a glorified extra.’ I said, ‘That’s what I want to do.’ I got to be around Dennis Hopper and the greatest American actor who ever lived, Marlon Brando.” “When the typhoon disaster hit Apocalypse Now in the summer of 1976, Francis came back to San Francisco for six weeks and showed me some of the dailies that had been shot up to that point,” remarks Murch. “I wondered, how is this all going to come together?’ Then we talked about the story and the script. He asked, ‘Do you have any suggestions?’ I said, ‘First of all, it’s a little weird that Willard is taken upriver in a boat to do this thing. They could have airdropped him 10 miles below Kurtz [Marlon Brando] and he would have gotten up there.’ The boat is a device to allow us to see the war, which you wouldn’t if it was a helicopter. They stop occasionally to look for mangos or at Playboy Bunnies, but nothing really happens. They’re tourists. I told him, ‘What it needs is a scene where the patrol boat does something it’s supposed to be doing. They’re supposed to stop other boats and inspect them.’ Francis said, ‘Okay, then write it.’ I sat down in his office for a week and wrote the puppy sampan scene. It’s like a mini Mỹ Lai Massacre where something bad happens accidentally.” “The beginning of Apocalypse Now was too flatfooted for Francis and he got excited by two accidental things – the slow-motion shots of the napalm exploding with the helicopters and Marty Sheen freaking out in his hotel room, which was shot as an acting exercise,” reveals Murch, who co-won an Oscar for

“[The opening sequence featured “The End” by the Doors.] There was a period where we were going to have nothing but songs by Jim Morrison. The idea collapsed quickly, because no matter where you were in the story and what Jim Morrison song you used, it was like Jim was looking at the movie and describing what he was looking at. It was too on the nose.” —Walter Murch, Sound Designer and Sound Engineer Best Sound Mixing and received a co-nomination for Best Film Editing for Apocalypse Now. “Francis had shot some additional material, like the famous shot of the fan and some upside-down shots of Marty under more controlled circumstances. Francis also said to me, ‘Ransack the film for other images.’ I pulled the Cambodian head and the big close-up of Marty’s eye with flames flickering on it. Those all come from the end of the film.” It was predetermined that the opening sequence would feature “The End” by the Doors. “There was a period where we were going to have nothing but songs by Jim Morrison,” Murch says. “The idea collapsed quickly, because no matter where you were in the story and what Jim Morrison song you used, it was like Jim was looking at the movie and describing what he was looking at. It was too on the nose.” The napalm explosion remains one of the largest practical explosions ever ignited for a movie. “Today, we would not do that,” admits Murch. “We might shoot some seed explosions to begin the process and then digital would takeover from that. That’s the origin of the material that you see at the beginning of the film. The slow-motion is a sixth camera that [Special Effects Coordinator] A.D. Flowers wanted in order to get a record of the explosion. Everything is real. Robert Duvall is really in a helicopter flying through the air.” Working on Apocalypse Now for Frazier was special effects 101. “We did everything in-camera. At that time there were only a few special effects people, like Joe Lombardi, who could have pulled that off and had the stamina to keep up with Francis. These guys set the bar. I was a young guy and they were like mentors to me. I learned a lot about life.” The special effects have not become dated. “For some of the stuff, like the napalm run, nobody has come up with anything better than putting the gasoline in a PVC pipe,” adds Frazier. “That’s what makes those movies so great – there’s no CGI in them. It’s real. I remember one time sitting down in the water when we were doing the firefight. I nodded off, woke up, and there’s a cobra looking at me! Apocalypse Now had a big impact on me. Not so much in techniques, but in how you make movies.”

TOP: Francis Ford Coppola and crew take a break shooting at the Kurtz compound. MIDDLE: Walter Murch developed the 5.1 surround sound mix for Apocalypse Now that went on to become an industry standard. BOTTOM: Brando shares a tender moment with an Ifugao child.

FALL 2019 VFXVOICE.COM • 93


VFX VAULT

“There was a group of French people who had gone up to the world-famous rapids. They thought the temple set [which was part of the Kurtz compound] was real. I said, ‘We’re going to blow that up in a couple of days.’ They replied, ‘What are you talking about?’ I said, ‘It’s a movie set.’ They said, ‘You can’t blow that up. It’s like thousands of years old!’” —John Frazier, Special Effects Supervisor

TOP: Scott Glenn is seated in the foreground while Dennis Hopper stands at the top of the stairs at the Kurtz compound, which was mistaken by tourists as being an ancient temple. Glenn gave up the role of Roach in order to have the opportunity to be around and learn from the great American actor Marlon Brando.

92 • VFXVOICE.COM FALL 2019

the bathrooms. The people in Manila weren’t even sure we were alive at that point.” Another dangerous situation arose after the bulk of the storm was over. “The helicopter pilot didn’t know if he could fly Francis back to Manila,” recounts Glenn. “The rain was coming down so hard and it was so windy that they were afraid that water would get mixed in with the jet fuel when refueling the helicopter. I’ve been around that more times than I would like to remember. I said, ‘Don’t worry, I’ll fill it up.’ Later, Francis took me aside and said, ‘I want to reward you for your behavior. I can write you a better part anywhere in this movie.’ I thought about it and said, ‘I want to be in the end of the film.’ He said, ‘Scott, that’s the one part of the film that I can’t bring in a new character. You could play the part of Lieutenant Richard Colby [who was sent upriver ahead of Willard to kill Kurtz], but you’d be like a glorified extra.’ I said, ‘That’s what I want to do.’ I got to be around Dennis Hopper and the greatest American actor who ever lived, Marlon Brando.” “When the typhoon disaster hit Apocalypse Now in the summer of 1976, Francis came back to San Francisco for six weeks and showed me some of the dailies that had been shot up to that point,” remarks Murch. “I wondered, how is this all going to come together?’ Then we talked about the story and the script. He asked, ‘Do you have any suggestions?’ I said, ‘First of all, it’s a little weird that Willard is taken upriver in a boat to do this thing. They could have airdropped him 10 miles below Kurtz [Marlon Brando] and he would have gotten up there.’ The boat is a device to allow us to see the war, which you wouldn’t if it was a helicopter. They stop occasionally to look for mangos or at Playboy Bunnies, but nothing really happens. They’re tourists. I told him, ‘What it needs is a scene where the patrol boat does something it’s supposed to be doing. They’re supposed to stop other boats and inspect them.’ Francis said, ‘Okay, then write it.’ I sat down in his office for a week and wrote the puppy sampan scene. It’s like a mini Mỹ Lai Massacre where something bad happens accidentally.” “The beginning of Apocalypse Now was too flatfooted for Francis and he got excited by two accidental things – the slow-motion shots of the napalm exploding with the helicopters and Marty Sheen freaking out in his hotel room, which was shot as an acting exercise,” reveals Murch, who co-won an Oscar for

“[The opening sequence featured “The End” by the Doors.] There was a period where we were going to have nothing but songs by Jim Morrison. The idea collapsed quickly, because no matter where you were in the story and what Jim Morrison song you used, it was like Jim was looking at the movie and describing what he was looking at. It was too on the nose.” —Walter Murch, Sound Designer and Sound Engineer Best Sound Mixing and received a co-nomination for Best Film Editing for Apocalypse Now. “Francis had shot some additional material, like the famous shot of the fan and some upside-down shots of Marty under more controlled circumstances. Francis also said to me, ‘Ransack the film for other images.’ I pulled the Cambodian head and the big close-up of Marty’s eye with flames flickering on it. Those all come from the end of the film.” It was predetermined that the opening sequence would feature “The End” by the Doors. “There was a period where we were going to have nothing but songs by Jim Morrison,” Murch says. “The idea collapsed quickly, because no matter where you were in the story and what Jim Morrison song you used, it was like Jim was looking at the movie and describing what he was looking at. It was too on the nose.” The napalm explosion remains one of the largest practical explosions ever ignited for a movie. “Today, we would not do that,” admits Murch. “We might shoot some seed explosions to begin the process and then digital would takeover from that. That’s the origin of the material that you see at the beginning of the film. The slow-motion is a sixth camera that [Special Effects Coordinator] A.D. Flowers wanted in order to get a record of the explosion. Everything is real. Robert Duvall is really in a helicopter flying through the air.” Working on Apocalypse Now for Frazier was special effects 101. “We did everything in-camera. At that time there were only a few special effects people, like Joe Lombardi, who could have pulled that off and had the stamina to keep up with Francis. These guys set the bar. I was a young guy and they were like mentors to me. I learned a lot about life.” The special effects have not become dated. “For some of the stuff, like the napalm run, nobody has come up with anything better than putting the gasoline in a PVC pipe,” adds Frazier. “That’s what makes those movies so great – there’s no CGI in them. It’s real. I remember one time sitting down in the water when we were doing the firefight. I nodded off, woke up, and there’s a cobra looking at me! Apocalypse Now had a big impact on me. Not so much in techniques, but in how you make movies.”

TOP: Francis Ford Coppola and crew take a break shooting at the Kurtz compound. MIDDLE: Walter Murch developed the 5.1 surround sound mix for Apocalypse Now that went on to become an industry standard. BOTTOM: Brando shares a tender moment with an Ifugao child.

FALL 2019 VFXVOICE.COM • 93


[ VES SECTION SPOTLIGHT: WASHINGTON ]

A Broad Vision of VFX Rises in the Pacific Northwest By NAOMI GOLDMAN

ABOVE: Members and college students participate in VES Washington’s signature Creative Spotlight Series. BOTTOM: VES Washington members and guests celebrate at the 2018 Holiday Party.

94 • VFXVOICE.COM FALL 2019

Now more than two decades strong, the Visual Effects Society is flourishing thanks to its dynamic network of Sections, which reflect and contribute to their communities’ distinctive visual effects landscape. Section members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region while advancing the Society and industry worldwide. Founded in 2016, the vibrant Washington State Section boasts almost 100 members who reflect the unique makeup of the area’s visual effects community. As one of the country’s gaming and tech epicenters, the local visual effects work in Washington is fueled by a roster of leading video game design and visual computing companies including Microsoft Studios, NVIDIA, Valve, Suckerpunch, Blizzard, Bungie and PopCap Games. The locale of those businesses aligns with the membership, which is primarily drawn from Bellevue, Kirkland and Redmond, as well as the Seattle hub. In highlighting the membership, Washington Section Chair Neil Lim Sang proudly proclaims, “Things are different here. At least one quarter of our members are engineers – professionals that don’t immediately come to mind when describing the VES’s membership. But they absolutely should be part of our holistic vision.” Continues Lim Sang on the Section leadership’s impassioned vision, “Before the creation of our Section, so many professionals would not have been exposed to something like the VES or known about it, because our community was so separated among enclaves of artists, technologists, producers and creatives. But now there is a centerpoint that bridges the separation and binds us together. “We want to expand the idea of the ‘typical’ VES member with a nontraditional point of view. Without the brilliant technical engineers designing the programs and software, we simply would not have visual effects. Their roles in the industry are essential and we stand for diversity in our membership. Everyone deserves a shot and a collegial place to belong.” When it comes to serving the membership, the Section’s signature program is the Creative Spotlight Series – which are hybrid

educational, career development and networking events. Each event brings together industry practitioners and thought leaders to share their insights and career highlights with fellow VES members and area college students, offering a dynamic introduction to working professionals and the broad field of filmed entertainment. Held at venues such as Bellevue College and the Academy of Interactive Entertainment, the series has featured a diverse lineup of speakers to date, including an architectural video designer, CG supervisor/technical director, VFX supervisor, screenwriter, director, producer, art director, production manager and motion-capture studio executives. “Through our Creative Spotlight series, we are really working to forge a sense of community that links practitioners and VES members from all across our regional industry. Opening up these events to students in film, visual effects, animation and creative arts programs offers a great opportunity to connect with the next generation of professionals and hopefully inspire them to pursue their dreams,” said Lim Sang. The Section held their first Summer BBQ last year and hosts an ongoing series of film screenings, panels and pub nights as social and professional gatherings to meet prospective members. Moving forward, the Board of Managers is exploring new programmatic options such as family-friendly events and opportunities to highlight local jobs and support recruitment and career development. The Washington Section clearly embodies the Society’s commitment to engaging and representing the full breadth of artists and innovators working across the visual effects industry. In summary, Lim Sang reinforces the mantra of the Section. “Opening up to the diversity of deeply talented colleagues all around us can be a game changer. We all benefit from an expanded worldview and the chance to create really interesting and often unexpected relationships. Here at the VES Washington Section, everyone is welcome.”


[ VES NEWS ]

VES Awards Qualification Changes, VES Celebrates SIGGRAPH By CHRIS MCKITTRICK

18TH ANNUAL VES AWARDS Each year, the VES Awards Committee refines the Rules and Procedures in response to the remarkable creative achievements and technological advancements happening in the VFX industry. For the 18th Annual VES Awards, there have been significant changes to the qualifications in the “Outstanding Virtual Cinematography in a CG Project” and the “Outstanding Compositing in a Feature” categories. Animated projects will now be able to compete in these two categories. As in prior years, the fifth entry slot in six of the eight general categories is reserved for a project’s Special Effects Coordinator or Supervisor. VES will hold its worldwide Nomination Event across the globe on January 4, 2020. As they vote on the nominees, VES members are invited to view the most amazing visual imagery of 2019, including exclusive behind-the-scenes footage, while also participating in the single best networking opportunity of the year. And finally, the nominees for the VES Awards will be celebrated when the winners are announced at the 18th Annual VES Awards Show on Wednesday, January 29, 2020 at the Beverly Hilton Hotel in Beverly Hills, California. VES CELEBRATES SIGGRAPH 2019 The Visual Effects Society, in partnership with its Los Angeles Section, held the VES SIGGRAPH Party on July 30 in downtown Los Angeles. VES members from around the globe and their guests took some time away from the busy conference to meet fellow members who make up the best and brightest in the visual effects field. The hugely popular event, which was free to VES members, gave SIGGRAPH attendees the opportunity to enjoy one of the many benefits that VES offers its global membership. VES was also highlighted during SIGGRAPH week at Walt Disney Animation Studios’ Women in Tech networking breakfast, along with Women in Animation, Women in Tech/Hollywood and the Association for Computing Machinery’s Council on Women in Computing (ACM-W). Camille Eden, VES member and Manager, Recruitment/Outreach at Disney Animation organized the event which drew a capacity crowd. VES Board of Directors Executive Committee member Rita Cahill addressed the packed house providing an overview about our Society and the important role women play in our industry.

TOP AND MIDDLE: VES members and guests at the 2019 VES SIGGRAPH Party in Los Angeles. BOTTOM: Camille Eden, VES member and organizer of Disney Animation’s Women in Tech breakfast with VES Executive Committee member Rita Cahill.

FALL 2019 VFXVOICE.COM • 95


[ FINAL FRAME ]

A Day at the Beach with The Addams Family

Is Pugsley burying a cadaver? Is Lurch mixing up a poisonous cocktail? These might be some of the questions a viewer might ask gazing at this 14-foot by 4-foot mural originally hanging in a Westhampton, New York hotel called Dune Deck. The year was 1952. Today it hangs in a library at Penn State University after the hotel changed hands and a Penn State alumnus donated it. A little later the image appeared in The New Yorker magazine. The Addams Family can truly be called a multimedia family. The brainchild of American cartoonist Charles Addams, he would often say his weirdly created family was inspired by his hometown of Westfield, New Jersey, which contained many old cemeteries and numerous Victorian mansions. Since he created the macabre family, the characters have been the subject of multiple cartoons, comics, live-action and animated movies, books, TV series (live-action and animated), TV specials, direct-to-videos, musicals, video games and a pinball machine. The latest iteration of the clan is a major new animated film version in Fall 2019 (see cover story). The iconic TV series debuted in 1964. Featuring actors John Astin as Gomez, Carolyn Jones as Morticia and Jackie Coogan as Uncle Fester, the special effects back then were provided by Larry Chapman, Joe Zomar, Robert Cole and Bob Overbeck, according to IMDb. As a matter of fact, the disembodied hand, Thing, was played in the TV show’s opening titles by Richard Edlund, VES, a member of the Society’s Board, VES Fellow and Lifetime Achievement Award honoree. This mural, entitled “An Addams Family Holiday,” shows the familiar characters (left to right) Pugsley, Wednesday, Gomez, Aristotle the Octopus, Fester, Morticia, Lurch and Grandmama Addams. The family is oblivious to the ‘normal’ beachgoers who are fleeing in panic.

Image copyright © 1952 The Addams Family™ Tee and Charles Addams Foundation. All rights reserved.

96 • VFXVOICE.COM FALL 2019




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.