VFX Voice Winter 2021

Page 1

VFXVOICE.COM WINTER 2021

SOUL POWER

THE FUTURE OF FILM • GREYHOUND • OVER THE MOON • HIS DARK MATERIALS PROJECT POWER REMOTELY • FILM SCHOOLS • PROFILES: SARA BENNETT & ZOE CRANLEY

CVR1 VFXV issue 16 WINTER 2021.indd 1

12/16/20 1:38 PM


CVR2 NETFLIX THE MIDNIGHT SKY AD.indd 2

12/16/20 1:37 PM


PG 1 AUTODESK AD.indd 1

12/16/20 1:40 PM


[ EXECUTIVE NOTE ]

Welcome to the Winter issue of VFX Voice! As we enter 2021, we wish you a New Year filled with good health and happiness. VFX Voice appreciates being part of this strong, vibrant and resilient global community. And we continue to value the opportunity to keep us tethered through stories that celebrate hard-working, visionary visual effects talent across the world. In this issue, our cover story goes inside the making of Soul, Pixar’s animated take on the meaning of life. We sit down with Oscar-winning VFX Supervisor Sara Bennett, and DNEG’s Head of CG Vancouver, Zoe Cranley. We go deep on the industry pivot amidst these unprecedented times, looking at VFX trends in virtual production and indie virtual production, and the future of film itself. Catch our TV/streaming profiles on Season 2 of His Dark Materials, animated musical Over the Moon, superhero film Project Power and our big-screen profile on the naval VFX in Greyhound. We bring you an inside look at how film schools have shifted to online education, new composting packages in Tech & Tools, breakthrough advances in VR software and immersive experiences, and tips from the pros behind the acclaimed VES Handbook of Visual Effects. And we continue to shine a light on our regional sections, with a worldwide roundup of news and events keeping members engaged, educated and entertained. It’s all here and more. VFX Voice is proud to be the definitive authority of all things VFX. We continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM WINTER 2021

PG 2 EXECUTIVE NOTE.indd 2

12/16/20 1:42 PM


PG 3 DISNEY ONWARD AD.indd 3

12/16/20 1:43 PM


[ CONTENTS ] FEATURES 8 FILM: THE FUTURE OF FILM Returning to the old norms, adjusting to the new.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 THE VES HANDBOOK

14 TV/STREAMING: PROJECT POWER How Framestore completed the superhero film remotely. 20 TV/STREAMING: HIS DARK MATERIALS New characters, creatures, environments propel Season 2. 28 PROFILE: SARA BENNETT Oscar-winning VFX supervisor aims to excel and inspire. 34 COVER: SOUL Pixar animates uncharted territory in The Great Before.

92 VES SECTION: GLOBAL ROUNDUP 94 VES NEWS 96 FINAL FRAME: FILM SCHOOLS

ON THE COVER: Jazz musician Joe Gardner (Jamie Foxx) is in “the zone” in Soul. (Image courtesy of Pixar/Disney)

42 VFX TRENDS: VIRTUAL PRODUCTION Industry pros see immediate benefits in trying virtual. 50 VFX TRENDS: INDIE VIRTUAL PRODUCTION Real-time offers new opportunities for indie filmmakers. 56 FILM: GREYHOUND Experienced hands at deep ocean VFX guide the ship. 62 PROFILE: ZOE CRANLEY The global journey of DNEG’s Head of CG Vancouver. 68 TV/STREAMING: OVER THE MOON Thinking, feeling characters spark Netflix animated musical. 74 FILM SCHOOLS: LEARNING NEW WAYS Zoom and remote tools help ease shift to online education. 80 TECH & TOOLS: COMPOSITING Compositing the same shot in three different packages. 86 VR/AR/MR TRENDS: VR SOFTWARE Breakthrough titles, inviting new experiences lure new users.

4 • VFXVOICE.COM WINTER 2021

PG 4 TOC.indd 4

12/16/20 1:43 PM


PG 5 EPIC GAMES AD.indd 5

12/16/20 1:44 PM


WINTER 2021 • VOL. 5, NO. 1

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING María Margarita López VFXVoiceAds@gmail.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Marun Eken Ian Failes Naomi Goldman Trevor Hogg Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Mike Chambers, Chair Lisa Cooke, 1st Vice Chair Emma Clifton Perry, 2nd Vice Chair Laurie Blavin, Treasurer Rita Cahill, Secretary DIRECTORS Neishaw Ali, Brooke Breton, Kathryn Brillhart Colin Campbell, Bob Coleman, Kim Davidson Rose Duignan, Camille Eden, Michael Fink, VES Dennis Hoffman, Jeff Kleiser, Kim Lavery, VES Tim McGovern, Karen Murphy, Susan O’Neal Jeffrey A. Okun, VES, Jim Rygiel, Lisa Sepp-Wilson Katie Stetson, David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Gavin Graham, Bryan Grill, Arnon Manor Andres Martinez, Dan Schrecker, David Valentin Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media

Tom Atkin, Founder Allen Battino, VES Logo Design VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2021 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM WINTER 2021

PG 6 MASTHEAD.indd 6

12/16/20 1:44 PM


PG 7 PEACOCK AD.indd 7

12/16/20 1:45 PM


FILM

THE FUTURE OF FILM IN THE POST-COVID ERA By TREVOR HOGG

TOP: MPC was responsible for stitching separate takes into one continuous shot for the Oscar-winning visual effects of 1917. (Image courtesy of Universal Pictures) OPPOSITE TOP: Engerraund Serac (Vincent Cassel) flies in a personal drone surrounded by LED walls in a scene from Westworld. (Image courtesy of HBO) OPPOSITE MIDDLE: Imagery is projected onto LED walls in the pilot episode of Westworld, shot by Paul Cameron for the scene when Dolores Abernathy (Evan Rachel Wood) witnesses futuristic Los Angeles for the first time. (Image courtesy of HBO) OPPOSITE BOTTOM: A signature effect for Watchmen was the reflective mask worn by Looking Glass, created by MARZ. (Image courtesy of HBO)

8 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 8-9

In March 2020, the escalating international situation brought on by the coronavirus pandemic resulted in governmental work-fromhome mandates to prevent the spread of the disease. As the film industry slowly recommenced productions last August, protocols were implemented that will continue to evolve. Special Effects Supervisor Chris Corbould (No Time to Die) isn’t sure whether the ramifications of COVID-19 will have a lasting impact. “It is difficult to say, at the least; however, I am determined that this horrible virus will not stop us from providing spectacular films that will make us forget the events of 2020. I am currently fighting to make filming a viable process, as it used to be, but also providing film fans with a relief from some heartfelt tragedies.” In the meantime, changes will be implemented to counter the impacts of the virus on the industry. “COVID-19 and social distancing are going to inform everything in our business until we get a handle on it, including set design and location selection,” notes Production Designer Scott Chambliss (Star Trek Into Darkness). “Real crowd scenes will likely become virtual ones, and a scripted cramped environment may turn into one with, if not windows, then perhaps lots of vent ducting present. Lasting changes are hard to predict, but habits do form with repetition. New choices required by COVID management may become accepted norms over time.” Human nature and the requirements of movie-making could work against establishing a ‘new’ normal. “From what I’ve seen and understand about people, this is going to go back to the old norm whether it is safe or not,” remarks Cinematographer Paul Cameron (Man on Fire). “We can’t go about this job differently. You can’t put a camera on a remote head or rig without close proximity and multiple people holding stuff, passing batteries and cables, and trying to shoot things. All of a sudden, everybody’s face ends up inches apart.”

Post-production has also been impacted. “We finished a score for a Netflix series that featured a small musical ensemble, the kind you would typically see playing together in a room,” explains Composer Jeremy Turner (Five Came Back). “But this time around, we had to get unusually granular with who, where and how we recorded the music, then find clever ways of mixing the score so that it sounded seamless and all felt natural. When size and scale increase, the challenges obviously compound. I genuinely wonder how smaller projects will even get made during these times. My guess is that there will be a real shift where location might dominate the decision-making process. Not only for shooting or tax credits, but for an entire production, start to finish.” On the sound side of the film industry, Mark DeSimone (Boardwalk Empire), ADR Mixer/Re-Recording Mixer at Soundtrack NY Recording, remarks, “With a sudden stoppage of production, and after regrouping and reinventing how to move forward, at least with the post-production side, there were a number of ways to continue in some capacity. More editors are working from home and connecting remotely with directors and production staff. ADR Mixers sending out remote kits to the actors’ homes and being able to have some control, while Zooming in anyone who needs to be part of the session. Not ideal, but a way to keep moving. As New York City numbers came down drastically after months of struggle, we have been able to see more actors in the studio, while Zooming in everyone else. We have even had directors in the studio safely, with a social distance and masks for everyone.” Frequent Robert Zemeckis collaborator Jeremiah O’Driscoll (The Walk) provides the perspective from the cutting room. “In anticipation of California’s statewide lockdown orders, our producers and studio post-production team secured our editorial

“I am determined that this horrible virus will not stop us from providing spectacular films that will make us forget the events of 2020.” —Chris Corbould, Special Effects Supervisor

WINTER 2021 VFXVOICE.COM • 9

12/16/20 1:46 PM


FILM

THE FUTURE OF FILM IN THE POST-COVID ERA By TREVOR HOGG

TOP: MPC was responsible for stitching separate takes into one continuous shot for the Oscar-winning visual effects of 1917. (Image courtesy of Universal Pictures) OPPOSITE TOP: Engerraund Serac (Vincent Cassel) flies in a personal drone surrounded by LED walls in a scene from Westworld. (Image courtesy of HBO) OPPOSITE MIDDLE: Imagery is projected onto LED walls in the pilot episode of Westworld, shot by Paul Cameron for the scene when Dolores Abernathy (Evan Rachel Wood) witnesses futuristic Los Angeles for the first time. (Image courtesy of HBO) OPPOSITE BOTTOM: A signature effect for Watchmen was the reflective mask worn by Looking Glass, created by MARZ. (Image courtesy of HBO)

8 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 8-9

In March 2020, the escalating international situation brought on by the coronavirus pandemic resulted in governmental work-fromhome mandates to prevent the spread of the disease. As the film industry slowly recommenced productions last August, protocols were implemented that will continue to evolve. Special Effects Supervisor Chris Corbould (No Time to Die) isn’t sure whether the ramifications of COVID-19 will have a lasting impact. “It is difficult to say, at the least; however, I am determined that this horrible virus will not stop us from providing spectacular films that will make us forget the events of 2020. I am currently fighting to make filming a viable process, as it used to be, but also providing film fans with a relief from some heartfelt tragedies.” In the meantime, changes will be implemented to counter the impacts of the virus on the industry. “COVID-19 and social distancing are going to inform everything in our business until we get a handle on it, including set design and location selection,” notes Production Designer Scott Chambliss (Star Trek Into Darkness). “Real crowd scenes will likely become virtual ones, and a scripted cramped environment may turn into one with, if not windows, then perhaps lots of vent ducting present. Lasting changes are hard to predict, but habits do form with repetition. New choices required by COVID management may become accepted norms over time.” Human nature and the requirements of movie-making could work against establishing a ‘new’ normal. “From what I’ve seen and understand about people, this is going to go back to the old norm whether it is safe or not,” remarks Cinematographer Paul Cameron (Man on Fire). “We can’t go about this job differently. You can’t put a camera on a remote head or rig without close proximity and multiple people holding stuff, passing batteries and cables, and trying to shoot things. All of a sudden, everybody’s face ends up inches apart.”

Post-production has also been impacted. “We finished a score for a Netflix series that featured a small musical ensemble, the kind you would typically see playing together in a room,” explains Composer Jeremy Turner (Five Came Back). “But this time around, we had to get unusually granular with who, where and how we recorded the music, then find clever ways of mixing the score so that it sounded seamless and all felt natural. When size and scale increase, the challenges obviously compound. I genuinely wonder how smaller projects will even get made during these times. My guess is that there will be a real shift where location might dominate the decision-making process. Not only for shooting or tax credits, but for an entire production, start to finish.” On the sound side of the film industry, Mark DeSimone (Boardwalk Empire), ADR Mixer/Re-Recording Mixer at Soundtrack NY Recording, remarks, “With a sudden stoppage of production, and after regrouping and reinventing how to move forward, at least with the post-production side, there were a number of ways to continue in some capacity. More editors are working from home and connecting remotely with directors and production staff. ADR Mixers sending out remote kits to the actors’ homes and being able to have some control, while Zooming in anyone who needs to be part of the session. Not ideal, but a way to keep moving. As New York City numbers came down drastically after months of struggle, we have been able to see more actors in the studio, while Zooming in everyone else. We have even had directors in the studio safely, with a social distance and masks for everyone.” Frequent Robert Zemeckis collaborator Jeremiah O’Driscoll (The Walk) provides the perspective from the cutting room. “In anticipation of California’s statewide lockdown orders, our producers and studio post-production team secured our editorial

“I am determined that this horrible virus will not stop us from providing spectacular films that will make us forget the events of 2020.” —Chris Corbould, Special Effects Supervisor

WINTER 2021 VFXVOICE.COM • 9

12/16/20 1:46 PM


FILM

crew with home workstations, improved our home Internet connections [as necessary], and provided access to essential communication systems via Slack, Zoom and BlueJeans. All work and sessions were attended, previewed and enjoyed from the privacy of our homes. “I must say,” O’Driscoll adds, “freed from the time constraints and stress of my usual two hours plus daily commute to and from our offices and cutting rooms, the quality of my attention to my work also increased. The occasional need to check material on a large screen was handled by occasional trips to Mr. Zemeckis’ empty office screening room. I can envisage a continuation of many of these economies and protocols becoming standards in our industry. An editor’s facetime in the room with the director is invaluable in our convoluted process on the way to telling the most effective story from material we inherit. I wish I could say the path becomes easier with time and experience, but every film is its own beast and wonder.” “If anything, this [past] COVID year has given visual effects studios an opportunity to prepare for the changing landscape that started shifting years ago,” states Lon Molnar, Co-President of Monsters Aliens Robots Zombies (MARZ). “The visual effects model as it currently stands is based on feature films – a long post-production period driven by the volume and complexity of visual effects. Condensed timelines are becoming the norm, but the industry hasn’t yet solved for that. The prevailing attitude is to continue throwing bodies at the problem, as has always been done. From the outset, MARZ recognized that the solution is technology. Technology has always helped find efficient ways to speed up the process and bring more spectacle to film and especially to TV. It doesn’t mean that incredible talent isn’t needed. If anything, the technology we’re developing is meant to free the labor to focus on the art. There is no reason to completely change the business – only to adapt and get better at it.” With movie theaters operating at restricted capacity, drive-in theaters have made a comeback and streaming has become a more accepted form of distribution, with Universal Pictures releasing Trolls World Tour on various digital platforms and Mulan appearing on Disney+. “Studios have been forced to consider straight-to-streaming and PVOD releases in place of the box office, and it looks like this may become a permanent alteration to the distribution model, with shorter theatrical windows and more direct-to-consumer content,” remarks Philip Greenlow, Global Executive Producer at MPC Film. “Visual effects have been quick to offer potential solutions to help the recovery of physical production,” Greenlow adds, “from remote and virtual prep to LED walls. All things which have been evolving steadily in recent times and may now become more prevalent.” Tom Williams, Managing Director at MPC Episodic, agrees with his colleague. “COVID-19 has created such an intense focus on new ways of working and new toolsets; we are constantly talking to our clients about their needs, but now through this COVID lens. It’s made us scrutinize why we worked a certain way in the past and it’s opened up ways of working we never thought would

10 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 10-11

be possible. On the flip side, it has also put focus on what we miss about centralized working – that the creative process is a personal experience to be shared and enjoyed. As things begin to ease, we plan to maintain that flexible working practice where possible and get together when necessary, but make the most of both states.” Pixomondo CEO Jonny Slow turns to a famous business tycoon for an appropriate description of the events of 2020. “Warren Buffet once famously said: ‘Only when the tide goes out do you discover who’s been swimming naked.’ For the film industry, in 2020, the tide truly went out – a good part of a year with pretty much no cameras on. This will have been very painful for almost every company involved in production. From the smallest ones to the largest ones. It will take many years to recover the losses and the lost time. “The delays caused by COVID, and the safety protocols being put in place right now,” Slow assesses, “will prove to be the clear tipping point for any technique that can save time, save costs and contribute to a safer production environment. When the tide comes back fully in 2021, the backlog will mean that production schedules are under more time pressure than ever before. We are going to see many productions embracing technology-driven techniques out of necessity – and once they have, they will never go back to how it used to be done. And the world will be a better place as a result – fewer miles traveled, less energy consumed, less waste, less time away from families, easier collaboration, more time to focus – and much more work possible for prolific producers than they could ever handle before. Time for a new swimsuit!”

Despite not relying on live-action footage, the animation industry was nevertheless impacted by the coronavirus lockdown. “I have two predictions,” notes Steve May, CTO of Pixar. “First, the euphoria of, ‘Hey, we can work from home’ will be replaced by the recognition that highly creative processes do best when we are in-person, in shared spaces, in environments that inspire us and encourage serendipitous interactions. “Second,” continues May, “one of the advantages of Zoom is that it has allowed us to be more inclusive in our creative processes. It’s easier to quickly share work and organically invite stakeholders into short conversations where, in-person, this is typically hard to do with more than two people. Likewise, by ‘listening in,’ a larger portion of the crew gets a better understanding of the creative decisions happening and how their individual work contributes to the larger filmmaking process. We have traditionally been cautious

OPPOSITE TOP: Charlotte Hale (Tessa Thompson) looks on at a personal drone in a shot created by Pixomondo for Westworld. (Image courtesy of HBO) OPPOSITE MIDDLE: Robin McKenna produced Thanadoula, a six-minute short, for the National Film Board of Canada. (Image courtesy of the National Film Board of Canada) OPPOSITE BOTTOM: A camera car supplied by William F. White International. (Image courtesy of William F. White International) TOP: Joe Gardner (voice of Jamie Foxx) gets the chance of a lifetime to play the piano in a jazz quartet headed by the great Dorothea Williams (Angela Bassett) in Soul. (Image courtesy of Disney/Pixar)

WINTER 2021 VFXVOICE.COM • 11

12/16/20 1:46 PM


FILM

crew with home workstations, improved our home Internet connections [as necessary], and provided access to essential communication systems via Slack, Zoom and BlueJeans. All work and sessions were attended, previewed and enjoyed from the privacy of our homes. “I must say,” O’Driscoll adds, “freed from the time constraints and stress of my usual two hours plus daily commute to and from our offices and cutting rooms, the quality of my attention to my work also increased. The occasional need to check material on a large screen was handled by occasional trips to Mr. Zemeckis’ empty office screening room. I can envisage a continuation of many of these economies and protocols becoming standards in our industry. An editor’s facetime in the room with the director is invaluable in our convoluted process on the way to telling the most effective story from material we inherit. I wish I could say the path becomes easier with time and experience, but every film is its own beast and wonder.” “If anything, this [past] COVID year has given visual effects studios an opportunity to prepare for the changing landscape that started shifting years ago,” states Lon Molnar, Co-President of Monsters Aliens Robots Zombies (MARZ). “The visual effects model as it currently stands is based on feature films – a long post-production period driven by the volume and complexity of visual effects. Condensed timelines are becoming the norm, but the industry hasn’t yet solved for that. The prevailing attitude is to continue throwing bodies at the problem, as has always been done. From the outset, MARZ recognized that the solution is technology. Technology has always helped find efficient ways to speed up the process and bring more spectacle to film and especially to TV. It doesn’t mean that incredible talent isn’t needed. If anything, the technology we’re developing is meant to free the labor to focus on the art. There is no reason to completely change the business – only to adapt and get better at it.” With movie theaters operating at restricted capacity, drive-in theaters have made a comeback and streaming has become a more accepted form of distribution, with Universal Pictures releasing Trolls World Tour on various digital platforms and Mulan appearing on Disney+. “Studios have been forced to consider straight-to-streaming and PVOD releases in place of the box office, and it looks like this may become a permanent alteration to the distribution model, with shorter theatrical windows and more direct-to-consumer content,” remarks Philip Greenlow, Global Executive Producer at MPC Film. “Visual effects have been quick to offer potential solutions to help the recovery of physical production,” Greenlow adds, “from remote and virtual prep to LED walls. All things which have been evolving steadily in recent times and may now become more prevalent.” Tom Williams, Managing Director at MPC Episodic, agrees with his colleague. “COVID-19 has created such an intense focus on new ways of working and new toolsets; we are constantly talking to our clients about their needs, but now through this COVID lens. It’s made us scrutinize why we worked a certain way in the past and it’s opened up ways of working we never thought would

10 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 10-11

be possible. On the flip side, it has also put focus on what we miss about centralized working – that the creative process is a personal experience to be shared and enjoyed. As things begin to ease, we plan to maintain that flexible working practice where possible and get together when necessary, but make the most of both states.” Pixomondo CEO Jonny Slow turns to a famous business tycoon for an appropriate description of the events of 2020. “Warren Buffet once famously said: ‘Only when the tide goes out do you discover who’s been swimming naked.’ For the film industry, in 2020, the tide truly went out – a good part of a year with pretty much no cameras on. This will have been very painful for almost every company involved in production. From the smallest ones to the largest ones. It will take many years to recover the losses and the lost time. “The delays caused by COVID, and the safety protocols being put in place right now,” Slow assesses, “will prove to be the clear tipping point for any technique that can save time, save costs and contribute to a safer production environment. When the tide comes back fully in 2021, the backlog will mean that production schedules are under more time pressure than ever before. We are going to see many productions embracing technology-driven techniques out of necessity – and once they have, they will never go back to how it used to be done. And the world will be a better place as a result – fewer miles traveled, less energy consumed, less waste, less time away from families, easier collaboration, more time to focus – and much more work possible for prolific producers than they could ever handle before. Time for a new swimsuit!”

Despite not relying on live-action footage, the animation industry was nevertheless impacted by the coronavirus lockdown. “I have two predictions,” notes Steve May, CTO of Pixar. “First, the euphoria of, ‘Hey, we can work from home’ will be replaced by the recognition that highly creative processes do best when we are in-person, in shared spaces, in environments that inspire us and encourage serendipitous interactions. “Second,” continues May, “one of the advantages of Zoom is that it has allowed us to be more inclusive in our creative processes. It’s easier to quickly share work and organically invite stakeholders into short conversations where, in-person, this is typically hard to do with more than two people. Likewise, by ‘listening in,’ a larger portion of the crew gets a better understanding of the creative decisions happening and how their individual work contributes to the larger filmmaking process. We have traditionally been cautious

OPPOSITE TOP: Charlotte Hale (Tessa Thompson) looks on at a personal drone in a shot created by Pixomondo for Westworld. (Image courtesy of HBO) OPPOSITE MIDDLE: Robin McKenna produced Thanadoula, a six-minute short, for the National Film Board of Canada. (Image courtesy of the National Film Board of Canada) OPPOSITE BOTTOM: A camera car supplied by William F. White International. (Image courtesy of William F. White International) TOP: Joe Gardner (voice of Jamie Foxx) gets the chance of a lifetime to play the piano in a jazz quartet headed by the great Dorothea Williams (Angela Bassett) in Soul. (Image courtesy of Disney/Pixar)

WINTER 2021 VFXVOICE.COM • 11

12/16/20 1:46 PM


FILM

TOP: A shot from the short film Construct by Kevin Margo which utilizes real-time ray tracing. (Image courtesy of Chaos Group) BOTTOM: Mark DeSimone does a sound mix for the documentary Art Bastard. (Image courtesy of Amazon Studios) OPPOSITE TOP: ILM worked with Unreal Engine to produce the Emmy-winning visual effects for The Mandalorian. (Image courtesy of Disney)

12 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 13

about opening these kinds of meetings up to more of the crew. Working remotely forced us to and we’re seeing benefits.” “Prior to 2020, we made many strategic moves for a more nomadic production lifestyle,” observes Mandy Tankenson, Senior Vice President, Head of Production at Sony Pictures Imageworks. “With the rise of the pandemic, our industry and company were thrust into getting our artist and production teams home quickly, and safely, while still working out plans to deliver the work that was promised to our clients. Connections are now made through Google Meet, Zoom calls, online happy hours and online trivia contests. Daily check-ins with the artists and production teams ensure that the camaraderie that delivers the best imagery is maintained. “As we continue working outside the office, where we can’t ‘bump into’ someone from another department,” Tankenson remarks, “we have to plan for those unscheduled conversations that lead to big-picture ideas. Our goals for innovating now encompass a broader scope. It’s not just about the tools to make amazing imagery, but also about that importance of keeping us connected to one another.” “Interestingly, even before the pandemic happened, the National Film Board had begun using state-of-the-art technology to digitize and restore some 4,350 films and interactive works to make them available online,” states Julie Roy, Director General of Creation and Innovation at the NFB of Canada. “The NFB recently did its first-ever remote color-grading on an animated short being co-produced with France and Denmark. The two co-directors were supposed to travel to the NFB to supervise that part of the work, but instead took part in the color-grading session from Viborg

[Denmark], taking cues from our color grader, who was at our head office in Montreal. This experiment and many others are quite valuable for an institution like ours, which has 10 studios all across Canada and co-produces with partners all over the world. “The whole process of implementing new shooting protocols along with the reorganization of our production calendars has been quite demanding and required plenty of agility and hard work,” Roy reveals. “Where we thought everything had come to a standstill at one point, we now see that everything has accelerated and intensified. And then there’s the visibility of our content online, which has exploded, with nfb.ca logging four times as many monthly visitors as before.” Suppliers and film studios have not been immune to the repercussions caused by the coronavirus. “We’re currently supporting a production that’s testing the limits of virtual sets through the use of LED walls, digital programming and good old-fashioned movie magic,” states Garin Josey, Executive Vice President and COO of William F. White International Inc. “By combining pre-built, high-quality LED video walls with real-life actors and specific props, productions can limit the amount of people and locations needed to pull a scene together. Clients are already making room in their budgets to incorporate long-reaching camera cranes in order to keep everyone separated. These cranes can even be pre-programmed, which further limits the number of people required on set. “Productions are also investing in PPE [Personal Protective Equipment] in the form of multiple hand-washing stations and medical-grade air scrubbers to keep people and the air they breathe routinely disinfected,” says Josey. “At William F. White,

we’re ensuring all our rental equipment and studio properties are sanitized to the highest possible standard. We’ve worked hard behind the scenes to support clients with stringent protocols, so they feel safe when equipment passes from our hands to theirs. After all, we’re all in this together.” The movie industry has been in flux for a long time. “In 2020, even before the spread of COVID-19, it felt as though festivals had become the last haven for independent films,” notes Casey Baron, Senior Film Program Director at the Austin Film Festival. “Netflix and Amazon, for example, provided some outlets for distribution, but those programs have tightened up as of late. COVID-19 is subverting the notion of normal lives and escalating poignant conversations surrounding inclusivity and racial injustice. For AFF, we’ve had to specifically consider how to stay true to the ethos of our organization. As the Writers Conference is known for fostering organic intimate moments between writers, we’ve put a lot of thought into any possible way to recreate some semblance of those moments for our audience. That’s the challenge facing all festivals at this point in time really. “How do we recreate that level of connectivity when we can’t get closer than six feet to each other?” asks Baron. “It’s interesting to watch a filmmaker like Christopher Nolan and Warner Bros. partner with Epic Games and Fortnite to host events around an Inception screening and the reveal of a new trailer for his film Tenet. Perhaps that holds some clue for the evolution of film industry exhibition and showcase. The capacity for independent films to have a space in this rapid evolution will be paramount to defining what the future of the industry entails, but I do think those spaces will come to the fore soon enough.”

WINTER 2021 VFXVOICE.COM • 13

12/16/20 1:46 PM


FILM

TOP: A shot from the short film Construct by Kevin Margo which utilizes real-time ray tracing. (Image courtesy of Chaos Group) BOTTOM: Mark DeSimone does a sound mix for the documentary Art Bastard. (Image courtesy of Amazon Studios) OPPOSITE TOP: ILM worked with Unreal Engine to produce the Emmy-winning visual effects for The Mandalorian. (Image courtesy of Disney)

12 • VFXVOICE.COM WINTER 2021

PG 8-13 FILM FUTURE.indd 13

about opening these kinds of meetings up to more of the crew. Working remotely forced us to and we’re seeing benefits.” “Prior to 2020, we made many strategic moves for a more nomadic production lifestyle,” observes Mandy Tankenson, Senior Vice President, Head of Production at Sony Pictures Imageworks. “With the rise of the pandemic, our industry and company were thrust into getting our artist and production teams home quickly, and safely, while still working out plans to deliver the work that was promised to our clients. Connections are now made through Google Meet, Zoom calls, online happy hours and online trivia contests. Daily check-ins with the artists and production teams ensure that the camaraderie that delivers the best imagery is maintained. “As we continue working outside the office, where we can’t ‘bump into’ someone from another department,” Tankenson remarks, “we have to plan for those unscheduled conversations that lead to big-picture ideas. Our goals for innovating now encompass a broader scope. It’s not just about the tools to make amazing imagery, but also about that importance of keeping us connected to one another.” “Interestingly, even before the pandemic happened, the National Film Board had begun using state-of-the-art technology to digitize and restore some 4,350 films and interactive works to make them available online,” states Julie Roy, Director General of Creation and Innovation at the NFB of Canada. “The NFB recently did its first-ever remote color-grading on an animated short being co-produced with France and Denmark. The two co-directors were supposed to travel to the NFB to supervise that part of the work, but instead took part in the color-grading session from Viborg

[Denmark], taking cues from our color grader, who was at our head office in Montreal. This experiment and many others are quite valuable for an institution like ours, which has 10 studios all across Canada and co-produces with partners all over the world. “The whole process of implementing new shooting protocols along with the reorganization of our production calendars has been quite demanding and required plenty of agility and hard work,” Roy reveals. “Where we thought everything had come to a standstill at one point, we now see that everything has accelerated and intensified. And then there’s the visibility of our content online, which has exploded, with nfb.ca logging four times as many monthly visitors as before.” Suppliers and film studios have not been immune to the repercussions caused by the coronavirus. “We’re currently supporting a production that’s testing the limits of virtual sets through the use of LED walls, digital programming and good old-fashioned movie magic,” states Garin Josey, Executive Vice President and COO of William F. White International Inc. “By combining pre-built, high-quality LED video walls with real-life actors and specific props, productions can limit the amount of people and locations needed to pull a scene together. Clients are already making room in their budgets to incorporate long-reaching camera cranes in order to keep everyone separated. These cranes can even be pre-programmed, which further limits the number of people required on set. “Productions are also investing in PPE [Personal Protective Equipment] in the form of multiple hand-washing stations and medical-grade air scrubbers to keep people and the air they breathe routinely disinfected,” says Josey. “At William F. White,

we’re ensuring all our rental equipment and studio properties are sanitized to the highest possible standard. We’ve worked hard behind the scenes to support clients with stringent protocols, so they feel safe when equipment passes from our hands to theirs. After all, we’re all in this together.” The movie industry has been in flux for a long time. “In 2020, even before the spread of COVID-19, it felt as though festivals had become the last haven for independent films,” notes Casey Baron, Senior Film Program Director at the Austin Film Festival. “Netflix and Amazon, for example, provided some outlets for distribution, but those programs have tightened up as of late. COVID-19 is subverting the notion of normal lives and escalating poignant conversations surrounding inclusivity and racial injustice. For AFF, we’ve had to specifically consider how to stay true to the ethos of our organization. As the Writers Conference is known for fostering organic intimate moments between writers, we’ve put a lot of thought into any possible way to recreate some semblance of those moments for our audience. That’s the challenge facing all festivals at this point in time really. “How do we recreate that level of connectivity when we can’t get closer than six feet to each other?” asks Baron. “It’s interesting to watch a filmmaker like Christopher Nolan and Warner Bros. partner with Epic Games and Fortnite to host events around an Inception screening and the reveal of a new trailer for his film Tenet. Perhaps that holds some clue for the evolution of film industry exhibition and showcase. The capacity for independent films to have a space in this rapid evolution will be paramount to defining what the future of the industry entails, but I do think those spaces will come to the fore soon enough.”

WINTER 2021 VFXVOICE.COM • 13

12/16/20 1:46 PM


TV/STREAMING

our communication had developed a kind of shorthand which was enormously helpful. Having said that, hundreds of people on our VFX crews had to pivot to a work-from-home scenario in a matter of weeks, which was a mammoth undertaking from a technological and psychological standpoint.” João Sita, Visual Effects Supervisor, Framestore: “Framestore quickly adapted to the technical changes, providing artists with the necessary hardware to connect remotely, and the Systems team were heroes in staying up-to-date with artists’ requests regarding technical issues. As we moved to the work-from-home setup, maintaining our regular daily schedules as if we were back in the office was key to ensuring artists were engaged while upholding the established delivery rhythm.” Kevin Sears, CG Supervisor, Framestore: “We had a consistent small crew that bonded for at least a year prior to the pandemic. This was certainly an advantage with the tight communications in the FX and Assets teams as we began developing the VFX-heavy finale toward the end of 2019. Work from home was implemented at the beginning of March 2020. The facility teams and senior managers at Framestore took it very seriously and somehow got the entire studio in Montreal, where I’m located, remotely working in less than two weeks.”

PROJECT POWER: HOW FRAMESTORE ‘FINALED’ THE FILM MID-PANDEMIC By IAN FAILES

Matthew Twyford, VFX Supervisor and Head of 2D London, Framestore: “Unfortunately, a few members of our crews were ‘trapped’ overseas, so this required further management protocols to get them up-and-running wherever they were. When communicating with these artists it was amazing how the video calls had such varied backgrounds, from the misty Scandinavian rains to the glorious scenes of a sunny Mediterranean!” TECH SOLUTIONS FOR A REMOTE DELIVERY

Coline Six: “Framestore’s Systems team did an incredible job, and in a couple of weeks Project Power’s team was fully set up with Teradici, a technology which allows us to see our remote computer screen in a secure and encrypted way, and ready to deliver.” Matthew Twyford: “Project Power was a multi-site show for Framestore, so the disruption was staggered as the London, Montreal and Pune lockdowns came through at different times. The technical setup between sites meant we used different solutions for some of the artists and this allowed us to work multiple options to get the best artist setups worldwide.” João Sita: “Framestore deployed tools to allow for screen sharing in a secure environment so we could continue our reviews as in the office – running Framestore’s SRP review tool – and share

Framestore was only weeks away from delivering its visual effects for the Netflix film Project Power, directed by Henry Joost and Ariel Schulman, when the COVID-19 pandemic hit. Like just about all VFX studios, it moved to a work-from-home model, with artists in London, Montreal and Pune having to complete the final shots from home. To get a sense of how Framestore managed that transition process for the superhero film, several lead members of the team revealed what going into lockdown required from a technical point of view, how shots continued to be worked on, and how those last crucial weeks of delivery were managed remotely. THE MOMENT THINGS WENT REMOTE

TOP: Newt’s power is the ability to selfimmolate. Framestore took live-action photography with practical makeup effects and LED lights and added fire during the visual effects process. (Image copyright © 2020 Netflix) OPPOSITE BOTTOM: Rodrigo Santoro as Biggie in practical effects makeup and prosthetics. Biggie possesses the power to rapidly increase in size and strength. (Image copyright © 2020 Netflix)

14 • VFXVOICE.COM WINTER 2021

PG 14-18 PROJECT POWER.indd 14-15

Coline Six, Visual Effects Executive Producer, Framestore: “We were six weeks away from delivery when the lockdown happened. Everything happened very quickly, as it did everywhere in the world. From one day to the next, we had to leave the office and adapt to the work-from-home reality. We re-worked the schedule to accelerate and absorb the delay created by the crisis. We talked to Netflix who understood the situation, and we managed to deliver the show on time. Project Power was the first show that Framestore delivered during the pandemic.” Ivan Moran, Overall Visual Effects Supervisor, Framestore: “We were incredibly fortunate that we had already completed shooting and were into our final VFX delivery. By then, we all had such an innate trust and working relationship with each other, and

WINTER 2021 VFXVOICE.COM • 15

12/16/20 1:47 PM


TV/STREAMING

our communication had developed a kind of shorthand which was enormously helpful. Having said that, hundreds of people on our VFX crews had to pivot to a work-from-home scenario in a matter of weeks, which was a mammoth undertaking from a technological and psychological standpoint.” João Sita, Visual Effects Supervisor, Framestore: “Framestore quickly adapted to the technical changes, providing artists with the necessary hardware to connect remotely, and the Systems team were heroes in staying up-to-date with artists’ requests regarding technical issues. As we moved to the work-from-home setup, maintaining our regular daily schedules as if we were back in the office was key to ensuring artists were engaged while upholding the established delivery rhythm.” Kevin Sears, CG Supervisor, Framestore: “We had a consistent small crew that bonded for at least a year prior to the pandemic. This was certainly an advantage with the tight communications in the FX and Assets teams as we began developing the VFX-heavy finale toward the end of 2019. Work from home was implemented at the beginning of March 2020. The facility teams and senior managers at Framestore took it very seriously and somehow got the entire studio in Montreal, where I’m located, remotely working in less than two weeks.”

PROJECT POWER: HOW FRAMESTORE ‘FINALED’ THE FILM MID-PANDEMIC By IAN FAILES

Matthew Twyford, VFX Supervisor and Head of 2D London, Framestore: “Unfortunately, a few members of our crews were ‘trapped’ overseas, so this required further management protocols to get them up-and-running wherever they were. When communicating with these artists it was amazing how the video calls had such varied backgrounds, from the misty Scandinavian rains to the glorious scenes of a sunny Mediterranean!” TECH SOLUTIONS FOR A REMOTE DELIVERY

Coline Six: “Framestore’s Systems team did an incredible job, and in a couple of weeks Project Power’s team was fully set up with Teradici, a technology which allows us to see our remote computer screen in a secure and encrypted way, and ready to deliver.” Matthew Twyford: “Project Power was a multi-site show for Framestore, so the disruption was staggered as the London, Montreal and Pune lockdowns came through at different times. The technical setup between sites meant we used different solutions for some of the artists and this allowed us to work multiple options to get the best artist setups worldwide.” João Sita: “Framestore deployed tools to allow for screen sharing in a secure environment so we could continue our reviews as in the office – running Framestore’s SRP review tool – and share

Framestore was only weeks away from delivering its visual effects for the Netflix film Project Power, directed by Henry Joost and Ariel Schulman, when the COVID-19 pandemic hit. Like just about all VFX studios, it moved to a work-from-home model, with artists in London, Montreal and Pune having to complete the final shots from home. To get a sense of how Framestore managed that transition process for the superhero film, several lead members of the team revealed what going into lockdown required from a technical point of view, how shots continued to be worked on, and how those last crucial weeks of delivery were managed remotely. THE MOMENT THINGS WENT REMOTE

TOP: Newt’s power is the ability to selfimmolate. Framestore took live-action photography with practical makeup effects and LED lights and added fire during the visual effects process. (Image copyright © 2020 Netflix) OPPOSITE BOTTOM: Rodrigo Santoro as Biggie in practical effects makeup and prosthetics. Biggie possesses the power to rapidly increase in size and strength. (Image copyright © 2020 Netflix)

14 • VFXVOICE.COM WINTER 2021

PG 14-18 PROJECT POWER.indd 14-15

Coline Six, Visual Effects Executive Producer, Framestore: “We were six weeks away from delivery when the lockdown happened. Everything happened very quickly, as it did everywhere in the world. From one day to the next, we had to leave the office and adapt to the work-from-home reality. We re-worked the schedule to accelerate and absorb the delay created by the crisis. We talked to Netflix who understood the situation, and we managed to deliver the show on time. Project Power was the first show that Framestore delivered during the pandemic.” Ivan Moran, Overall Visual Effects Supervisor, Framestore: “We were incredibly fortunate that we had already completed shooting and were into our final VFX delivery. By then, we all had such an innate trust and working relationship with each other, and

WINTER 2021 VFXVOICE.COM • 15

12/16/20 1:47 PM


TV/STREAMING

that with the artists. SRP also proved to be really robust as a group reviewing tool, as we could have multiple artists connected to the same session, which allowed them to make annotations and load new versions on the fly.” Kevin Sears: “We use a screen-sharing software at Framestore called Trumpet to assist shared sessions, but also with a combination of RV syncing as well. Immediately we noticed some differences in the traditional dailies review process! Artists would have working files up and be giving real-time feedback on the state of a situation in a file, or in some cases we had notes for Lookdev and would see the new take at the end of a 30-minute review session. The dynamic of team members in a dark theater setting was the same, but the information accessible and the feedback loop therein was greatly improved.” Matthew Twyford: “Most artists were doing everything from home as if they were still working in their respective studios. They had to overcome a few challenges with calibrated monitoring and blacking out rooms, but although the quality of the hardware and software PCoIP solutions was extremely good, it was not good enough for our final QC checks. This meant that some of the firstteam members returning to the offices in London and Montreal were the 2D supervisors to utilize the calibrated 4K suites to approve final deliveries.” A CHANGE IN COMMUNICATION AND WORKFLOW

João Sita: “Keeping our VFX crew informed was the main priority in combating potential confusion, doubts or worries. The first few sessions of dailies from home were mostly spent answering questions, sharing experiences, chatting and maintaining team spirit. We were continuing to use our existing structure of Google Suite, but there was a much heavier emphasis on video chat than previously used at Framestore. Once the communications were set up, the processes of tuning and tweaking all the work from home procedures, hardware and software were greatly accelerated to get back to our targeted 100% productivity.” Jonathan Opgenhaffen, Art Director, Framestore: “I was lucky that I was able to take my work machine home with me about a week before the whole company went home. Once I had a secure connection to our art department server, it was pretty much business as usual. We conducted our regular reviews and meetings through Zoom, and in a way, being able to screen share really helped speed things along and kept discussions flowing. I could prepare a reference board and even keep my Photoshop 3D scenes open in the background, and if needed, I’d screen share those to discuss various aspects of what we were working on.” Coline Six: “In terms of communication and tools, Shotgun has always been our go-to, but even more so since working from home as we are not in the office and able to easily communicate with our colleagues. The show is clearly driven by the schedule, which is necessary, but sometimes you also need to brainstorm with people

16 • VFXVOICE.COM WINTER 2021

PG 14-18 PROJECT POWER.indd 16-17

and reassess work, which is less straightforward than before. Not sharing the same office space and being able to have meetings on the fly, and not asking questions in person has really changed our way of working.” João Sita: “We established a very direct communication flow, as we could squeeze a review between other meetings without having to check on room availabilities and the artists were just a click away. Another interesting aspect of the work-from-home setup was that it allowed artists to listen to feedback regarding their sequences while waiting for their shot and keep working on their own shots. In many instances, they were able to answer questions regarding a missing element in their setup and make changes on the spot.” Kevin Sears: “I directed my energy from physical meetings with multi-department input into having dedicated chat rooms about sequences or shots with all people involved and saw clarifications and focus come into action. We implemented fun Friday night socials with the Production team to take the stress off.” Matthew Twyford: “Viewing dailies ran close to our original schedule, but we soon realized that without the need to book a viewing suite we had the flexibility to expand, move or continue scrutinizing our dailies. This freedom gave me more valuable time to interact with the team, which is something I still enjoy about working from home. With everyone then joining in on chat video

after a few days, it started to feel like we were back in familiar territory and, more importantly, back on track. Our presentations to the directors were largely unaffected as Henry and Ariel were based mostly in New York throughout the post-production period and we just continued to use cineSync for our regular sessions.” João Sita: “One thing that we really put a lot of attention towards was how we would complete the ‘final tech checks’ and latest reviews of the shots before sending them to the client. For the final tech checks, for example, we created a two-step process with a first pass tech checking the shots from home, which would get the majority of major issues still existent in a shot and later – after the government allowed certain employees to access the building – going to the office to have the final review with the shots in our screening rooms. We also needed our 4K projectors in an accurately calibrated environment for the final nuances.” OPPOSITE TOP: Art demonstrates his super power in the film’s climax, via visual effects from Framestore. (Image copyright © 2020 Netflix) OPPOSITE MIDDLE: Art gets ready to consume a Power pill. (Photo: Skip Bolen. Image copyright © 2020 Netflix) OPPOSITE BOTTOM: The aftermath of Frank and Art’s encounter with Biggie. (Photo: Skip Bolen. Image copyright © 2020 Netflix) TOP: Colson Baker (aka Machine Gun Kelly) as Newt, a dealer of Power pills. (Photo: Skip Bolen. Image copyright © 2020 Netflix)

WINTER 2021 VFXVOICE.COM • 17

12/16/20 1:47 PM


TV/STREAMING

that with the artists. SRP also proved to be really robust as a group reviewing tool, as we could have multiple artists connected to the same session, which allowed them to make annotations and load new versions on the fly.” Kevin Sears: “We use a screen-sharing software at Framestore called Trumpet to assist shared sessions, but also with a combination of RV syncing as well. Immediately we noticed some differences in the traditional dailies review process! Artists would have working files up and be giving real-time feedback on the state of a situation in a file, or in some cases we had notes for Lookdev and would see the new take at the end of a 30-minute review session. The dynamic of team members in a dark theater setting was the same, but the information accessible and the feedback loop therein was greatly improved.” Matthew Twyford: “Most artists were doing everything from home as if they were still working in their respective studios. They had to overcome a few challenges with calibrated monitoring and blacking out rooms, but although the quality of the hardware and software PCoIP solutions was extremely good, it was not good enough for our final QC checks. This meant that some of the firstteam members returning to the offices in London and Montreal were the 2D supervisors to utilize the calibrated 4K suites to approve final deliveries.” A CHANGE IN COMMUNICATION AND WORKFLOW

João Sita: “Keeping our VFX crew informed was the main priority in combating potential confusion, doubts or worries. The first few sessions of dailies from home were mostly spent answering questions, sharing experiences, chatting and maintaining team spirit. We were continuing to use our existing structure of Google Suite, but there was a much heavier emphasis on video chat than previously used at Framestore. Once the communications were set up, the processes of tuning and tweaking all the work from home procedures, hardware and software were greatly accelerated to get back to our targeted 100% productivity.” Jonathan Opgenhaffen, Art Director, Framestore: “I was lucky that I was able to take my work machine home with me about a week before the whole company went home. Once I had a secure connection to our art department server, it was pretty much business as usual. We conducted our regular reviews and meetings through Zoom, and in a way, being able to screen share really helped speed things along and kept discussions flowing. I could prepare a reference board and even keep my Photoshop 3D scenes open in the background, and if needed, I’d screen share those to discuss various aspects of what we were working on.” Coline Six: “In terms of communication and tools, Shotgun has always been our go-to, but even more so since working from home as we are not in the office and able to easily communicate with our colleagues. The show is clearly driven by the schedule, which is necessary, but sometimes you also need to brainstorm with people

16 • VFXVOICE.COM WINTER 2021

PG 14-18 PROJECT POWER.indd 16-17

and reassess work, which is less straightforward than before. Not sharing the same office space and being able to have meetings on the fly, and not asking questions in person has really changed our way of working.” João Sita: “We established a very direct communication flow, as we could squeeze a review between other meetings without having to check on room availabilities and the artists were just a click away. Another interesting aspect of the work-from-home setup was that it allowed artists to listen to feedback regarding their sequences while waiting for their shot and keep working on their own shots. In many instances, they were able to answer questions regarding a missing element in their setup and make changes on the spot.” Kevin Sears: “I directed my energy from physical meetings with multi-department input into having dedicated chat rooms about sequences or shots with all people involved and saw clarifications and focus come into action. We implemented fun Friday night socials with the Production team to take the stress off.” Matthew Twyford: “Viewing dailies ran close to our original schedule, but we soon realized that without the need to book a viewing suite we had the flexibility to expand, move or continue scrutinizing our dailies. This freedom gave me more valuable time to interact with the team, which is something I still enjoy about working from home. With everyone then joining in on chat video

after a few days, it started to feel like we were back in familiar territory and, more importantly, back on track. Our presentations to the directors were largely unaffected as Henry and Ariel were based mostly in New York throughout the post-production period and we just continued to use cineSync for our regular sessions.” João Sita: “One thing that we really put a lot of attention towards was how we would complete the ‘final tech checks’ and latest reviews of the shots before sending them to the client. For the final tech checks, for example, we created a two-step process with a first pass tech checking the shots from home, which would get the majority of major issues still existent in a shot and later – after the government allowed certain employees to access the building – going to the office to have the final review with the shots in our screening rooms. We also needed our 4K projectors in an accurately calibrated environment for the final nuances.” OPPOSITE TOP: Art demonstrates his super power in the film’s climax, via visual effects from Framestore. (Image copyright © 2020 Netflix) OPPOSITE MIDDLE: Art gets ready to consume a Power pill. (Photo: Skip Bolen. Image copyright © 2020 Netflix) OPPOSITE BOTTOM: The aftermath of Frank and Art’s encounter with Biggie. (Photo: Skip Bolen. Image copyright © 2020 Netflix) TOP: Colson Baker (aka Machine Gun Kelly) as Newt, a dealer of Power pills. (Photo: Skip Bolen. Image copyright © 2020 Netflix)

WINTER 2021 VFXVOICE.COM • 17

12/16/20 1:47 PM


TV/STREAMING

THE TOUGHER PARTS OF REMOTE VFX DELIVERY

Matthew Twyford: “In terms of actual tough shots to pull off from home, the ‘Man on Fire’ sequence was particularly challenging, as smooth playback and image quality for the fire and smoke detail tested the compression algorithms of the PCoIP tools well beyond their limits. Therefore, there was a great deal of work approved on ‘trust’ where our artists used their collective experience to know the completed work was in line with our high standards of excellence. Having this talented and experienced crew was key in getting many of our shots approved, and this in turn allowed me to speed up our operations by offering our artists more responsibility for fine-detailed decisions.” Ivan Moran: “I did have Chinook helicopters flying outside my window while on reviews during the New York city protests though, which was a touch distracting. I think if we were still in VFX design mode early on in the project, it would have been far more challenging. That kind of brainstorming and ‘brain trust’ relies heavily on face-to-face human interaction, creative dialogue and debate. We are social animals – solitude is inherently difficult for us.” Coline Six: “Obviously, we also had to adapt to this new way and new world of working remotely. This involved more catch-ups with the team to maintain connections, more recaps from everyone in all departments to make sure we didn’t miss any information, resulting in more and more emails to read. One of the biggest challenges at the beginning of the pandemic was to prevent consequences that could occur from isolation. Young artists or production members living on their own without a family nearby can easily feel very isolated and demotivated, so our Production and HR teams are constantly working on providing employees with resources for mental health and suggesting tips and tools to maintain contact with everyone.” Jonathan Opgenhaffen: “Of course, I miss the social aspect of being able to sit around with the team and share ideas, techniques and stories, but in terms of work and productivity, it was as good if not better than normal. I was also nervous about what it would be like working from home, but it felt very natural and timekeeping wasn’t really an issue. If anything, the opposite – I have to remind myself to take a break and stop working at the end of the work day!” TOP: Framestore augmented live-action footage of Joseph Gordon-Levitt to show him withstanding the bullet hit, in slow-motion. (Image copyright © 2020 Netflix) MIDDLE: Gordon-Levitt as New Orleans Police Department Officer Frank Shaver during filming of the aftermath of him being shot in the head, which he survives because of the ‘Power’ pill the character has taken. (Image copyright © 2020 Netflix) BOTTOM: Filming an alley scene in New Orleans. (Photo: Skip Bolen. Image copyright © 2020 Netflix)

Coline Six: “Indeed, we have also noticed that artists in some departments are more efficient working from home. They can address notes in real-time during dailies, since they are in front of their screen while receiving notes from the supervisors. Other departments are slower, so it compensates. We will see what happens in the future, but like many other industries the VFX industry is undergoing a small revolution. And we will adapt, as we always do.”

18 • VFXVOICE.COM WINTER 2021

PG 14-18 PROJECT POWER.indd 18

12/16/20 1:47 PM


PG 19 RODEO AD.indd 19

12/16/20 1:47 PM


FILM

“The Northern Lights is more of a child’s odyssey, so every episode is a complete new world, plus we had a lot more background creatures with the last two episodes being Iorek Byrnison and the bears. The Subtle Knife is a more personal story. It’s exciting because we visit new worlds but bounce between them more, rather than introducing new ones every episode, so it is a different challenge.” —Russell Dodgson, Senior Visual Effects Supervisor

WIELDING THE SUBTLE KNIFE FOR HIS DARK MATERIALS By TREVOR HOGG

TOP: A mandate for the visual effects team was to make sure that the fantasy elements felt grounded, not magical. (Image courtesy of HBO and Framestore) OPPOSITE TOP: Previs assisted in determining the appropriate camera angles for the balloon shots. (Image courtesy of HBO and Framestore)

20 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 20-21

Embarking on Season 2 of the HBO production of His Dark Materials has enabled showrunner Jane Tranter to expand the cast of characters and creatures, as well introduce new environments in the seven-episode adaptation of the second book in the trilogy called The Subtle Knife. Lyra Belacqua (Dafne Keen) pursues her father Lord Asriel (James McAvoy) after he kills her best friend in order to enter another dimension; she journeys to a mysterious abandoned city where Will Parry (Amir Wilson), a young boy with a troubled past, becomes her travel companion. Returning to the alternative fantasy adventure conceived by Philip Pullman is the visual effects team that includes Senior Visual Effects Supervisor Russell Dodgson, Senior Visual Effects Producer James Whitlam and Visual Effects Art Director Daniel May. “It’s a reduction in visual effects shots from Season 1 to 2, which is rare,” notes Whitlam. “It will jump back up again on Season 3. We went from a bit over 2,000 shots in Season 1 to 1,400 shots on Season 2. That’s predominantly because of losing an episode due to COVID-19. We went from eight episodes down to seven and reduced the appearances of the polar bears.” Episode 204 was going to be devoted to Lord Asriel. “It was a standalone that assisted the main narrative, but once COVID-19 hit we had to stop the shoot,” states Dodgson. “We then had to

back into the edit and restructure things to have the other episodes finished at different places. We had about a week where we dropped to 75% or 80% productivity. Then we got back up to 90% to 95%, which means we will all have to work that bit harder to match the results that come from all being together.” Whereas Season 1 was divided between Framestore facilities in London and Montreal, Season 2 saw the addition of New York in order to relieve the scheduling pressure caused by the global pandemic. “There is always the sharing of assets and we try to minimize the amount of shot sharing as much a possible,” explains Whitlam. “Framestore New York was brought into the mix and took on a big set piece at the end of Episode 207. We brought in another team from London that worked on another set piece in Episode 203. In terms of the actual split, we flipped it this time and more work was done in London than in Montreal. It was 50/50 between North America and U.K. We always try to make sure to cast individuals or teams for particular work and consistently keep that work with them. For instance, all of the Spectre work was done out of London.” The budget had to be kept in mind when deciding upon the balance of visual effects work. “We had to figure out the best mix of effects we need to tell the story and how do we work around that,”

states Whitlam. “Obviously, the daemons have to be there. The Golden Monkey and Pan [Kit Connor] are in a way as important as the actual actors who are in the show. There’s that baseline. Then you get into that place where you’re going between additional daemons, establishing shots and environment extensions. We want to create a world that is big enough to draw audiences in and dramatic enough to hold their attention.” There is a significant difference between the first book in the His Dark Materials trilogy, The Northern Lights, and the second one, The Subtle Knife. “The Northern Lights is more of a child’s odyssey, so every episode is a complete new world, plus we had a lot more background creatures with the last two episodes being Iorek Byrnison [Joe Tandberg] and the bears,” observes Dodgson. “The Subtle Knife is a more personal story. It’s exciting because we visit new worlds but bounce between them more, rather than introducing new ones every episode, so it is a different challenge.” “One of the things about the design stage is not to be afraid to sometimes pull the whole thing apart and re-do it, because we had a model with a lot of different ideas and out-of-date parts in it,” notes May. “It was all getting to be unwieldy, so we made the call to re-do from scratch and build based on the set. We made loads of what you call decals in video gaming language. We did a bitmap for

WINTER 2021 VFXVOICE.COM • 21

12/16/20 1:48 PM


FILM

“The Northern Lights is more of a child’s odyssey, so every episode is a complete new world, plus we had a lot more background creatures with the last two episodes being Iorek Byrnison and the bears. The Subtle Knife is a more personal story. It’s exciting because we visit new worlds but bounce between them more, rather than introducing new ones every episode, so it is a different challenge.” —Russell Dodgson, Senior Visual Effects Supervisor

WIELDING THE SUBTLE KNIFE FOR HIS DARK MATERIALS By TREVOR HOGG

TOP: A mandate for the visual effects team was to make sure that the fantasy elements felt grounded, not magical. (Image courtesy of HBO and Framestore) OPPOSITE TOP: Previs assisted in determining the appropriate camera angles for the balloon shots. (Image courtesy of HBO and Framestore)

20 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 20-21

Embarking on Season 2 of the HBO production of His Dark Materials has enabled showrunner Jane Tranter to expand the cast of characters and creatures, as well introduce new environments in the seven-episode adaptation of the second book in the trilogy called The Subtle Knife. Lyra Belacqua (Dafne Keen) pursues her father Lord Asriel (James McAvoy) after he kills her best friend in order to enter another dimension; she journeys to a mysterious abandoned city where Will Parry (Amir Wilson), a young boy with a troubled past, becomes her travel companion. Returning to the alternative fantasy adventure conceived by Philip Pullman is the visual effects team that includes Senior Visual Effects Supervisor Russell Dodgson, Senior Visual Effects Producer James Whitlam and Visual Effects Art Director Daniel May. “It’s a reduction in visual effects shots from Season 1 to 2, which is rare,” notes Whitlam. “It will jump back up again on Season 3. We went from a bit over 2,000 shots in Season 1 to 1,400 shots on Season 2. That’s predominantly because of losing an episode due to COVID-19. We went from eight episodes down to seven and reduced the appearances of the polar bears.” Episode 204 was going to be devoted to Lord Asriel. “It was a standalone that assisted the main narrative, but once COVID-19 hit we had to stop the shoot,” states Dodgson. “We then had to

back into the edit and restructure things to have the other episodes finished at different places. We had about a week where we dropped to 75% or 80% productivity. Then we got back up to 90% to 95%, which means we will all have to work that bit harder to match the results that come from all being together.” Whereas Season 1 was divided between Framestore facilities in London and Montreal, Season 2 saw the addition of New York in order to relieve the scheduling pressure caused by the global pandemic. “There is always the sharing of assets and we try to minimize the amount of shot sharing as much a possible,” explains Whitlam. “Framestore New York was brought into the mix and took on a big set piece at the end of Episode 207. We brought in another team from London that worked on another set piece in Episode 203. In terms of the actual split, we flipped it this time and more work was done in London than in Montreal. It was 50/50 between North America and U.K. We always try to make sure to cast individuals or teams for particular work and consistently keep that work with them. For instance, all of the Spectre work was done out of London.” The budget had to be kept in mind when deciding upon the balance of visual effects work. “We had to figure out the best mix of effects we need to tell the story and how do we work around that,”

states Whitlam. “Obviously, the daemons have to be there. The Golden Monkey and Pan [Kit Connor] are in a way as important as the actual actors who are in the show. There’s that baseline. Then you get into that place where you’re going between additional daemons, establishing shots and environment extensions. We want to create a world that is big enough to draw audiences in and dramatic enough to hold their attention.” There is a significant difference between the first book in the His Dark Materials trilogy, The Northern Lights, and the second one, The Subtle Knife. “The Northern Lights is more of a child’s odyssey, so every episode is a complete new world, plus we had a lot more background creatures with the last two episodes being Iorek Byrnison [Joe Tandberg] and the bears,” observes Dodgson. “The Subtle Knife is a more personal story. It’s exciting because we visit new worlds but bounce between them more, rather than introducing new ones every episode, so it is a different challenge.” “One of the things about the design stage is not to be afraid to sometimes pull the whole thing apart and re-do it, because we had a model with a lot of different ideas and out-of-date parts in it,” notes May. “It was all getting to be unwieldy, so we made the call to re-do from scratch and build based on the set. We made loads of what you call decals in video gaming language. We did a bitmap for

WINTER 2021 VFXVOICE.COM • 21

12/16/20 1:48 PM


FILM

all of the doors and windows so that they had all of the right details and felt in line with the set. A lot of the times the art department will have a lot of specialist artists who have spent their lifework doing architecture, and sometimes the CG guys, mine included, don’t get those proportions right. To go back to use the real set is always useful.” Fantasy concept art is to be viewed as a guideline. “It is the more visually rich version that would never fit within the TV show,” remarks Dodgson. “You make that and begin stripping back layers to bring it down to reality.” Previs fits into different categories for His Dark Materials. “There is the speculative previs, which is when I work out the set design and environments ahead of directors, because they often start quite late and in order to get the production value that you want, it takes a lot of time,” states May. “Substantial sets can take two to three months to build. In Season 1, I had quite good relationships with Otto Bathurst and Jamie Childs, and they took a lot of the ideas that we had, but naturally wanted to change things. You tweak the previs to suit their needs. Some of it can be changed and others can’t because the set is being made. The major set piece previs involves extensive planning and a lot of involvement with visual effects and stunts. Sometimes you’re not sure which directors are going to do the sequence. You have to be prepared to have some fails.” Virtual production methodology with Unreal Engine was adopted from the beginning of the television series. “The modus operandi for His Dark Materials is to be collaborative with the different creatives at the concept design and previs stages, as that allows for ideas to be explored with a higher fidelity early in the process,” states May. “In Season 1, we were able to build the bear palace at quite a nice level of detail. Even though the set had yet to be built, we could look at it in VR or with a virtual camera to get the sense of the scale. “The same with Season 2,” adds May. “The main project for us was the town of Cittàgazze. We had a repeat of the physical set and copied and pasted it around the virtual version so when looking at a staircase you could understand what angles were possible. Russell had the previs as a guide when he was doing the plate photography in Kauai. We attached and tweaked our rugged landscape and city with the physical accurate geometry from Kauai to get the realworld scale.” Introduced in Season 1 were daemons, which are animals that physically represent the soul of their human counterparts. “In Season 2, it is more of a focus around the main hero daemons, because we’re in a world where there are fewer of them around,” states Dodgson. “A few new daemons and characters for Pan were introduced, so we had to make some new assets along with new body movements to explore.” Pan is spiritually connected to Lyra and for the first time appears as a red panda and wolverine. “We’re still 100% hand-animated for all of our creature work,” continues Dodgson, “but we did extensive research on the movements of red pandas and wolverines. Red pandas are great but sometimes lumber when walking, so we had to find a way of balancing that new movement style while maintaining the feel and energy of Pan in his other forms. This lumbering nature

22 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 22-23

“The main project for us [in Season 2] was the town of Cittàgazze. We had a repeat of the physical set and copied and pasted it around the virtual version so when looking at a staircase you could understand what angles were possible. Russell [Dodgson] had the previs as a guide when he was doing the plate photography in Kauai. We attached and tweaked our rugged landscape and city with the physical, accurate geometry from Kauai to get the real-world scale.” —Daniel May, Visual Effects Art Director also worked quite well for the stage Lyra is at in life. It helped to physically show the slight emotional awkwardness she finds herself in as she is spending more and more time with this new boy.” Each time the Subtle Knife cuts a new window to a world, a malevolent spirit referred to as a Spectre escapes from the void between universes. “Anytime you have to do a monster or creature that is predominately effects driven, you open up this broad range of outcomes that happen through simulation,” remarks Dodgson. “You’re not only doing Spectres that appear regularly throughout the season, but when they attack people and other daemons you have effects monsters interacting with furry and feathered creatures, which is tricky. The thing that makes it hard is needing to create a system that becomes slightly more predictable. “The Spectres,” Dodgson explains, “and a lot of effects-driven creatures are actually a conversation about things that you’re

OPPOSITE TOP TO BOTTOM: Various atmospherics were simulated by Framestore such as the fire. (Image courtesy of HBO and Framestore) Some of the major set pieces in Season 2 take place in stormy environments. (Image courtesy of HBO and Framestore) The effect simulation-driven monster known as a Spectre appears behind Will Parry (Amir Wilson). (Image courtesy of HBO) Lyra Belacqua (Dafne Keen) stands alongside the wolverine form of Pan. (Image courtesy of HBO and Framestore) TOP: Aerial plates were shot in Kauai and incorporated into the environment build for Cittàgazze. (Image courtesy of HBO and Framestore)

WINTER 2021 VFXVOICE.COM • 23

12/16/20 1:48 PM


FILM

all of the doors and windows so that they had all of the right details and felt in line with the set. A lot of the times the art department will have a lot of specialist artists who have spent their lifework doing architecture, and sometimes the CG guys, mine included, don’t get those proportions right. To go back to use the real set is always useful.” Fantasy concept art is to be viewed as a guideline. “It is the more visually rich version that would never fit within the TV show,” remarks Dodgson. “You make that and begin stripping back layers to bring it down to reality.” Previs fits into different categories for His Dark Materials. “There is the speculative previs, which is when I work out the set design and environments ahead of directors, because they often start quite late and in order to get the production value that you want, it takes a lot of time,” states May. “Substantial sets can take two to three months to build. In Season 1, I had quite good relationships with Otto Bathurst and Jamie Childs, and they took a lot of the ideas that we had, but naturally wanted to change things. You tweak the previs to suit their needs. Some of it can be changed and others can’t because the set is being made. The major set piece previs involves extensive planning and a lot of involvement with visual effects and stunts. Sometimes you’re not sure which directors are going to do the sequence. You have to be prepared to have some fails.” Virtual production methodology with Unreal Engine was adopted from the beginning of the television series. “The modus operandi for His Dark Materials is to be collaborative with the different creatives at the concept design and previs stages, as that allows for ideas to be explored with a higher fidelity early in the process,” states May. “In Season 1, we were able to build the bear palace at quite a nice level of detail. Even though the set had yet to be built, we could look at it in VR or with a virtual camera to get the sense of the scale. “The same with Season 2,” adds May. “The main project for us was the town of Cittàgazze. We had a repeat of the physical set and copied and pasted it around the virtual version so when looking at a staircase you could understand what angles were possible. Russell had the previs as a guide when he was doing the plate photography in Kauai. We attached and tweaked our rugged landscape and city with the physical accurate geometry from Kauai to get the realworld scale.” Introduced in Season 1 were daemons, which are animals that physically represent the soul of their human counterparts. “In Season 2, it is more of a focus around the main hero daemons, because we’re in a world where there are fewer of them around,” states Dodgson. “A few new daemons and characters for Pan were introduced, so we had to make some new assets along with new body movements to explore.” Pan is spiritually connected to Lyra and for the first time appears as a red panda and wolverine. “We’re still 100% hand-animated for all of our creature work,” continues Dodgson, “but we did extensive research on the movements of red pandas and wolverines. Red pandas are great but sometimes lumber when walking, so we had to find a way of balancing that new movement style while maintaining the feel and energy of Pan in his other forms. This lumbering nature

22 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 22-23

“The main project for us [in Season 2] was the town of Cittàgazze. We had a repeat of the physical set and copied and pasted it around the virtual version so when looking at a staircase you could understand what angles were possible. Russell [Dodgson] had the previs as a guide when he was doing the plate photography in Kauai. We attached and tweaked our rugged landscape and city with the physical, accurate geometry from Kauai to get the real-world scale.” —Daniel May, Visual Effects Art Director also worked quite well for the stage Lyra is at in life. It helped to physically show the slight emotional awkwardness she finds herself in as she is spending more and more time with this new boy.” Each time the Subtle Knife cuts a new window to a world, a malevolent spirit referred to as a Spectre escapes from the void between universes. “Anytime you have to do a monster or creature that is predominately effects driven, you open up this broad range of outcomes that happen through simulation,” remarks Dodgson. “You’re not only doing Spectres that appear regularly throughout the season, but when they attack people and other daemons you have effects monsters interacting with furry and feathered creatures, which is tricky. The thing that makes it hard is needing to create a system that becomes slightly more predictable. “The Spectres,” Dodgson explains, “and a lot of effects-driven creatures are actually a conversation about things that you’re

OPPOSITE TOP TO BOTTOM: Various atmospherics were simulated by Framestore such as the fire. (Image courtesy of HBO and Framestore) Some of the major set pieces in Season 2 take place in stormy environments. (Image courtesy of HBO and Framestore) The effect simulation-driven monster known as a Spectre appears behind Will Parry (Amir Wilson). (Image courtesy of HBO) Lyra Belacqua (Dafne Keen) stands alongside the wolverine form of Pan. (Image courtesy of HBO and Framestore) TOP: Aerial plates were shot in Kauai and incorporated into the environment build for Cittàgazze. (Image courtesy of HBO and Framestore)

WINTER 2021 VFXVOICE.COM • 23

12/16/20 1:48 PM


FILM

AD

TOP: A partial backlot set of Cittàgazze was built with incredible detail along with an internal stage with a lot of side streets that could double as other streets. (Image courtesy of HBO and Framestore) BOTTOM: Aeronaut Lee Scoresby (Lin-Manuel Miranda) finds himself pursued by Magisterium airships. (Image courtesy of HBO)

trying to avoid, rather than the things you want. If you avoid the things you don’t want then you end up with something that you like. We didn’t want the Spectres to feel too gunky and oily. It’s definitely this iterative abstract process that is always hard to bid and predict how long it’s going to take. Sometimes you nail it quickly and in other cases it takes longer than you thought.” Creating the windows between worlds were difficult for a different reason. “If you’ve got two people standing in front of a window to another world, where do you put the focus?” explains Dodgson. “Do you put the focus deep in the other world, but then suddenly everything in front and behind the window in your real space is out of focus? It throws a lot of weird unpredictable problems at you. Also, we’re trying to make the window grounded in that it is subtle; that’s way harder to do than doing a Doctor Strange-style portal, because with that you’re creating so much effects work to place it in the space. Trying to show the window in 3D space with minimal visual tricks is complex. Most of it is rule-based with the windows. You want to give a sense of dimensionality, which doesn’t work if both sides look the same. One side always has to have more definition and the other side has more of an ethereal softness; this gives you a sense of what is further away and what is closer. Our Montreal team built a system that allowed you to create theses cuts that had a repeatable aesthetic result.” Witches have a major role in the storyline for Season 2. “When

24 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 25

12/16/20 1:48 PM


PG 25 UNIVERSAL NEWS OF THE WORLD AD.indd 25

12/16/20 1:48 PM


FILM

TOP: Guiding the way for Lyra Belacqua (Dafne Keen) are her daemon Pan (Kit Connor) and the alethiometer, which can determine if someone is telling the truth. (Image courtesy of HBO and Framestore) MIDDLE: Iorek Byrnison (Joe Tandberg) and the polar bears have a less significant role in Season 2. (Image courtesy of HBO and Framestore) BOTTOM: Mrs. Coulter (Ruth Wilson) and her daemon The Golden Monkey are determined to find Lyra. (Image courtesy of HBO and Framestore)

you get people flying it’s hard to make them look convincing,” reveals Dodgson. “Everything from Superman through to Wonder Woman, convincingly flying is always a hard sell, especially when they don’t have anything to hold onto. We didn’t want our witches on brooms. Our witches are literally flying humans. We made great digital doubles of them in order to do things that we can’t do in the real world. When shooting them, we used a tuning fork rig which is a common technique used for anything from Harry Potter to Wonder Woman. The tuning fork rig enabled us to get the close-up of their performances, but that also required a lot of paint-out and digital limbs. “There was a concept in Season 1 that was utilized a lot in Season 2 called ‘bamfing,’” Dodgson adds, “which is the ability to break apart like a cloud of ash and smoke, and re-form somewhere else quickly. It’s the way the witches move around quickly and gives them an edge. It makes the witches more three-dimensional as they can do more than just taking off and flying.” Not everything takes place on the ground. “Because Unreal is a game engine, you can lean on its programming technology,” states May. “We were able to set up a simple system that allowed us to be in the balloon with a camera and get a sense of different types of scenes as it flew over the city and what kind of angles were effective. It helped with choosing shots. In the end, they used a physical set, getting a crane up there and shooting some drone stuff. We could have done that in a more traditional sense, but once we built that virtual world, you could re-use it for quite a lot of things. The storm sequence with the airships was a mixture of Unreal and Photoshop, but ultimately it was down to what Russell used as a plate.” Flat light storm plates were shot in Kauai. “That gave us a grounded landscape that we could then turn to night,” remarks Dodgson. “Then we did a combination of CG, compositing and DMP.” The VFX team each has their Season 2 highlight. For Dodgson, one is when the Spectres attack someone and their daemon. “That’s a dark and cool scene,” he states. “There is another fight between Mrs. Coulter [Ruth Wilson] and Lyra that is cool and a nice surprise to people.” The world building and new environments are what continues to intrigue May, while Whitlam enjoys the aerial moments. “I like the storm sequence in Episode 201,” Whitlam says. “It’s a great beginning to the show that sucks you in and is dynamic. Likewise, there’s a fantastic sequence with the balloon being chased by Magisterium airships in another storm towards the end of the season.” As for what to expect from the third season, which will be adapting The Amber Spyglass, Dodgson remarks, “Add angels to the list of the slightly intangible. We had a small amount of trying to visualize them in Season 2, but in Season 3 they become a much bigger thing. “What we had to do,” he offers, “was to find something that we liked but gives us some wiggle room as we go deeper into Season 3, and develop that. We ended up looking at lot of references of statues which were made of thin strips of metal that caused some interesting plays on the light that were then used to drive the physical way that we made them in CG. The angels have been tricky and fun to do. It’s that exercise of leveraging real life to try to inspire the abstract.”

26 • VFXVOICE.COM WINTER 2021

PG 20-26 HIS DARK MATERIALS.indd 26

12/16/20 1:48 PM


PG 27 UNIVERSAL TROLLS AD.indd 27

12/16/20 1:49 PM


PROFILE

SARA BENNETT: OSCAR-WINNER TURNS PASSION INTO INSPIRATION FOR WOMEN IN FILM By TREVOR HOGG

Images courtesy of Milk VFX and Sara Bennett except where noted. TOP: Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX. OPPOSITE TOP: Official Oscar photo of Bennett, Mark Ardington, Paul Norris, Andrew Whitehurst and presenter Andy Serkis. OPPOSITE BOTTOM: The multiple layers of water simulations made Adrift one of the most difficult projects that Bennett has ever worked on. (Image courtesy of STX Entertainment)

28 • VFXVOICE.COM WINTER 2021

PG 28-32 SARA BENNETT.indd 28-29

During the 2015 Academy Awards, Sara Bennett became the second woman ever to win the Oscar for Best Visual Effects for her contributions on Ex Machina. The career path that led to her standing on the stage of the Dolby Theatre in Hollywood involved a change in profession and becoming a co-founder of a visual effects company. Bennett is the daughter of a police photographer and a nurse who still embraces the family tradition of being an avid reader. “I was born in Worcestershire in the West Midlands and grew up in a small town called Droitwich. I have fond memories but I couldn’t wait to move to London, and from the age of 14 I started planning my escape! My family were big readers, so I tended to read quite a lot from an early age. “My brother and I used to stay with our grandparents on weekends,” Bennett recalls, “and one occasion they rented The Dark Crystal for us. I must have been around 12 years old. I was blown away. That’s when I got my first real taste of inspiration to get into film. Legend was another film that inspired me and after that I started to think that makeup and prosthetics were the way to go.” Another of Bennett’s personal favorite films is James Cameron’s Aliens. “Everyone always says that Alien is their favorite, but for me it’s the second film, Aliens. I love the final scene with Sigourney Weaver and the alien Mama fighting. The visual effects are fantastic.” After graduating from what was a new course on makeup and prosthetics at her local college, the aspiring artist headed to London and found work with a special effects company as a receptionist. It was here that a casual conversation sparked her awareness and intrigue around the emergence of computer effects. “I applied to every VFX and production company in London for a runner’s job,” says Bennett. “A year later someone finally got back to me. From there, I fell in love with compositing. I gradually worked my way up through the ranks at various studios and learned different software, and that’s how I started my visual effects career.” There were no courses or mentors. “It was hard because you had to learn from everyone around you, and very quickly. I remember being surrounded by all of these fantastic senior compositors and getting frustrated as a roto and prep junior who was desperate to become a compositor. It was a steep learning curve. But I love a challenge, and the stubbornness in me ensured I pushed ahead with it and kept persevering. I learned a great deal of patience and pushed to learn as much as I could.” A change in corporate mandate resulted in Bennett co-founding Milk VFX in 2013. “Myself and the other Milk co-founders were part of The Mill’s film and TV team. The Mill decided to close that department to concentrate on commercials. There was a small group of us who had worked closely together for six or seven years at that point. We’d joked before about establishing our own company. Then this opportunity came along. We thought if we don’t do it now, we’ll never do it. We took a leap of faith and Milk was born. To have that level of control over your own venture was an amazing feeling. We hired key members of the old Mill TV and film team and we got a couple of key projects on board, including Doctor Who and Sherlock.

“I don’t think anything could have fully prepared us for Adrift. ... [T]his was a whole new level in terms of creating thousands of frames of stormy ocean. Just in terms of the rendering power required we used Amazon and Google cloud services because of the sheer number of caches from the effects that we had to render. The dramatic opening shot alone was 7,000 frames of water simulations. Then the end shot was 3,000 frames.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX “We did fly by the seat of our pants at the start,” Bennett acknowledges, “because it’s one thing to say that you’re going to set it all up, but doing it in reality is, of course, tougher than it looks. And, as we know, visual effects is a tough business. But what I love is that we started the company based on our ideas and passion for what we do. I headed up the 2D compositing department. We each managed different areas in the business so we never stepped on each other’s toes. Seven years on, I’m proud of what we’ve done with it so far.” Bennett feels that being a woman in the visual effects industry did not impede her career. “I never found any major issues when I was coming through the ranks. There were always people around who were difficult, but most of the time I had very good people around teaching me. There were only a few women back then as I was coming up, but more so now. We need to keep encouraging women into our industry until we have a more balanced ratio. Other than ensuring women in mid to senior roles in VFX are visible in order to inspire the new talent coming through, one of

WINTER 2021 VFXVOICE.COM • 29

12/16/20 1:50 PM


PROFILE

SARA BENNETT: OSCAR-WINNER TURNS PASSION INTO INSPIRATION FOR WOMEN IN FILM By TREVOR HOGG

Images courtesy of Milk VFX and Sara Bennett except where noted. TOP: Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX. OPPOSITE TOP: Official Oscar photo of Bennett, Mark Ardington, Paul Norris, Andrew Whitehurst and presenter Andy Serkis. OPPOSITE BOTTOM: The multiple layers of water simulations made Adrift one of the most difficult projects that Bennett has ever worked on. (Image courtesy of STX Entertainment)

28 • VFXVOICE.COM WINTER 2021

PG 28-32 SARA BENNETT.indd 28-29

During the 2015 Academy Awards, Sara Bennett became the second woman ever to win the Oscar for Best Visual Effects for her contributions on Ex Machina. The career path that led to her standing on the stage of the Dolby Theatre in Hollywood involved a change in profession and becoming a co-founder of a visual effects company. Bennett is the daughter of a police photographer and a nurse who still embraces the family tradition of being an avid reader. “I was born in Worcestershire in the West Midlands and grew up in a small town called Droitwich. I have fond memories but I couldn’t wait to move to London, and from the age of 14 I started planning my escape! My family were big readers, so I tended to read quite a lot from an early age. “My brother and I used to stay with our grandparents on weekends,” Bennett recalls, “and one occasion they rented The Dark Crystal for us. I must have been around 12 years old. I was blown away. That’s when I got my first real taste of inspiration to get into film. Legend was another film that inspired me and after that I started to think that makeup and prosthetics were the way to go.” Another of Bennett’s personal favorite films is James Cameron’s Aliens. “Everyone always says that Alien is their favorite, but for me it’s the second film, Aliens. I love the final scene with Sigourney Weaver and the alien Mama fighting. The visual effects are fantastic.” After graduating from what was a new course on makeup and prosthetics at her local college, the aspiring artist headed to London and found work with a special effects company as a receptionist. It was here that a casual conversation sparked her awareness and intrigue around the emergence of computer effects. “I applied to every VFX and production company in London for a runner’s job,” says Bennett. “A year later someone finally got back to me. From there, I fell in love with compositing. I gradually worked my way up through the ranks at various studios and learned different software, and that’s how I started my visual effects career.” There were no courses or mentors. “It was hard because you had to learn from everyone around you, and very quickly. I remember being surrounded by all of these fantastic senior compositors and getting frustrated as a roto and prep junior who was desperate to become a compositor. It was a steep learning curve. But I love a challenge, and the stubbornness in me ensured I pushed ahead with it and kept persevering. I learned a great deal of patience and pushed to learn as much as I could.” A change in corporate mandate resulted in Bennett co-founding Milk VFX in 2013. “Myself and the other Milk co-founders were part of The Mill’s film and TV team. The Mill decided to close that department to concentrate on commercials. There was a small group of us who had worked closely together for six or seven years at that point. We’d joked before about establishing our own company. Then this opportunity came along. We thought if we don’t do it now, we’ll never do it. We took a leap of faith and Milk was born. To have that level of control over your own venture was an amazing feeling. We hired key members of the old Mill TV and film team and we got a couple of key projects on board, including Doctor Who and Sherlock.

“I don’t think anything could have fully prepared us for Adrift. ... [T]his was a whole new level in terms of creating thousands of frames of stormy ocean. Just in terms of the rendering power required we used Amazon and Google cloud services because of the sheer number of caches from the effects that we had to render. The dramatic opening shot alone was 7,000 frames of water simulations. Then the end shot was 3,000 frames.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX “We did fly by the seat of our pants at the start,” Bennett acknowledges, “because it’s one thing to say that you’re going to set it all up, but doing it in reality is, of course, tougher than it looks. And, as we know, visual effects is a tough business. But what I love is that we started the company based on our ideas and passion for what we do. I headed up the 2D compositing department. We each managed different areas in the business so we never stepped on each other’s toes. Seven years on, I’m proud of what we’ve done with it so far.” Bennett feels that being a woman in the visual effects industry did not impede her career. “I never found any major issues when I was coming through the ranks. There were always people around who were difficult, but most of the time I had very good people around teaching me. There were only a few women back then as I was coming up, but more so now. We need to keep encouraging women into our industry until we have a more balanced ratio. Other than ensuring women in mid to senior roles in VFX are visible in order to inspire the new talent coming through, one of

WINTER 2021 VFXVOICE.COM • 29

12/16/20 1:50 PM


PROFILE

“We need to keep encouraging women into our industry until we have a more balanced ratio. ... [O]ne of the things that I believe is super important is mentoring. It was something I never had access to in my early career. ... There are so many more powerful women running studios now and that inspires me too.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX the things that I believe is super important is mentoring. It was something I never had access to in my early career. “Leading up to the Academy Awards, I didn’t know what to expect,” Bennett says. “After I won the Oscar, I wasn’t prepared for the barrage of people getting in touch. I had so many lovely emails from women saying how inspiring it was to them. They wouldn’t have known about me if I hadn’t had that visibility. It’s so important to have mentors for new talent coming through. There are so many more powerful women running studios now and that inspires me too.” As to why Ex Machina, an independent film about an android breaking free of her volatile creator, was able to beat Mad Max: Fury Road, Star Wars: The Force Awakens, The Martian and The Revenant, Bennett believes, “This was an incredibly well-written, independent sci-fi film. The design of Ava and the simplicity of it was beautiful. It was quality work. No big setup pieces. It was brilliant storytelling and people responded to it accordingly.” One of the most technically difficult projects Bennett has tackled was the film Adrift, which required massive water simulations for a stormy ocean with terrifyingly immense waves threatening to capsize a sailboat. “I don’t think anything could have fully prepared us for Adrift,” admits Bennett. “You plan these jobs ahead as much as you can – and I’ve been doing the job for a long time, but this was a whole new level in terms of creating thousands of frames of stormy ocean. Just in terms of the rendering power required we used Amazon and Google cloud services because of the sheer number of caches from the effects that we had to render. The dramatic opening shot alone was 7,000 frames of water simulations. Then the end shot was 3,000 frames. “At one point we had a power cut after 24 hours of simulating water and had to restart the sims all over again. I remember the poor effects team members were trying to mesh together two or three different simulations and jaw-dropping render times. The cloud enabled us to scale up to tackle that job. A studio of our size wouldn’t have the capacity to tackle it without the cloud.” For The Old Guard, a Netflix film about a covert team of immortal mercenaries fighting to keep their identity a secret, Bennett took over the role as the Production Visual Effects Supervisor for Netflix, which meant working closely with filmmaker Gina Prince-Bythewood (Love & Basketball) and actress Charlize Theron (Monster). “Charlize was quite an inspiration

30 • VFXVOICE.COM WINTER 2021

PG 28-32 SARA BENNETT.indd 30-31

“There are now about 180 of us [at Milk VFX] working remotely. It’s slightly slower creatively than being in the studio face-to-face, but in other areas it has sometimes proved more productive. We’ve had a lot of meetings over the past few months for jobs that don’t involve travel and huge crowd setups. Clients are exploring ways of shooting scenes in a virtual production setup and relying on visual effects to solve crowd issues within that.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX on the film. She had her two little girls walk around and hand out water to the crew, which was so sweet. I love the John Wick films, so this kind of film is right up my alley. It has two kick-ass leading female actors. I love the fight choreography and the stunts. There is an element of fantasy with them being immortal, but it was never sold as a big visual effects film. “When I first met Gina Prince-Bythewood and had the initial meetings, she was keen to shoot as much for real as possible. The immortality [wounds regenerating] had to be visual effects. You just couldn’t do it any other way. It felt nicely contained. When the script was first broken down, we had 450 shots in mind and ended up doing 835 shots. Things change.” The rise of streaming shows was something that Bennett and Milk VFX were prepared for due to their long-time involvement with high-end television. “From the outset, we focused on longform TV shows and did smaller pieces of high-end film work. The amount of TV content that is being made now, thanks to the rise of Netflix, Amazon and the other streamers, is great for the industry.”

OPPOSITE TOP: Attending the 2016 Into Film Awards with her Oscar. OPPOSITE MIDDLE: With Will Cohen, left, Milk VFX Co-founder and CEO, Jean-Claude Deguara, Milk VFX Co-founder and VFX Supervisor - and her Oscar. OPPOSITE BOTTOM: Bennett wins the Technicolor Award for Creative Technology at the 2016 Women in Film and TV Awards. TOP: Clockwise from top: Matthias Schoenaerts (Booker), Charlize Theron (Andy), Luca Marinelli (Nicky) portray immortal mercenaries in The Old Guard. (Image courtesy of Netflix)

WINTER 2021 VFXVOICE.COM • 31

12/16/20 1:50 PM


PROFILE

“We need to keep encouraging women into our industry until we have a more balanced ratio. ... [O]ne of the things that I believe is super important is mentoring. It was something I never had access to in my early career. ... There are so many more powerful women running studios now and that inspires me too.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX the things that I believe is super important is mentoring. It was something I never had access to in my early career. “Leading up to the Academy Awards, I didn’t know what to expect,” Bennett says. “After I won the Oscar, I wasn’t prepared for the barrage of people getting in touch. I had so many lovely emails from women saying how inspiring it was to them. They wouldn’t have known about me if I hadn’t had that visibility. It’s so important to have mentors for new talent coming through. There are so many more powerful women running studios now and that inspires me too.” As to why Ex Machina, an independent film about an android breaking free of her volatile creator, was able to beat Mad Max: Fury Road, Star Wars: The Force Awakens, The Martian and The Revenant, Bennett believes, “This was an incredibly well-written, independent sci-fi film. The design of Ava and the simplicity of it was beautiful. It was quality work. No big setup pieces. It was brilliant storytelling and people responded to it accordingly.” One of the most technically difficult projects Bennett has tackled was the film Adrift, which required massive water simulations for a stormy ocean with terrifyingly immense waves threatening to capsize a sailboat. “I don’t think anything could have fully prepared us for Adrift,” admits Bennett. “You plan these jobs ahead as much as you can – and I’ve been doing the job for a long time, but this was a whole new level in terms of creating thousands of frames of stormy ocean. Just in terms of the rendering power required we used Amazon and Google cloud services because of the sheer number of caches from the effects that we had to render. The dramatic opening shot alone was 7,000 frames of water simulations. Then the end shot was 3,000 frames. “At one point we had a power cut after 24 hours of simulating water and had to restart the sims all over again. I remember the poor effects team members were trying to mesh together two or three different simulations and jaw-dropping render times. The cloud enabled us to scale up to tackle that job. A studio of our size wouldn’t have the capacity to tackle it without the cloud.” For The Old Guard, a Netflix film about a covert team of immortal mercenaries fighting to keep their identity a secret, Bennett took over the role as the Production Visual Effects Supervisor for Netflix, which meant working closely with filmmaker Gina Prince-Bythewood (Love & Basketball) and actress Charlize Theron (Monster). “Charlize was quite an inspiration

30 • VFXVOICE.COM WINTER 2021

PG 28-32 SARA BENNETT.indd 30-31

“There are now about 180 of us [at Milk VFX] working remotely. It’s slightly slower creatively than being in the studio face-to-face, but in other areas it has sometimes proved more productive. We’ve had a lot of meetings over the past few months for jobs that don’t involve travel and huge crowd setups. Clients are exploring ways of shooting scenes in a virtual production setup and relying on visual effects to solve crowd issues within that.” —Sara Bennett, Visual Effects Supervisor and Co-founder, Milk VFX on the film. She had her two little girls walk around and hand out water to the crew, which was so sweet. I love the John Wick films, so this kind of film is right up my alley. It has two kick-ass leading female actors. I love the fight choreography and the stunts. There is an element of fantasy with them being immortal, but it was never sold as a big visual effects film. “When I first met Gina Prince-Bythewood and had the initial meetings, she was keen to shoot as much for real as possible. The immortality [wounds regenerating] had to be visual effects. You just couldn’t do it any other way. It felt nicely contained. When the script was first broken down, we had 450 shots in mind and ended up doing 835 shots. Things change.” The rise of streaming shows was something that Bennett and Milk VFX were prepared for due to their long-time involvement with high-end television. “From the outset, we focused on longform TV shows and did smaller pieces of high-end film work. The amount of TV content that is being made now, thanks to the rise of Netflix, Amazon and the other streamers, is great for the industry.”

OPPOSITE TOP: Attending the 2016 Into Film Awards with her Oscar. OPPOSITE MIDDLE: With Will Cohen, left, Milk VFX Co-founder and CEO, Jean-Claude Deguara, Milk VFX Co-founder and VFX Supervisor - and her Oscar. OPPOSITE BOTTOM: Bennett wins the Technicolor Award for Creative Technology at the 2016 Women in Film and TV Awards. TOP: Clockwise from top: Matthias Schoenaerts (Booker), Charlize Theron (Andy), Luca Marinelli (Nicky) portray immortal mercenaries in The Old Guard. (Image courtesy of Netflix)

WINTER 2021 VFXVOICE.COM • 31

12/16/20 1:50 PM


PROFILE

TOP: Milk VFX produced Dalek Saucers for the 50th anniversary episode of Doctor Who. (Image courtesy of BBC Studios) MIDDLE: Ex Machina is a proud achievement for Bennett, who enjoys working with Visual Effects Supervisor Andrew Whitehurst and filmmaker Alex Garland. (Image courtesy of STX Entertainment) BOTTOM: Immortal mercenary Quynh (Ngo Thanh Van) gets placed inside an iron coffin in a shot produced by Milk VFX for The Old Guard. (Image courtesy of Netflix)

The coronavirus pandemic presented a challenge that the team was able to overcome by adapting to work remotely. “We had a few of our crew working remotely already, but when the pandemic happened we had to quickly adapt and work from home in a safe setup,” Bennett states. “Milk got set up remotely within two weeks in a fairly smooth transition. We have delivered three shows during lockdown. All Netflix projects: The Old Guard, Ben Wheatley’s remake of Rebecca and Cursed, which is an episodic retelling of the Arthurian legend with a twist. There are now about 180 of us working remotely. It’s slightly slower creatively than being in the studio face-to-face, but in other areas it has sometimes proved more productive. We’ve had a lot of meetings over the past few months for jobs that don’t involve travel and huge crowd setups. Clients are exploring ways of shooting scenes in a virtual production setup and relying on visual effects to solve crowd issues within that.” Looking back on her first film credit, Bennett remarks, “I loved Babe: Pig in the City. That was when I first joined Mill Film in 1998. I was compositing on Flame and working nightshifts roto-ing ducks, pigs and a variety of other farmyard animals. Other big formative projects for me were the Harry Potter films. I worked on them while at Framestore and Mill Film – this was where I made my first jump into compositing.” Bennett remains an avid reader. “I read a lot. Escapism all the way. Fiction and fantasy. Some of my favorite books are The Famished Road by Ben Okri, Quarantine by Jim Grace and Phillip Pullman’s His Dark Materials. It’s how I relax after a long day at work.”

32 • VFXVOICE.COM WINTER 2021

PG 28-32 SARA BENNETT.indd 32

12/16/20 1:50 PM


PG 33 UNIVERSAL CROODS AD.indd 33

12/16/20 1:50 PM


COVER

SOUL’S JOURNEY FROM ‘THE GREAT BEFORE’ TO NEW YORK JAZZ CLUBS By BARBARA ROBERTSON

All images copyright © 2020 Disney/Pixar. TOP: Soul introduces Joe Gardner (voice of Jaime Foxx), a middle-school band teacher who get the chance of a lifetime to play with Dorothea Williams (voice of Angela Bassett) at the best jazz club in town. BOTTOM: Pete Docter, Writer/Director/Executive Producer and Chief Creative Officer, Pixar Animation. OPPOSITE TOP: Set in fast-paced New York City and the abstract, illusionary world of The Great Before, Soul capitalizes on the contrast between the big city and the cosmic realm.

34 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 34-35

“How did they do that?” may not be your typical reaction to scenes in an animated feature. But consider this shot with two characters from Disney/Pixar’s latest film Soul: A tall stick figure, a “counselor,” is trying to catch a “soul,” a small, soft three-dimensional character with a big round head and almost no body. The counselor is also 3D, but looks like a partially shaded, partially transparent line drawing with a Picasso-esque face. His nose and mouth point in one direction, his eyes in another. One eye is outside the lines entirely. As the counselor frantically scrambles after the little soul, he grows extra arms for a moment, then stretches one arm w-a-a-ay out, grabs the soul, lifts it up... and drops it. As it falls, the soul loses shape entirely and stretches, eyes growing big, until it lands on another soul, the character Joe, and melts over his head. Joe Gardner (Jamie Foxx) is a middle-school music teacher in New York, who, on the very day his dream to be a jazz musician is about to come true, steps into an open manhole. Joe’s soul escapes “The Great Beyond” and lands in “The Great Before” where new souls train before birth. There, he meets 22 (Tina Fey), a soul not yet born who looks with skepticism at the Earth. It befalls Joe, a soul who has already lived and wants to live again, to show 22 the promise of life. And thereby hangs the tale – a soul who doesn’t want to live meets a soul who doesn’t want to die. But what do souls look like and where are they before they are born? “The characters started as design problems and then quickly became technical challenges,” says Soul writer/director/executive producer (and Chief Creative Officer at Pixar) Pete Docter. “In literature and traditionally, people think of souls as ethereal. We wanted to see through them like fog but not be distracted by what was behind. How do we film that? And, that bled into location. Most religions talk about the afterlife. Few talk about before life. It’s one of those things that if you think about it too hard you’ll break your brain.”

The creative team decided the story would put blank-slate souls in a kind of training camp designed with grand abstract pavilions of knowledge. Mentoring and sorting those souls would be counselors like Jerry (Richard Ayoade), the tall line drawing who grabbed 22 and dropped her on Joe’s head. BODY AND SOUL

“The souls are not physical characters,” says Michael Fong, Visual Effects Supervisor. “They can change shape, elongate and stretch. They have no physical form.” Designers gave the souls faces, but the character’s ambiguity led them to lean on the technology group to help create foggy volumes that would sustain interest. “We needed to be inspired by technology,” Docter says. “We had weekly reviews. What if we push this way? What if we put a line around the fingers? It was cyclical art that was possible because we could be iterative and adaptable and change things quickly.” Docter didn’t want the souls to look like the emotions in Inside Out or otherworldly characters depicted in other Pixar films, so the team created souls that catch light and reflect it in an atypical way. “The characters are volumetric, like fog,” Fong says. “But Pete didn’t want the volumes to look like volumes that have used correct math. He wanted pink here, blue there, green in the middle. We didn’t know how to do that.” Over the past 10 years or so, physically-based lighting and rendering techniques have become standard for look dev, lighting, and rendering in most VFX and animation studios. They simplify lighting setups and bring synthetic worlds closer to reality. At Pixar, artists use RenderMan, a homegrown toolset based around Katana, for lighting and Nuke for compositing, a typical set of tools for many studios. But these characters weren’t typical.

“We wanted the characters to feel luminous, not emissive,” says Ian Megibben, Director of Photography. “Joy, a character in Inside Out, was a volume, but she was a light source. Soul’s Joe and 22 are also volumes, but they are affected by an external light source. We used volume rendering for a good portion of the soul world and characters. But we didn’t want them to take on local color. We didn’t want the shadows to look dark. And we didn’t want them to come across as a ghost. I remember asking the designers what their expectation was for light entering one end of a volume and coming out the other end. We narrowed down on the theme of prismatic light.” BLUE IN GREEN

For reference, the artists looked at opalescent glass, beetle shells, rainbows in the mist – anything that showed light splitting. Then, rather than shadows, they used variations of color to create something that would feel like shadows. “The rainbow was our surrogate for light and shadow,” Megibben says. “Warm colors represented light and cooler colors represented shadow. The characters shift from yellowish green to turquoise or teal to, on the unlit side, deep blue as shadows. On the edges, they have other rainbow colors – magenta, orange, yellow. We call that their ethereal helmet. To preserve the luminance so they felt glowy, we relied on the hue of the color.” Shading technical directors created a system in Houdini, Katana, Mari and Pixar’s own shading tool called Flow. Working with the shading team. the lighting team devised a method for casting shadows that was independent of light falling off a rounded object. “I relied heavily on techniques from 20 years ago for this project,” Megibben says. “How do you ensure a mouth that is a translucent volume occludes light? Well, it doesn’t actually do that. At the end of the day, the rules of composition and creating an image still applied,

WINTER 2021 VFXVOICE.COM • 35

12/16/20 1:51 PM


COVER

SOUL’S JOURNEY FROM ‘THE GREAT BEFORE’ TO NEW YORK JAZZ CLUBS By BARBARA ROBERTSON

All images copyright © 2020 Disney/Pixar. TOP: Soul introduces Joe Gardner (voice of Jaime Foxx), a middle-school band teacher who get the chance of a lifetime to play with Dorothea Williams (voice of Angela Bassett) at the best jazz club in town. BOTTOM: Pete Docter, Writer/Director/Executive Producer and Chief Creative Officer, Pixar Animation. OPPOSITE TOP: Set in fast-paced New York City and the abstract, illusionary world of The Great Before, Soul capitalizes on the contrast between the big city and the cosmic realm.

34 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 34-35

“How did they do that?” may not be your typical reaction to scenes in an animated feature. But consider this shot with two characters from Disney/Pixar’s latest film Soul: A tall stick figure, a “counselor,” is trying to catch a “soul,” a small, soft three-dimensional character with a big round head and almost no body. The counselor is also 3D, but looks like a partially shaded, partially transparent line drawing with a Picasso-esque face. His nose and mouth point in one direction, his eyes in another. One eye is outside the lines entirely. As the counselor frantically scrambles after the little soul, he grows extra arms for a moment, then stretches one arm w-a-a-ay out, grabs the soul, lifts it up... and drops it. As it falls, the soul loses shape entirely and stretches, eyes growing big, until it lands on another soul, the character Joe, and melts over his head. Joe Gardner (Jamie Foxx) is a middle-school music teacher in New York, who, on the very day his dream to be a jazz musician is about to come true, steps into an open manhole. Joe’s soul escapes “The Great Beyond” and lands in “The Great Before” where new souls train before birth. There, he meets 22 (Tina Fey), a soul not yet born who looks with skepticism at the Earth. It befalls Joe, a soul who has already lived and wants to live again, to show 22 the promise of life. And thereby hangs the tale – a soul who doesn’t want to live meets a soul who doesn’t want to die. But what do souls look like and where are they before they are born? “The characters started as design problems and then quickly became technical challenges,” says Soul writer/director/executive producer (and Chief Creative Officer at Pixar) Pete Docter. “In literature and traditionally, people think of souls as ethereal. We wanted to see through them like fog but not be distracted by what was behind. How do we film that? And, that bled into location. Most religions talk about the afterlife. Few talk about before life. It’s one of those things that if you think about it too hard you’ll break your brain.”

The creative team decided the story would put blank-slate souls in a kind of training camp designed with grand abstract pavilions of knowledge. Mentoring and sorting those souls would be counselors like Jerry (Richard Ayoade), the tall line drawing who grabbed 22 and dropped her on Joe’s head. BODY AND SOUL

“The souls are not physical characters,” says Michael Fong, Visual Effects Supervisor. “They can change shape, elongate and stretch. They have no physical form.” Designers gave the souls faces, but the character’s ambiguity led them to lean on the technology group to help create foggy volumes that would sustain interest. “We needed to be inspired by technology,” Docter says. “We had weekly reviews. What if we push this way? What if we put a line around the fingers? It was cyclical art that was possible because we could be iterative and adaptable and change things quickly.” Docter didn’t want the souls to look like the emotions in Inside Out or otherworldly characters depicted in other Pixar films, so the team created souls that catch light and reflect it in an atypical way. “The characters are volumetric, like fog,” Fong says. “But Pete didn’t want the volumes to look like volumes that have used correct math. He wanted pink here, blue there, green in the middle. We didn’t know how to do that.” Over the past 10 years or so, physically-based lighting and rendering techniques have become standard for look dev, lighting, and rendering in most VFX and animation studios. They simplify lighting setups and bring synthetic worlds closer to reality. At Pixar, artists use RenderMan, a homegrown toolset based around Katana, for lighting and Nuke for compositing, a typical set of tools for many studios. But these characters weren’t typical.

“We wanted the characters to feel luminous, not emissive,” says Ian Megibben, Director of Photography. “Joy, a character in Inside Out, was a volume, but she was a light source. Soul’s Joe and 22 are also volumes, but they are affected by an external light source. We used volume rendering for a good portion of the soul world and characters. But we didn’t want them to take on local color. We didn’t want the shadows to look dark. And we didn’t want them to come across as a ghost. I remember asking the designers what their expectation was for light entering one end of a volume and coming out the other end. We narrowed down on the theme of prismatic light.” BLUE IN GREEN

For reference, the artists looked at opalescent glass, beetle shells, rainbows in the mist – anything that showed light splitting. Then, rather than shadows, they used variations of color to create something that would feel like shadows. “The rainbow was our surrogate for light and shadow,” Megibben says. “Warm colors represented light and cooler colors represented shadow. The characters shift from yellowish green to turquoise or teal to, on the unlit side, deep blue as shadows. On the edges, they have other rainbow colors – magenta, orange, yellow. We call that their ethereal helmet. To preserve the luminance so they felt glowy, we relied on the hue of the color.” Shading technical directors created a system in Houdini, Katana, Mari and Pixar’s own shading tool called Flow. Working with the shading team. the lighting team devised a method for casting shadows that was independent of light falling off a rounded object. “I relied heavily on techniques from 20 years ago for this project,” Megibben says. “How do you ensure a mouth that is a translucent volume occludes light? Well, it doesn’t actually do that. At the end of the day, the rules of composition and creating an image still applied,

WINTER 2021 VFXVOICE.COM • 35

12/16/20 1:51 PM


COVER

but the techniques and tools were totally different, home grown and home conceived. We had to create our own lighting system. We used old techniques that were new to some of the younger artists. I hope that people in the industry will scratch their heads a little bit and think, ‘I don’t know what they did.’” As might be imagined, rigs for these volumetric characters were tricky. Facial features were more important than the bodies – their head-to-body ratio was almost one-to-one and sometimes their legs would disappear. 22 has fingers only when she needs them; otherwise she has “mittens.” Sometimes there are lines around her fingers. “It put a strain on our workflow,” Docter says. “In some cases, we could push the rig to the limit of what it could do and then go in and try to fix things. But we also had to build stunt versions of the characters.” Compositing techniques helped the final volumetric characters completely break from the physical world. ALL OF YOU

Breaking from the physical world was only the half of it when it came to the counselors, which can completely break apart. “I

36 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 36-37

usually don’t freak out when the art folks come up with ideas for characters, but I started to panic when they came up with the counselor characters,” Fong says. “They look different from different angles, maybe not even like a character. They’re doing their best to look humanoid, but don’t quite get it right. They can pop out arms and faces wherever they want. Their faces break, mouths break off. And we had to fit membranes between the lines. They’re not just planar.” The design looks simple, but it’s deceptively difficult to create shape-changing, three-dimensional characters formed using lines and a membrane that appear to be two-dimensional. “We had to break down the situations they would be in,” Fong says. “Some turn into buildings or are buildings. The arms could go up and down, but the shoulder joint could be anywhere in the wire frame. We had to build and rebuild the rig five times. We cheat by flattening [the 3D models] down in screen space, but that didn’t give us a viable membrane. So we build the membrane on the fly per frame. We had rules establishing where the membrane needed to be more solid and where it needed to be transparent. We’d never seen characters like this before. There was a lot of handwork.”

Concept artists, riggers and animators worked simultaneously for a year and a half to figure out what the counselors were and to develop systems that would work. Lighting artists created rules for what key light means on characters that appear to be flat, and shading artists created complex systems in Houdini that determined when to color inside the lines. Sometimes you can see through the characters; sometimes the lines are filled.

“This film is about me re-examining the worth of it all, about what are we doing with our time, the reason for doing it at all. Those are the things we’re exploring. What are we waking up for every day? Let’s examine what this is leading to.” —Pete Docter, Writer/Director/Executive Producer

AIN’T MISBEHAVIN’

“Creating a system that kept everything consistent was fascinating,” says Jonathan Hoffman, Character Shading Technical Director. “I would dream about it at night. There were all these constraints: If a counselor swept a left arm in front of another arm, anything within would disappear. It was like a Venn diagram.” Animators had the freedom to move the counselors as they wished. A counselor could become a staircase that it walked up. It could suddenly have multiple arms. An arm could cross under an armpit. The shading system needed to know what to render. “Imagine taking a pen and drawing it across the character,” Hoffman says. “You count the number of times it touches a line

OPPOSITE TOP: A single, unexpected step sends Joe to the cosmic realm where he finds the “You Seminar” – a fantastical place where Joe is forced to ponder the meaning of soul. OPPOSITE BOTTOM: One small misstep takes Joe from the streets of New York City to The Great Before. TOP: Joe finds himself in The Great Before, where new souls get their personalities, quirks and interests before they go to Earth. There he meets the ubiquitous Counselors who run the “You Seminar” and precocious soul 22 (Tina Fey), who has never understood the appeal of the human experience. BOTTOM: When Joe gets lost in his music, he goes into “the zone,” an immersive state that causes the rest of the world to melt away.

WINTER 2021 VFXVOICE.COM • 37

12/16/20 1:51 PM


COVER

but the techniques and tools were totally different, home grown and home conceived. We had to create our own lighting system. We used old techniques that were new to some of the younger artists. I hope that people in the industry will scratch their heads a little bit and think, ‘I don’t know what they did.’” As might be imagined, rigs for these volumetric characters were tricky. Facial features were more important than the bodies – their head-to-body ratio was almost one-to-one and sometimes their legs would disappear. 22 has fingers only when she needs them; otherwise she has “mittens.” Sometimes there are lines around her fingers. “It put a strain on our workflow,” Docter says. “In some cases, we could push the rig to the limit of what it could do and then go in and try to fix things. But we also had to build stunt versions of the characters.” Compositing techniques helped the final volumetric characters completely break from the physical world. ALL OF YOU

Breaking from the physical world was only the half of it when it came to the counselors, which can completely break apart. “I

36 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 36-37

usually don’t freak out when the art folks come up with ideas for characters, but I started to panic when they came up with the counselor characters,” Fong says. “They look different from different angles, maybe not even like a character. They’re doing their best to look humanoid, but don’t quite get it right. They can pop out arms and faces wherever they want. Their faces break, mouths break off. And we had to fit membranes between the lines. They’re not just planar.” The design looks simple, but it’s deceptively difficult to create shape-changing, three-dimensional characters formed using lines and a membrane that appear to be two-dimensional. “We had to break down the situations they would be in,” Fong says. “Some turn into buildings or are buildings. The arms could go up and down, but the shoulder joint could be anywhere in the wire frame. We had to build and rebuild the rig five times. We cheat by flattening [the 3D models] down in screen space, but that didn’t give us a viable membrane. So we build the membrane on the fly per frame. We had rules establishing where the membrane needed to be more solid and where it needed to be transparent. We’d never seen characters like this before. There was a lot of handwork.”

Concept artists, riggers and animators worked simultaneously for a year and a half to figure out what the counselors were and to develop systems that would work. Lighting artists created rules for what key light means on characters that appear to be flat, and shading artists created complex systems in Houdini that determined when to color inside the lines. Sometimes you can see through the characters; sometimes the lines are filled.

“This film is about me re-examining the worth of it all, about what are we doing with our time, the reason for doing it at all. Those are the things we’re exploring. What are we waking up for every day? Let’s examine what this is leading to.” —Pete Docter, Writer/Director/Executive Producer

AIN’T MISBEHAVIN’

“Creating a system that kept everything consistent was fascinating,” says Jonathan Hoffman, Character Shading Technical Director. “I would dream about it at night. There were all these constraints: If a counselor swept a left arm in front of another arm, anything within would disappear. It was like a Venn diagram.” Animators had the freedom to move the counselors as they wished. A counselor could become a staircase that it walked up. It could suddenly have multiple arms. An arm could cross under an armpit. The shading system needed to know what to render. “Imagine taking a pen and drawing it across the character,” Hoffman says. “You count the number of times it touches a line

OPPOSITE TOP: A single, unexpected step sends Joe to the cosmic realm where he finds the “You Seminar” – a fantastical place where Joe is forced to ponder the meaning of soul. OPPOSITE BOTTOM: One small misstep takes Joe from the streets of New York City to The Great Before. TOP: Joe finds himself in The Great Before, where new souls get their personalities, quirks and interests before they go to Earth. There he meets the ubiquitous Counselors who run the “You Seminar” and precocious soul 22 (Tina Fey), who has never understood the appeal of the human experience. BOTTOM: When Joe gets lost in his music, he goes into “the zone,” an immersive state that causes the rest of the world to melt away.

WINTER 2021 VFXVOICE.COM • 37

12/16/20 1:51 PM


COVER

“The souls are not physical characters. They can change shape, elongate and stretch. They have no physical form.” —Michael Fong, Visual Effects Supervisor and remove the even lines. If it hits one, that’s positive, keep that. Hit another line, that’s two so cut that. That’s how we’d march a ray across a two-dimensional shape. But the character has lines going down the middle that break the simple logic. The right side and left weren’t always right and left because sometimes the character would twist. Then the membrane had to exist within the scene as a 3D shape. We try to render everything in a single frame. So after we flatten the characters, we return them to 3D space so the membrane can exist there.” SPACE IS THE PLACE

The souls and counselors exist in The Great Before. Creating the non-physical environment they wander around in became another design adventure. “We wanted a place that felt soft, that had a sense there were no boundaries,” Docter says. “It extends into an undefined horizon. A lot of the shapes and choices we made were trying to reinforce the idea that this place inspires knowledge. The souls are blank slates, but when they come to Earth they have a personality. So we have pavilions like in ancient Greece and in world’s fairs.” Motif, an in-house tool for set dressing, provided real-time feedback to artists trying to visualize the ideas in Docter’s and the art director’s minds. “We could build fields of grass and pavilions and get feedback in reviews, and then make changes in real-time,” Fong says. “The grass might not be wispy enough or too opaque, or need to be taller and longer. We could change that on the screen.” NEW YORK

Contrasting with The Great Before is Earth – specifically, New York City, the place Joe wants to return to and 22 wants to avoid.

38 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 39

To Joe – and to the film – The Great Before is ethereal. New York is jazz. “I grew up in a musical family,” Docter says. “My parents were musicians and teachers and my two sisters are professional musicians. But instead of listening to classical music, I listened to ’30s and ’40s jazz. As we were looking for a main character who would be born with a passion, I thought about a lot of occupations, even animator. But it seems like a musician, especially a jazz musician – there’s a nobility without selfishness. It’s not like someone is going to get rich being a jazz musician. You do it because you have a passion. That jazz is being written as it’s played spoke directly to the theme of the film. You’re given a tune. What do you do with it?” To create the New York that a Black jazz piano player like Joe Gardner would know, Docter relied on a “culture team” drawn from, among others, African-American artists at Pixar, jazz musicians in New York and co-writer Kemp Powers who became a co-director. Powers is the same age as Joe, is a jazz musician, grew up in New York, and he’s Black. “I had a lot to learn,” Docter says. “The peek into the America that’s right next to us was eye-opening, not only the culture but the things African-Americans have brought to America and their experience of being marginalized. There was a lot I didn’t know that I didn’t know. We found musicians and teachers in New York and asked them, ‘What’s your lifestyle? What posters are hanging on your wall?’ Initially, I thought 90% of the film would be in The Great Before. As it turned out, most of the film focuses on New York.” Just as jazz influenced the theme of the film, it also infused the style of New York, the detailed sets and environments such as the West Village, Queens, a barber shop, a tailor shop, the Half-Note jazz club, the middle school classroom. “When you look at a fence, you can see syncopation in the rhythm of the slats,” Hoffman says. “We wanted no straight lines. No repeating patterns. Long, elegant, exaggerated shapes, even in Joe’s design.” The set designers looked at 101 Dalmatians as they considered giving Joe’s New York an improvisational style. “Our sets team

loved working on New York and put a lot of time into making sure everything had enough detail,” Fong says. “We wanted to see if we could combine the real world with something almost impressionistic, rather than having cobbles or bricks everywhere we can suggest it.” TAKE FIVE

But when it came to the musicians, the crew wanted to make sure that whether in Joe’s classroom or the jazz club, the musicians were really playing music. “We set up about eight million cameras all around the musicians,” Docter says. “We have close-ups of their fingers, close-ups of details all around.” Then, with help from Disney, they converted the music to analyze it electronically. “We ran that music into the digital instruments,” Fong says. “You can see the keys light up on the virtual piano. Animators can know which key is pressed at what time for how long. We would sync that with the video so they could see the way the musicians use their fingers. If you watch Joe’s hands you will see how he lifts his fingers off and presses with the sides of his fingers.”

“We needed to be inspired by technology. We had weekly reviews. What if we push this way? What if we put a line around the fingers? It was cyclical art that was possible because we could be iterative and adaptable and change things quickly.” —Pete Docter, Writer/Director/Executive Producer

OPPOSITE TOP: Concept art by Steve Pilcher explores the ethereal landscapes, cosmic dimensions and spectacular vistas of The Great Before. TOP AND BOTTOM: Concept art by Kyle MacNaughton and Camilo Castro conveys the the unique urban ambiance of a city jazz club and rich detail of New York City street life.

WINTER 2021 VFXVOICE.COM • 39

12/16/20 1:51 PM


COVER

“The souls are not physical characters. They can change shape, elongate and stretch. They have no physical form.” —Michael Fong, Visual Effects Supervisor and remove the even lines. If it hits one, that’s positive, keep that. Hit another line, that’s two so cut that. That’s how we’d march a ray across a two-dimensional shape. But the character has lines going down the middle that break the simple logic. The right side and left weren’t always right and left because sometimes the character would twist. Then the membrane had to exist within the scene as a 3D shape. We try to render everything in a single frame. So after we flatten the characters, we return them to 3D space so the membrane can exist there.” SPACE IS THE PLACE

The souls and counselors exist in The Great Before. Creating the non-physical environment they wander around in became another design adventure. “We wanted a place that felt soft, that had a sense there were no boundaries,” Docter says. “It extends into an undefined horizon. A lot of the shapes and choices we made were trying to reinforce the idea that this place inspires knowledge. The souls are blank slates, but when they come to Earth they have a personality. So we have pavilions like in ancient Greece and in world’s fairs.” Motif, an in-house tool for set dressing, provided real-time feedback to artists trying to visualize the ideas in Docter’s and the art director’s minds. “We could build fields of grass and pavilions and get feedback in reviews, and then make changes in real-time,” Fong says. “The grass might not be wispy enough or too opaque, or need to be taller and longer. We could change that on the screen.” NEW YORK

Contrasting with The Great Before is Earth – specifically, New York City, the place Joe wants to return to and 22 wants to avoid.

38 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 39

To Joe – and to the film – The Great Before is ethereal. New York is jazz. “I grew up in a musical family,” Docter says. “My parents were musicians and teachers and my two sisters are professional musicians. But instead of listening to classical music, I listened to ’30s and ’40s jazz. As we were looking for a main character who would be born with a passion, I thought about a lot of occupations, even animator. But it seems like a musician, especially a jazz musician – there’s a nobility without selfishness. It’s not like someone is going to get rich being a jazz musician. You do it because you have a passion. That jazz is being written as it’s played spoke directly to the theme of the film. You’re given a tune. What do you do with it?” To create the New York that a Black jazz piano player like Joe Gardner would know, Docter relied on a “culture team” drawn from, among others, African-American artists at Pixar, jazz musicians in New York and co-writer Kemp Powers who became a co-director. Powers is the same age as Joe, is a jazz musician, grew up in New York, and he’s Black. “I had a lot to learn,” Docter says. “The peek into the America that’s right next to us was eye-opening, not only the culture but the things African-Americans have brought to America and their experience of being marginalized. There was a lot I didn’t know that I didn’t know. We found musicians and teachers in New York and asked them, ‘What’s your lifestyle? What posters are hanging on your wall?’ Initially, I thought 90% of the film would be in The Great Before. As it turned out, most of the film focuses on New York.” Just as jazz influenced the theme of the film, it also infused the style of New York, the detailed sets and environments such as the West Village, Queens, a barber shop, a tailor shop, the Half-Note jazz club, the middle school classroom. “When you look at a fence, you can see syncopation in the rhythm of the slats,” Hoffman says. “We wanted no straight lines. No repeating patterns. Long, elegant, exaggerated shapes, even in Joe’s design.” The set designers looked at 101 Dalmatians as they considered giving Joe’s New York an improvisational style. “Our sets team

loved working on New York and put a lot of time into making sure everything had enough detail,” Fong says. “We wanted to see if we could combine the real world with something almost impressionistic, rather than having cobbles or bricks everywhere we can suggest it.” TAKE FIVE

But when it came to the musicians, the crew wanted to make sure that whether in Joe’s classroom or the jazz club, the musicians were really playing music. “We set up about eight million cameras all around the musicians,” Docter says. “We have close-ups of their fingers, close-ups of details all around.” Then, with help from Disney, they converted the music to analyze it electronically. “We ran that music into the digital instruments,” Fong says. “You can see the keys light up on the virtual piano. Animators can know which key is pressed at what time for how long. We would sync that with the video so they could see the way the musicians use their fingers. If you watch Joe’s hands you will see how he lifts his fingers off and presses with the sides of his fingers.”

“We needed to be inspired by technology. We had weekly reviews. What if we push this way? What if we put a line around the fingers? It was cyclical art that was possible because we could be iterative and adaptable and change things quickly.” —Pete Docter, Writer/Director/Executive Producer

OPPOSITE TOP: Concept art by Steve Pilcher explores the ethereal landscapes, cosmic dimensions and spectacular vistas of The Great Before. TOP AND BOTTOM: Concept art by Kyle MacNaughton and Camilo Castro conveys the the unique urban ambiance of a city jazz club and rich detail of New York City street life.

WINTER 2021 VFXVOICE.COM • 39

12/16/20 1:51 PM


COVER

“At the end of the day, the rules of composition and creating an image still applied, but the techniques and tools were totally different, home grown and home conceived. We had to create our own lighting system. We used old techniques that were new to some of the younger artists. I hope that people in the industry will scratch their heads a little bit and think, ‘I don’t know what they did.’” —Ian Megibben, Director of Photography They also hooked a midi device into a keyboard so they could port the music into Pixar’s animation system. “The animators probably cheated some things, but they created fantastic authenticity,” Docter says. MILESTONES

“Creating a system that kept everything consistent was fascinating. I would dream about it at night. There were all these constraints: If a counselor swept a left arm in front of another arm, anything within would disappear. It was like a Venn diagram.” —Jonathan Hoffman, Character Shading Technical Director TOP AND MIDDLE: Concept art by Celine You and Maria Yi captures the wide variety of souls and “counselors”populating The Great Before in Soul. BOTTOM: Concept art by Dave Strick of Joe and soul 22 walking through a gleaming futuristic cityscape of transparent geometric objects and symbols in The Great Before.

Soul is the third feature film Pete Docter has directed, and he has received major honors for his previous: Oscar, Annie and BAFTA awards for Inside Out, plus nominations for Best Original Screenplay from all three organizations; Oscar, Annie and BAFTA awards for Up along with nominations for Best Original Screenplay from the three organizations and a VES award for outstanding animation; an Oscar nomination and a BAFTA Children’s Award for Monsters, Inc., the first film he directed. He also has an Annie Award for writing Toy Story 2, an Oscar nomination for writing Toy Story, and an Annie for Toy Story animation. So for Docter to consider whether what he has been doing is worthwhile comes as a surprise. But that was indeed the motivation for this film. “I bet a lot of people in visual effects are like that, too, in that you go into a line of work that appeals to you, you become passionate about what you’re doing and it becomes your world,” he says. “This film is about me re-examining the worth of it all, about what are we doing with our time, the reason for doing it at all. Those are the things we’re exploring. What are we waking up for every day? Let’s examine what this is leading to.” Those questions could have been explored – and have been explored – in live-action films. But an animated film makes it possible to do so in unique, creative and visually stunning ways. A Great Before filled with blank-slate souls. Who else could have imagined such a place and have the technology and artistic experience to put it on “film?” “One of the joys of working here at Pixar is to have this amazing bunch of artists bring this stuff to life,” Docter says. “They allow us to open deep doors.”

40 • VFXVOICE.COM WINTER 2021

PG 34-40 SOUL.indd 40

12/16/20 1:51 PM


PG 41 AMAZON TALES FROM THE LOOP AD.indd 41

12/16/20 1:52 PM


VFX TRENDS

enhanced by its chief gifts of interactive practical lighting and directly representative motion picture backgrounds the screens display in-camera for shooting purposes. For sci-fi and fantasy projects these advances are major and practical additions. Now for the first time and as a matter of course, designers will be directly participating in the final digital completion of their own work on a project. This alone is a significant and long overdue change. “Virtual production tech isn’t appropriate or practical for every genre. If a project is set in the present day and requires direct interaction with a story’s environment as a Mission: Impossible or Bond movie would, the tech offers little benefit. Medium to small-budget projects will also have difficulty accessing the tech as it comes with a healthy price tag, both for the stage setup itself and the lead time required to create digital assets for playback. Neither is the tech a magic bullet that will remedy layers of problems we face going back into production in the age of COVID. Nevertheless, The Mandalorian-style virtual production is a visual qualityenhancing gift to top tier-budgeted projects that traditionally require a load of bluescreen/greenscreen stage environments. I don’t imagine many on a production team would be sad to lose those grating and problematic chromakey drops once and for all.” Scott Meadows, Head of Visualization and Virtual Production, Digital Domain “We recently had a client in the middle of reshoots when COVID hit, forcing us to rethink how best to proceed. We previously completed the previs for this sequence, so we already had a virtual environment that could easily be integrated into Unreal Engine. We had several props and CG characters, and our team put together some blocking animation that we added to Unreal Engine. Within a day, we had everything we needed for the filmmakers to

VIRTUAL PRODUCTION TAKES A BIG STEP FORWARD By TREVOR HOGG

TOP: The Mandalorian is viewed as a major breakthrough for virtual production with many other projects trying to emulate the success of the series. (Image courtesy of Disney) OPPOSITE BOTTOM: LED walls were used for subtle environmental effects in Westworld. (Image courtesy of HBO)

42 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 42-43

do whatever they wanted within the scene. For the actual shoot there were only seven people present, with the director, editor, VFX Supervisor and Animation Supervisor all calling in remotely. We broadcast the camera operator feeds to them, so they saw the virtual camera shots in real-time. Afterwards, we scheduled follow-up shoots with someone in a mocap suit to provide a more nuanced performance. “Filmmakers love to iterate though, so we’ve modified our pipeline to allow them to do that throughout, and even after virtual production. We are going to spend more time in the game engines long-term. Right now, Digital Domain is one of only a few other studios that have both the pipeline and the staff to manage everything correctly [and efficiently], plus we also have the advantage of a full motion library created over time on our mocap stage. The goal with all of this is to design sequences that flow well and are visually compelling. Real-time is great for this, and it gives you the feeling of being on set and/or finding a shot on the day. As virtual production becomes more accessible and in-demand, it won’t be treated as a separate part of the filmmaking process, but another creative tool we use to produce great work.” Sam Nicholson, CEO and Founder, Stargate Studios “Virtual production is the new Wild West of the film business where the world of game developers and film producers are merging. From photoreal avatars to flawless virtual sets and extensive Unreal worlds, the global production community has embraced the amazing potential of virtual production as a solution to many of the production challenges facing us during the current global pandemic. “The convergence of volumetric imaging, spatial tracking, scalable LED displays and extensive real-time visualization tools,

Traditionally, movies and television shows have been divided into three stages consisting of pre-production, production and post-production; however, the lines are blurring with the advancements in virtual production. A great deal of interest was generated with what Jon Favreau was able to achieve utilizing the technology to produce The Mandalorian. Interest turned into necessity when the coronavirus pandemic restricted the ability to shoot global locations. If you cannot go out into the world then the next best thing is to create a photorealistic, computer-generated environment that can be adjusted in real-time. Will virtual production be a game-changer that will have lasting impact? Industry leaders weigh in. Scott Chambliss, Production Designer, Guardians of the Galaxy, Vol. 2 “One of the best qualities of our medium is its essential plasticity. By replacing traditional bluescreen/greenscreen tech with LED display walls, a stage working environment is dramatically

WINTER 2021 VFXVOICE.COM • 43

12/16/20 1:52 PM


VFX TRENDS

enhanced by its chief gifts of interactive practical lighting and directly representative motion picture backgrounds the screens display in-camera for shooting purposes. For sci-fi and fantasy projects these advances are major and practical additions. Now for the first time and as a matter of course, designers will be directly participating in the final digital completion of their own work on a project. This alone is a significant and long overdue change. “Virtual production tech isn’t appropriate or practical for every genre. If a project is set in the present day and requires direct interaction with a story’s environment as a Mission: Impossible or Bond movie would, the tech offers little benefit. Medium to small-budget projects will also have difficulty accessing the tech as it comes with a healthy price tag, both for the stage setup itself and the lead time required to create digital assets for playback. Neither is the tech a magic bullet that will remedy layers of problems we face going back into production in the age of COVID. Nevertheless, The Mandalorian-style virtual production is a visual qualityenhancing gift to top tier-budgeted projects that traditionally require a load of bluescreen/greenscreen stage environments. I don’t imagine many on a production team would be sad to lose those grating and problematic chromakey drops once and for all.” Scott Meadows, Head of Visualization and Virtual Production, Digital Domain “We recently had a client in the middle of reshoots when COVID hit, forcing us to rethink how best to proceed. We previously completed the previs for this sequence, so we already had a virtual environment that could easily be integrated into Unreal Engine. We had several props and CG characters, and our team put together some blocking animation that we added to Unreal Engine. Within a day, we had everything we needed for the filmmakers to

VIRTUAL PRODUCTION TAKES A BIG STEP FORWARD By TREVOR HOGG

TOP: The Mandalorian is viewed as a major breakthrough for virtual production with many other projects trying to emulate the success of the series. (Image courtesy of Disney) OPPOSITE BOTTOM: LED walls were used for subtle environmental effects in Westworld. (Image courtesy of HBO)

42 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 42-43

do whatever they wanted within the scene. For the actual shoot there were only seven people present, with the director, editor, VFX Supervisor and Animation Supervisor all calling in remotely. We broadcast the camera operator feeds to them, so they saw the virtual camera shots in real-time. Afterwards, we scheduled follow-up shoots with someone in a mocap suit to provide a more nuanced performance. “Filmmakers love to iterate though, so we’ve modified our pipeline to allow them to do that throughout, and even after virtual production. We are going to spend more time in the game engines long-term. Right now, Digital Domain is one of only a few other studios that have both the pipeline and the staff to manage everything correctly [and efficiently], plus we also have the advantage of a full motion library created over time on our mocap stage. The goal with all of this is to design sequences that flow well and are visually compelling. Real-time is great for this, and it gives you the feeling of being on set and/or finding a shot on the day. As virtual production becomes more accessible and in-demand, it won’t be treated as a separate part of the filmmaking process, but another creative tool we use to produce great work.” Sam Nicholson, CEO and Founder, Stargate Studios “Virtual production is the new Wild West of the film business where the world of game developers and film producers are merging. From photoreal avatars to flawless virtual sets and extensive Unreal worlds, the global production community has embraced the amazing potential of virtual production as a solution to many of the production challenges facing us during the current global pandemic. “The convergence of volumetric imaging, spatial tracking, scalable LED displays and extensive real-time visualization tools,

Traditionally, movies and television shows have been divided into three stages consisting of pre-production, production and post-production; however, the lines are blurring with the advancements in virtual production. A great deal of interest was generated with what Jon Favreau was able to achieve utilizing the technology to produce The Mandalorian. Interest turned into necessity when the coronavirus pandemic restricted the ability to shoot global locations. If you cannot go out into the world then the next best thing is to create a photorealistic, computer-generated environment that can be adjusted in real-time. Will virtual production be a game-changer that will have lasting impact? Industry leaders weigh in. Scott Chambliss, Production Designer, Guardians of the Galaxy, Vol. 2 “One of the best qualities of our medium is its essential plasticity. By replacing traditional bluescreen/greenscreen tech with LED display walls, a stage working environment is dramatically

WINTER 2021 VFXVOICE.COM • 43

12/16/20 1:52 PM


VFX TRENDS

“Throughout 2020, we also saw the emergence of a different type of democratization powered by real-time productions – the democratization of location.” —Adam Myhill, Creative Director, Labs, Unity Technologies

such as Unreal Engine 5, are increasingly enabling a new ‘Virtual Era’ of production on set. Simultaneously, traditional visual effects post-production workflows are accelerating towards real-time with advances in computational processing, cloud rendering and integrated remote workflows which enable complex hybrid combinations of new entertainment formats. Physical reality, sampled reality and synthesized reality can now be seamlessly blended by creative artists and skilled technicians into multiple, complex entertainment experiences – increasingly produced and experienced in real-time. “Individual, highly-specialized production teams will become increasingly decentralized, yet remain hyper-connected on digital platforms. These new strategic partnerships and virtual production communities will in turn share knowledge, integrated assets and unlimited horsepower enabling virtual productions to collaborate on a global scale unimaginable by today’s standards.” Adam Myhill, Creative Director, Labs, Unity Technologies “Now that the industry understands how real-time technologies can best benefit a production, the next step is making them accessible to every level of filmmaker. There’s no reason that a student making his first film shouldn’t be able to benefit from the same kinds of real-time production tools as Jon Favreau and Steven Spielberg. Throughout 2020, we also saw the emergence of a different type of democratization powered by real-time productions – the democratization of location. “The pandemic we’re living through has had a massive impact on countless industries, and film was no exception. Not only were studios forced to pivot to new forms of distribution to reach their audiences, but the world of on-set production ground to an absolute halt. But for some studios, real-time production has helped members of their teams get back to work. In other cases, real-time production is helping to power the transition to animation to help keep content in production. Instead of the hours per frame that traditional methods of animation take, real-time rendering in an animation pipeline occurs at just that real-time. While real-time production wasn’t designed to help keep Hollywood studios working during a pandemic, it has definitely helped to take some of the pressure off of production that would otherwise be stranded.” Paul Cameron, Cinematographer and Director, Westworld “The volumes are great for The Mandalorian, landscapes and putting a massive cargo ship in space multiplied by 100 ships, but you can’t put sets on a 360-degree LED volume. We will be testing big LED volume-type applications for Westworld this year. Is it all based on COVID? No. It’s based on what we did last season and coming to the realization that we want to work in a combination of set and LED volume walls. Everybody is looking at people like us to say, ‘Shoot it all in a LED environment. You’re fine.’ But that’s not the case. The inherent problem is that we’re looking at an industry that relies on a lot of people going to different locations or stages or both. Now with COVID a big panic button got hit. How do you stay at home and do it? We’ve got to get back onto the plane with the same groups of people to produce the same imagery and content.

44 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 44-45

“I hope that virtual production and aspects of it remain similar to what they are now. What I don’t want to see are discussions of how to capture images with new LiDAR-type technology of lens-less cameras recording environments so that it could all be determined later on how the environments are going to be used; that has been a moving target in the virtual production world anyway. The important thing short-term is to simplify stories, cast the number of people you need and find the locations required to tell a story. Let’s just get the content going and once the machine is moving then determine how virtual production can be improved for production across the board, instead of making it the driving force.” Alex McDowell, Co-Founder, Creative Director, Experimental Design “Someone like my son can play [a video game] for 72 hours in the same environment and is in complete control of his character’s costume, behavior and ability to move; he is standing at the center of stage at all times. The essence of filmmaking is that the audience is passive and immersing themselves in somebody else’s world. I don’t think it’s a possibility of them becoming similar and nor would you want to. On the other hand, as a designer I think absolutely. Finally, the film industry understands that game engines could be part of pre-production as well as post-production. The only design difference for video games is that you have to create infinitely large worlds for years of gameplay. “When I worked at the beginning of Star Wars: Episode IX in London with the original director, Colin Trevorrow, ILM was part of the art department. Our virtual production process was rough-designing sets before illustration so you’re blocking environments relative to scale and geo location, pushing that wireframe into Maya and the game engine, putting VR goggles on the director, establishing keyframes from the director’s scout of the set, and feeding that information back to the artists in the art department who create high-resolution images that are constantly pushed back into VR for location scouting. The end result is that you have accelerated the project by eight to 12 months by having the director live inside the environment from day one.”

David Morin, Head, Epic Games Los Angeles Lab “The big thing that happened recently is the ability to do photoreal rendering in real-time; that has changed the art department for those who have chosen to embrace the technology. For the glass house scene in John Wick: Chapter 3 – Parabellum, the reflections were influencing the set design. The normal CAD package was imported into Unreal Engine, which has the ability to do ray tracing and produce accurate reflections. The camera department could see where the reflection of the camera was going to show in that virtual twin of the set and have the design modified. You can also use it later for re-shoots if the set has been dismantled. It brings filmmakers back to a workflow that has the immediacy of physical filmmaking. “The benefit of putting Unreal on the LED wall is that the parallax of the world changes according to the position of the camera. It’s not going to look like a flat wall. That’s a game-changer in the way you can shoot something. The biggest opportunity for the visual effects industry is for it to move back from post-production into production and pre-production so there is this collaborative feedback loop. Between 40% and 50% of shots captured on set for The Mandalorian were final while the remainder were traditional visual effects. This won’t replace all of the other ways for doing things but adds a new way of working that brings a lot of benefits with it.”

OPPOSITE TOP: The near-seamless on-set integration of LED walls is illustrated in a behind-the-scenes still of The Mandalorian. (Image courtesy of Disney) OPPOSITE MIDDLE: Unreal Engine was indispensable when visualizing and executing the glass house fight in John Wick: Chapter 3 – Parabellum. (Images courtesy of Lionsgate, Epic Games and Alex Nice) OPPOSITE BOTTOM: LED screens were placed around the glass house set in John Wick: Chapter 3 – Parabellum that projected surreal content made in Unreal Engine. (Image courtesy of Lionsgate, Epic Games and Alex Nice) TOP: Breakout star and practical puppet Baby Yoda is surrounded by an LED wall environment created with Unreal Engine in The Mandalorian. (Image courtesy of Disney)

WINTER 2021 VFXVOICE.COM • 45

12/16/20 1:52 PM


VFX TRENDS

“Throughout 2020, we also saw the emergence of a different type of democratization powered by real-time productions – the democratization of location.” —Adam Myhill, Creative Director, Labs, Unity Technologies

such as Unreal Engine 5, are increasingly enabling a new ‘Virtual Era’ of production on set. Simultaneously, traditional visual effects post-production workflows are accelerating towards real-time with advances in computational processing, cloud rendering and integrated remote workflows which enable complex hybrid combinations of new entertainment formats. Physical reality, sampled reality and synthesized reality can now be seamlessly blended by creative artists and skilled technicians into multiple, complex entertainment experiences – increasingly produced and experienced in real-time. “Individual, highly-specialized production teams will become increasingly decentralized, yet remain hyper-connected on digital platforms. These new strategic partnerships and virtual production communities will in turn share knowledge, integrated assets and unlimited horsepower enabling virtual productions to collaborate on a global scale unimaginable by today’s standards.” Adam Myhill, Creative Director, Labs, Unity Technologies “Now that the industry understands how real-time technologies can best benefit a production, the next step is making them accessible to every level of filmmaker. There’s no reason that a student making his first film shouldn’t be able to benefit from the same kinds of real-time production tools as Jon Favreau and Steven Spielberg. Throughout 2020, we also saw the emergence of a different type of democratization powered by real-time productions – the democratization of location. “The pandemic we’re living through has had a massive impact on countless industries, and film was no exception. Not only were studios forced to pivot to new forms of distribution to reach their audiences, but the world of on-set production ground to an absolute halt. But for some studios, real-time production has helped members of their teams get back to work. In other cases, real-time production is helping to power the transition to animation to help keep content in production. Instead of the hours per frame that traditional methods of animation take, real-time rendering in an animation pipeline occurs at just that real-time. While real-time production wasn’t designed to help keep Hollywood studios working during a pandemic, it has definitely helped to take some of the pressure off of production that would otherwise be stranded.” Paul Cameron, Cinematographer and Director, Westworld “The volumes are great for The Mandalorian, landscapes and putting a massive cargo ship in space multiplied by 100 ships, but you can’t put sets on a 360-degree LED volume. We will be testing big LED volume-type applications for Westworld this year. Is it all based on COVID? No. It’s based on what we did last season and coming to the realization that we want to work in a combination of set and LED volume walls. Everybody is looking at people like us to say, ‘Shoot it all in a LED environment. You’re fine.’ But that’s not the case. The inherent problem is that we’re looking at an industry that relies on a lot of people going to different locations or stages or both. Now with COVID a big panic button got hit. How do you stay at home and do it? We’ve got to get back onto the plane with the same groups of people to produce the same imagery and content.

44 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 44-45

“I hope that virtual production and aspects of it remain similar to what they are now. What I don’t want to see are discussions of how to capture images with new LiDAR-type technology of lens-less cameras recording environments so that it could all be determined later on how the environments are going to be used; that has been a moving target in the virtual production world anyway. The important thing short-term is to simplify stories, cast the number of people you need and find the locations required to tell a story. Let’s just get the content going and once the machine is moving then determine how virtual production can be improved for production across the board, instead of making it the driving force.” Alex McDowell, Co-Founder, Creative Director, Experimental Design “Someone like my son can play [a video game] for 72 hours in the same environment and is in complete control of his character’s costume, behavior and ability to move; he is standing at the center of stage at all times. The essence of filmmaking is that the audience is passive and immersing themselves in somebody else’s world. I don’t think it’s a possibility of them becoming similar and nor would you want to. On the other hand, as a designer I think absolutely. Finally, the film industry understands that game engines could be part of pre-production as well as post-production. The only design difference for video games is that you have to create infinitely large worlds for years of gameplay. “When I worked at the beginning of Star Wars: Episode IX in London with the original director, Colin Trevorrow, ILM was part of the art department. Our virtual production process was rough-designing sets before illustration so you’re blocking environments relative to scale and geo location, pushing that wireframe into Maya and the game engine, putting VR goggles on the director, establishing keyframes from the director’s scout of the set, and feeding that information back to the artists in the art department who create high-resolution images that are constantly pushed back into VR for location scouting. The end result is that you have accelerated the project by eight to 12 months by having the director live inside the environment from day one.”

David Morin, Head, Epic Games Los Angeles Lab “The big thing that happened recently is the ability to do photoreal rendering in real-time; that has changed the art department for those who have chosen to embrace the technology. For the glass house scene in John Wick: Chapter 3 – Parabellum, the reflections were influencing the set design. The normal CAD package was imported into Unreal Engine, which has the ability to do ray tracing and produce accurate reflections. The camera department could see where the reflection of the camera was going to show in that virtual twin of the set and have the design modified. You can also use it later for re-shoots if the set has been dismantled. It brings filmmakers back to a workflow that has the immediacy of physical filmmaking. “The benefit of putting Unreal on the LED wall is that the parallax of the world changes according to the position of the camera. It’s not going to look like a flat wall. That’s a game-changer in the way you can shoot something. The biggest opportunity for the visual effects industry is for it to move back from post-production into production and pre-production so there is this collaborative feedback loop. Between 40% and 50% of shots captured on set for The Mandalorian were final while the remainder were traditional visual effects. This won’t replace all of the other ways for doing things but adds a new way of working that brings a lot of benefits with it.”

OPPOSITE TOP: The near-seamless on-set integration of LED walls is illustrated in a behind-the-scenes still of The Mandalorian. (Image courtesy of Disney) OPPOSITE MIDDLE: Unreal Engine was indispensable when visualizing and executing the glass house fight in John Wick: Chapter 3 – Parabellum. (Images courtesy of Lionsgate, Epic Games and Alex Nice) OPPOSITE BOTTOM: LED screens were placed around the glass house set in John Wick: Chapter 3 – Parabellum that projected surreal content made in Unreal Engine. (Image courtesy of Lionsgate, Epic Games and Alex Nice) TOP: Breakout star and practical puppet Baby Yoda is surrounded by an LED wall environment created with Unreal Engine in The Mandalorian. (Image courtesy of Disney)

WINTER 2021 VFXVOICE.COM • 45

12/16/20 1:52 PM


VFX TRENDS

“As virtual production becomes more accessible and in-demand, it won’t be treated as a separate part of the filmmaking process, but another creative tool we use to produce great work.” —Scott Meadows, Head of Visualization and Virtual Production, Digital Domain Christopher Nichols, Director, Chaos Group Labs “It will be great when virtual production becomes production. There are a couple of things that need to happen. If what comes out of shooting virtual production looks close to what it is going to look like in the final frame then the director and DP will be able to see and change things as opposed to the old days when they were looking at rudimentary grey-shaded geometry. Now they can say, ‘I want a key light here.’ Those decisions are being made on set as opposed to passing it down to some visual effects artists to make that decision later and guessing what the intent was – that is starting to occur now with some of the stuff happening with game engines. The next step beyond that is to have real physical lighting in virtual production. The only way that is actually going to happen is with ray tracing. Once you start doing full ray tracing in realtime with real global illumination and lighting coming out of real things then it is going to get closer to looking like the final frame. At that point you might be able to shoot things live, and that is going to happen. Maybe not this year, but very soon.”

TOP: Intel Studios’ 10,000-square-foot geodesic dome in Los Angeles is the world’s largest immersive media hub. Each of the studio’s 96 high-resolution 5K cameras captures action in two dimensions and algorithms convert those trillions of pixels into a 360-degree, 3D virtual environment. (Photo: Tim Herman. Courtesy of Intel Corporation) MIDDLE: A scene from a trilogy of short films called Baymax Dreams produced using the Unity game engine.(Photo courtesy of Disney and Unity Technologies) BOTTOM: A virtual production methodology was developed by Stargate Studios and implemented for the HBO series Run. (Image courtesy of Stargate Studio

AD

Nic Hatch, CEO, Ncam “The journey that we’re all in now with virtual production is how do we make it easy, achievable, affordable, scalable and repeatable? Part of that is the rendering engine inside of it. You can see the effort that Epic is putting into this space, which is phenomenal, and we work closely with them. But there are other bits of technology required to make virtual production a reality for everyone. What is required for the future and will open up a huge amount of possibilities is depth per pixel in real-time. When the technology does not just understand the X and Y coordinates but the Z coordinate as well, then things become a lot more flexible. You can start to automate various processes that today are simply manual. “As a filmmaker, we shouldn’t care whether it’s real or not when looking through the lens. It just looks real. The same [is true] with visual effects if that’s what you’re trying to do. I don’t think this means post-production goes away. It means that the heavy lifting and planning are done upfront. If you want to make changes afterwards you can certainly do that. There is no degradation in the data. Ultimately, what post-production will turn into will be tweaking, not design; this is a good thing because that is where it can go quite wrong, especially with re-shoots.” Ben Grossmann, Co-Founder and CEO, Magnopus “Everyone’s been salivating over Unreal Engine 5, and if that hits in 2021, it’s going to be a madhouse. The shortage of skilled realtime artists and engineers has been a major point of friction for

46 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 47

12/16/20 1:53 PM


PG 47 DISNEY SOUL AD.indd 47

12/16/20 1:53 PM


VFX TRENDS

filmmakers the last couple years. There are just more shows that need real-time artists than there are in the market. Fortunately, there’s a huge investment being made to train skilled labor, like the Unreal Fellowship. “LED walls will grow to be a critical part of filmmaking, but can’t do everything; filmmakers still haven’t adjusted to the reality that they need to build great assets before shooting on a LED wall. Mobile devices will be shipping with LiDAR scanners. Filmmakers can start collecting the parts of the world they want to use, dropping them into game engines, and making the movies they want with less technical skills required. Plus, they can use those same sensors for performance capture, so I expect that is a pretty big leap forward in democratizing filmmaking. “What I’d like to see is major advancements in applying deep learning and AI to solving a lot of the friction points at scale. However, everyone is so focused on making presentations and white papers that they aren’t producing tech people can use. I’d also like to see USD become ubiquitous finally, so we can work on materials and shaders next. COVID will force a lot of remote collaboration and that’s going to force the major studios to rethink their security requirements. Expect a lot of focus on ‘moving the studio system into the cloud’ [not at the vendor level, but above that]. It only makes sense that you consolidate the entire film production pipeline so that you can optimize storage and rendering resources.”

TOP: A motion capture stage utilized by Animatrik. (Image courtesy of Animatrik) MIDDLE: A close-up shot of the virtual production that was utilized by Stargate Studios during the making of Run for HBO. (Image courtesy of Stargate Studios) BOTTOM: A virtual production stage deployed by Digital Domain. (Image courtesy of Digital Domain)

Rachel Rose, R&D Supervisor, ILM “This is a really exciting time for virtual production in our industry. While virtual production has had a long history with early uses on films such as The Lord of the Rings: The Fellowship of the Ring and A.I. Artificial Intelligence, technology in this space is currently advancing rapidly. These advancements have a wide range of filmmakers excited about embracing the latest new workflows. “Throughout 2021, we should see films and TV shows fully adopting the latest techniques for in-camera virtual production using LED stages, like what we have done at ILM on The Mandalorian. Our immersive StageCraft LED volumes with perspective-correct, real-time rendered imagery allow for final pixels to be captured in-camera while providing realistic lighting and reflections on foreground elements. As more traditional filmmakers get involved in virtual production, techniques that combine real-time technology with physical camera equipment will spread rapidly, like what we did with Gore Verbinski on Rango and what Jon Favreau did in VR on The Lion King. “All of these virtual production workflows will advance further throughout the year as technology continues to develop. Expect to see more use of real-time ray tracing capabilities in the latest graphics cards. In practice, this means that virtual production previsualization and on-set capture will allow for more dynamically-adjustable, realistic lighting, including soft shadows and quality reflections. And new innovative software features for multi-user workflows will connect key creatives in a wide range of locations, allowing some collaborative virtual production work to happen seamlessly from a distance.”

48 • VFXVOICE.COM WINTER 2021

PG 42-48 VIRTUAL PRODUCTION.indd 48

12/16/20 1:53 PM


PG 49 SONY AD.indd 49

12/16/20 1:54 PM


VFX TRENDS

ADVENTURES IN INDIE VIRTUAL PRODUCTION By IAN FAILES

Around the world, excitement continues to brew at the filmmaking-related possibilities offered up by virtual production and realtime rendering. This includes independent filmmakers, whose projects might be on a much smaller scale than high-profile virtual production shows such as The Mandalorian and The Lion King. Indeed, the availability and democratization of LED wall technology, game engines like Unreal Engine and Unity and real-time tools have given indie filmmakers the chance to experiment widely in this new virtual production paradigm to produce high-quality content. At the same time, they’ve been able to offer new filmmaking techniques in an age of social distancing requirements. Several creators, including VFX artists and cinematographers, explain here what they’ve been experimenting with in the area of virtual production. LED WALLS: NEW STORY TELLING TECHNIQUES

TOP: On the set of Last Pixel’s test-shoot with crew member Ian Rogers in front of the LED walls and a projected background. (Image courtesy Last Pixel) OPPOSITE TOP: The Granary’s LED wall test-shoot included an ‘outdoor’ car scene. (Image courtesy The Granary)

50 • VFXVOICE.COM WINTER 2021

PG 50-54 INDIE VIRTUAL.indd 50-51

LED walls, coupled with real-time rendering tools, can offer up alternative ways to provide for virtual backgrounds, which might normally have required greenscreens or rear-projection techniques. Plus, they can provide final shots ‘in-camera’ and benefit from interactive light on your actors. Independent filmmakers, motivated by what they have seen on large-scale productions, have found that LED walls can be incredibly useful in their projects. One outfit that has been experimenting with LED walls, for example, is Last Pixel, based in Perth, Australia, headed by Rick Grigsby and David McDonnell. Having produced visual effects, animation and visualization for several projects including Netflix’s I Am Mother, Last Pixel had already been dabbling in real-time rendering with Unreal Engine. As the coronavirus pandemic hit, Last Pixel partnered with local events company Mediatec, which owned LED screens, to produce a couple of test shoots. According to McDonnell, the initial areas of LED wall shoots

“I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.” —David McDonnell, Co-founder, Last Pixel Last Pixel needed to get a handle on included utilizing Unreal’s nDisplay for pushing imagery onto the screens, and configuring a HTC Vive as a tracking solution to enable the imagery on the walls to change depending upon the camera position. NVIDIA graphics cards were also part of the equation. “Apart from the back-end tech side, we tried to focus on the craft and strategies of how you would shoot a real project on an LED stage, so we set ourselves a few goals to shoot coverage on some made-up scenes,” says McDonnell. “I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.” Another independent outfit that has jumped into the world of LED walls is Wellington, New Zealand production company The Granary. Founders Amber Marie Naveira and Victor Naveira have backgrounds in visual effects and post-production. They started the company looking to realize high-concept shorts and commercial work for lower budgets. Then the virus struck, forcing them to look at alternative filmmaking options. “I had come across The Mandalorian, of course,” details Victor, “but

prior to that, there’d been a Quixel video that’d been released, which was photorealism using Unreal Engine 4. It all just sort of went from there.” Avalon Studios, where The Granary had set up a work space, connected them with Streamliner Productions, which owned some LED screens. “We thought,” recalls Amber Marie, “if we could prove the concept [of virtual production with LED walls] that it could be a game-changer for our local industry. It would mean people in our local film industry could get some ideas out onto the screen and we could help them with it.” Some of the principal issues The Granary faced in their testing included the size of the screens, dealing with screen tearing, which camera was best to use, which graphics card would best enable real-time rendering of the imagery they had generated, and how much dynamic lighting could be part of that imagery. However, with some to and fro, the tests proved successful. “What’s really liberating is,” suggests Victor, “since Epic is making all of this available through Unreal, it is democratizing the process and making it possible for a massive audience. So, indie filmmakers like ourselves can try and make productions which are higher caliber or have bigger concepts.” Amber Marie agrees: “I’m super excited about where we can go with it, how much we’re learning and what opportunities are out there.”

WINTER 2021 VFXVOICE.COM • 51

12/16/20 1:55 PM


VFX TRENDS

ADVENTURES IN INDIE VIRTUAL PRODUCTION By IAN FAILES

Around the world, excitement continues to brew at the filmmaking-related possibilities offered up by virtual production and realtime rendering. This includes independent filmmakers, whose projects might be on a much smaller scale than high-profile virtual production shows such as The Mandalorian and The Lion King. Indeed, the availability and democratization of LED wall technology, game engines like Unreal Engine and Unity and real-time tools have given indie filmmakers the chance to experiment widely in this new virtual production paradigm to produce high-quality content. At the same time, they’ve been able to offer new filmmaking techniques in an age of social distancing requirements. Several creators, including VFX artists and cinematographers, explain here what they’ve been experimenting with in the area of virtual production. LED WALLS: NEW STORY TELLING TECHNIQUES

TOP: On the set of Last Pixel’s test-shoot with crew member Ian Rogers in front of the LED walls and a projected background. (Image courtesy Last Pixel) OPPOSITE TOP: The Granary’s LED wall test-shoot included an ‘outdoor’ car scene. (Image courtesy The Granary)

50 • VFXVOICE.COM WINTER 2021

PG 50-54 INDIE VIRTUAL.indd 50-51

LED walls, coupled with real-time rendering tools, can offer up alternative ways to provide for virtual backgrounds, which might normally have required greenscreens or rear-projection techniques. Plus, they can provide final shots ‘in-camera’ and benefit from interactive light on your actors. Independent filmmakers, motivated by what they have seen on large-scale productions, have found that LED walls can be incredibly useful in their projects. One outfit that has been experimenting with LED walls, for example, is Last Pixel, based in Perth, Australia, headed by Rick Grigsby and David McDonnell. Having produced visual effects, animation and visualization for several projects including Netflix’s I Am Mother, Last Pixel had already been dabbling in real-time rendering with Unreal Engine. As the coronavirus pandemic hit, Last Pixel partnered with local events company Mediatec, which owned LED screens, to produce a couple of test shoots. According to McDonnell, the initial areas of LED wall shoots

“I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.” —David McDonnell, Co-founder, Last Pixel Last Pixel needed to get a handle on included utilizing Unreal’s nDisplay for pushing imagery onto the screens, and configuring a HTC Vive as a tracking solution to enable the imagery on the walls to change depending upon the camera position. NVIDIA graphics cards were also part of the equation. “Apart from the back-end tech side, we tried to focus on the craft and strategies of how you would shoot a real project on an LED stage, so we set ourselves a few goals to shoot coverage on some made-up scenes,” says McDonnell. “I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.” Another independent outfit that has jumped into the world of LED walls is Wellington, New Zealand production company The Granary. Founders Amber Marie Naveira and Victor Naveira have backgrounds in visual effects and post-production. They started the company looking to realize high-concept shorts and commercial work for lower budgets. Then the virus struck, forcing them to look at alternative filmmaking options. “I had come across The Mandalorian, of course,” details Victor, “but

prior to that, there’d been a Quixel video that’d been released, which was photorealism using Unreal Engine 4. It all just sort of went from there.” Avalon Studios, where The Granary had set up a work space, connected them with Streamliner Productions, which owned some LED screens. “We thought,” recalls Amber Marie, “if we could prove the concept [of virtual production with LED walls] that it could be a game-changer for our local industry. It would mean people in our local film industry could get some ideas out onto the screen and we could help them with it.” Some of the principal issues The Granary faced in their testing included the size of the screens, dealing with screen tearing, which camera was best to use, which graphics card would best enable real-time rendering of the imagery they had generated, and how much dynamic lighting could be part of that imagery. However, with some to and fro, the tests proved successful. “What’s really liberating is,” suggests Victor, “since Epic is making all of this available through Unreal, it is democratizing the process and making it possible for a massive audience. So, indie filmmakers like ourselves can try and make productions which are higher caliber or have bigger concepts.” Amber Marie agrees: “I’m super excited about where we can go with it, how much we’re learning and what opportunities are out there.”

WINTER 2021 VFXVOICE.COM • 51

12/16/20 1:55 PM


VFX TRENDS

A Virtual Production Tool for Your Phone Many of the tools that can be used for virtual production are now widely accessible. What that means is that just about anyone can try virtual production techniques with equipment they may already own. For example, FXhome has an iOS app available called CamTrackAR that captures videos and 3D tracking data simultaneously using Apple’s ARKit. CamTrackAR uses data available from ARKit to map out an environment in real-time, making it possible to capture tracking and anchors in real-time while you record your footage. “I’d been thinking about it for a few years now, what with the cameras on iPhones becoming pretty amazing and the accuracy of the AR having improved immensely,” says FXhome founder and

CEO Joshua Davies. “Compared to wrangling with other VR tech it just seemed like all the tools in one package – on your iPhone.” Davies sees filmmakers and VFX artists working on previs as one major user for CamTrackAR. He also believes the tool could be employed by the new generation of iPhone filmmakers. “iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.”

“iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.” —Joshua Davies, Founder and CEO, FXhome EMBRACING VIRTUAL CINEMATOGRAPHY

TOP: The Unreal Engine user interface for a scene from Nemosyne. (Image courtesy of Kevin Stewart and Luc Delamare) BOTTOM: In addition to running the Real-Time Shorts Challenge, won by Nemosyne, MacInnes Studios is also heavily involved in the creation of real-time virtual humans, such as this depiction made of David Bowie. (Image courtesy of John MacInnes) OPPOSITE BOTTOM: A depiction of the raw video, 3D tracking and final CG object that can be added to a scene using CamTrackAR. (Image courtesy of FXhome and CamTrackAR)

52 • VFXVOICE.COM WINTER 2021

PG 50-54 INDIE VIRTUAL.indd 52-53

Alongside LED wall production by independent filmmakers are a plethora of other virtual production techniques in use right now. These techniques often simply mimic what might normally be done on a traditional live-action production with actors, but instead now use completely synthetic cameras, actors and locations, while still following the rules and lessons learned from live-action filmmaking. Directors of Photography Kevin Stewart and Luc Delamare recently explored the world of virtual cinematography for their short film Nemosyne, which tells the story of a sentient machine. Using Unreal Engine, the filmmakers sought to translate their live-action knowledge into this new virtual approach. “Almost everything we know about live-action cinematography translates photo-accurately to what we wanted to do in Unreal,” outlines Stewart. “We could shape passive light with bounce boards and negative fill, we could soften the lighting or make it have hard shadows, and we could even create and simulate a sky ambience that would light up a dark room in a realistic way. Camera operating was also a very tangible experience – in this case, we were able to pair our virtual camera to an HTC Vive controller and operate it like a live handheld camera in engine. Walking around my office finding virtual shots and compositions was probably the most exhilarating part of the whole experience.” What helped to make the experience exciting for Stewart and Delamare, too, was that they found ways to maintain a ‘grounded’ look and feel inside a virtual environment. “In live-action cinematography I often aim for a naturalistic approach to camera

and lighting,” says Delamare. “Opening up Unreal Engine can be daunting with that goal in mind because it is still technically a game engine, and you have to put in some work to stray away from that video game feel for both camera and lighting. But that’s where we found ourselves having the most fun – grounding ourselves in the world of cinematography we were familiar with, in a new environment that technically has no limits.” The main tool the filmmakers used for making Nemosyne was Unreal’s Virtual Camera. “Giving an authentic handheld feel to a

virtual camera is part of what makes it feel organic and helps sell the cinematic effect we were seeking,” notes Stewart. The duo also established a specific look by relying on real-time raytracing, a setting that initially requires a fair amount of tweaking before settling on the approach. “One of the main challenges we faced in using raytracing was getting the sample count high enough for certain elements – like global illumination, which is obviously a crucial aspect to photoreal rendering – to look acceptable on-screen,” details Delamare.

WINTER 2021 VFXVOICE.COM • 53

12/16/20 1:55 PM


VFX TRENDS

A Virtual Production Tool for Your Phone Many of the tools that can be used for virtual production are now widely accessible. What that means is that just about anyone can try virtual production techniques with equipment they may already own. For example, FXhome has an iOS app available called CamTrackAR that captures videos and 3D tracking data simultaneously using Apple’s ARKit. CamTrackAR uses data available from ARKit to map out an environment in real-time, making it possible to capture tracking and anchors in real-time while you record your footage. “I’d been thinking about it for a few years now, what with the cameras on iPhones becoming pretty amazing and the accuracy of the AR having improved immensely,” says FXhome founder and

CEO Joshua Davies. “Compared to wrangling with other VR tech it just seemed like all the tools in one package – on your iPhone.” Davies sees filmmakers and VFX artists working on previs as one major user for CamTrackAR. He also believes the tool could be employed by the new generation of iPhone filmmakers. “iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.”

“iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.” —Joshua Davies, Founder and CEO, FXhome EMBRACING VIRTUAL CINEMATOGRAPHY

TOP: The Unreal Engine user interface for a scene from Nemosyne. (Image courtesy of Kevin Stewart and Luc Delamare) BOTTOM: In addition to running the Real-Time Shorts Challenge, won by Nemosyne, MacInnes Studios is also heavily involved in the creation of real-time virtual humans, such as this depiction made of David Bowie. (Image courtesy of John MacInnes) OPPOSITE BOTTOM: A depiction of the raw video, 3D tracking and final CG object that can be added to a scene using CamTrackAR. (Image courtesy of FXhome and CamTrackAR)

52 • VFXVOICE.COM WINTER 2021

PG 50-54 INDIE VIRTUAL.indd 52-53

Alongside LED wall production by independent filmmakers are a plethora of other virtual production techniques in use right now. These techniques often simply mimic what might normally be done on a traditional live-action production with actors, but instead now use completely synthetic cameras, actors and locations, while still following the rules and lessons learned from live-action filmmaking. Directors of Photography Kevin Stewart and Luc Delamare recently explored the world of virtual cinematography for their short film Nemosyne, which tells the story of a sentient machine. Using Unreal Engine, the filmmakers sought to translate their live-action knowledge into this new virtual approach. “Almost everything we know about live-action cinematography translates photo-accurately to what we wanted to do in Unreal,” outlines Stewart. “We could shape passive light with bounce boards and negative fill, we could soften the lighting or make it have hard shadows, and we could even create and simulate a sky ambience that would light up a dark room in a realistic way. Camera operating was also a very tangible experience – in this case, we were able to pair our virtual camera to an HTC Vive controller and operate it like a live handheld camera in engine. Walking around my office finding virtual shots and compositions was probably the most exhilarating part of the whole experience.” What helped to make the experience exciting for Stewart and Delamare, too, was that they found ways to maintain a ‘grounded’ look and feel inside a virtual environment. “In live-action cinematography I often aim for a naturalistic approach to camera

and lighting,” says Delamare. “Opening up Unreal Engine can be daunting with that goal in mind because it is still technically a game engine, and you have to put in some work to stray away from that video game feel for both camera and lighting. But that’s where we found ourselves having the most fun – grounding ourselves in the world of cinematography we were familiar with, in a new environment that technically has no limits.” The main tool the filmmakers used for making Nemosyne was Unreal’s Virtual Camera. “Giving an authentic handheld feel to a

virtual camera is part of what makes it feel organic and helps sell the cinematic effect we were seeking,” notes Stewart. The duo also established a specific look by relying on real-time raytracing, a setting that initially requires a fair amount of tweaking before settling on the approach. “One of the main challenges we faced in using raytracing was getting the sample count high enough for certain elements – like global illumination, which is obviously a crucial aspect to photoreal rendering – to look acceptable on-screen,” details Delamare.

WINTER 2021 VFXVOICE.COM • 53

12/16/20 1:55 PM


VFX TRENDS

“But what’s powerful about Unreal Engine is that even at our lowest settings, we were looking at near-final render imagery that enabled us to make calculated creative decisions.” Nemosyne won the Real-Time Shorts Challenge, an event hosted by MacInnes Studios, which specializes in real-time digital human characters. In fact, the Unreal characters and scene files for the Challenge entries came from MacInnes. Founder John MacInnes set out with the Challenge to showcase high-level content that is being made by independent filmmakers. “With Unreal and Unity, anyone can do it,” MacInnes says. “What was missing for most real-time filmmakers was high-level characters, assets and environments to play around with. Over the last four years I’ve created and built some amazing photorealistic scenes in VR. These experiences were basically pitches for larger VR projects that didn’t end up happening, largely because VR didn’t end up happening in the way I would’ve liked. I had already given one VR scene to [well-known virtual production cinematographer] Matt Workman, and I was impressed and inspired by what he was able to do with it. I then thought, ‘What if I gave the scene to anyone who wanted to make something?’ “Kevin and Luc with their short Nemosyne were both live-action DPs who had never made anything in Unreal before,” continues MacInnes. “There’s often a lot of awe and mystique around technology and VFX that sometimes prevents people from diving in, and I think they got a chance to just play around with nothing to prove. And the results were incredible!”

regularly posts tests and final results to YouTube. Cinematic Captures has been generating Star Wars-related content as a series of shorts in Unreal Engine and, along the way, experimenting with a number of virtual production tools. In Not Alone, Cinematic Captures combined a virtual-camera approach for shooting with a Rokoko motion capture suit for characters. “I had previously built up my virtual camera for another short, Order 66, using my Oculus Rift S VR headset and controller to track the real-world camera movements, which I would then relay back into Unreal Engine and parent my camera to the VR controller. “Once I had that set up,” says Cinematic Captures, “I then began rigging up my controller using various bits and pieces I had sitting around from my real-camera setup. As time progressed, I slowly invested in more parts that were better suited to the rig to make it more streamlined and sturdier.” The virtual camera can be a powerful filmmaking tool in that just about anything can be imagined, notes Cinematic Captures. “It’s what feels the most natural to me as a cinematographer. While you can keyframe similar motions and add digital shake in-engine, you just don’t get those minor inconsistencies like you see in physical cinematography. Little things like accidental bumps or jitters in the shots and slightly over or undershooting your character tracking, then repositioning the shot back on them. Additionally, it restricts you to only perform camera moves that would be achievable in the real world, which makes it feel grounded and less like a video game spectator camera.”

EXPERIMENTATION: SHARING AND SHOWING WHAT’S POSSIBLE

Perhaps what makes the advent of indie virtual production particularly attractive today is the significant level of sharing amongst the community. The freelance cinematographer known as Cinematic Captures is one of those community members who

TOP: A scene from Cinematic Captures’ Not Alone, made with a variety of virtual production tools and techniques. (Image courtesy of Cinematic Captures)

54 • VFXVOICE.COM WINTER 2021

PG 50-54 INDIE VIRTUAL.indd 54

12/16/20 1:55 PM


PG 55 AMAZON THE BOYS AD.indd 55

12/16/20 1:59 PM


FILM

DEEP DIVING INTO WWII NAVAL WARFARE ABOARD SUB HUNTER GREYHOUND By TREVOR HOGG

TOP: Not only does Tom Hanks portray Commander Ernest Krause in Greyhound, but he also wrote the screenplay based on The Good Shepherd by English novelist C.S. Forester. (Image courtesy of Apple)

56 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 56-57

Seafaring adventures inspired the imagination of English novelist C.S. Forester who wrote the Napoleonic War-era Horatio Hornblower series and World War II-era The Good Shepherd, which Tom Hanks turned into a screenplay and retitled Greyhound in reference to the code name of the destroyer USS Keeling. The Oscar-winning actor stars as Commander Ernest Krause who is given his first wartime command escorting 37 allied ships across the North Atlantic while being hunted by German U-boats. In order to guide the production, director Aaron Schneider (Get Low) put together an extensive website with details and photographs of all of the ships that were suppose to be in the story. Trouble arose when the visual effects methodology proved to be problematic once post-production commenced. In order to fix the issue, executive producer Aaron Ryder (Arrival) turned to the visual effects team responsible for Transcendence. “Aaron called me and asked, ‘Can you help me out with this?’” recalls VFX Producer Mike Chambers (Tenet, Dunkirk). “We talked about supervisors and he said, ‘What do you think about Nathan McGuinness?’ I answered, ‘I like Nathan. We both know how we did on Transcendence. Let’s go.’ Over the course of this show, Nathan and I were absolutely partners joined at the hip to make the effects look great and fight whatever battles that had to be fought.”

The practical effects had to be digitally replaced in post-production lasting from October 2018 to February 2019. “The interiors and a lot of the exterior immediate deck [of the Greyhound] were shot on an actual Fletcher-class destroyer [USS Kidd in Baton Rouge] but it was landlocked,” states Chambers. “There was no motion or background to speak of. The bridge was built on a gimbal to try to help with the motion, but a lot of the practical stuff didn’t work as well as it needed to. That’s why we had to work over all of those shots.” There were over 1,300 visual effects shots, with the water and exteriors being entirely CG. “Nothing was just an A/B composite,” Chambers remarks. “A lot of the shot count was treating shots where we had existing foreground material, like the dialogue scenes. There are not as many huge, full-on-CG, wide, epic shots, but a good number of them. Every day we kept dealing with whatever the issues were, and trying to get everybody on track and moving forward. It was certainly a scramble towards the end. “Early on I was screaming to get more editorial turnovers than what I was getting, but eventually got what we needed,” adds Chambers. “Fortunately, we did have a semi-rough cut of the movie. We definitely knew which sequences were going to need more focus, resources and time. As things got fine-tuned, what we would think of as simple shots came late [in post-production], so that became more of a thing.” Originally, there were going to be multiple visual effects vendors. “However,” explains Chambers, “due to circumstance, schedule availability, and knowing that we needed a certain level of quality and believability [the decision was made to go with a single vendor]. Fortunately, Nathan and I have a long-term relationship with DNEG and knew what they are capable of. When we learned that their visual effects supervisor was going to be Pete Bebb, I said, ‘Sign me up.’ I had a great deal of confidence in DNEG, and they went above and beyond.” A detailed chart was produced that categorized the difficulty and state of development of each shot for DNEG facilities in London, Vancouver and Mumbai. “I picked out shots within these

“The interiors and a lot of the exterior immediate deck [of the Greyhound] were shot on an actual Fletcher-class destroyer [USS Kidd in Baton Rouge] but it was landlocked. There was no motion or background to speak of. The bridge was built on a gimbal to try to help with the motion, but a lot of the practical stuff didn’t work as well as it needed to. That’s why we had to work over all of those shots [in post-production].” —Mike Chambers, VFX Producer TOP TO BOTTOM: DNEG visited the HMS Belfast in London to get a sense of how the Fletcher-class destroyers were actually built and applied that knowledge to the CG versions. The sky was mapped out for the entire duration of the film’s journey across the Atlantic and determined the light source for each shot. (Images courtesy of DNEG. Final image courtesy of Apple)

WINTER 2021 VFXVOICE.COM • 57

12/16/20 2:00 PM


FILM

DEEP DIVING INTO WWII NAVAL WARFARE ABOARD SUB HUNTER GREYHOUND By TREVOR HOGG

TOP: Not only does Tom Hanks portray Commander Ernest Krause in Greyhound, but he also wrote the screenplay based on The Good Shepherd by English novelist C.S. Forester. (Image courtesy of Apple)

56 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 56-57

Seafaring adventures inspired the imagination of English novelist C.S. Forester who wrote the Napoleonic War-era Horatio Hornblower series and World War II-era The Good Shepherd, which Tom Hanks turned into a screenplay and retitled Greyhound in reference to the code name of the destroyer USS Keeling. The Oscar-winning actor stars as Commander Ernest Krause who is given his first wartime command escorting 37 allied ships across the North Atlantic while being hunted by German U-boats. In order to guide the production, director Aaron Schneider (Get Low) put together an extensive website with details and photographs of all of the ships that were suppose to be in the story. Trouble arose when the visual effects methodology proved to be problematic once post-production commenced. In order to fix the issue, executive producer Aaron Ryder (Arrival) turned to the visual effects team responsible for Transcendence. “Aaron called me and asked, ‘Can you help me out with this?’” recalls VFX Producer Mike Chambers (Tenet, Dunkirk). “We talked about supervisors and he said, ‘What do you think about Nathan McGuinness?’ I answered, ‘I like Nathan. We both know how we did on Transcendence. Let’s go.’ Over the course of this show, Nathan and I were absolutely partners joined at the hip to make the effects look great and fight whatever battles that had to be fought.”

The practical effects had to be digitally replaced in post-production lasting from October 2018 to February 2019. “The interiors and a lot of the exterior immediate deck [of the Greyhound] were shot on an actual Fletcher-class destroyer [USS Kidd in Baton Rouge] but it was landlocked,” states Chambers. “There was no motion or background to speak of. The bridge was built on a gimbal to try to help with the motion, but a lot of the practical stuff didn’t work as well as it needed to. That’s why we had to work over all of those shots.” There were over 1,300 visual effects shots, with the water and exteriors being entirely CG. “Nothing was just an A/B composite,” Chambers remarks. “A lot of the shot count was treating shots where we had existing foreground material, like the dialogue scenes. There are not as many huge, full-on-CG, wide, epic shots, but a good number of them. Every day we kept dealing with whatever the issues were, and trying to get everybody on track and moving forward. It was certainly a scramble towards the end. “Early on I was screaming to get more editorial turnovers than what I was getting, but eventually got what we needed,” adds Chambers. “Fortunately, we did have a semi-rough cut of the movie. We definitely knew which sequences were going to need more focus, resources and time. As things got fine-tuned, what we would think of as simple shots came late [in post-production], so that became more of a thing.” Originally, there were going to be multiple visual effects vendors. “However,” explains Chambers, “due to circumstance, schedule availability, and knowing that we needed a certain level of quality and believability [the decision was made to go with a single vendor]. Fortunately, Nathan and I have a long-term relationship with DNEG and knew what they are capable of. When we learned that their visual effects supervisor was going to be Pete Bebb, I said, ‘Sign me up.’ I had a great deal of confidence in DNEG, and they went above and beyond.” A detailed chart was produced that categorized the difficulty and state of development of each shot for DNEG facilities in London, Vancouver and Mumbai. “I picked out shots within these

“The interiors and a lot of the exterior immediate deck [of the Greyhound] were shot on an actual Fletcher-class destroyer [USS Kidd in Baton Rouge] but it was landlocked. There was no motion or background to speak of. The bridge was built on a gimbal to try to help with the motion, but a lot of the practical stuff didn’t work as well as it needed to. That’s why we had to work over all of those shots [in post-production].” —Mike Chambers, VFX Producer TOP TO BOTTOM: DNEG visited the HMS Belfast in London to get a sense of how the Fletcher-class destroyers were actually built and applied that knowledge to the CG versions. The sky was mapped out for the entire duration of the film’s journey across the Atlantic and determined the light source for each shot. (Images courtesy of DNEG. Final image courtesy of Apple)

WINTER 2021 VFXVOICE.COM • 57

12/16/20 2:00 PM


FILM

“[We visited the HMS Belfast in London because] I wanted to look at an actual World War II destroyer to see what’s involved in the build of it, because we were going to get incredibly close to the Greyhound. The ship build took the duration of the show.” —Pete Bebb, Visual Effects Supervisor, DNEG

TOP TO BOTTOM: It was critical to create the impression that the bow of the Greyhound was crashing through the water with the help of atmospherics such as ocean spray. The weathering of crossing the Atlantic Ocean was something lacking in the practical shooting on the USS Kidd and was added to the CG version of the Greyhound. (Images courtesy of DNEG)

58 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 58-59

categories at certain points of the film to fast-track so we knew what we were aiming for,” explains DNEG VFX Supervisor Pete Bebb. “What you can’t do is change your mind with this huge machine behind you. Nathan, Mike, Aaron and Tom could see it and sign off from the look development point of view.” Bebb realized that the layout pipeline needed to be rewritten. “You could scrub through the entire period of time with all of the ships,” he says. “The cameras would go instantly into layout so they would be in the correct place in the ocean, the right type of Beaufort scale and the proper time of day. Then you could render them out and review it the next day. In regards to the color pipeline, I pushed for using an internal one so we balanced all of our neutral grade all the way through the entire film. We suggested CDLs on top of that and always gave it a half stop more so there was more latitude that Nathan could play with.” Even though the water was fully CG, it was important for Visual Effects Supervisor McGuinness (Master and Commander: The Far Side of the World) to emulate the actual shooting conditions of being out in the ocean. “One of the hardest parts is the motion of the ocean as well as the camera placement,” McGuinness says. “Applying a real-world approach in the beginning was a better way to put everyone’s mindset to get to where we were going. I had a team from a small previs company called Day For Nite. I mapped out the journey so you could come up on a bird’s eye view with one camera and see where the convoy or the Greyhound was. Then I would drop the camera back down into where it was supposed to be in that part of the movie. I felt that was the only way to do it as accurately as possible because otherwise I wouldn’t have known where West and East was or the time of day. “Along with the zigzagging and maneuvers that the convoys take, we had to also know that there was a destination point,” adds McGuinness. “It was a difficult process to share with other people. I brought in the previs artists to work side by side with me and my visual effects team. I was building the story and everybody who needed to be part of the filmmaking process was around us. The editors were dropping in the sequences.

[Producer] Gary Goetzman and Tom Hanks were looking at it, saying, ‘I don’t know if this makes sense. We have to do a bird’s eye.’ [Cinematographer] Shelly Johnson (Hidalgo) came in consistently with me and sat with the EXR files. I had the Avid running on one screen, and DaVinci Resolve running on another screen. We’re looking at the cut and then the shots.” Naval consultant Gordon Laco was brought onto the project to ensure authenticity. “I’d say, ‘This maneuver written in the script,” McGuiness notes. “Tell me more about it. Why has this maneuver been so traditional in maritime warfare since the 1600s?’ He would give me a good explanation. I would then start prevising the tactics. But then I would have to say to him, ‘This is a movie!’ Gordon also had a strong understanding of the sonar and the capabilities at the time.” Balancing readability and believability of what the lighting conditions would be like in the North Atlantic was hard. “It was a difficult part of the process, especially when you’re telling the studio that you’re putting as much effort in a shot that at the end of the day you’d only see 5% of it,” remarks McGuinness. “But we had to because Tom was so passionate about what you don’t see. People smoking on the deck would be court martialed because that would be an immediate telltale for the U-boats. The ships had to cross with no lights. It was dark. We had to portray that. I did numerous 24-hour HDRs on beaches. I picked the lighting for the lighting teams.” DNEG had 10 Canon 5D cameras each placed upon an individual tripod which took time-lapse HDRs every 30 seconds to produce a 60K sky. “We put that on the top of the office in London and a team was sent to Brighton Beach to do the same thing because we needed a clean horizon,” explains Bebb. “It gave us the exact time, clouds, and light which were then offered to Nathan. He picked specific ones and that gave us an HDR map per scene and time of day.” Journals were a critical part of the research. “I also researched the era,” states McGuinness. “My rooms were covered in every U-boat that potentially crossed. I almost felt that I had a stronger connection to the wolfpack [eight to 20 U-boats would join

“I spoke to some guys who explained to me that German torpedoes sounded like a 125cc engine on a motorbike, so they screamed. That helped me to visualize the turbulence underneath.” —Nathan McGuinness, Visual Effects Supervisor

TOP TO BOTTOM: Visual Effects Supervisor Nathan McGuinness did extensive research into what German U-boats were in operation during the time period of Greyhound, and he became fascinated as to what it would have been actually like to be onboard one. Shots featuring the U-boats were a late addition to the post-production schedule and had to be created from scratch. (Images courtesy of DNEG)

WINTER 2021 VFXVOICE.COM • 59

12/16/20 2:00 PM


FILM

“[We visited the HMS Belfast in London because] I wanted to look at an actual World War II destroyer to see what’s involved in the build of it, because we were going to get incredibly close to the Greyhound. The ship build took the duration of the show.” —Pete Bebb, Visual Effects Supervisor, DNEG

TOP TO BOTTOM: It was critical to create the impression that the bow of the Greyhound was crashing through the water with the help of atmospherics such as ocean spray. The weathering of crossing the Atlantic Ocean was something lacking in the practical shooting on the USS Kidd and was added to the CG version of the Greyhound. (Images courtesy of DNEG)

58 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 58-59

categories at certain points of the film to fast-track so we knew what we were aiming for,” explains DNEG VFX Supervisor Pete Bebb. “What you can’t do is change your mind with this huge machine behind you. Nathan, Mike, Aaron and Tom could see it and sign off from the look development point of view.” Bebb realized that the layout pipeline needed to be rewritten. “You could scrub through the entire period of time with all of the ships,” he says. “The cameras would go instantly into layout so they would be in the correct place in the ocean, the right type of Beaufort scale and the proper time of day. Then you could render them out and review it the next day. In regards to the color pipeline, I pushed for using an internal one so we balanced all of our neutral grade all the way through the entire film. We suggested CDLs on top of that and always gave it a half stop more so there was more latitude that Nathan could play with.” Even though the water was fully CG, it was important for Visual Effects Supervisor McGuinness (Master and Commander: The Far Side of the World) to emulate the actual shooting conditions of being out in the ocean. “One of the hardest parts is the motion of the ocean as well as the camera placement,” McGuinness says. “Applying a real-world approach in the beginning was a better way to put everyone’s mindset to get to where we were going. I had a team from a small previs company called Day For Nite. I mapped out the journey so you could come up on a bird’s eye view with one camera and see where the convoy or the Greyhound was. Then I would drop the camera back down into where it was supposed to be in that part of the movie. I felt that was the only way to do it as accurately as possible because otherwise I wouldn’t have known where West and East was or the time of day. “Along with the zigzagging and maneuvers that the convoys take, we had to also know that there was a destination point,” adds McGuinness. “It was a difficult process to share with other people. I brought in the previs artists to work side by side with me and my visual effects team. I was building the story and everybody who needed to be part of the filmmaking process was around us. The editors were dropping in the sequences.

[Producer] Gary Goetzman and Tom Hanks were looking at it, saying, ‘I don’t know if this makes sense. We have to do a bird’s eye.’ [Cinematographer] Shelly Johnson (Hidalgo) came in consistently with me and sat with the EXR files. I had the Avid running on one screen, and DaVinci Resolve running on another screen. We’re looking at the cut and then the shots.” Naval consultant Gordon Laco was brought onto the project to ensure authenticity. “I’d say, ‘This maneuver written in the script,” McGuiness notes. “Tell me more about it. Why has this maneuver been so traditional in maritime warfare since the 1600s?’ He would give me a good explanation. I would then start prevising the tactics. But then I would have to say to him, ‘This is a movie!’ Gordon also had a strong understanding of the sonar and the capabilities at the time.” Balancing readability and believability of what the lighting conditions would be like in the North Atlantic was hard. “It was a difficult part of the process, especially when you’re telling the studio that you’re putting as much effort in a shot that at the end of the day you’d only see 5% of it,” remarks McGuinness. “But we had to because Tom was so passionate about what you don’t see. People smoking on the deck would be court martialed because that would be an immediate telltale for the U-boats. The ships had to cross with no lights. It was dark. We had to portray that. I did numerous 24-hour HDRs on beaches. I picked the lighting for the lighting teams.” DNEG had 10 Canon 5D cameras each placed upon an individual tripod which took time-lapse HDRs every 30 seconds to produce a 60K sky. “We put that on the top of the office in London and a team was sent to Brighton Beach to do the same thing because we needed a clean horizon,” explains Bebb. “It gave us the exact time, clouds, and light which were then offered to Nathan. He picked specific ones and that gave us an HDR map per scene and time of day.” Journals were a critical part of the research. “I also researched the era,” states McGuinness. “My rooms were covered in every U-boat that potentially crossed. I almost felt that I had a stronger connection to the wolfpack [eight to 20 U-boats would join

“I spoke to some guys who explained to me that German torpedoes sounded like a 125cc engine on a motorbike, so they screamed. That helped me to visualize the turbulence underneath.” —Nathan McGuinness, Visual Effects Supervisor

TOP TO BOTTOM: Visual Effects Supervisor Nathan McGuinness did extensive research into what German U-boats were in operation during the time period of Greyhound, and he became fascinated as to what it would have been actually like to be onboard one. Shots featuring the U-boats were a late addition to the post-production schedule and had to be created from scratch. (Images courtesy of DNEG)

WINTER 2021 VFXVOICE.COM • 59

12/16/20 2:00 PM


FILM

“The shots from Brighton Beach provided us with weather all of the way down to the horizon line. It gave us this narrow band where the sky finishes and you get this band of light. We used that quite a lot to silhouette ships against it.” —Pete Bebb, Visual Effects Supervisor, DNEG

TOP TO BOTTOM: VFX Supervisor Nathan McGuinness did extensive research into what German U-boats were in operation during the time period of Greyhound, and he became fascinated as to what it would have been actually like to be onboard one. Shots featuring the U-boats were a late addition to the post-production schedule and had to be created from scratch. (Images courtesy of DNEG)

60 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 61

together to attack convoys at night] than to Tom Hanks because I would have loved to be inside one of them. The taunting that those U-boats gave these convoys were horrific, but their life was horrific too.” Multiple versions of ships had to be built for the purpose of continuity. “It wasn’t like one asset got you through the whole movie, because they’re under attack the whole time,” McGuinness says. “The asset library was massive. We started with absolutely nothing. We built various designs of U-boats, all based on my research.” Bebb and his visual effects team visited the HMS Belfast in London. “I wanted to look at an actual World War II destroyer to see what’s involved in the build of it, because we were going to get incredibly close to the Greyhound. The ship build took the duration of the show,” says Bebb. Sub assets were produced to reflect the various states of damage. Adds Bebb, “We’d know that the gun turret was destroyed so we would swap it out for a destroyed version, and that would happen automatically as part of the layout system.” Mathematics was key in making sure that ships interacted with the ocean properly. “You study the statistical drawing of the ships, find where the center of gravity is located, and take in account the dry and laden weights,” remarks Bebb. “You knew where the waterline was so you could put them down correctly for what they are. You can then calculate their tonnage and then calculate what that ship would do in a certain Beaufort scale of water.” There are a couple of dramatic moments that occur like when the U-boat emerges right alongside the Greyhound resulting in a gun battle and a torpedo grazing off of the side of the destroyer. “Weirdly enough, that sort of stuff used to happen. U-boats would surface so close in proximity to these destroyers,” says Bebb. “The destroyer’s guns are five inches and couldn’t get down low enough to take them out. There was an interview with Tom Hanks where he asked if torpedoes could actually graze the side of a ship. The arming head at the front of the torpedo has to hit dead-on perpendicular to detonate.”

Sounds assisted with visualizing shots. “I spoke to some guys who explained to me that German torpedoes sounded like a 125cc engine on a motorbike, so they screamed,” notes McGuinness. “That helped me to visualize the turbulence underneath.” DNEG researched what possible torpedoes would have been used by the U-boats at the time. “We tended to go with one that had a brass head,” explains Bebb, “so you could see it easier in the water, because most of them were grey so you couldn’t see them.” Natural elements also assisted with readability. “Most of the time, with the aerial shots of the convoy, it’s the actual wake and smoke stacks that you see as the ships blend in slightly,” continues Bebb. “The shots from Brighton Beach provided us with weather all of the way down to the horizon line. It gave us this narrow band where the sky finishes and you get this band of light. We used that quite a lot to silhouette ships against it.” Most of the radar screens were replaced in order to reflect the story correctly. “400 to 500 foretold in bridge and radar shots,” reveals Bebb. “One of the visual effects supervisors at London, Ian Simpson, was an old analog engineer at BBC. He designed and built the radar screens to how they would have done it back in the day. We had numerous radar shots, so a lot of effort went into making them look authentic.” It was important to avoid having the trace fire look like lasers. “They used to load one in five rounds for tracers at night, so we animated it like that,” says McGuinness, who was driven by a certain mandate. “I wanted every shot to feel like it was at sea. The hard stuff was setting up the tone of the ocean. There were a couple of shots we did where I felt like we were looking at the bow and were plowing through. All of the visual effects ingredients are in that shot. It is a huge array of work. We stare at it and go, ‘It’s a water job!’ But at the same time, it was the assets of U-boats, fleets and convoys, as well as destruction, attacks, flares, night, skies and bird’s eye views. I don’t think there’s anything we’re missing, but it all contributes to the show as a whole. We drew from all of our past experiences and applied them to Greyhound.”

“It wasn’t like one asset got you through the whole movie, because they’re under attack the whole time. The asset library was massive. We started with absolutely nothing. We built various designs of U-boats, all based on my research.” —Nathan McGuinness, Visual Effects Supervisor

TOP TO BOTTOM: Even in the darkest corners of the frame a great attention was paid to detail. DNEG served as the sole vendor for Greyhound and had five categories of shots that went from simple, motion graphics, medium, full CG and very hard. (Images courtesy of DNEG. Final image courtesy of Apple)

WINTER 2021 VFXVOICE.COM • 61

12/16/20 2:00 PM


FILM

“The shots from Brighton Beach provided us with weather all of the way down to the horizon line. It gave us this narrow band where the sky finishes and you get this band of light. We used that quite a lot to silhouette ships against it.” —Pete Bebb, Visual Effects Supervisor, DNEG

TOP TO BOTTOM: VFX Supervisor Nathan McGuinness did extensive research into what German U-boats were in operation during the time period of Greyhound, and he became fascinated as to what it would have been actually like to be onboard one. Shots featuring the U-boats were a late addition to the post-production schedule and had to be created from scratch. (Images courtesy of DNEG)

60 • VFXVOICE.COM WINTER 2021

PG 56-61 GREYHOUND.indd 61

together to attack convoys at night] than to Tom Hanks because I would have loved to be inside one of them. The taunting that those U-boats gave these convoys were horrific, but their life was horrific too.” Multiple versions of ships had to be built for the purpose of continuity. “It wasn’t like one asset got you through the whole movie, because they’re under attack the whole time,” McGuinness says. “The asset library was massive. We started with absolutely nothing. We built various designs of U-boats, all based on my research.” Bebb and his visual effects team visited the HMS Belfast in London. “I wanted to look at an actual World War II destroyer to see what’s involved in the build of it, because we were going to get incredibly close to the Greyhound. The ship build took the duration of the show,” says Bebb. Sub assets were produced to reflect the various states of damage. Adds Bebb, “We’d know that the gun turret was destroyed so we would swap it out for a destroyed version, and that would happen automatically as part of the layout system.” Mathematics was key in making sure that ships interacted with the ocean properly. “You study the statistical drawing of the ships, find where the center of gravity is located, and take in account the dry and laden weights,” remarks Bebb. “You knew where the waterline was so you could put them down correctly for what they are. You can then calculate their tonnage and then calculate what that ship would do in a certain Beaufort scale of water.” There are a couple of dramatic moments that occur like when the U-boat emerges right alongside the Greyhound resulting in a gun battle and a torpedo grazing off of the side of the destroyer. “Weirdly enough, that sort of stuff used to happen. U-boats would surface so close in proximity to these destroyers,” says Bebb. “The destroyer’s guns are five inches and couldn’t get down low enough to take them out. There was an interview with Tom Hanks where he asked if torpedoes could actually graze the side of a ship. The arming head at the front of the torpedo has to hit dead-on perpendicular to detonate.”

Sounds assisted with visualizing shots. “I spoke to some guys who explained to me that German torpedoes sounded like a 125cc engine on a motorbike, so they screamed,” notes McGuinness. “That helped me to visualize the turbulence underneath.” DNEG researched what possible torpedoes would have been used by the U-boats at the time. “We tended to go with one that had a brass head,” explains Bebb, “so you could see it easier in the water, because most of them were grey so you couldn’t see them.” Natural elements also assisted with readability. “Most of the time, with the aerial shots of the convoy, it’s the actual wake and smoke stacks that you see as the ships blend in slightly,” continues Bebb. “The shots from Brighton Beach provided us with weather all of the way down to the horizon line. It gave us this narrow band where the sky finishes and you get this band of light. We used that quite a lot to silhouette ships against it.” Most of the radar screens were replaced in order to reflect the story correctly. “400 to 500 foretold in bridge and radar shots,” reveals Bebb. “One of the visual effects supervisors at London, Ian Simpson, was an old analog engineer at BBC. He designed and built the radar screens to how they would have done it back in the day. We had numerous radar shots, so a lot of effort went into making them look authentic.” It was important to avoid having the trace fire look like lasers. “They used to load one in five rounds for tracers at night, so we animated it like that,” says McGuinness, who was driven by a certain mandate. “I wanted every shot to feel like it was at sea. The hard stuff was setting up the tone of the ocean. There were a couple of shots we did where I felt like we were looking at the bow and were plowing through. All of the visual effects ingredients are in that shot. It is a huge array of work. We stare at it and go, ‘It’s a water job!’ But at the same time, it was the assets of U-boats, fleets and convoys, as well as destruction, attacks, flares, night, skies and bird’s eye views. I don’t think there’s anything we’re missing, but it all contributes to the show as a whole. We drew from all of our past experiences and applied them to Greyhound.”

“It wasn’t like one asset got you through the whole movie, because they’re under attack the whole time. The asset library was massive. We started with absolutely nothing. We built various designs of U-boats, all based on my research.” —Nathan McGuinness, Visual Effects Supervisor

TOP TO BOTTOM: Even in the darkest corners of the frame a great attention was paid to detail. DNEG served as the sole vendor for Greyhound and had five categories of shots that went from simple, motion graphics, medium, full CG and very hard. (Images courtesy of DNEG. Final image courtesy of Apple)

WINTER 2021 VFXVOICE.COM • 61

12/16/20 2:00 PM


PROFILE

ZOE CRANLEY: LESSONS LEARNED, LEADERSHIP, AND A LOVE OF MATCH-MOVING IN VFX By IAN FAILES

TOP: Zoe Cranley, Head of CG at DNEG Vancouver. (Photo courtesy of DNEG) OPPOSITE TOP: Cranley holds a Macbeth chart during an on-set shoot. (Photo courtesy of DNEG) OPPOSITE BOTTOM: Cranley’s work-from-home setup in Vancouver. (Photo courtesy of Zoe Cranley)

62 • VFXVOICE.COM WINTER 2021

PG 62-66 ZOE CRANLEY ALT.indd 62-63

Zoe Cranley loves match-moving. As Head of CG at DNEG Vancouver, this is a VFX task she no longer gets to enjoy, but it is one that, when she started at DNEG in 2005 in London, immediately captivated her. “I really enjoyed it, to the point that later when I was a CG Supervisor, I would still say, ‘Oh, can I just match-move one shot? Just one? Please?’ It’s such great creative problem solving.” Problem solving is something Cranley currently does day-to-day in the Head of CG role, the pinnacle of a journey that has included stints at DNEG’s London and Singapore offices before the move to Vancouver, and experience on films such as John Carter, Godzilla (2014) and The Hunger Games: Catching Fire. Before she embarked on a visual effects career in the U.K., Cranley was initially interested in going to art school. “When I applied they actually told me I was too academic to go to art school,” she says. “I was so mad! I actually argued with them about it.” After that early setback, Cranley thought she could instead combine her artistic interests with a love of computers and mathematics to earn a computer animation degree at Bournemouth University. This was in the early 2000s, a time when the London VFX industry was growing rapidly. “As soon as I finished my degree,” she remarks, “I sent off my DVDs – actual DVDs with a front cover and everything! – and I luckily got offered a job pretty much straight away at DNEG.” Cranley was hired as a render wrangler doing shift work to monitor render jobs on DNEG’s ‘Alfred’ render farm. At first, it seemed a far cry from her initial goal of working in feature animation. “Actually,” Cranley notes, “a lot of people from my degree course turned it down. I think they wanted something more creative. At the time, I was just grateful to get a job in the industry, to be honest. I’ve always been like that; you take any opportunity given to you and you build on it, no matter what.” The render wrangler job proved to be a “great first job,” Cranley says, since it required her to meet a lot of different people at the studio and understand the technology early on. She also got to wield some impressive rendering-related power. “I had to be able to walk up to people or email them and say I’ve turned their priority off because they didn’t have permissions or discuss that their render was causing errors. It was a great introduction.” Cranley was able to segue into a match-moving and lighting role after a few months at DNEG, working on films Flyboys and World Trade Center. But then she decided to quit. “I went to America and did three months in a summer camp in a leadership role, which was really fun, but also tough going. While running the Arts and Crafts program, I got a chance to lead very creative and artistic sessions, but at the same time a lot of managing as well. It was at that point I realized I just really enjoyed leading people. I got a taste for it quite early on from this and I learned some really valuable people skills.” After the summer camp ended, Cranley returned to DNEG in London as a lighting TD and later a 3D sequence lead. In those roles, she contributed to films such as Harry Potter and the Order of the Phoenix, Sherlock Holmes and The Sorcerer’s Apprentice. Then along came John Carter. “That film is still my favorite show to date,” Cranley exclaims. “I was on it for almost two years. I worked mainly on the baby aliens, I had so many high hopes for toys and backpacks

and everything from them, but that never happened. I was so disappointed – it was two years of my life and I didn’t even get a backpack! However, the experience and lessons learned while working on that show were still worth it.” Cranley’s work on John Carter, however, led to another significant change – being asked to move to Singapore where the studio had set up a satellite office. “They were looking for people to go out there, to help advance the team, their pipeline, and start taking on more challenging work. I thought I might go for six months, but it ended up being for two-and-a-half years.” It was in Singapore that Cranley had to take on some of the most challenging assignments in VFX she had ever faced. She would oversee a significant number of shots for the film Total Recall, many with complex CG ‘Synths.’ It was some of the most complex work that the Singapore team had tackled at that time. “The team wasn’t really ready. I wasn’t ready. It was really tough work, but we had to get it done. It was probably one of the hardest challenges of my life until then, actually. I didn’t know the team. I didn’t know their strengths, their weaknesses or their skillsets. They also didn’t know me. We did some long hours, and sometimes I’d be ringing London at three in the morning asking for help, but the team delivered some impressive shots. I’m still really proud of the work we produced in those short months.” Cranley moved into a CG Supervisor role at DNEG after that, working on films including Alice Through the Looking Glass and Wonder Woman. Over the course of those projects, she moved back to London from Singapore, before transferring to the studio’s Vancouver office. Becoming a CG Supervisor opened up a whole new set of experiences and challenges for Cranley. “The difference now was, whereas I used to look to someone else up the chain to make a

decision or propose a solution, there wasn’t that person anymore. Suddenly that was me. “I also had to learn how to delegate to other people,” adds Cranley. “It wasn’t just about you being a good leader anymore. It was about also getting others to lead as well, make the best decisions on artist assignments or how artists were going to work.” Cranley realized that key questions would come directly her way on a myriad of issues. “The VFX supervisor might say, ‘Hey Zoe,

WINTER 2021 VFXVOICE.COM • 63

12/16/20 2:06 PM


PROFILE

ZOE CRANLEY: LESSONS LEARNED, LEADERSHIP, AND A LOVE OF MATCH-MOVING IN VFX By IAN FAILES

TOP: Zoe Cranley, Head of CG at DNEG Vancouver. (Photo courtesy of DNEG) OPPOSITE TOP: Cranley holds a Macbeth chart during an on-set shoot. (Photo courtesy of DNEG) OPPOSITE BOTTOM: Cranley’s work-from-home setup in Vancouver. (Photo courtesy of Zoe Cranley)

62 • VFXVOICE.COM WINTER 2021

PG 62-66 ZOE CRANLEY ALT.indd 62-63

Zoe Cranley loves match-moving. As Head of CG at DNEG Vancouver, this is a VFX task she no longer gets to enjoy, but it is one that, when she started at DNEG in 2005 in London, immediately captivated her. “I really enjoyed it, to the point that later when I was a CG Supervisor, I would still say, ‘Oh, can I just match-move one shot? Just one? Please?’ It’s such great creative problem solving.” Problem solving is something Cranley currently does day-to-day in the Head of CG role, the pinnacle of a journey that has included stints at DNEG’s London and Singapore offices before the move to Vancouver, and experience on films such as John Carter, Godzilla (2014) and The Hunger Games: Catching Fire. Before she embarked on a visual effects career in the U.K., Cranley was initially interested in going to art school. “When I applied they actually told me I was too academic to go to art school,” she says. “I was so mad! I actually argued with them about it.” After that early setback, Cranley thought she could instead combine her artistic interests with a love of computers and mathematics to earn a computer animation degree at Bournemouth University. This was in the early 2000s, a time when the London VFX industry was growing rapidly. “As soon as I finished my degree,” she remarks, “I sent off my DVDs – actual DVDs with a front cover and everything! – and I luckily got offered a job pretty much straight away at DNEG.” Cranley was hired as a render wrangler doing shift work to monitor render jobs on DNEG’s ‘Alfred’ render farm. At first, it seemed a far cry from her initial goal of working in feature animation. “Actually,” Cranley notes, “a lot of people from my degree course turned it down. I think they wanted something more creative. At the time, I was just grateful to get a job in the industry, to be honest. I’ve always been like that; you take any opportunity given to you and you build on it, no matter what.” The render wrangler job proved to be a “great first job,” Cranley says, since it required her to meet a lot of different people at the studio and understand the technology early on. She also got to wield some impressive rendering-related power. “I had to be able to walk up to people or email them and say I’ve turned their priority off because they didn’t have permissions or discuss that their render was causing errors. It was a great introduction.” Cranley was able to segue into a match-moving and lighting role after a few months at DNEG, working on films Flyboys and World Trade Center. But then she decided to quit. “I went to America and did three months in a summer camp in a leadership role, which was really fun, but also tough going. While running the Arts and Crafts program, I got a chance to lead very creative and artistic sessions, but at the same time a lot of managing as well. It was at that point I realized I just really enjoyed leading people. I got a taste for it quite early on from this and I learned some really valuable people skills.” After the summer camp ended, Cranley returned to DNEG in London as a lighting TD and later a 3D sequence lead. In those roles, she contributed to films such as Harry Potter and the Order of the Phoenix, Sherlock Holmes and The Sorcerer’s Apprentice. Then along came John Carter. “That film is still my favorite show to date,” Cranley exclaims. “I was on it for almost two years. I worked mainly on the baby aliens, I had so many high hopes for toys and backpacks

and everything from them, but that never happened. I was so disappointed – it was two years of my life and I didn’t even get a backpack! However, the experience and lessons learned while working on that show were still worth it.” Cranley’s work on John Carter, however, led to another significant change – being asked to move to Singapore where the studio had set up a satellite office. “They were looking for people to go out there, to help advance the team, their pipeline, and start taking on more challenging work. I thought I might go for six months, but it ended up being for two-and-a-half years.” It was in Singapore that Cranley had to take on some of the most challenging assignments in VFX she had ever faced. She would oversee a significant number of shots for the film Total Recall, many with complex CG ‘Synths.’ It was some of the most complex work that the Singapore team had tackled at that time. “The team wasn’t really ready. I wasn’t ready. It was really tough work, but we had to get it done. It was probably one of the hardest challenges of my life until then, actually. I didn’t know the team. I didn’t know their strengths, their weaknesses or their skillsets. They also didn’t know me. We did some long hours, and sometimes I’d be ringing London at three in the morning asking for help, but the team delivered some impressive shots. I’m still really proud of the work we produced in those short months.” Cranley moved into a CG Supervisor role at DNEG after that, working on films including Alice Through the Looking Glass and Wonder Woman. Over the course of those projects, she moved back to London from Singapore, before transferring to the studio’s Vancouver office. Becoming a CG Supervisor opened up a whole new set of experiences and challenges for Cranley. “The difference now was, whereas I used to look to someone else up the chain to make a

decision or propose a solution, there wasn’t that person anymore. Suddenly that was me. “I also had to learn how to delegate to other people,” adds Cranley. “It wasn’t just about you being a good leader anymore. It was about also getting others to lead as well, make the best decisions on artist assignments or how artists were going to work.” Cranley realized that key questions would come directly her way on a myriad of issues. “The VFX supervisor might say, ‘Hey Zoe,

WINTER 2021 VFXVOICE.COM • 63

12/16/20 2:06 PM


PROFILE

“The difference now [as a CG Supervisor at DNEG] was, whereas I used to look to someone else up the chain to make a decision or propose a solution, there wasn’t that person anymore. Suddenly that was me.” —Zoe Cranley, Head of CG Vancouver, DNEG

TOP: The original plate for a scene from John Carter. (Image copyright © 2012 Walt Disney Pictures) BOTTOM: This is the final shot with CG creatures. Cranley’s role on John Carter was as a CG Sequence Lead at DNEG. (Image copyright © 2012 Walt Disney Pictures)

64 • VFXVOICE.COM WINTER 2021

PG 62-66 ZOE CRANLEY ALT.indd 64-65

what do you think about this?’ or ‘Can you do this?’ and you know that deep down inside there’s absolutely no way that it’s possible, but you learn to say, ‘Sure, let me look into it.’” One particularly thorny issue for her to solve as CG Supervisor was the difficult matter of disk space usage. “Suddenly you were very aware of how much data visual effects creates,” Cranley says. “I would be asked to do these disk estimates and my first response was, ‘What’s a disk estimate?’ There haven’t been many things that have brought me to tears in visual effects, but doing my first ever disk estimate made me cry. I don’t know, it was just one of the hardest things.” What also came with the CG Supervisor role was the opportunity to be on set for Alice Through the Looking Glass and Wonder Woman. Those times would prove to be extremely valuable in terms of both practical lessons and discovering even more about leadership. “I was so inexperienced about anything on set,” acknowledges Cranley. “I literally had no experience whatsoever, so I just followed people around and tried to absorb and learn as much as I could. Also, up until that point, I was always like, ‘Why didn’t they shoot this? Why didn’t they get an HDRI?’ And then

“They were looking for people to go [to Singapore] to help advance the team, their pipeline, and start taking on more challenging work. I thought I might go for six months, but it ended up being for two-and-a-half years.” —Zoe Cranley, Head of CG Vancouver, DNEG when I went on set and I was like, ‘Oh, now I understand. It’s a film set. It’s crazy. You get what you can get.’” On Wonder Woman, Cranley was on location in Italy for three weeks with other DNEG crew. She remembers the anxiety of one time being thrust into the demands of production by overseeing the VFX supervision for the shoot that day. “The supervisor said, ‘I’m just going to give you a bit of advice: Whatever you do, if you’re asked a question, just make a decision. Just say yes, confidently, or no, confidently. Don’t dither.’ It was absolutely terrifying. The whole day. The director, Patty Jenkins, asked me at one point about a shot where there was this large, wooden target in it, which had to be replaced with CG. She said, ‘Do you want it in shot or out of shot?’ Being unsure, I said confidently, ‘Just leave it in.’ “A few months later,” continues Cranley, “the plates came into

TOP: As a CG Sequence Supervisor at DNEG on Total Recall, Cranley’s role included extending plates like these with more digital ‘Synths.’ (Image copyright © 2012 Sony Pictures) BOTTOM: The final shot with the expanded crowd of CG Synths. (Image copyright © 2012 Sony Pictures)

WINTER 2021 VFXVOICE.COM • 65

12/16/20 2:06 PM


PROFILE

“The difference now [as a CG Supervisor at DNEG] was, whereas I used to look to someone else up the chain to make a decision or propose a solution, there wasn’t that person anymore. Suddenly that was me.” —Zoe Cranley, Head of CG Vancouver, DNEG

TOP: The original plate for a scene from John Carter. (Image copyright © 2012 Walt Disney Pictures) BOTTOM: This is the final shot with CG creatures. Cranley’s role on John Carter was as a CG Sequence Lead at DNEG. (Image copyright © 2012 Walt Disney Pictures)

64 • VFXVOICE.COM WINTER 2021

PG 62-66 ZOE CRANLEY ALT.indd 64-65

what do you think about this?’ or ‘Can you do this?’ and you know that deep down inside there’s absolutely no way that it’s possible, but you learn to say, ‘Sure, let me look into it.’” One particularly thorny issue for her to solve as CG Supervisor was the difficult matter of disk space usage. “Suddenly you were very aware of how much data visual effects creates,” Cranley says. “I would be asked to do these disk estimates and my first response was, ‘What’s a disk estimate?’ There haven’t been many things that have brought me to tears in visual effects, but doing my first ever disk estimate made me cry. I don’t know, it was just one of the hardest things.” What also came with the CG Supervisor role was the opportunity to be on set for Alice Through the Looking Glass and Wonder Woman. Those times would prove to be extremely valuable in terms of both practical lessons and discovering even more about leadership. “I was so inexperienced about anything on set,” acknowledges Cranley. “I literally had no experience whatsoever, so I just followed people around and tried to absorb and learn as much as I could. Also, up until that point, I was always like, ‘Why didn’t they shoot this? Why didn’t they get an HDRI?’ And then

“They were looking for people to go [to Singapore] to help advance the team, their pipeline, and start taking on more challenging work. I thought I might go for six months, but it ended up being for two-and-a-half years.” —Zoe Cranley, Head of CG Vancouver, DNEG when I went on set and I was like, ‘Oh, now I understand. It’s a film set. It’s crazy. You get what you can get.’” On Wonder Woman, Cranley was on location in Italy for three weeks with other DNEG crew. She remembers the anxiety of one time being thrust into the demands of production by overseeing the VFX supervision for the shoot that day. “The supervisor said, ‘I’m just going to give you a bit of advice: Whatever you do, if you’re asked a question, just make a decision. Just say yes, confidently, or no, confidently. Don’t dither.’ It was absolutely terrifying. The whole day. The director, Patty Jenkins, asked me at one point about a shot where there was this large, wooden target in it, which had to be replaced with CG. She said, ‘Do you want it in shot or out of shot?’ Being unsure, I said confidently, ‘Just leave it in.’ “A few months later,” continues Cranley, “the plates came into

TOP: As a CG Sequence Supervisor at DNEG on Total Recall, Cranley’s role included extending plates like these with more digital ‘Synths.’ (Image copyright © 2012 Sony Pictures) BOTTOM: The final shot with the expanded crowd of CG Synths. (Image copyright © 2012 Sony Pictures)

WINTER 2021 VFXVOICE.COM • 65

12/16/20 2:06 PM


PROFILE

TOP: Zoe Cranley introduces a session at Spark FX 2020, run by the Spark Computer Graphics (Spark CG) Society in Vancouver, of which she is the Chair. (Photo: Marisa Molinar) BOTTOM: An HDRI capture for an on-set shoot that Cranley was involved with. (Photo courtesy of DNEG)

DNEG and the shot came up, with the entire target filling the frame and Comp asked, ’Is there a clean plate for this shot? Why is this target in there?’ I owned up and said, ‘It’s my fault! Sorry!’ It was really interesting to realize that you make the decision and you have to work with it. The shot actually ended up getting omitted, but it was really great experience.” Today Cranley says that when artists at DNEG ask her if they might be able to go on set, she always tries to help make that possible. “It’s so valuable. I really changed after my experience. I learned so much and had a lot more respect for the way we capture HDRIs, LiDAR and reference.” Still, the on-set experience also informed Cranley that she was not necessarily interested in being a VFX Supervisor or going to sets regularly. “I felt on edge the whole time. You’ve got to be so quiet and I felt anxious for all of it. Once, actually, I was walking around the set of Wonder Woman and I heard this crunch and I turned around, and I’d trod on and broken the end of a prop. Thinking my set days were numbered, I looked over and saw the prop guy, and he went, ‘Don’t worry. That’s why you’re in visual effects. You’ll have to paint that back in later.’” From 2017, Cranley worked as Head of Build (Assets) at DNEG Vancouver before starting as Head of CG in the middle of 2019. This role involves strategic planning at the studio, show and pipeline levels and supporting career development for the DNEG crew she oversees, while also reacting to the typical daily challenges that tend to erupt in visual effects. “I tend to have an overarching eye on the beginning of a show – getting in the right bids, working out how we’re going to do it, who’s going to do it and where we’ll do it – through to wrapping it up and doing a post-mortem,” outlines Cranley. “I’m a big believer in learning from the past by running post-mortems and collecting actuals. This involves producing and showing graphs. I love a graph!” Outside of DNEG, Cranley is the Chair of the Spark Computer Graphics (Spark CG) Society in Vancouver, a role that gives her the chance to participate in the local VFX community. Spark’s various conferences and industry events also provide an opportunity for Cranley to keep learning about technology and techniques in visual effects. Furthermore, she sees the Spark position and her leadership role at DNEG as a way to champion more women in the industry – the number of women working in VFX has historically been very low. “If me being in these positions can help show and represent, then I like that,” says Cranley. “I like to connect with amazing women that are in senior roles in visual effects, especially technical roles, and put them on the stage. I hope by giving them the chance to be in the limelight to talk about their craft and talent, it highlights that there are great opportunities for women in the industry.” Where, though, does all that fit in with her love for matchmoving? It’s not something the Head of CG gets to do these days. When asked about it, Cranley actually admits to another secret VFX passion: UV mapping. “I love UV’ing. It’s a jigsaw puzzle. I used to be like, ‘Quick, give me that model! I’ll UV that!’”

66 • VFXVOICE.COM WINTER 2021

PG 62-66 ZOE CRANLEY ALT.indd 66

12/16/20 2:06 PM


PG 67 WARNER BROS WONDER WOMAN AD.indd 67

12/16/20 2:07 PM


ANIMATION

‘FEELING THE CHARACTERS’ ELEVATES HEART-FILLED OVER THE MOON By TREVOR HOGG

All images courtesy of Netflix. TOP: The backstory of Chang’e was animated in 2D by Glen Keane and then projected onto a moving scarf. OPPOSITE TOP: The Water Town in which Fei Fei lives was modeled on Wuzhen, China.

68 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 68-69

Being the son of famous cartoonist Bil Keane (The Family Circus) did not discourage Glen Keane from pursuing a career as an animator; he subsequently went on to become a Disney Legend known for designing characters in The Little Mermaid, Beauty and the Beast, Pocahontas and Tarzan. After producing the Oscarwinning animated short Dear Basketball with Kobe Bryant, Glen Keane was drawn to the Netflix animated feature Over the Moon, which entwines Chinese folklore with the present day. The Moon Goddess Chang’e (Phillipa Soo) is separated from her husband Houyi (Conrad Ricamora) upon drinking the elixir of immortality, while 12-year-old Fei Fei (Cathy Ang) is mourning the death of her Mother (Ruthie Ann Miles) and has to deal with the prospect of her Father (John Cho) remarrying. Their stories collide when the ‘tweenager’ decides to build a rocket to the Moon to prove that the deity actually exists. Screenwriter Audrey Wells (The Hate U Give) saw the project as a departing love letter to her husband and daughter as she was waging a battle with cancer while writing the script, and subsequently died. “Audrey said that all of her stories are about healing in some way,” states Keane. “The breakup of families, whether through divorce or death, is something that so many people in the world are going through. The theme of our story is to learn to love somebody new. There has to be depth for me to want to do it.” Production on the animated musical began in 2017, and with five months to go the COVID-19 lockdown caused artists to finish their work remotely. “120 animators were working at the same time and barely able to keep up,” remarks Keane, who partnered with Netflix and Pearl Studio. “Then on a Thursday at 11 a.m. word comes down, ‘We’ve got to shut down the studio at Netflix.’ Everybody left within an hour, leaving behind coffee cups and coats hanging on chairs. It was like The Day the Earth Stood Still. In some miraculous

“CG mouths always have this weird pinch at the corners. But if you open your mouth, the top lip curves and rolls, and there is this beautiful rolling curve at the corners. We worked hard at putting that into all of our human characters.” —Glen Keane, Producer/Director way, we all went to our homes, used Zoom and continued on.” What proved to be indispensable was a review system devised by Sony Pictures Imageworks that allows individuals around the world to see the same image. “Part of that toolset includes our animation tool that allows us to draw on the image,” remarks Sony Pictures Imageworks Visual Effects Supervisor David Smith. “I showed Glen that tool and the first thing he did was to sit down in the chair and started drawing with it. He said, ‘Wouldn’t it be great if this could feel like a pencil and we could have a better brush here?’ We took those notes right away, made an overlay like a traditional animation transparency overlay, and fixed up the brushes so that they responded more like a pencil.” Animation review sessions got quite lively with Keane and co-director John Kahrs (Paperman). “I’m left-handed and Glen is right-handed, so we would be sitting on opposite sides of the Cintiq grabbing the stylus from each other,” laughs Kahrs. “I worked with the editor on some of these action sequences to calm things down and to tell the audience where to direct their eye. I removed a lot of Dutch angle shots when I got on the show because [the Moon city of] Lunaria is such a wild, psychedelic color fest that tilting the camera arbitrarily felt like too much craziness.” Kahrs also wanted to make sure that the characters looked like they were actually singing. “You have to animate the lungs inhaling and pushing that air out through the larynx singing and projecting,” he says. “If you’re just moving the mouth and the

character is blinking and looking around, that’s not enough for me.” A particular aspect of CG facial animation was addressed. “CG mouths always have this weird pinch at the corners,” observes Keane. “But if you open your mouth, the top lip curves and rolls, and there is this beautiful rolling curve at the corners. We worked hard at putting that into all of our human characters.” “More days were spent in rigging than what we typically do on a picture at Sony because anatomy was important to Glen,” states Sony Pictures Imageworks Animation Supervisor Sacha Kapijimpanga. “Getting all of the responses across the face that you would expect to see, we pushed hard on that. Also, we made sure that all of the body parts moved correctly and looked good, so when the cloth simulation happened on top of the forms underneath the cloth, it felt that you could see an arm or leg under there.” Research was conducted to ensure that Chinese characters felt authentic. “We went to Shanghai and did a lot of drawing of people on the street,” explains Keane. “I did a lot of studying of Asian eyes. There is this double fold on an eyelid that I never noticed before. The higher cheek bones. How does that fit into a simplified design of that face? There is this triangle between the eyes and eyebrows. We worked hard at being able to create enough of the muscle tension in there so you can feel that.” Character Designer Brittany Myers (Spider-Man: Into the SpiderVerse) kept in mind how the various characters might present themselves if they were real people. “Fei Fei has a quirk of pulling

WINTER 2021 VFXVOICE.COM • 69

12/16/20 2:08 PM


ANIMATION

‘FEELING THE CHARACTERS’ ELEVATES HEART-FILLED OVER THE MOON By TREVOR HOGG

All images courtesy of Netflix. TOP: The backstory of Chang’e was animated in 2D by Glen Keane and then projected onto a moving scarf. OPPOSITE TOP: The Water Town in which Fei Fei lives was modeled on Wuzhen, China.

68 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 68-69

Being the son of famous cartoonist Bil Keane (The Family Circus) did not discourage Glen Keane from pursuing a career as an animator; he subsequently went on to become a Disney Legend known for designing characters in The Little Mermaid, Beauty and the Beast, Pocahontas and Tarzan. After producing the Oscarwinning animated short Dear Basketball with Kobe Bryant, Glen Keane was drawn to the Netflix animated feature Over the Moon, which entwines Chinese folklore with the present day. The Moon Goddess Chang’e (Phillipa Soo) is separated from her husband Houyi (Conrad Ricamora) upon drinking the elixir of immortality, while 12-year-old Fei Fei (Cathy Ang) is mourning the death of her Mother (Ruthie Ann Miles) and has to deal with the prospect of her Father (John Cho) remarrying. Their stories collide when the ‘tweenager’ decides to build a rocket to the Moon to prove that the deity actually exists. Screenwriter Audrey Wells (The Hate U Give) saw the project as a departing love letter to her husband and daughter as she was waging a battle with cancer while writing the script, and subsequently died. “Audrey said that all of her stories are about healing in some way,” states Keane. “The breakup of families, whether through divorce or death, is something that so many people in the world are going through. The theme of our story is to learn to love somebody new. There has to be depth for me to want to do it.” Production on the animated musical began in 2017, and with five months to go the COVID-19 lockdown caused artists to finish their work remotely. “120 animators were working at the same time and barely able to keep up,” remarks Keane, who partnered with Netflix and Pearl Studio. “Then on a Thursday at 11 a.m. word comes down, ‘We’ve got to shut down the studio at Netflix.’ Everybody left within an hour, leaving behind coffee cups and coats hanging on chairs. It was like The Day the Earth Stood Still. In some miraculous

“CG mouths always have this weird pinch at the corners. But if you open your mouth, the top lip curves and rolls, and there is this beautiful rolling curve at the corners. We worked hard at putting that into all of our human characters.” —Glen Keane, Producer/Director way, we all went to our homes, used Zoom and continued on.” What proved to be indispensable was a review system devised by Sony Pictures Imageworks that allows individuals around the world to see the same image. “Part of that toolset includes our animation tool that allows us to draw on the image,” remarks Sony Pictures Imageworks Visual Effects Supervisor David Smith. “I showed Glen that tool and the first thing he did was to sit down in the chair and started drawing with it. He said, ‘Wouldn’t it be great if this could feel like a pencil and we could have a better brush here?’ We took those notes right away, made an overlay like a traditional animation transparency overlay, and fixed up the brushes so that they responded more like a pencil.” Animation review sessions got quite lively with Keane and co-director John Kahrs (Paperman). “I’m left-handed and Glen is right-handed, so we would be sitting on opposite sides of the Cintiq grabbing the stylus from each other,” laughs Kahrs. “I worked with the editor on some of these action sequences to calm things down and to tell the audience where to direct their eye. I removed a lot of Dutch angle shots when I got on the show because [the Moon city of] Lunaria is such a wild, psychedelic color fest that tilting the camera arbitrarily felt like too much craziness.” Kahrs also wanted to make sure that the characters looked like they were actually singing. “You have to animate the lungs inhaling and pushing that air out through the larynx singing and projecting,” he says. “If you’re just moving the mouth and the

character is blinking and looking around, that’s not enough for me.” A particular aspect of CG facial animation was addressed. “CG mouths always have this weird pinch at the corners,” observes Keane. “But if you open your mouth, the top lip curves and rolls, and there is this beautiful rolling curve at the corners. We worked hard at putting that into all of our human characters.” “More days were spent in rigging than what we typically do on a picture at Sony because anatomy was important to Glen,” states Sony Pictures Imageworks Animation Supervisor Sacha Kapijimpanga. “Getting all of the responses across the face that you would expect to see, we pushed hard on that. Also, we made sure that all of the body parts moved correctly and looked good, so when the cloth simulation happened on top of the forms underneath the cloth, it felt that you could see an arm or leg under there.” Research was conducted to ensure that Chinese characters felt authentic. “We went to Shanghai and did a lot of drawing of people on the street,” explains Keane. “I did a lot of studying of Asian eyes. There is this double fold on an eyelid that I never noticed before. The higher cheek bones. How does that fit into a simplified design of that face? There is this triangle between the eyes and eyebrows. We worked hard at being able to create enough of the muscle tension in there so you can feel that.” Character Designer Brittany Myers (Spider-Man: Into the SpiderVerse) kept in mind how the various characters might present themselves if they were real people. “Fei Fei has a quirk of pulling

WINTER 2021 VFXVOICE.COM • 69

12/16/20 2:08 PM


ANIMATION

the sleeves of her sweater right up to her fingers. I used to do it when I was a little girl,” says Myers. “It’s like a security blanket. She also has crazy, wild hair held up with this little headband because that was a way of quickly getting it out of her face.” Fei Fei wears casual modern clothing, while the diva Chang’e is adorned with costumes conceptualized by acclaimed couture designer Guo Pei. “At one point, Fei Fei wants a picture and Chang’e responds, ‘Does this look like a photo op to you?’” remarks Production Designer Celine Desrumaux (The Little Prince). “She instantly transforms the setting. For this scene, I took some Vogue pictures of how models are sitting on a box, as well as Chinese calendars from the 1970s that always depicted gods and goddesses in a floating environment.” Liberties were taken with the design of Chang’e to emphasize that she is a supernatural being. “[Los Angeles-based artist] Eusong Lee had these wild designs of her being nine feet tall with smaller hands and feet,” recalls Keane. “Everything was going to be bigger than life for her. All of the same emotional components that we put into Fei Fei’s face became the foundation for Chang’e.” “Chang’e was one of the more challenging characters I’ve worked on,” notes Kapijimpanga. “There were tons of different hairstyles and lots of different outfits. Every time you change those things it is like dealing with an entirely new character. You can’t sway her arms back and forth because she has these huge cloth trails that come off of her arms. Chang’e would hold her arms more in front of her. We kept her walk fairly steady and tried to make her look as majestic as possible. Then we also have her performing onstage where we’re using all kinds of choreography as reference and she is like a rock star. When Chang’e is in the Chamber of Sadness, she is a lot more of a motherly, relatable, emotional character. Chang’e is diverse.” Three major environments are the Water Town (based on

70 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 70-71

Wuzhen, China) in which Fei Fei lives, a scarf used to tell the backstory of Chang’e, and Lunaria, which is situated on the Moon. “All five senses are involved in the way we created the Water Town,” notes Keane. “When you go to the Moon, I wanted it to feel stark black and white. I gave Celine the image of Pink Floyd’s The Dark Side of the Moon prism. The white light hitting and breaking into a spectrum of colors – that’s what Lunaria had to be. “Celine came up with this wonderful backstory of the tears of Chang’e creating all of the Lunarians and buildings,” Keane continues. “Everything radiates from her. Then there is this handdrawn style which had to be graphic and flat because it was on the scarf. I knew that was something I wanted to animate personally and it would probably be the last thing that we did to get the movie done. It was the final thing that we finished.” “Everything is about having a strong concept and for that I always look to Hayao Miyazaki (Spirited Away),” explains Desrumaux, who also drew inspiration from religious themes. “There is this relationship of the god who creates his own people, so it made sense that the tears of Chang’e would become the Lunarians. It was also creating this sweet relationship that the Lunarians are joyful and colorful, but in the same way keep reminding her that they are tears coming from the sadness of losing the love of her life. “One thing that I avoided was having iPads and cellphones, everything that makes you think that this movie is from 2020,” Desrumaux notes. “There are these motorcycles driven by the bad guys. I didn’t want to fall into this steampunk or gaseous design. I needed to analyze why a motorcycle is a motorcycle. You’ve got wheels and power, but there is no gas so it was just energy. Lunarians have their eyes and mouth inside of their bodies and the motorcycle is the same. It has power inside of the wheels.” “The Lunarians are a teardrop shape with a lightbulb effect,”

observes Character Designer Jin Kim (Big Hero 6). “Those are the most difficult designs because they are so simple and could go a thousand different ways. A human is a human. There could be some variations, but you know a human or animal form.” The voice cast influenced the look of their animated personas. “I used the actor’s facial features and attitude as reference,” says Kim. “That makes a huge difference. For example, there is Gobi who is one of the creatures on the Moon. Glen wanted all of those funny Ken Jeong facial expressions in that character. Before that we had a lot of trials. Try this. Try that. We couldn’t get the right character,

OPPOSITE TOP: Mrs. Zhong (Sandra Oh) and her son Chin (Robert G. Chiu), who has a frog named Croak as a sidekick. TOP: As a sidekick character, Bungee amplifies the inner emotions of Fei Fei (Cathy Ang). Fei Fei and Bungee share a fist-pump. BOTTOM: Experts at the NASA’s Jet Propulsion Lab and China National Space Administration were consulted to ensure a sense of authenticity.

WINTER 2021 VFXVOICE.COM • 71

12/16/20 2:08 PM


ANIMATION

the sleeves of her sweater right up to her fingers. I used to do it when I was a little girl,” says Myers. “It’s like a security blanket. She also has crazy, wild hair held up with this little headband because that was a way of quickly getting it out of her face.” Fei Fei wears casual modern clothing, while the diva Chang’e is adorned with costumes conceptualized by acclaimed couture designer Guo Pei. “At one point, Fei Fei wants a picture and Chang’e responds, ‘Does this look like a photo op to you?’” remarks Production Designer Celine Desrumaux (The Little Prince). “She instantly transforms the setting. For this scene, I took some Vogue pictures of how models are sitting on a box, as well as Chinese calendars from the 1970s that always depicted gods and goddesses in a floating environment.” Liberties were taken with the design of Chang’e to emphasize that she is a supernatural being. “[Los Angeles-based artist] Eusong Lee had these wild designs of her being nine feet tall with smaller hands and feet,” recalls Keane. “Everything was going to be bigger than life for her. All of the same emotional components that we put into Fei Fei’s face became the foundation for Chang’e.” “Chang’e was one of the more challenging characters I’ve worked on,” notes Kapijimpanga. “There were tons of different hairstyles and lots of different outfits. Every time you change those things it is like dealing with an entirely new character. You can’t sway her arms back and forth because she has these huge cloth trails that come off of her arms. Chang’e would hold her arms more in front of her. We kept her walk fairly steady and tried to make her look as majestic as possible. Then we also have her performing onstage where we’re using all kinds of choreography as reference and she is like a rock star. When Chang’e is in the Chamber of Sadness, she is a lot more of a motherly, relatable, emotional character. Chang’e is diverse.” Three major environments are the Water Town (based on

70 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 70-71

Wuzhen, China) in which Fei Fei lives, a scarf used to tell the backstory of Chang’e, and Lunaria, which is situated on the Moon. “All five senses are involved in the way we created the Water Town,” notes Keane. “When you go to the Moon, I wanted it to feel stark black and white. I gave Celine the image of Pink Floyd’s The Dark Side of the Moon prism. The white light hitting and breaking into a spectrum of colors – that’s what Lunaria had to be. “Celine came up with this wonderful backstory of the tears of Chang’e creating all of the Lunarians and buildings,” Keane continues. “Everything radiates from her. Then there is this handdrawn style which had to be graphic and flat because it was on the scarf. I knew that was something I wanted to animate personally and it would probably be the last thing that we did to get the movie done. It was the final thing that we finished.” “Everything is about having a strong concept and for that I always look to Hayao Miyazaki (Spirited Away),” explains Desrumaux, who also drew inspiration from religious themes. “There is this relationship of the god who creates his own people, so it made sense that the tears of Chang’e would become the Lunarians. It was also creating this sweet relationship that the Lunarians are joyful and colorful, but in the same way keep reminding her that they are tears coming from the sadness of losing the love of her life. “One thing that I avoided was having iPads and cellphones, everything that makes you think that this movie is from 2020,” Desrumaux notes. “There are these motorcycles driven by the bad guys. I didn’t want to fall into this steampunk or gaseous design. I needed to analyze why a motorcycle is a motorcycle. You’ve got wheels and power, but there is no gas so it was just energy. Lunarians have their eyes and mouth inside of their bodies and the motorcycle is the same. It has power inside of the wheels.” “The Lunarians are a teardrop shape with a lightbulb effect,”

observes Character Designer Jin Kim (Big Hero 6). “Those are the most difficult designs because they are so simple and could go a thousand different ways. A human is a human. There could be some variations, but you know a human or animal form.” The voice cast influenced the look of their animated personas. “I used the actor’s facial features and attitude as reference,” says Kim. “That makes a huge difference. For example, there is Gobi who is one of the creatures on the Moon. Glen wanted all of those funny Ken Jeong facial expressions in that character. Before that we had a lot of trials. Try this. Try that. We couldn’t get the right character,

OPPOSITE TOP: Mrs. Zhong (Sandra Oh) and her son Chin (Robert G. Chiu), who has a frog named Croak as a sidekick. TOP: As a sidekick character, Bungee amplifies the inner emotions of Fei Fei (Cathy Ang). Fei Fei and Bungee share a fist-pump. BOTTOM: Experts at the NASA’s Jet Propulsion Lab and China National Space Administration were consulted to ensure a sense of authenticity.

WINTER 2021 VFXVOICE.COM • 71

12/16/20 2:08 PM


ANIMATION

but as soon as Ken Jeong was in the movie, it was easy.” Lunaria was a blend of 2D and 3D looks. “We wanted to get even the building colors correct and composition right in rough layout,” states Sony Pictures Imageworks CG Supervisor Clara Chan. “That changed the structure and pipeline. Also, the buildings are not made of anything that is real, so we had to change our shaders.” The Moon was not treated universally. “There are two parts of the Moon,” explains Chan. “One is more realistic and has the sun as a source. For that we used a lot of NASA imagery to be our reference. But when we get to Lunaria, it’s a fantasy world. Every building is a light source, but we don’t want everything to be glowing; otherwise, it would be overwhelming. We had to pick and choose what would make the image look best.” A primary lighting source for the Water Town were the numerous lanterns with a unique aspect being the numerous white walls. “There was a particular brush style that Celine liked all of her art to have that was rooted on this concept of a white wall,” reveals Smith. “She would say that a white wall is full of color. Each one of our white walls probably had 15 different shades of white or hints of yellows, pinks, blues and greys. “We would build and simplify a traditional tree so it had the impression of a realistic one but with less detail,” adds Smith. “We only built the trunk of the willow tree. The effects team came with up a procedural system for the longer branches and leaves so we could always control the density, because there are times that Fei Fei is peering through or sitting behind the willow tree branches.” A proper balance was needed to make the Moon feel authentic, but avoid floating characters and objects becoming too distracting. “In Lunaria, there is this palace at the center of this world, and as you get closer to it gravity becomes more Earth-like,” remarks Kapijimpanga. “As you’re further out on the outskirts of the Moon, it is more what you would expect to see in old space footage.”

72 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 73

Lunarians had detailed rigging which enabled them to have a variety of facial expressions that could be controlled. “The Space Dog forms out of this nebula and stardust, but the dog we animated was completely realistic,” states Kapijimpanga. “We allowed effects and compositing to be able to dial in how much that you would see of the character.” Not all of the buildings on Lunaria were created at the same time from the tears of Chang’e. “Further out of the city, you can see the buildings that are much more simple, blobby and lava-lamping around,” explains Smith. “The buildings as they get closer to the palace are more solidified, structured and detailed.” It was not easy dealing with aliens that are luminescent and translucent. “We had this solid gelatinous form for the characters, but within there was a gas,” Smith explains. “Normally, if you put gas inside water or a gel you will get a bubble. We had to rebuild the shader so that they would allow a gas to live inside the same space as a solid or semi-solid gel. We also had to have it so that the shader would both cast and receive light. Also, many of the characters and buildings have these little tear-offs that work like lava lamps.” An extra element requested by Keane significantly complicated the 2D backstory of Chang’e. “Originally, we thought it would be something simple, but Glen had the idea of mapping that 2D animation onto a moving scarf,” remarks Chan. “That simulation was tricky because we don’t want it to have too much wrinkles, otherwise it distorts the 2D animation. We had to think of a way to put it onto a scarf, make it still look like it was lit by the environment, and yet maintain the colors that we wanted to see in the 2D animation. That actual environment is the family sitting by the river and the main sources [of light] are the moon and a lantern. We couldn’t make it too flat, so we made sure that some parts on the scarf had some highlights from the lantern.” “As my mentor, Ollie Johnston (Bambi), would tell me, ‘Don’t

animate what the character is doing. Animate what the character is feeling and thinking,’” states Keane. “I’ve always taken that as my mantra for what I’m trying to do in a film. There is one particular moment when Fei Fei sees her dad touch Mrs. Zhong (Sandra Oh), who is going to become her new mom. The camera is one big shot of her face. Fei Fei’s eyes widen and her eyebrows go in. Her world is beginning to crack and crumble at that instance. We showed this one shot at the 2019 CTN animation eXpo. There was no sound and context. It was an audible gasp in the crowd. For five seconds, you feel everything that girl feels, and it’s pure animation.”

OPPOSITE TOP: The facial features of Fei Fei served as the foundation for Chang’e (Phillipa Soo). TOP: Chin (Robert Chiu), who is soon to be the stepbrother of Fei Fei, stows away on her rocket. BOTTOM: It was important to Glen Keane for the Asian cast to be ethically and physiologically correct. From back left clockwise: Grandma (Irene Tsu), Grandpa (Clem Chung), Auntie Mei (Kimiko Glenn), Auntie Ling (Margaret Cho), Uncle (Artt Butler), Father (John Cho), Fei Fei (Cathy Ang), Mrs. Zhong (Sandra Oh) and her son, Chin (Robert G. Chiu).

WINTER 2021 VFXVOICE.COM • 73

12/16/20 2:08 PM


ANIMATION

but as soon as Ken Jeong was in the movie, it was easy.” Lunaria was a blend of 2D and 3D looks. “We wanted to get even the building colors correct and composition right in rough layout,” states Sony Pictures Imageworks CG Supervisor Clara Chan. “That changed the structure and pipeline. Also, the buildings are not made of anything that is real, so we had to change our shaders.” The Moon was not treated universally. “There are two parts of the Moon,” explains Chan. “One is more realistic and has the sun as a source. For that we used a lot of NASA imagery to be our reference. But when we get to Lunaria, it’s a fantasy world. Every building is a light source, but we don’t want everything to be glowing; otherwise, it would be overwhelming. We had to pick and choose what would make the image look best.” A primary lighting source for the Water Town were the numerous lanterns with a unique aspect being the numerous white walls. “There was a particular brush style that Celine liked all of her art to have that was rooted on this concept of a white wall,” reveals Smith. “She would say that a white wall is full of color. Each one of our white walls probably had 15 different shades of white or hints of yellows, pinks, blues and greys. “We would build and simplify a traditional tree so it had the impression of a realistic one but with less detail,” adds Smith. “We only built the trunk of the willow tree. The effects team came with up a procedural system for the longer branches and leaves so we could always control the density, because there are times that Fei Fei is peering through or sitting behind the willow tree branches.” A proper balance was needed to make the Moon feel authentic, but avoid floating characters and objects becoming too distracting. “In Lunaria, there is this palace at the center of this world, and as you get closer to it gravity becomes more Earth-like,” remarks Kapijimpanga. “As you’re further out on the outskirts of the Moon, it is more what you would expect to see in old space footage.”

72 • VFXVOICE.COM WINTER 2021

PG 68-73 OVER THE MOON.indd 73

Lunarians had detailed rigging which enabled them to have a variety of facial expressions that could be controlled. “The Space Dog forms out of this nebula and stardust, but the dog we animated was completely realistic,” states Kapijimpanga. “We allowed effects and compositing to be able to dial in how much that you would see of the character.” Not all of the buildings on Lunaria were created at the same time from the tears of Chang’e. “Further out of the city, you can see the buildings that are much more simple, blobby and lava-lamping around,” explains Smith. “The buildings as they get closer to the palace are more solidified, structured and detailed.” It was not easy dealing with aliens that are luminescent and translucent. “We had this solid gelatinous form for the characters, but within there was a gas,” Smith explains. “Normally, if you put gas inside water or a gel you will get a bubble. We had to rebuild the shader so that they would allow a gas to live inside the same space as a solid or semi-solid gel. We also had to have it so that the shader would both cast and receive light. Also, many of the characters and buildings have these little tear-offs that work like lava lamps.” An extra element requested by Keane significantly complicated the 2D backstory of Chang’e. “Originally, we thought it would be something simple, but Glen had the idea of mapping that 2D animation onto a moving scarf,” remarks Chan. “That simulation was tricky because we don’t want it to have too much wrinkles, otherwise it distorts the 2D animation. We had to think of a way to put it onto a scarf, make it still look like it was lit by the environment, and yet maintain the colors that we wanted to see in the 2D animation. That actual environment is the family sitting by the river and the main sources [of light] are the moon and a lantern. We couldn’t make it too flat, so we made sure that some parts on the scarf had some highlights from the lantern.” “As my mentor, Ollie Johnston (Bambi), would tell me, ‘Don’t

animate what the character is doing. Animate what the character is feeling and thinking,’” states Keane. “I’ve always taken that as my mantra for what I’m trying to do in a film. There is one particular moment when Fei Fei sees her dad touch Mrs. Zhong (Sandra Oh), who is going to become her new mom. The camera is one big shot of her face. Fei Fei’s eyes widen and her eyebrows go in. Her world is beginning to crack and crumble at that instance. We showed this one shot at the 2019 CTN animation eXpo. There was no sound and context. It was an audible gasp in the crowd. For five seconds, you feel everything that girl feels, and it’s pure animation.”

OPPOSITE TOP: The facial features of Fei Fei served as the foundation for Chang’e (Phillipa Soo). TOP: Chin (Robert Chiu), who is soon to be the stepbrother of Fei Fei, stows away on her rocket. BOTTOM: It was important to Glen Keane for the Asian cast to be ethically and physiologically correct. From back left clockwise: Grandma (Irene Tsu), Grandpa (Clem Chung), Auntie Mei (Kimiko Glenn), Auntie Ling (Margaret Cho), Uncle (Artt Butler), Father (John Cho), Fei Fei (Cathy Ang), Mrs. Zhong (Sandra Oh) and her son, Chin (Robert G. Chiu).

WINTER 2021 VFXVOICE.COM • 73

12/16/20 2:08 PM


FILM SCHOOLS

LEARNING NEW WAYS TO EDUCATE TOMORROW’S INDUSTRY PROS By CHRIS McGOWAN

TOP: An aerial shot of Talbot campus of Bournemouth University in Poole, England. The newly-opened Poole Gateway Building houses labs and studios for courses in the creative industries, including the National Centre for Computer Animation. (Image courtesy of Bournemouth University)

74 • VFXVOICE.COM WINTER 2021

PG 74-78 FILM SCHOOLS.indd 74-75

Once the pandemic hit and turned classes into virtual events, VFX and animation schools scrambled to get their curricula online, make classes glitch-free and dynamic, and offer remote access to workstations. Zoom has been an essential platform for online classes and meetings, with Discord, Blackboard, SyncSketch, Slack, MS Teams and Shotgun, among others, also cited as key software – for collaboration, communication, learning and/or project management. Online classes have generally functioned well, according to schools contacted, and in some cases provided extra benefits like recordability for later viewing, available on demand. Plus, guest speakers can join a seminar even if they are in Timbuktu or Tasmania. “As the world changed, so have we,” says Miguel Rodriguez about Vancouver’s InFocus Film School and its response to COVID-19. Rodriguez, Head of the 3D Animation and VFX program, comments, “It definitely was a rough process of adapting to the new normal. During the first week of the quarantine we worked hard to set up online learning tools and remote access to the class computers. It gave [students] 24/7 access to their workstations without leaving home.” Regarding classes, Rodriguez notes that a webcam and microphone can’t convey as much as sharing a space, “but there are plenty of tools that help make the class more dynamic. Video, audio, drawing boards, screen and file sharing play an important part in this. “Zoom is definitely important, allowing us to have classes and meetings without too much setup,” he continues. “We have also

used Discord with great success, [as its] ability to share multiple screens between several participants and share files while keeping a chat history makes it an effective tool when keeping track of several medium and large-sized projects. For production purposes, I’d say Shotgun is still the number one option when compiling all of the aspects surrounding a project.” As of fall of last year, InFocus animation and VFX classes were online, while the Canadian school’s film production classes were hybrid – with distancing and masks. When USC’s School of Cinematic Arts (SCA) went online, it was a “stressful and difficult time for everyone – both professionally and personally,” notes Teresa Cheng, Chair of the John C. Hench Division of Animation and Digital Arts at SCA. “The resilience of our [division’s] students has really been impressive.” Cheng continues, “This pandemic forced everyone to adapt and do so quickly.” The silver lining is that “remote collaboration has been the norm in our industry for some time now.” SCA’s animation and digital arts classes are using Zoom, Blackboard, SyncSketch and Slack, according to Cheng, plus “our Creative Technology department has worked out virtual desktop access for our students via Teradici.” However, she emphasizes that “our value is in our faculty. Zoom is just a tool. Of course, there are limitations [in not] being physically in the same space, but good teachers always find inventive ways to reach their students and deliver good content. “We already have access to great people in the industry, alumni and professional contacts who are all eager to help,” she adds, “now

TOP: Actor Imogen Ridley, with face mask, utilizes the NCCA optical motion capture system for a medical education project at the National Centre for Computer Animation at Bournemouth University. (Image courtesy of Bournemouth University) BOTTOM: A student keeps her social distance during a mocap session at the InFocus Film School in Vancouver, Canada. (Image courtesy of InFocus Film School)

WINTER 2021 VFXVOICE.COM • 75

12/16/20 2:08 PM


FILM SCHOOLS

LEARNING NEW WAYS TO EDUCATE TOMORROW’S INDUSTRY PROS By CHRIS McGOWAN

TOP: An aerial shot of Talbot campus of Bournemouth University in Poole, England. The newly-opened Poole Gateway Building houses labs and studios for courses in the creative industries, including the National Centre for Computer Animation. (Image courtesy of Bournemouth University)

74 • VFXVOICE.COM WINTER 2021

PG 74-78 FILM SCHOOLS.indd 74-75

Once the pandemic hit and turned classes into virtual events, VFX and animation schools scrambled to get their curricula online, make classes glitch-free and dynamic, and offer remote access to workstations. Zoom has been an essential platform for online classes and meetings, with Discord, Blackboard, SyncSketch, Slack, MS Teams and Shotgun, among others, also cited as key software – for collaboration, communication, learning and/or project management. Online classes have generally functioned well, according to schools contacted, and in some cases provided extra benefits like recordability for later viewing, available on demand. Plus, guest speakers can join a seminar even if they are in Timbuktu or Tasmania. “As the world changed, so have we,” says Miguel Rodriguez about Vancouver’s InFocus Film School and its response to COVID-19. Rodriguez, Head of the 3D Animation and VFX program, comments, “It definitely was a rough process of adapting to the new normal. During the first week of the quarantine we worked hard to set up online learning tools and remote access to the class computers. It gave [students] 24/7 access to their workstations without leaving home.” Regarding classes, Rodriguez notes that a webcam and microphone can’t convey as much as sharing a space, “but there are plenty of tools that help make the class more dynamic. Video, audio, drawing boards, screen and file sharing play an important part in this. “Zoom is definitely important, allowing us to have classes and meetings without too much setup,” he continues. “We have also

used Discord with great success, [as its] ability to share multiple screens between several participants and share files while keeping a chat history makes it an effective tool when keeping track of several medium and large-sized projects. For production purposes, I’d say Shotgun is still the number one option when compiling all of the aspects surrounding a project.” As of fall of last year, InFocus animation and VFX classes were online, while the Canadian school’s film production classes were hybrid – with distancing and masks. When USC’s School of Cinematic Arts (SCA) went online, it was a “stressful and difficult time for everyone – both professionally and personally,” notes Teresa Cheng, Chair of the John C. Hench Division of Animation and Digital Arts at SCA. “The resilience of our [division’s] students has really been impressive.” Cheng continues, “This pandemic forced everyone to adapt and do so quickly.” The silver lining is that “remote collaboration has been the norm in our industry for some time now.” SCA’s animation and digital arts classes are using Zoom, Blackboard, SyncSketch and Slack, according to Cheng, plus “our Creative Technology department has worked out virtual desktop access for our students via Teradici.” However, she emphasizes that “our value is in our faculty. Zoom is just a tool. Of course, there are limitations [in not] being physically in the same space, but good teachers always find inventive ways to reach their students and deliver good content. “We already have access to great people in the industry, alumni and professional contacts who are all eager to help,” she adds, “now

TOP: Actor Imogen Ridley, with face mask, utilizes the NCCA optical motion capture system for a medical education project at the National Centre for Computer Animation at Bournemouth University. (Image courtesy of Bournemouth University) BOTTOM: A student keeps her social distance during a mocap session at the InFocus Film School in Vancouver, Canada. (Image courtesy of InFocus Film School)

WINTER 2021 VFXVOICE.COM • 75

12/16/20 2:08 PM


FILM SCHOOLS

TOP: Lost Boys Studios Co-owner/Director Ria Benard and Founder/Director Mark Benard work with compositing student Chris Thomassin in Vancouver to add makeup and effects before filming with greenscreen for an ‘Ethereal Effects Project.’ (Image courtesy of Lost Boys Studios) MIDDLE: Lost Boys Studios FXTD instructor Harrison Molling conducts a screening of students’ completed demo reels with industry guests. All are socially distanced with mandatory face masks after a temperature and symptoms check prior to the screening. (Image courtesy of Lost Boys Studios) BOTTOM: Aerial view of Stage 1 of the National Film and Television School in Beaconsfield, U.K. during a test-shoot of Our Love Is Here to Stay in June 2020, when the U.K. COVID lockdown was lifted. (Image courtesy of the National Film and Television School)

not limited to the same time zone.” Guest appearances by notable film and VFX professionals have been a plus for SCA and many other schools. SCA’s summer 2020 “webinars” featured the likes of Jeffrey Katzenberg (Founding Partner of WndrCo), Ted Sarandos (Co-CEO of Netflix), Kristine Belson (President of Sony Pictures Animation) and Karen Toliver (Executive Vice President of Creative at Sony Pictures Animation). The webinars were for students and for professionals to keep up to date, and typically had about 200-400 participants. VFX courses at the Savannah College of Art and Design (SCAD) also converted to remote learning for the spring through fall of 2020. SCAD has campuses in Savannah and Atlanta, Georgia and Lacoste, France. Kevin Mannens, Chair of SCAD’s Visual Effects degree program, comments, “Classes were conducted live over Zoom during the usual class times. Because lectures are recorded, students could revisit sections of a class they want to see again for better understanding. Students love this feature and we will keep recording classes even once we go back on ground.” Mannens notes that even classes that required hardware and gear – like cameras, lights and greenscreens – were converted successfully to virtual, which in some ways “added benefits because the students were forced to flex their creative problem-solving muscles to come up with solutions to shoot, light and work on footage, despite not having access to the gear at SCAD,” he says. The university continued with virtual programming for its fall 2020 curriculum, but planned to open designated spaces for students who desired access to labs and studios with specialized equipment. The College of Motion Picture Arts at Florida State University pursued a hybrid model for the fall of 2020, going remote when possible, according to Ron Honn, Filmmaker-in-Residence, Visual Arts. He notes that the school went the extra mile for its students when the pandemic began. “We were determined that our students would have the equipment necessary to continue work on their projects. So we shipped professional camera packages, lighting and grip gear, as needed, to students in their various locations.” At New York City’s School of Visual Arts (SVA), “the value of experience, insight, advice and critique that our faculty bring to our students is fully present in our online classes, and ensures that a Zoom session can offer a valuable learning experience,” reports Jimmy Calhoun, Chair of BFA in Computer Art, Computer Animation and Visual Effects at SVA. In addition, “Our alumni and industry friends that work outside of New York City have been able to join us for guest lectures and workshops, and our industry thesis screening jury increased from 50 professionals to over 200 participants from around the world this past spring. As we look for ways to add value to what we are doing online, we find new things that we will retain when we return to campus. “Zoom is certainly our go-to software for connecting students and teachers in real-time,” explains Calhoun, “and our

AD

76 • VFXVOICE.COM WINTER 2021

PG 74-78 FILM SCHOOLS.indd 76-77

12/16/20 2:08 PM


PG 77 AMAZON UPLOAD AD.indd 77

12/16/20 2:09 PM


FILM SCHOOLS

TOP: Professor Fred Spector gives a virtual class inside SCAD’s Gulfstream Design Center. Students have enjoyed the recordability of Zoom sessions as they could revisit sections of a class later. (Image courtesy of Savannah College of Art and Design) MIDDLE: Husband-and-wife School of Cinematic Arts professors Mike Patterson and Candace Reckinger teaching a remote Animation Design & Production class for grad students. The couple also co-directed a string of acclaimed music videos for Suzanne Vega, Sting and Paula Abdul, among other high-profile projects. (Image courtesy of the USC School of Cinematic Arts) BOTTOM: Vancouver Film School Makeup Design for Film & Television student Aerien Steadman works on a clay sculpture after limited groups of students resumed campus activities last August. (Image courtesy of Vancouver Film School)

instructors have found new and better ways to take advantage of our learning management software, Canvas. Our students and faculty have always used professional tools to both create and track their productions. Our reliance on Shotgun to monitor our students’ progress has been a huge benefit.” Calhoun also expresses his continuing gratitude towards “companies like Autodesk and Epic Games, who make their software always free to students, and we are thankful [to] other software companies like SideFX, Avid and Foundry that have also been very supportive to our students in providing them access to the tools they need to create from home.” The coronavirus situation has been different in every country and subject to rapid change. According to Richard Southern, Head of Department, National Centre for Computer Animation at Bournemouth University in England, the school will provide online classes through January, but will also open some facilities to students. “A small proportion of our teaching will necessarily need to take place in specialist facilities, such as greenscreen and camera training,” he elaborates, “although special measures are in place for the safe shared use of equipment.” He explains that animation and VFX students have also had physical access to a proportion of the physical workstations. “The remaining workstations are available via a remote/virtual desktop solution” and all production software “available to students via VPN should they have the personal equipment to use it.” Zoom, MS Teams, Shotgun and SyncSketch are platforms used by the school, among others. Southern recalls that at first there were several specific problems to resolve. One example was that “the virtual/remote desktop access compatibility with OpenGL was a challenge.” Southern explains that it was also challenging in the shift online to maintain “cohort culture,” which he considers one of the most valuable components of the school’s programs. The promotion of the use of collaborative platforms for peer contributions became “increasingly important – for example, the use of MS Teams for peer assess project dailies or pitches, and Shotgun for project management and collaboration,” he says. “In my experience the ability to screen share and work collaboratively on documents via MS Teams and Shotgun has actually improved small group collaboration and supervision.” When adapting studies to the pandemic, SCA’s Cheng states, “We have to lead by example and show [students] how to pivot in an uncertain world under impossible circumstances. This is what professionals in our industry do all the time, so these lessons help prepare our students for the working world when they leave USC.” InFocus Film School’s Rodriguez observes, “These are difficult times for everyone, but it’s also a great opportunity to look into developing your career. People will keep watching shows, movies and playing video games, much more so during these crazy times. That means more work needs to be done, more hands and talent are needed.”

78 • VFXVOICE.COM WINTER 2021

PG 74-78 FILM SCHOOLS.indd 78

12/16/20 2:08 PM


PG 79 APPLE GREYHOUND AD.indd 79

12/16/20 2:09 PM


TECH & TOOLS

HOW TO COMP A SHOT IN THREE DIFFERENT TOOLS By IAN FAILES and MARIJN EKEN

TOP: Using the PlanarTracker node in Nuke. BOOTOM: Compositor Marijn Eken, who carried out the composite in Nuke, Fusion and After Effects.

80 • VFXVOICE.COM WINTER 2021

PG 80-84 COMP SHOTS.indd 80-81

Visual effects artists already know there are often many ways you can pull off a particular shot via different filming methods and with different tools and techniques. That’s certainly the case with compositing, where several pieces of software and compositing workflows can be used. Here, visual effects artist Marijn Eken – who has worked as a compositor at studios including DNEG, Scanline VFX and RISE | Visual Effects Studios, and currently teaches at the Netherlands Film Academy – explains how he might tackle compositing of the same shot in three separate packages. These are Foundry’s Nuke, Blackmagic Design’s Fusion (inside DaVinci Resolve) and Adobe After Effects, arguably the three most accessible compositing tools. The elements supplied to Eken were stock footage of a woman pinch-zooming a greenscreen and the marker-covered tablet, and a photograph of the space shuttle Endeavour taken at California Science Center, with the idea being that she would be zooming into an area of the photograph on the device. In coming up with a methodology for approaching this relatively simple composite in the three different packages, Eken generally followed these steps: 1. Track the four corners. 2. Use the corner pin method to composite the image over the screen. 3. Track the two fingers on the screen and use that to position and zoom the image. 4. Apply grading on the inserted image to match the black levels and white point.

5. Use the greenscreen and some roto to separate the hand and composite it on top. These overall steps represent just one compositor’s view about how the elements could be composited together, along with some of the nuances involved in each of these common compositing tools. Step 1. Tracking In general, for accuracy, it’s best to use as large an area as possible for tracking. Here, the hand comes across the screen at some point and obscures part of the screen and the lower right corner, so we need to work around that. For Nuke and Fusion, we can use a Planar Tracker. It’ll use a large surface area to track, and even though the lower right corner is obscured at some point, we can get an accurate location for that corner at all times. See below for what works best for After Effects. Nuke: We use the PlanarTracker node to create a roto shape that covers as much area as possible, without using any of the surface that is later covered by the hand. Just pressing the Track Forward button was enough to make this work in one go. Fusion: We use the Planar Tracker which starts in Track mode and directly allows us to draw a roto shape for the area we want to track. After the tracking is done, you have to switch the Tool’s Operation Mode to Corner Pin. Then you’ll get a rectangle that you can modify to place the corners in the correct location on one frame, and then it’ll follow the screen in subsequent frames. After Effects: You could do a planar tracker, but it involves going

TOP: The Planar Tracker in Fusion. BOTTOM: Perspective Corner Pin track in After Effects.

WINTER 2021 VFXVOICE.COM • 81

12/16/20 2:10 PM


TECH & TOOLS

HOW TO COMP A SHOT IN THREE DIFFERENT TOOLS By IAN FAILES and MARIJN EKEN

TOP: Using the PlanarTracker node in Nuke. BOOTOM: Compositor Marijn Eken, who carried out the composite in Nuke, Fusion and After Effects.

80 • VFXVOICE.COM WINTER 2021

PG 80-84 COMP SHOTS.indd 80-81

Visual effects artists already know there are often many ways you can pull off a particular shot via different filming methods and with different tools and techniques. That’s certainly the case with compositing, where several pieces of software and compositing workflows can be used. Here, visual effects artist Marijn Eken – who has worked as a compositor at studios including DNEG, Scanline VFX and RISE | Visual Effects Studios, and currently teaches at the Netherlands Film Academy – explains how he might tackle compositing of the same shot in three separate packages. These are Foundry’s Nuke, Blackmagic Design’s Fusion (inside DaVinci Resolve) and Adobe After Effects, arguably the three most accessible compositing tools. The elements supplied to Eken were stock footage of a woman pinch-zooming a greenscreen and the marker-covered tablet, and a photograph of the space shuttle Endeavour taken at California Science Center, with the idea being that she would be zooming into an area of the photograph on the device. In coming up with a methodology for approaching this relatively simple composite in the three different packages, Eken generally followed these steps: 1. Track the four corners. 2. Use the corner pin method to composite the image over the screen. 3. Track the two fingers on the screen and use that to position and zoom the image. 4. Apply grading on the inserted image to match the black levels and white point.

5. Use the greenscreen and some roto to separate the hand and composite it on top. These overall steps represent just one compositor’s view about how the elements could be composited together, along with some of the nuances involved in each of these common compositing tools. Step 1. Tracking In general, for accuracy, it’s best to use as large an area as possible for tracking. Here, the hand comes across the screen at some point and obscures part of the screen and the lower right corner, so we need to work around that. For Nuke and Fusion, we can use a Planar Tracker. It’ll use a large surface area to track, and even though the lower right corner is obscured at some point, we can get an accurate location for that corner at all times. See below for what works best for After Effects. Nuke: We use the PlanarTracker node to create a roto shape that covers as much area as possible, without using any of the surface that is later covered by the hand. Just pressing the Track Forward button was enough to make this work in one go. Fusion: We use the Planar Tracker which starts in Track mode and directly allows us to draw a roto shape for the area we want to track. After the tracking is done, you have to switch the Tool’s Operation Mode to Corner Pin. Then you’ll get a rectangle that you can modify to place the corners in the correct location on one frame, and then it’ll follow the screen in subsequent frames. After Effects: You could do a planar tracker, but it involves going

TOP: The Planar Tracker in Fusion. BOTTOM: Perspective Corner Pin track in After Effects.

WINTER 2021 VFXVOICE.COM • 81

12/16/20 2:10 PM


TECH & TOOLS

to the included Mocha application. That’s a bit involved, so for this demonstration I opted to stick with the built-in tools. That means we’re doing a perspective corner pin track using four trackers. Three of those we can place on the actual corners, but the bottom right one is obscured by the hand at some point, so we place it on the closest tracking marker on the screen. That’s not ideal, since we won’t be tracking the actual corner of the screen, and this will cause inaccuracies with the position of that corner. With a control-drag we can position the corner pin point on the actual corner. Step 2. Corner pin Nuke: We can select our PlanarTracker and choose Export > CornerPin2D (absolute). This will create a CornerPin2D node that does the work for us of warping the space shuttle image to match the screen. Our tracking data was created on an image of 1920 x 1080 (our source footage), but the image to be inserted has a resolution of 3024 x 4032. To make the corner pin work, we use a Reformat node, to ‘distort’ the space shuttle image into the 1920 x 1080 resolution, before applying the corner pin. Nuke is smart enough to not actually do two transforms back-to-back. It would be scaling the image down two times, which would result in a loss of image quality. Nuke uses what is called concatenation to gather all the transformations first, and only applies them once at the end. Once we have our properly distorted image, we use a Merge node to composite it on top of our footage. Fusion: We connect the space shuttle image to the special ‘Corner Pin 1’ input of the Planar Tracker. Fusion takes care of the resolution differences and compositing of the images. In fact, it uses a unique system of always describing coordinates and positions in a range from (0,0) to (1,1). Since the corner pin is warping the (0,0) and (1,1) coordinates (the corners) to the correct screen locations, this always works, regardless of resolution. After Effects: From the Tracker side panel, we select a Motion Target (our space shuttle image layer that we brought into the composition) and press Apply. This creates a Corner Pin effect on the space shuttle layer, with animated corner points, which squeezes the image to fit inside the four corners. Resolution differences are taken care of by After Effects. Step 3. Finger tracking The hand moves over the screen and makes a pinch motion. To translate that action into actually zooming into the content was the most difficult part of this exercise. I took the easiest approach in Nuke, which worked straight away. The other packages, though, couldn’t do the same, so I had to come up with some tricks with expressions, which are similar between Fusion and After Effects, but not completely. Nuke: I duplicated the CornerPin node that we already had and turned on the ‘invert’ checkbox. This inverts the corner pin operation, filling our screen with the tablet’s screen. The next step was to create a normal tracker node (not the PlanarTracker) and track the two finger tips moving apart. A

82 • VFXVOICE.COM WINTER 2021

PG 80-84 COMP SHOTS.indd 82-83

reference frame had to be chosen (frame 40) to be the frame at which no zooming or translation would be in effect. By setting the tracker transform mode to ‘match-move,’ and ticking the Translate and Scale options (not the Rotation), the space shuttle image would follow the scale (distance between the fingers) and position (average of the two fingertips). By inserting the Tracker node before the CornerPin2D node, this assured the scaling would be applied in the correct ‘domain.’ Because the image is enlarged by zooming in, it was necessary to add a Crop node to remove the edges of the image that should not be visible beyond the screen of the tablet. Fusion: Using a Tracker Tool, we track the two fingertips. After the MediaIn Tool (which reads in the space shuttle image), we place a Transform Tool that we use to move the image around to position it where we need it. We also zoom in a little bit. Next is another Transform Tool we called Scale. This has an expression to do the scaling based on the movement of the fingers. We need to calculate the distance between the two points using good old Pythagoras. We divide this number by the smallest distance when the fingers first touch the screen. That way the scale will be 1 at that point, and scale up accordingly with the distance between the fingers. That takes care of the zooming, but not the position yet. To move the space shuttle image with the average position between the fingers, we need to do the following math on another Transform Tool, which we’ll call TrackAverage. So we basically add the coordinates of the two fingertips and divide by two to get the average, but because the tracked footage has a resolution of 1920 x 1080 and the space shuttle image is a different resolution (3024 x 4032), we need to scale by the fractions between those x and y resolutions. After Effects: To apply the zooming of the image, we need to do this before the corner pin is applied, otherwise we’d already have lost resolution if zooming in after the corner pin effect. To do this effectively, we have to Pre-Compose the image layer. Inside this new composition we need to create an expression on the Scale parameter of the layer. This expression is quite simple. It takes the two tracked fingertip points and calculates the distance between them with the ‘length()’ function. This distance is a number in pixels, so we have to ‘convert’ that into a scale. We determine that the smallest distance is 63.6 pixels (when the fingers touch the screen). If we divide by that number, the scale will be 100% when the fingers touch the screen. When they move away from each other, the scale will increase exactly by the right amount. However, this doesn’t take care of the position just yet. When the fingers move apart, the position in between the fingers is used to move the image around. To mimic that, we create an expression on the Position parameter of the layer in the main composition. It takes the original position and adds some motion on top. We take the average of the two tracked fingers by adding the two tracked positions and dividing by two. Adding this would create an offset, though, so we counteract that with a Track Offset parameter so that at the moment the fingers touch the screen, no extra motion is applied.

OPPOSITE TOP TO BOTTOM: Using a CornerPin2D node in Nuke. Changing the Planar Tracker mode to ‘Corner Pin’ in Fusion. Creating a Corner Pin effect on the space shuttle layer in After Effects. Dealing with the finger tracking in Nuke. THIS PAGE TOP TO BOTTOM: Fusion Transform Tool’s math formula calculates distance between two points. Finger tracking in Fusion. Finger tracking and expression creation in After Effects. Pushing the gain up in Nuke to check the grading.

WINTER 2021 VFXVOICE.COM • 83

12/16/20 2:10 PM


TECH & TOOLS

to the included Mocha application. That’s a bit involved, so for this demonstration I opted to stick with the built-in tools. That means we’re doing a perspective corner pin track using four trackers. Three of those we can place on the actual corners, but the bottom right one is obscured by the hand at some point, so we place it on the closest tracking marker on the screen. That’s not ideal, since we won’t be tracking the actual corner of the screen, and this will cause inaccuracies with the position of that corner. With a control-drag we can position the corner pin point on the actual corner. Step 2. Corner pin Nuke: We can select our PlanarTracker and choose Export > CornerPin2D (absolute). This will create a CornerPin2D node that does the work for us of warping the space shuttle image to match the screen. Our tracking data was created on an image of 1920 x 1080 (our source footage), but the image to be inserted has a resolution of 3024 x 4032. To make the corner pin work, we use a Reformat node, to ‘distort’ the space shuttle image into the 1920 x 1080 resolution, before applying the corner pin. Nuke is smart enough to not actually do two transforms back-to-back. It would be scaling the image down two times, which would result in a loss of image quality. Nuke uses what is called concatenation to gather all the transformations first, and only applies them once at the end. Once we have our properly distorted image, we use a Merge node to composite it on top of our footage. Fusion: We connect the space shuttle image to the special ‘Corner Pin 1’ input of the Planar Tracker. Fusion takes care of the resolution differences and compositing of the images. In fact, it uses a unique system of always describing coordinates and positions in a range from (0,0) to (1,1). Since the corner pin is warping the (0,0) and (1,1) coordinates (the corners) to the correct screen locations, this always works, regardless of resolution. After Effects: From the Tracker side panel, we select a Motion Target (our space shuttle image layer that we brought into the composition) and press Apply. This creates a Corner Pin effect on the space shuttle layer, with animated corner points, which squeezes the image to fit inside the four corners. Resolution differences are taken care of by After Effects. Step 3. Finger tracking The hand moves over the screen and makes a pinch motion. To translate that action into actually zooming into the content was the most difficult part of this exercise. I took the easiest approach in Nuke, which worked straight away. The other packages, though, couldn’t do the same, so I had to come up with some tricks with expressions, which are similar between Fusion and After Effects, but not completely. Nuke: I duplicated the CornerPin node that we already had and turned on the ‘invert’ checkbox. This inverts the corner pin operation, filling our screen with the tablet’s screen. The next step was to create a normal tracker node (not the PlanarTracker) and track the two finger tips moving apart. A

82 • VFXVOICE.COM WINTER 2021

PG 80-84 COMP SHOTS.indd 82-83

reference frame had to be chosen (frame 40) to be the frame at which no zooming or translation would be in effect. By setting the tracker transform mode to ‘match-move,’ and ticking the Translate and Scale options (not the Rotation), the space shuttle image would follow the scale (distance between the fingers) and position (average of the two fingertips). By inserting the Tracker node before the CornerPin2D node, this assured the scaling would be applied in the correct ‘domain.’ Because the image is enlarged by zooming in, it was necessary to add a Crop node to remove the edges of the image that should not be visible beyond the screen of the tablet. Fusion: Using a Tracker Tool, we track the two fingertips. After the MediaIn Tool (which reads in the space shuttle image), we place a Transform Tool that we use to move the image around to position it where we need it. We also zoom in a little bit. Next is another Transform Tool we called Scale. This has an expression to do the scaling based on the movement of the fingers. We need to calculate the distance between the two points using good old Pythagoras. We divide this number by the smallest distance when the fingers first touch the screen. That way the scale will be 1 at that point, and scale up accordingly with the distance between the fingers. That takes care of the zooming, but not the position yet. To move the space shuttle image with the average position between the fingers, we need to do the following math on another Transform Tool, which we’ll call TrackAverage. So we basically add the coordinates of the two fingertips and divide by two to get the average, but because the tracked footage has a resolution of 1920 x 1080 and the space shuttle image is a different resolution (3024 x 4032), we need to scale by the fractions between those x and y resolutions. After Effects: To apply the zooming of the image, we need to do this before the corner pin is applied, otherwise we’d already have lost resolution if zooming in after the corner pin effect. To do this effectively, we have to Pre-Compose the image layer. Inside this new composition we need to create an expression on the Scale parameter of the layer. This expression is quite simple. It takes the two tracked fingertip points and calculates the distance between them with the ‘length()’ function. This distance is a number in pixels, so we have to ‘convert’ that into a scale. We determine that the smallest distance is 63.6 pixels (when the fingers touch the screen). If we divide by that number, the scale will be 100% when the fingers touch the screen. When they move away from each other, the scale will increase exactly by the right amount. However, this doesn’t take care of the position just yet. When the fingers move apart, the position in between the fingers is used to move the image around. To mimic that, we create an expression on the Position parameter of the layer in the main composition. It takes the original position and adds some motion on top. We take the average of the two tracked fingers by adding the two tracked positions and dividing by two. Adding this would create an offset, though, so we counteract that with a Track Offset parameter so that at the moment the fingers touch the screen, no extra motion is applied.

OPPOSITE TOP TO BOTTOM: Using a CornerPin2D node in Nuke. Changing the Planar Tracker mode to ‘Corner Pin’ in Fusion. Creating a Corner Pin effect on the space shuttle layer in After Effects. Dealing with the finger tracking in Nuke. THIS PAGE TOP TO BOTTOM: Fusion Transform Tool’s math formula calculates distance between two points. Finger tracking in Fusion. Finger tracking and expression creation in After Effects. Pushing the gain up in Nuke to check the grading.

WINTER 2021 VFXVOICE.COM • 83

12/16/20 2:10 PM


TECH & TOOLS

Step 4. Grading To better integrate the space shuttle image into the final composite, we need to match the color of the image to the main footage. Nuke: We can use a Grade node directly after the Read node to control the color of the image that is inserted. We can select the darkest black for the ‘blackpoint’ and the brightest white for the ‘whitepoint’ in the space shuttle image. If we then select the darkest black in our footage for the ‘lift’ parameter and the brightest white for our ‘gain’ parameter, Nuke matches the blackpoint and whitepoint. This does need some manual tweaking, but it’s a starting point. To illustrate what this does, it’s hard to see without over-exaggerating the effect. By pushing the gain up on the viewer, you can see a before and after of what the Grade node does. Fusion: The process is similar to Nuke, but we don’t have the blackpoint/whitepoint feature. Using the Color Gain Tool, we can dial in the values by eye. But if we want to judge the darker levels, we need to use a Color Correct node to increase the gain, because the viewer doesn’t have a slider to do that (like Nuke and After Effects do). It’s important to check the ‘Pre-Divide/Post-Multiply’ option; otherwise, the lifting of the blacks will lift areas outside of our image too. After Effects: On the layer that holds our image, we can apply a ‘Levels (Individual Controls)’ effect to change the color. It does have a similar feature as Nuke with the ‘Input Black’ and ‘Output Black’ parameters, but you can’t use a color picker with those, so it makes less sense to use them. So I just did it by eye in this case.

TOP TO BOTTOM: Grading step in Fusion. The ‘Levels (Individual Controls)’ effect in After Effects. Using the Keylight node for roto in Nuke.

Step 5. Greenscreen We now basically have everything working correctly, but our image covers the hand, so we need to fix that. There are multiple ways to achieve this, but the easiest, and I think best way, is to just layer the hand on top. Luckily we have the green to help us with that, otherwise we would have to use rotoscoping to do it. But sadly, there are markers on the screen, so we still need a little bit of roto. Nuke: For this example we’ll just use a Keylight node. Simply select the Screen Color and set the view to Final Result. With the Roto node we create a very simple shape around the hand that only needs to be accurate around the thumb area, where the tracking marker is. We Merge the result over the image, using the Mask input to limit the effect. Fusion: For the keying we’ll use an Ultra Keyer Tool. Using the color picker to select the Background Color quite quickly gives us a satisfying result, after tweaking a few values. The UltraKeyer has a separate Garbage Matte input that we can use with the B-spline Tool to mask out a rough roto for the hand with more accuracy at the thumb, where we need it. Using the Merge Tool, we combine this isolated hand with the image we had. After Effects: We duplicate our footage layer to composite our keyed hand on top. Then we apply the Keylight effect and select the Screen Color. On this layer we also create a Mask, for which we have to set the Mode to None. The mask is used inside the Keylight effect by selecting it as an Outside Mask and checking the Invert checkbox.

84 • VFXVOICE.COM WINTER 2021

PG 80-84 COMP SHOTS.indd 84

12/16/20 2:10 PM


PG 85 DISNEY MANDALORIAN AD.indd 85

12/16/20 2:11 PM


VR/AR/MR TRENDS

BEAT SABER, HALF-LIFE: ALYX AND STAR WARS BOLSTER VIRTUAL REALITY By CHRIS McGOWAN

Virtual reality had its ups and downs in 2020. The pandemic boosted VR home usage, but disrupted headset supply chains in Q2. In addition, theme parks and other location-based entertainment centers shut down, closing many VR attractions for part of the year. SuperData (a Nielsen company) projects total worldwide VR hardware and software revenue for 2020 of $3.2 billion, a slight drop from $3.3 billion the previous year, with that figure rising to $3.9 billion in 2021 and $6.2 billion in 2023. Among the bright notes: big tech firms continue to invest heavily in VR, hardware keeps improving, and breakthrough titles like Beat Saber, Half-Life: Alyx and Vader Immortal: A Star Wars VR Series, along with a wide variety of new experiences, are luring new users. BEAT SABER

TOP: A player wearing virtual reality gear slashes the music beats with light sabers in the VR game Beat Saber, created by a small team of Czech video game developers. (Image copyright © 2018 Beat Games, Oculus Studios and Facebook) OPPOSITE TOP: The Azumel bartender Seezelslak will bend your ear with wild stories when you visit his cantina. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) OPPOSITE BOTTOM: Fortress Vader on the planet Mustafar. From Vader Immortal: A Star Wars VR Series. (Image copyright © 2019 Lucasfilm and ILMxLAB)

86 • VFXVOICE.COM WINTER 2021

PG 86-89 VR/AR.indd 86-87

Beat Saber has been something of a killer app for more than two years now and continues to inspire people to purchase their first virtual reality headsets. It is an addictive VR “rhythm game” in which the player slashes the cube-like “beats” of various pop songs with a pair of light sabers – kind of like Guitar Hero meets Star Wars. It was initially developed by Czech programmers Ján Ilavský and Vladimír Hrinčár, who had been building games together since high school and had published Chameleon Run, a mobile game that won an Apple Design Award in 2016. After that, they decided to build a game for VR and posted

a demo of Beat Saber on Facebook, which was seen by composer Jaroslav Beck, who was living in Los Angeles. When he discovered that Ilavský and Hrinčár were also Czech, he flew to Prague to see them and convinced them to let him create the soundtrack for the new game. He also pushed them to finish it. Together, they formed Beat Games. They released an “early access” version of Beat Saber in May 2018, with an official “full release” coming a year later (Beat Games was purchased by Facebook through Oculus Studios in November 2019). Beat Saber comes with 10 songs and can be expanded with downloadable content (DLC) packs containing multiple songs. In addition to Beck, artists include Imagine Dragons, Panic! at the Disco, Green Day, Linkin Park and Timbaland. Beat Saber won many video game awards and in 2019 became the first VR-only game on Steam’s Top Sellers list (in the bronze category). To date, it

has sold some two million copies, along with 10 million songs as DLC, according to Beat Games, and its gamer videos have tallied over two billion total views on YouTube. HALF-LIFE: ALYX

Half-Life: Alyx is another current VR killer app and is a virtualreality, first-person shooter developed and published by Valve Software. It required a four-year production involving 80 people and launched March 23, 2020. It achieved nearly universal acclaim. SuperData claims the title sold 680,000 copies its first month of release (including pre-orders), generating approximately $40 million in revenue, and Half-Life: Alyx led to a leap of nearly one million users on Steam in April, according to the distribution service. In the VR game, Earth has been colonized by an alien empire called the Combine. Game players control the actions of Alyx Vance, who

WINTER 2021 VFXVOICE.COM • 87

12/16/20 2:11 PM


VR/AR/MR TRENDS

BEAT SABER, HALF-LIFE: ALYX AND STAR WARS BOLSTER VIRTUAL REALITY By CHRIS McGOWAN

Virtual reality had its ups and downs in 2020. The pandemic boosted VR home usage, but disrupted headset supply chains in Q2. In addition, theme parks and other location-based entertainment centers shut down, closing many VR attractions for part of the year. SuperData (a Nielsen company) projects total worldwide VR hardware and software revenue for 2020 of $3.2 billion, a slight drop from $3.3 billion the previous year, with that figure rising to $3.9 billion in 2021 and $6.2 billion in 2023. Among the bright notes: big tech firms continue to invest heavily in VR, hardware keeps improving, and breakthrough titles like Beat Saber, Half-Life: Alyx and Vader Immortal: A Star Wars VR Series, along with a wide variety of new experiences, are luring new users. BEAT SABER

TOP: A player wearing virtual reality gear slashes the music beats with light sabers in the VR game Beat Saber, created by a small team of Czech video game developers. (Image copyright © 2018 Beat Games, Oculus Studios and Facebook) OPPOSITE TOP: The Azumel bartender Seezelslak will bend your ear with wild stories when you visit his cantina. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) OPPOSITE BOTTOM: Fortress Vader on the planet Mustafar. From Vader Immortal: A Star Wars VR Series. (Image copyright © 2019 Lucasfilm and ILMxLAB)

86 • VFXVOICE.COM WINTER 2021

PG 86-89 VR/AR.indd 86-87

Beat Saber has been something of a killer app for more than two years now and continues to inspire people to purchase their first virtual reality headsets. It is an addictive VR “rhythm game” in which the player slashes the cube-like “beats” of various pop songs with a pair of light sabers – kind of like Guitar Hero meets Star Wars. It was initially developed by Czech programmers Ján Ilavský and Vladimír Hrinčár, who had been building games together since high school and had published Chameleon Run, a mobile game that won an Apple Design Award in 2016. After that, they decided to build a game for VR and posted

a demo of Beat Saber on Facebook, which was seen by composer Jaroslav Beck, who was living in Los Angeles. When he discovered that Ilavský and Hrinčár were also Czech, he flew to Prague to see them and convinced them to let him create the soundtrack for the new game. He also pushed them to finish it. Together, they formed Beat Games. They released an “early access” version of Beat Saber in May 2018, with an official “full release” coming a year later (Beat Games was purchased by Facebook through Oculus Studios in November 2019). Beat Saber comes with 10 songs and can be expanded with downloadable content (DLC) packs containing multiple songs. In addition to Beck, artists include Imagine Dragons, Panic! at the Disco, Green Day, Linkin Park and Timbaland. Beat Saber won many video game awards and in 2019 became the first VR-only game on Steam’s Top Sellers list (in the bronze category). To date, it

has sold some two million copies, along with 10 million songs as DLC, according to Beat Games, and its gamer videos have tallied over two billion total views on YouTube. HALF-LIFE: ALYX

Half-Life: Alyx is another current VR killer app and is a virtualreality, first-person shooter developed and published by Valve Software. It required a four-year production involving 80 people and launched March 23, 2020. It achieved nearly universal acclaim. SuperData claims the title sold 680,000 copies its first month of release (including pre-orders), generating approximately $40 million in revenue, and Half-Life: Alyx led to a leap of nearly one million users on Steam in April, according to the distribution service. In the VR game, Earth has been colonized by an alien empire called the Combine. Game players control the actions of Alyx Vance, who

WINTER 2021 VFXVOICE.COM • 87

12/16/20 2:11 PM


VR/AR/MR TRENDS

is in the resistance in the crumbling urban dystopia of City 17 and on a mission to seize the Combine’s superweapon. You interact with the environment and fight enemies, using “gravity gloves” to manipulate objects. The game includes puzzles, combat and survival horror aspects. Cloudhead Games CEO and Creative Director Denny Unger comments, “Half-Life: Alyx represents the first time an AAA [major publisher] has really put their full weight behind a VR title. It’s not a port, it’s built from the ground up for VR and [for] what makes VR different. It’s beautiful and polished in the ways that consumers care about, and it’s lengthy. This is a signal to other AAAs that the time is now, that the consumer appetite for VR is real, that the technology works and isn’t a gimmick, that it is something groundbreaking. While many AAAs have passed on VR, we’re reaching a critical point that will be impossible to ignore.” Unger’s Cloudhead Games has published the popular Pistol Whip VR game and The Gallery titles. “As a developer, [I thought that] Half-Life: Alyx was extremely exciting, as it’s the largest VR production I’ve experienced, and truly exemplified a level of production value in VR that we’ve only been dreaming of. My hope is that developers working in the VR space use it as a benchmark of what VR experiences should be,” comments Keith Guerrette, Founder and Studio Director of Beyond-FX, a VFX design studio that focuses on immersive, interactive visual effects for real-time entertainment. STAR WARS IN VR

“Half-Life: Alyx represents the first time an AAA [major publisher] has really put their full weight behind a VR title. It’s not a port, it’s built from the ground up for VR and [for] what makes VR different. It’s beautiful and polished in the ways that consumers care about, and it’s lengthy. This is a signal to other AAAs that the time is now, that the consumer appetite for VR is real, that the technology works and isn’t a gimmick, that it is something groundbreaking.” —Denny Unger, CEO and Creative Director, Cloudhead Games

88 • VFXVOICE.COM WINTER 2021

PG 86-89 VR/AR.indd 88-89

Vader Immortal: A Star Wars VR Series brought the film franchise to the format with an impressive high-end production. The three-episode, short-form narrative VR game was created by ILMxLAB. With cinema-quality visuals from ILM, it was launched in May 2019 for Oculus, winning many awards, and then released in August 2020 for PlayStation. In Vader Immortal, you step inside your own Star Wars story as a smuggler operating near Mustafar, the fiery world that is home to Darth Vader. When you are unexpectedly pulled out of hyperspace, you find yourself uncovering an ancient mystery at the behest of Vader himself. The series includes a lightsaber dojo mode, where you can spend hours honing your skills. “Vader Immortal filled a new storytelling niche in VR that people responded well to. We spent a lot of time working on the balance between the interactive parts and the more passive, story moments. In the end, it flows well between the two,” says Mark Miller, ILMxLAB Executive Creative Producer and Executive Producer on the project. “Graphics quality – meaning character look and animation, lighting and environments – was also a place where we looked to separate ourselves, both on the Rift and Quest versions, and our team did an amazing job of doing just that.” He continues, “Our success will continue to expand the market and make space for projects that incorporate great interactive game mechanics and compelling narrative. The PlayStation VR platform has allowed us to reach a broad new audience of Star Wars fans who’ve been eager for us to come to that platform,” says Miller.

Star Wars: Tales from the Galaxy’s Edge is another original VR experience from ILMxLAB, developed in collaboration with Oculus Studios and scheduled to launch soon. It involves adventures, stories and characters connected to the Black Spire Outpost, associated with the Star Wars: Galaxy’s Edge land in Disneyland and Disney World. “I look for a continued rise in adoption rates and hopefully a spike in interest brought by Vader Immortal and Star Wars: Tales from the Galaxy’s Edge,” adds Miller. The Stars Wars VR titles, along with Walt Disney Animation Studios’ virtual reality short Myth: A Frozen Tale, which explores a world inspired by the feature film Frozen 2, are venturing into new territory: virtual reality experiences that immerse viewers in settings and stories associated with popular films and – in the case of Galaxy’s Edge – theme parks. THE MANY VARIETIES OF VR EXPERIENCES

There may not be enough killer game/entertainment apps currently available in VR to make it mainstream, but the format is expanding steadily into a variety of uses. Social VR platforms like VRChat, Bigscreen VR and AltspaceVR (owned by Microsoft) are creating unique experiences that could attract millions of users who want to take video conferencing into a new dimension, engage in diverse activities in VR or just hang out with others in virtual reality. It also seems likely to be heavily used in education, creativity, medicine, research, training, travel, real estate and other businesses. It could turn out that VR goes mainstream in both predictable and as yet unimagined ways. VR 2021

Strong demand for existing popular headsets should push the format briskly forward this year, as may interest in new headsets like the HP Reverb G2 (scheduled for 2020) and PlayStation VR 2 (release TBA). Apple appears to be working on its own AR/VR headset, possibly launching this year. And the advent of 5G, the next generation of wireless technology, is expected to provide lower latency and more stable connections that will greatly improve the performance of wireless VR headsets. Amazon’s Prime Video VR app, available for Oculus Quest and Samsung Gear, should also significantly expand the audience for virtual reality. Initially, Prime Video VR will migrate the company’s existing content into VR headsets, as well as launch original 360-degree videos. Tuong Nguyen, Senior Principal Analyst for Gartner, Inc., a research consulting firm, sees three “Cs” as necessary before VR goes mainstream: “Content – more breadth and depth of content. Convenience – accessible, affordable, multipurpose devices to experience VR. Control – improvement in the interface and experience. Things like 6DOF for VR experiences. Best practice/ combinations of UI for interaction in VR.” There was still more promise than performance with VR in 2020. Yet despite a sluggish year for the industry as a whole, Beat Saber, Half-Life: Alyx, Vader Immortal, social VR and apps in diverse fields are showing us that virtual reality may soon have its day in a bigger spotlight – indeed in many bigger spotlights.

OPPOSITE TOP TO BOTTOM: Seezelslak is not just a bartender and owner of a cantina – he will also give you missions to complete. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) The power-hungry pirate Tara Rashin leads a cell of the fearsome Guavian Death Gang. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) The alien Combine governs the Earth from atop this Citadel in City 17, which you climb in the high-production VR game HalfLife: Alyx. (Image copyright © 2020 Valve Software) THIS PAGE TOP TO BOTTOM: Half-Life: Alyx is a VR first-person shooter in which you guide the actions of the heroine Alyx as she fights the alien Combine that has taken over Earth. (Image copyright © 2020 Valve Software) In Half-Life: Alyx, you must battle relentless alien Combine soldiers wearing hazmat suits and riot gear. (Image copyright © 2020 Valve Software) Cross swords – or light sabers in this case – with the legendary Darth Vader in the VR experience Vader Immortal: A Star Wars VR Series. (Image copyright © 2019 Lucasfilm and ILMxLAB)

WINTER 2021 VFXVOICE.COM • 89

12/16/20 2:11 PM


VR/AR/MR TRENDS

is in the resistance in the crumbling urban dystopia of City 17 and on a mission to seize the Combine’s superweapon. You interact with the environment and fight enemies, using “gravity gloves” to manipulate objects. The game includes puzzles, combat and survival horror aspects. Cloudhead Games CEO and Creative Director Denny Unger comments, “Half-Life: Alyx represents the first time an AAA [major publisher] has really put their full weight behind a VR title. It’s not a port, it’s built from the ground up for VR and [for] what makes VR different. It’s beautiful and polished in the ways that consumers care about, and it’s lengthy. This is a signal to other AAAs that the time is now, that the consumer appetite for VR is real, that the technology works and isn’t a gimmick, that it is something groundbreaking. While many AAAs have passed on VR, we’re reaching a critical point that will be impossible to ignore.” Unger’s Cloudhead Games has published the popular Pistol Whip VR game and The Gallery titles. “As a developer, [I thought that] Half-Life: Alyx was extremely exciting, as it’s the largest VR production I’ve experienced, and truly exemplified a level of production value in VR that we’ve only been dreaming of. My hope is that developers working in the VR space use it as a benchmark of what VR experiences should be,” comments Keith Guerrette, Founder and Studio Director of Beyond-FX, a VFX design studio that focuses on immersive, interactive visual effects for real-time entertainment. STAR WARS IN VR

“Half-Life: Alyx represents the first time an AAA [major publisher] has really put their full weight behind a VR title. It’s not a port, it’s built from the ground up for VR and [for] what makes VR different. It’s beautiful and polished in the ways that consumers care about, and it’s lengthy. This is a signal to other AAAs that the time is now, that the consumer appetite for VR is real, that the technology works and isn’t a gimmick, that it is something groundbreaking.” —Denny Unger, CEO and Creative Director, Cloudhead Games

88 • VFXVOICE.COM WINTER 2021

PG 86-89 VR/AR.indd 88-89

Vader Immortal: A Star Wars VR Series brought the film franchise to the format with an impressive high-end production. The three-episode, short-form narrative VR game was created by ILMxLAB. With cinema-quality visuals from ILM, it was launched in May 2019 for Oculus, winning many awards, and then released in August 2020 for PlayStation. In Vader Immortal, you step inside your own Star Wars story as a smuggler operating near Mustafar, the fiery world that is home to Darth Vader. When you are unexpectedly pulled out of hyperspace, you find yourself uncovering an ancient mystery at the behest of Vader himself. The series includes a lightsaber dojo mode, where you can spend hours honing your skills. “Vader Immortal filled a new storytelling niche in VR that people responded well to. We spent a lot of time working on the balance between the interactive parts and the more passive, story moments. In the end, it flows well between the two,” says Mark Miller, ILMxLAB Executive Creative Producer and Executive Producer on the project. “Graphics quality – meaning character look and animation, lighting and environments – was also a place where we looked to separate ourselves, both on the Rift and Quest versions, and our team did an amazing job of doing just that.” He continues, “Our success will continue to expand the market and make space for projects that incorporate great interactive game mechanics and compelling narrative. The PlayStation VR platform has allowed us to reach a broad new audience of Star Wars fans who’ve been eager for us to come to that platform,” says Miller.

Star Wars: Tales from the Galaxy’s Edge is another original VR experience from ILMxLAB, developed in collaboration with Oculus Studios and scheduled to launch soon. It involves adventures, stories and characters connected to the Black Spire Outpost, associated with the Star Wars: Galaxy’s Edge land in Disneyland and Disney World. “I look for a continued rise in adoption rates and hopefully a spike in interest brought by Vader Immortal and Star Wars: Tales from the Galaxy’s Edge,” adds Miller. The Stars Wars VR titles, along with Walt Disney Animation Studios’ virtual reality short Myth: A Frozen Tale, which explores a world inspired by the feature film Frozen 2, are venturing into new territory: virtual reality experiences that immerse viewers in settings and stories associated with popular films and – in the case of Galaxy’s Edge – theme parks. THE MANY VARIETIES OF VR EXPERIENCES

There may not be enough killer game/entertainment apps currently available in VR to make it mainstream, but the format is expanding steadily into a variety of uses. Social VR platforms like VRChat, Bigscreen VR and AltspaceVR (owned by Microsoft) are creating unique experiences that could attract millions of users who want to take video conferencing into a new dimension, engage in diverse activities in VR or just hang out with others in virtual reality. It also seems likely to be heavily used in education, creativity, medicine, research, training, travel, real estate and other businesses. It could turn out that VR goes mainstream in both predictable and as yet unimagined ways. VR 2021

Strong demand for existing popular headsets should push the format briskly forward this year, as may interest in new headsets like the HP Reverb G2 (scheduled for 2020) and PlayStation VR 2 (release TBA). Apple appears to be working on its own AR/VR headset, possibly launching this year. And the advent of 5G, the next generation of wireless technology, is expected to provide lower latency and more stable connections that will greatly improve the performance of wireless VR headsets. Amazon’s Prime Video VR app, available for Oculus Quest and Samsung Gear, should also significantly expand the audience for virtual reality. Initially, Prime Video VR will migrate the company’s existing content into VR headsets, as well as launch original 360-degree videos. Tuong Nguyen, Senior Principal Analyst for Gartner, Inc., a research consulting firm, sees three “Cs” as necessary before VR goes mainstream: “Content – more breadth and depth of content. Convenience – accessible, affordable, multipurpose devices to experience VR. Control – improvement in the interface and experience. Things like 6DOF for VR experiences. Best practice/ combinations of UI for interaction in VR.” There was still more promise than performance with VR in 2020. Yet despite a sluggish year for the industry as a whole, Beat Saber, Half-Life: Alyx, Vader Immortal, social VR and apps in diverse fields are showing us that virtual reality may soon have its day in a bigger spotlight – indeed in many bigger spotlights.

OPPOSITE TOP TO BOTTOM: Seezelslak is not just a bartender and owner of a cantina – he will also give you missions to complete. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) The power-hungry pirate Tara Rashin leads a cell of the fearsome Guavian Death Gang. Concept art for Star Wars: Tales from the Galaxy’s Edge. (Image copyright © 2020 Lucasfilm and ILMxLAB) The alien Combine governs the Earth from atop this Citadel in City 17, which you climb in the high-production VR game HalfLife: Alyx. (Image copyright © 2020 Valve Software) THIS PAGE TOP TO BOTTOM: Half-Life: Alyx is a VR first-person shooter in which you guide the actions of the heroine Alyx as she fights the alien Combine that has taken over Earth. (Image copyright © 2020 Valve Software) In Half-Life: Alyx, you must battle relentless alien Combine soldiers wearing hazmat suits and riot gear. (Image copyright © 2020 Valve Software) Cross swords – or light sabers in this case – with the legendary Darth Vader in the VR experience Vader Immortal: A Star Wars VR Series. (Image copyright © 2019 Lucasfilm and ILMxLAB)

WINTER 2021 VFXVOICE.COM • 89

12/16/20 2:11 PM


[ THE VES HANDBOOK ]

Real-Time Motion Capture By JOHN ROOT

Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Visual Effects – 3rd Edition Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES

Real-time motion capture refers to the capture, processing and visualization of motion capture. It usually manifests as one or more large screens displaying solved and possibly retargeted data on a digital character, in a digital environment. Because real-time motion capture forgoes complex post-processing, it usually comes with some limitations and or artifacts. Most often real-time is used as a powerful previsualization tool for directors and DPs to immerse them in the virtual world. Real-Time Uses Real-time can be a very useful tool on the motion capture stage. Its uses include, but are not limited to, the following:

Real-time motion capture at House of Moves. (Image courtesy of House of Moves)

Character Identification When using motion capture an actor is usually playing a character very different than themselves in proportion and style. For this reason it can be an invaluable tool for an actor to see their performance as it will look on the character. The actor can then work with the director to account for the differences and incorporate those into the performance. Virtual Cinematography By deploying a virtual camera and/or pointing device, the director or DP can virtually move around a 3D set and experiment with lenses, composition, and lighting. The freedom of the virtual environment means that cameras can be placed and moved in ways not possible on a live-action set. Issues with a set can be discovered and fixed. Light, color, time of day, and a variety of other factors are all completely under the control of the director while the performance takes place. Character Integration A motion capture stage is often sparse with regard to props and set decoration. For this reason, it can be useful for an actor to see himself in the virtual environment. This allows for a sense of space and an awareness of one’s surroundings. Live Performances Real-time capture can also be used as a final product. Real-time puppeteering of characters and stage performances are becoming common.

Real-Time Limitations Real-time motion capture gets better every year. To date, however, significant compromises must be made to achieve real-time. Line of Sight Optical motion capture is computer vision-based, meaning the cameras must have an unobstructed view of the markers. Because real-time MoCap forgoes post-processing, it is common to reduce the number of actors, limit the props, and govern the performance to maximize marker visibility. It would be a very hard, for instance, to generate real-time data for 13 actors rolling around on the floor. Markers Trying to make sense of the markers on the stage in real-time is a daunting task. Beyond simply using fewer markers, inducing asymmetry into the marker placement will make them easier to identify. Solving Solving methods that are currently possible in real-time often involve approximations and limitations that will yield a resulting solve that does not look as good as an offline solve. Visualization Real-time rendering technology is catching up with offline rendering technology at a radical pace. Some games look close to feature-film quality. Using real-time game rendering technology to previs a film is becoming a common practice but one that requires significant investment. Off-the-shelf solutions, such as Autodesk’s MotionBuilder, offer a complete turnkey package to visualizing motion capture in real-time. Alternate Technologies If real-time is a requirement, passive optical technology might not be the best choice. Passive optical is known for its accuracy, not for its real-time ability. Some optical systems such as Giant and Motion Analysis are quite good at real-time considering the limitations inherent in the technology. At the expense of quality, alternate technologies can deliver more reliable real-time. Refer to the section: Which Technology is Right for a Project? (See page 286 of the VES Handbook of Visual Effects – 3rd Edition for alternate technologies that may be better suited for real-time capture).

90 • VFXVOICE.COM WINTER 2021

PG 90 HANDBOOK.indd 90

12/16/20 2:12 PM


PG 91 DISNEY MULAN AD.indd 91

12/16/20 2:13 PM


[ VES SECTION SPOTLIGHT ]

VES Sections Go Virtual By NAOMI GOLDMAN

92 • VFXVOICE.COM WINTER 2021

PG 92-93 SECTION ROUNDUP.indd All Pages

During this challenging year, the VES community has come together to help maintain our global community. VES Sections have adapted with great creativity, delivering compelling online experiences and content to replace live events. Programs this year have included virtual happy hours, educational programs, networking opportunities and screenings – including an inspired series of virtual group film screenings. They have also compiled resources to keep members abreast of COVID-19 updates and highlight regional resources and developed new interactive communication tools. Here are some highlights from Section happenings. In early spring, VES Georgia thought outside the box and collaborated with VES New York and VES members in Florida and Louisiana to host a virtual screening – and the series of East Coast Netflix Parties was born. Mid-summer, they held an educational webinar, “Capturing the Power of the Cloud,” with Amazon Web Services, and in the fall they worked to develop tools to provide additional forums for communication, including a Discord Channel, which also provides resources and assistance for job-seekers. “We have been hosting webinars and virtual screenings for our membership to keep a sense of camaraderie and reinforce that we are there for one another, and the events have been well received,” says Rob Wright, VES Georgia Co-Chair. “In conjunction with a number of our sister sections, the East Coast Netflix Parties have been a lot of fun and very successful to help keep everyone’s spirits up – and as an opportunity to network and invite in prospective new members. Thanks to VES HQ, there was a component built in where the members voted on the films, and the ones we showed included The Matrix, Raiders of the Lost Ark and Jurassic Park, which was great, because these were the very films that inspired some of our members to get into the industry.” VES Germany was one of the first to put forth a comprehensive COVID-19 preparedness resource page for members with financial aid resources, work from home tips and industry news. In the Spring, they held an interactive online meeting with TRIXTER, Royal Penguins, Scanline VFX, Rodeo FX, D-Facto VFX, RISE Visual Effects Studios, LAVAlabs, bEpic, ARRI Media, Storz & Escherich, which focused on adapting to working remotely, business continuity and industry support. As restrictions allowed some things to open up, VES Germany held an end of summer BBQ in Stuttgart and a Summer Feast in Berlin. In late October, the Section held its virtual “MegaBrain Masterclass,” Volume 4 educational series featuring top industry professionals from around the globe. In early spring, VES Washington modified their “Creative Spotlight Series” into an interactive webinar tackling how COVID-19 has affected the industry, locally and globally. Subsequent webinars provided educational content for both the membership and future artists through their long-standing partnership with The Academy of Interactive Entertainment (AIE). “We plan to continue to offer webinars and work with AIE to add value for their students,” states Todd Perry, VES Washington Board of Managers. “We believe in mentoring future VFX professionals, so through this partnership we’re providing portfolio reviews and other mentorship opportunities. As we continue, we are looking

to bring in VFX professionals who work on our chosen films, to talk about aspects people may not have known about.” VES Montreal was also among the first to put forth a comprehensive COVID-19 resources webpage for members. The Section was a proponent of offering training on unconscious/implicit bias, and integral to VES global delivering an insightful webcast conversation this summer, featuring Tessa Blake and Nancy Malone, Director of Inclusion Initiatives at the American Film Institute. “When we were able to, we coordinated some impromptu meetups in parks to offer some social time,” remarks Philipp Wolf, VES Montreal Co-Chair. “But as COVID-19 has had an accordion effect with the environment opening and closing, we are looking into drive-in movies to supplement our fun East Coast Netflix Parties with our colleagues in New York, Georgia, Florida and Louisiana until theaters can safely open. We’re all working to keep our members connected and feeling supported however we can.” VES Toronto resumed their Speaker Series with special guest, Oscar Award-winning Visual Effects Supervisor and Director Ian Hunter, discussing his phenomenal career, passion for practical effects and their marriage with digital techniques. In early fall, they took virtual meetups to a whole new level with their HiFi Pub Night hosted on High Fidelity. VES New Zealand revamped its series of Teq Talks, a tequilainfused version of innovation jams where technology, entertainment and design converge and feature an eclectic array of creative thinkers with the first in the series being “Virtual Production in Unity.” The Section also took advantage of the country safely reopening and secured a new screening venue just in time for the opening weekend release of Tenet. In October, the Bay Area Section held “FAANG 2,” (named for the five most popular and best-performing American technology companies) highlighting what VFX artists can expect when they cross over into tech. In November, they held a portfolio review with seasoned VES members reviewing the portfolios of graduating or about-to-graduate students from area VFX and animation programs to get them connections and face time with companies who may be future employers. VES Australia continued its collaboration with The Australian Cinematographers Society holding a series of online educational forums, including “ACS Webinar: On the Applebox” with Dylan River and Ari Wegner, ACS, and a special Q&A discussion with Cinematographer Charlie Sarroff and Director Natalie Erika James on their debut feature film, Relic. As Australia’s conditions have allowed some in-person events, the Section held in-theater screenings of The Eight Hundred and Tenet. VES London hosted a series of educational webinars in conjunction with CAVE Academy, including “Developing a Character,” featuring motion performance expert Ace Ruele; “Unity for Animated Storytelling” and “Unity Visual Effect Graph: Create Beautiful VFX in Real-time,” both with Ben Radcliffe; “Virtual Production with Treehouse Digital,” sharing their in-house solution to virtual production using the Unreal game engine; and “Introduction to Onset Data Acquisition” with Daniel Gilligan and Jahirul Amin.

VES New York was one of the hardest hit by the pandemic, but they’ve come out strong. In addition to the East Coast Netflix Parties, they have held a virtual “Hub Night Pub Night” and a series of educational webinars, such as “Common VFX Workflows with CityEngine and Houdini” with a talented group of panelists from Esri and One Of Us who discussed the use of CityEngine for Netflix’s The Witcher. The Section also aligned with the Post NY Alliance to hold a two-part webinar “VFX Solutions to COVID problems Part I & Part II,” where VFX experts discussed the role of visual effects in helping production and post in a post-COVID landscape. VES Los Angeles was one of the first to see the value of delivering online content to replace in-person educational events, as well as offering virtual meet-ups to retain a sense of community. Starting in the spring, the Section held a series of educational webinars. The roster included: “Creating Realistic Digital Humans in Film and Television, Digital Visualization – Behind the Scenes with The Third Floor,” “LED Stage Production,” “The Sound of VFX” and “Professional Virtual Production from Home.” “A big part of the VES family is understanding that while we may work with different companies and compete for business, we are part of one big community with shared issues and shared hopes,” says Frederick Lissau, VES Los Angeles Co-Chair. “We’ve been holding online happy hours almost monthly, and the appearances by dogs and cats, kids and family members on the screen has been really fun and helped normalize the situation. We’ve been adding special themes to liven up our virtual meetups, including “Zombie Octoberfest,” a special Halloween Happy Hour, a Pop Culture Trivia Night and giveaways of The VES Handbook of Visual Effects. Meeting up with our friends and colleagues in the Vancouver, Washington and Bay Area Sections to Netflix and chill together has been a silver lining in bringing us even closer together.” We thank and applaud our Sections from around the world for keeping the flame going and ensuring that our global community remains connected and supported, educated, inspired and entertained as we venture through this new landscape – together.

WINTER 2021 VFXVOICE.COM • 93

12/16/20 2:13 PM


[ VES SECTION SPOTLIGHT ]

VES Sections Go Virtual By NAOMI GOLDMAN

92 • VFXVOICE.COM WINTER 2021

PG 92-93 SECTION ROUNDUP.indd All Pages

During this challenging year, the VES community has come together to help maintain our global community. VES Sections have adapted with great creativity, delivering compelling online experiences and content to replace live events. Programs this year have included virtual happy hours, educational programs, networking opportunities and screenings – including an inspired series of virtual group film screenings. They have also compiled resources to keep members abreast of COVID-19 updates and highlight regional resources and developed new interactive communication tools. Here are some highlights from Section happenings. In early spring, VES Georgia thought outside the box and collaborated with VES New York and VES members in Florida and Louisiana to host a virtual screening – and the series of East Coast Netflix Parties was born. Mid-summer, they held an educational webinar, “Capturing the Power of the Cloud,” with Amazon Web Services, and in the fall they worked to develop tools to provide additional forums for communication, including a Discord Channel, which also provides resources and assistance for job-seekers. “We have been hosting webinars and virtual screenings for our membership to keep a sense of camaraderie and reinforce that we are there for one another, and the events have been well received,” says Rob Wright, VES Georgia Co-Chair. “In conjunction with a number of our sister sections, the East Coast Netflix Parties have been a lot of fun and very successful to help keep everyone’s spirits up – and as an opportunity to network and invite in prospective new members. Thanks to VES HQ, there was a component built in where the members voted on the films, and the ones we showed included The Matrix, Raiders of the Lost Ark and Jurassic Park, which was great, because these were the very films that inspired some of our members to get into the industry.” VES Germany was one of the first to put forth a comprehensive COVID-19 preparedness resource page for members with financial aid resources, work from home tips and industry news. In the Spring, they held an interactive online meeting with TRIXTER, Royal Penguins, Scanline VFX, Rodeo FX, D-Facto VFX, RISE Visual Effects Studios, LAVAlabs, bEpic, ARRI Media, Storz & Escherich, which focused on adapting to working remotely, business continuity and industry support. As restrictions allowed some things to open up, VES Germany held an end of summer BBQ in Stuttgart and a Summer Feast in Berlin. In late October, the Section held its virtual “MegaBrain Masterclass,” Volume 4 educational series featuring top industry professionals from around the globe. In early spring, VES Washington modified their “Creative Spotlight Series” into an interactive webinar tackling how COVID-19 has affected the industry, locally and globally. Subsequent webinars provided educational content for both the membership and future artists through their long-standing partnership with The Academy of Interactive Entertainment (AIE). “We plan to continue to offer webinars and work with AIE to add value for their students,” states Todd Perry, VES Washington Board of Managers. “We believe in mentoring future VFX professionals, so through this partnership we’re providing portfolio reviews and other mentorship opportunities. As we continue, we are looking

to bring in VFX professionals who work on our chosen films, to talk about aspects people may not have known about.” VES Montreal was also among the first to put forth a comprehensive COVID-19 resources webpage for members. The Section was a proponent of offering training on unconscious/implicit bias, and integral to VES global delivering an insightful webcast conversation this summer, featuring Tessa Blake and Nancy Malone, Director of Inclusion Initiatives at the American Film Institute. “When we were able to, we coordinated some impromptu meetups in parks to offer some social time,” remarks Philipp Wolf, VES Montreal Co-Chair. “But as COVID-19 has had an accordion effect with the environment opening and closing, we are looking into drive-in movies to supplement our fun East Coast Netflix Parties with our colleagues in New York, Georgia, Florida and Louisiana until theaters can safely open. We’re all working to keep our members connected and feeling supported however we can.” VES Toronto resumed their Speaker Series with special guest, Oscar Award-winning Visual Effects Supervisor and Director Ian Hunter, discussing his phenomenal career, passion for practical effects and their marriage with digital techniques. In early fall, they took virtual meetups to a whole new level with their HiFi Pub Night hosted on High Fidelity. VES New Zealand revamped its series of Teq Talks, a tequilainfused version of innovation jams where technology, entertainment and design converge and feature an eclectic array of creative thinkers with the first in the series being “Virtual Production in Unity.” The Section also took advantage of the country safely reopening and secured a new screening venue just in time for the opening weekend release of Tenet. In October, the Bay Area Section held “FAANG 2,” (named for the five most popular and best-performing American technology companies) highlighting what VFX artists can expect when they cross over into tech. In November, they held a portfolio review with seasoned VES members reviewing the portfolios of graduating or about-to-graduate students from area VFX and animation programs to get them connections and face time with companies who may be future employers. VES Australia continued its collaboration with The Australian Cinematographers Society holding a series of online educational forums, including “ACS Webinar: On the Applebox” with Dylan River and Ari Wegner, ACS, and a special Q&A discussion with Cinematographer Charlie Sarroff and Director Natalie Erika James on their debut feature film, Relic. As Australia’s conditions have allowed some in-person events, the Section held in-theater screenings of The Eight Hundred and Tenet. VES London hosted a series of educational webinars in conjunction with CAVE Academy, including “Developing a Character,” featuring motion performance expert Ace Ruele; “Unity for Animated Storytelling” and “Unity Visual Effect Graph: Create Beautiful VFX in Real-time,” both with Ben Radcliffe; “Virtual Production with Treehouse Digital,” sharing their in-house solution to virtual production using the Unreal game engine; and “Introduction to Onset Data Acquisition” with Daniel Gilligan and Jahirul Amin.

VES New York was one of the hardest hit by the pandemic, but they’ve come out strong. In addition to the East Coast Netflix Parties, they have held a virtual “Hub Night Pub Night” and a series of educational webinars, such as “Common VFX Workflows with CityEngine and Houdini” with a talented group of panelists from Esri and One Of Us who discussed the use of CityEngine for Netflix’s The Witcher. The Section also aligned with the Post NY Alliance to hold a two-part webinar “VFX Solutions to COVID problems Part I & Part II,” where VFX experts discussed the role of visual effects in helping production and post in a post-COVID landscape. VES Los Angeles was one of the first to see the value of delivering online content to replace in-person educational events, as well as offering virtual meet-ups to retain a sense of community. Starting in the spring, the Section held a series of educational webinars. The roster included: “Creating Realistic Digital Humans in Film and Television, Digital Visualization – Behind the Scenes with The Third Floor,” “LED Stage Production,” “The Sound of VFX” and “Professional Virtual Production from Home.” “A big part of the VES family is understanding that while we may work with different companies and compete for business, we are part of one big community with shared issues and shared hopes,” says Frederick Lissau, VES Los Angeles Co-Chair. “We’ve been holding online happy hours almost monthly, and the appearances by dogs and cats, kids and family members on the screen has been really fun and helped normalize the situation. We’ve been adding special themes to liven up our virtual meetups, including “Zombie Octoberfest,” a special Halloween Happy Hour, a Pop Culture Trivia Night and giveaways of The VES Handbook of Visual Effects. Meeting up with our friends and colleagues in the Vancouver, Washington and Bay Area Sections to Netflix and chill together has been a silver lining in bringing us even closer together.” We thank and applaud our Sections from around the world for keeping the flame going and ensuring that our global community remains connected and supported, educated, inspired and entertained as we venture through this new landscape – together.

WINTER 2021 VFXVOICE.COM • 93

12/16/20 2:13 PM


[ VES NEWS ]

VES Inducts 2020 Honorees at Special Celebration

By NAOMI GOLDMAN

This past December, the VES recognized its 2020 class of special honorees at a virtual celebration hosted by VES Board Chair Mike Chambers. The distinguished practitioners included the Society’s newest Lifetime and Honorary members, inductees into the VES Hall of Fame, recipient of the 2020 Founders Award, and the VES Fellows, those venerated visual effects professionals bestowed with the post-nominal letters “VES.” VES Fellow: Warren Franklin, VES. Franklin is a global leader of the animation and visual effects industry. As the founder and former CEO of Rainmaker Entertainment, he helped establish Vancouver as an industry hub. Franklin was a key member of George Lucas’ creative and management team where he served as Group Vice President, managing six divisions, including Industrial Light & Magic, LucasArts and Skywalker Sound. As the Vice President and General Manager of ILM, the company won nine Academy Awards during his tenure. VES Fellow: David Johnson, VES. Johnson is the award-winning founder, CEO and Creative Director of Undertone FX, a studio specializing in real-time visual effects for video games and VR/AR. David was the Lead Visual Effects Artist at Activision/Blizzard’s Infinity Ward Studio, creator of the Call of Duty franchise. He is a two-time VES Awards winner, advisory board member for VFX Voice, contributing author to Introduction to Game Development and The VES Handbook of Visual Effects, and co-founder of RealTimeVFX.com.

LEFT TO RIGHT: Warren Franklin, VES David Johnson, VES Janet Muswell Hamilton, VES Ken Ralston, VES Sebastian Sylwan, VES

VES Fellow: Janet Muswell Hamilton, VES. Muswell is currently the Global Director of VFX Production for Netflix and an original member of the VES Board of Directors. With a career spanning several decades, she established her reputation as a VFX Producer and/or Supervisor for cutting-edge visual effects on a wide range of groundbreaking television series and theatrical features, as well as animation, IMAX, commercials and Stereoscopic Special Venue projects. VES Fellow: Ken Ralston, VES. Ralston is a VES Lifetime Achievement Award recipient and has earned five BAFTAs and five Academy Awards, including a Special Achievement Oscar for the visual effects in Star Wars: Episode VI – The Return of the Jedi and VFX Oscars for Forrest Gump, Death Becomes Her, Who Framed Roger Rabbit and Cocoon. Ralston served as the Creative Head at Sony Pictures Imageworks, and prior to that played a pivotal role in advancing Industrial Light & Magic’s renown over the course of 20 years. VES Fellow: Sebastian Sylwan, VES. Sylwan is the Chief Technology Officer – Film & Episodic TV at Technicolor. A digital media executive with a passion for technology’s ability to arouse emotion, he develops live-action immersive experiences in addition to the cameras, tools and workflows necessary for their realization. Sylwan is the Chair of the VES Technology Committee, and an active member of the Academy’s Sci-Tech Awards Committee, Previsualization Committee and the Virtual Production Committee.

94 • VFXVOICE.COM WINTER 2021

PG 94 VES NEWS.indd 94

12/16/20 2:14 PM


PG 95 FTRACK AD.indd 95

12/16/20 2:14 PM


[ FINAL FRAME ]

Film Schools – 100 Years of Learning New Ways

Film school studies have taken on a new dimension in the time of COVID, as our article in this issue speaks to. The oldest film school in the world is believed to be the Moscow Film School, founded in 1919 by film director Vladimir Gardin with help from legendary director Sergei Eisenstein. The first film school in the U.S. was launched in 1929 within the University of Southern California with the course “Introduction to Photoplay” and evolved into the School of Cinematic Arts, which is now ranked as one of most prestigious film schools in the world with many notable alumni. This photo showcases iconic actor Douglas Fairbanks, Sr. leading an early film class at USC. Fairbanks was a key figure in the film school’s founding and in its curriculum development. USC became the first university in the U.S. to offer a Bachelor of Arts degree in

film. The school’s founding faculty included Douglas Fairbanks, D.W. Griffith, William C. deMille, Charlie Chaplin, Ernst Lubitsch, Irving Thalberg and Darryl Zanuck. This was at a time when a new technology – ‘talkies’ – was just coming in. There are now estimated to be more than 300 film schools in the U.S. and several hundred more in Canada, the U.K., Europe and Australia, with many more colleges and universities offering film studies, which means courses in VFX, animation, virtual production, and many other related fields. This image captures how it looked 90 years ago at USC when “learning new ways” meant something very different than it does today, with virtual learning now an essential component of educating the next generation of animators, filmmakers, creative and technical talent. Photo courtesy of USC School of Cinematic Arts

96 • VFXVOICE.COM WINTER 2021

PG 96 FINAL FRAME.indd 96

12/16/20 2:15 PM


PG 3 DISNEY ONWARD AD.indd 3

12/16/20 1:43 PM


CVR4 DISNEY ONE AND ONLY IVAN AD.indd 4

12/16/20 1:39 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.