VFX Voice - Fall 2020 Issue

Page 1

VFXVOICE.COM FALL 2020

TIME AND TENET

WORKING REMOTELY • PRACTICAL EFFECTS • DYNAMIC DUOS • MULAN • NEW TECH THE EXPANSE • PROFILES: J.D. SCHWALM, SHANNAN LOUIS & MATT WORKMAN




[ EXECUTIVE NOTE ]

Welcome to the Fall issue of VFX Voice! Thank you for being part of the VFX Voice global community. We’re proud to be connected through stories of artistry and innovation all around the world, especially during these difficult times. Filmed entertainment unites us all, especially while we’re apart. So thank you for your inspiring work and demonstrations of support. In this issue, we go deep inside the resurgence of practical effects. We introduce you to a couple of dynamic duos – Cinematographer Caleb Deschanel, ASC and Visual Effects Supervisor Rob Legato, ASC on their latest collaboration, The Lion King, and Production Visual Effects Supervisor Roger Guyett and Special Effects Supervisor Dominic Tuohy on Star Wars: Rise of Skywalker. We’ve got TV/streaming features on Watchmen, The Expanse and Solar Opposites, and, on the big screen, we offer features on much-anticipated films Tenet and Mulan. We profile FatBelly VFX Head of Studio Shannan Louis, Special Effects Supervisor J.D. Schwalm and Cinematographer/Developer Matt Workman. Our Tech & Tools section looks at case studies in working remotely during the pandemic, as well as how remote technology has swiftly adapted and evolved. We examine Social VR trends and the surge in video games, and we continue to shine a light on our regional sections with our profile of VES India and VFX in the heart of Bollywood. VFX Voice is proud to be the definitive authority on all things VFX. We continue to bring you exclusive features between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM FALL 2020



[ CONTENTS ] FEATURES 8 VFX TRENDS: PRACTICAL EFFECTS Inside the major resurgence in the demand for physical effects.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 VES SECTION SPOTLIGHT: INDIA

14 FILM: DYNAMIC DUOS – DP AND VFX DP Caleb Deschanel, ASC and VFX Supervisor Rob Legato, ASC. 18 FILM: DYNAMIC DUOS – SFX AND VFX SFX Supervisor Dominic Tuohy and VFX Supervisor Roger Guyett. 22 PROFILE: J.D. SCHWALM Oscar-winner followed his father’s career path to new heights. 26 COVER: TENET Backwards and forwards live together in time-bending thriller. 32 PROFILE: SHANNAN LOUIS Launching a new VFX studio in the year of the pandemic. 36 FILM: MULAN VFX furthers storytelling in Disney’s latest live-action re-make. 42 TV/STREAMING: THE EXPANSE Deep space is even more fertile VFX territory in Amazon upgrade. 48 VFX TRENDS: WORKING REMOTELY Meeting the challenges of relying entirely on a remote workflow. 56 TECH & TOOLS: WORKING FROM HOME VES Technology Committee assesses the industry’s shift to remote. 60 TECH & TOOLS: NEW TECH A roundup of recently-released innovative tools for VFX artists. 66 PROFILE: MATT WORKMAN Inventive filmmaker freely shares his passion for virtual production. 70 TV/STREAMING: WATCHMEN Epic story with surreal effects inspires a highpoint in TV history. 78 VR/AR/MR TRENDS: SOCIAL VR ‘Hanging out’ virtually offers new possibilities for entertainment. 82 TV/STREAMING: SOLAR OPPOSITES Bright colors, dark humor play well together in adult cartoon. 86 GAMES: VIDEO GAME SURGE Gaming enjoys a sharp spike during the pandemic lockdown.

4 • VFXVOICE.COM FALL 2020

92 THE VES HANDBOOK 94 VES NEWS 96 FINAL FRAME: BITZER & GRIFFITH

ON THE COVER: John David Washington is “The Protagonist” in Christopher Nolan’s Tenet. (Image courtesy of Warner Bros. Pictures)



FALL 2020 • VOL. 4, NO. 4

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING VFXVoiceAds@gmail.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Mike Chambers, Chair Lisa Cooke, 1st Vice Chair Emma Clifton Perry, 2nd Vice Chair Laurie Blavin, Treasurer Rita Cahill, Secretary DIRECTORS Neishaw Ali, Brooke Breton, Kathryn Brillhart Colin Campbell, Bob Coleman, Kim Davidson Rose Duignan, Camille Eden, Michael Fink, VES Dennis Hoffman, Jeff Kleiser, Kim Lavery, VES Tim McGovern, Karen Murphy, Susan O’Neal Jeffrey A. Okun, VES, Jim Rygiel, Lisa Sepp-Wilson Katie Stetson, David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Gavin Graham, Bryan Grill, Arnon Manor Andres Martinez, Dan Schrecker, David Valentin Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media

Tom Atkin, Founder Allen Battino, VES Logo Design VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2020 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM FALL 2020



VFX TRENDS

You might be able to do anything in CG, but in recent times there has been a major resurgence in the audience and filmmaker demand for physical effects. VFX Voice looks at the state of play in this field of filmmaking with a range of creature and makeup effects practitioners, special effects supervisors and model and miniature makers. THE STATE OF PLAY IN PRACTICAL EFFECTS

THE RESURGENCE AND REVITALIZATION OF PRACTICAL EFFECTS By IAN FAILES

TOP: Legacy Effects crafted a robotic arm for the character George Willard (Paul Schneider) in Tales from the Loop. (Photo by Jan Thijs, courtesy of Amazon Prime Video) BOTTOM: The Jakob robot was both a practical on-set puppet built by Legacy Effects and a digital creation made in VFX for Tales from the Loop. (Photo courtesy of Amazon Prime Video)

8 • VFXVOICE.COM FALL 2020

The diverse range of work covered by practical effects – animatronics, prosthetics, puppetry, mechanical effects, pyrotechnics and more – means that artists need to rely on a varied range of approaches to achieve them. Several innovations have led the charge in the past few years. “3D printed clear materials are fantastic for visors and helmets, and sintered metals have now become cost-efficient,” identifies Legacy Effects Co-owner and Animatronics Effects Supervisor Alan Scott, whose recent experience includes overseeing robotic arms and robots for Tales from the Loop. “Also, new robotic servos have really transformed a lot of what we’re able to do with animatronics. We could have 12 of them working and moving for the robotic arm in Tales and it didn’t affect dialogue once.” Weta Workshop Co-founder and Creative Director Richard Taylor concurs, stating that “over 60% of all that we make today utilizes robotic manufacturing technology.” The studio has had a hand in several practical robot and costume builds in recent films such as Ghost in the Shell, I Am Mother and The Wandering Earth. Special Makeup Effects Artist and Supervisor Jason Hamer of Hamer FX, which delivered full-scale and miniature water creature builds for Wendy, also agrees. “I think the use of digital artwork and 3D printing has been the biggest game-changer. The level of detail and precision that can be achieved is astounding. Mechanical parts that used to take weeks to machine can be produced at a fraction of the time and labor cost.” Silicones, a makeup effects material used for some time, have come a long way as another go-to material, adds Amalgamated Dynamics, Inc. (ADI) Co-founder and Character Effects Artist Alec Gillis, who counts Bright and It as the studio’s latest highlights. “For a while, there weren’t really paints that would stick to silicone, but now the quality has just skyrocketed.” Gillis’ ADI partner, Tom Woodruff, Jr., offers up a slightly different change in the state of play in creature effects, where studios with extensive experience in the area have been called upon to tackle direct digital design work. “Director Michael Dougherty had us and other studios do just the designs for Godzilla: King of the Monsters, which started as clay sculpts and ended in ZBrush.” What has changed, too, is the use of digital techniques to drive what gets built practically, as Creature and Special Makeup Effects Supervisor Neal Scanlan’s studio undertook for several recent Star Wars films. “I decided the thing to do was to flip it the other way around and to use digital technology as a huge assistance in the builds. It liberated us in a way that we could never have done during the ‘analog’ period.” At KNB EFX Group, partners and Special Makeup Effects

Artists Greg Nicotero and Howard Berger have been busy with large projects like The Orville, The Walking Dead and Space Jam: A New Legacy, while also approaching the management side of the business in new ways. “I wanted to diversify a little bit about nine years ago,” says Berger, “so I started department heading, where I run the whole makeup and effects department. It’s been a great and new experience.” A feature of the creature effects experience is being able to have

TOP: For Hitchcock, Howard Berger helped transform Sir Anthony Hopkins into the famous director. (Image courtesy of Howard Berger) BOTTOM: Lead Application Artist Christopher Nelson, left, actor Joel Edgerton and Makeup Effects Artist Joe Badiali work on Edgerton’s orc makeup for Bright. (Image courtesy of Amalgamated Dynamics, Inc.)

FALL 2020 VFXVOICE.COM • 9


VFX TRENDS

You might be able to do anything in CG, but in recent times there has been a major resurgence in the audience and filmmaker demand for physical effects. VFX Voice looks at the state of play in this field of filmmaking with a range of creature and makeup effects practitioners, special effects supervisors and model and miniature makers. THE STATE OF PLAY IN PRACTICAL EFFECTS

THE RESURGENCE AND REVITALIZATION OF PRACTICAL EFFECTS By IAN FAILES

TOP: Legacy Effects crafted a robotic arm for the character George Willard (Paul Schneider) in Tales from the Loop. (Photo by Jan Thijs, courtesy of Amazon Prime Video) BOTTOM: The Jakob robot was both a practical on-set puppet built by Legacy Effects and a digital creation made in VFX for Tales from the Loop. (Photo courtesy of Amazon Prime Video)

8 • VFXVOICE.COM FALL 2020

The diverse range of work covered by practical effects – animatronics, prosthetics, puppetry, mechanical effects, pyrotechnics and more – means that artists need to rely on a varied range of approaches to achieve them. Several innovations have led the charge in the past few years. “3D printed clear materials are fantastic for visors and helmets, and sintered metals have now become cost-efficient,” identifies Legacy Effects Co-owner and Animatronics Effects Supervisor Alan Scott, whose recent experience includes overseeing robotic arms and robots for Tales from the Loop. “Also, new robotic servos have really transformed a lot of what we’re able to do with animatronics. We could have 12 of them working and moving for the robotic arm in Tales and it didn’t affect dialogue once.” Weta Workshop Co-founder and Creative Director Richard Taylor concurs, stating that “over 60% of all that we make today utilizes robotic manufacturing technology.” The studio has had a hand in several practical robot and costume builds in recent films such as Ghost in the Shell, I Am Mother and The Wandering Earth. Special Makeup Effects Artist and Supervisor Jason Hamer of Hamer FX, which delivered full-scale and miniature water creature builds for Wendy, also agrees. “I think the use of digital artwork and 3D printing has been the biggest game-changer. The level of detail and precision that can be achieved is astounding. Mechanical parts that used to take weeks to machine can be produced at a fraction of the time and labor cost.” Silicones, a makeup effects material used for some time, have come a long way as another go-to material, adds Amalgamated Dynamics, Inc. (ADI) Co-founder and Character Effects Artist Alec Gillis, who counts Bright and It as the studio’s latest highlights. “For a while, there weren’t really paints that would stick to silicone, but now the quality has just skyrocketed.” Gillis’ ADI partner, Tom Woodruff, Jr., offers up a slightly different change in the state of play in creature effects, where studios with extensive experience in the area have been called upon to tackle direct digital design work. “Director Michael Dougherty had us and other studios do just the designs for Godzilla: King of the Monsters, which started as clay sculpts and ended in ZBrush.” What has changed, too, is the use of digital techniques to drive what gets built practically, as Creature and Special Makeup Effects Supervisor Neal Scanlan’s studio undertook for several recent Star Wars films. “I decided the thing to do was to flip it the other way around and to use digital technology as a huge assistance in the builds. It liberated us in a way that we could never have done during the ‘analog’ period.” At KNB EFX Group, partners and Special Makeup Effects

Artists Greg Nicotero and Howard Berger have been busy with large projects like The Orville, The Walking Dead and Space Jam: A New Legacy, while also approaching the management side of the business in new ways. “I wanted to diversify a little bit about nine years ago,” says Berger, “so I started department heading, where I run the whole makeup and effects department. It’s been a great and new experience.” A feature of the creature effects experience is being able to have

TOP: For Hitchcock, Howard Berger helped transform Sir Anthony Hopkins into the famous director. (Image courtesy of Howard Berger) BOTTOM: Lead Application Artist Christopher Nelson, left, actor Joel Edgerton and Makeup Effects Artist Joe Badiali work on Edgerton’s orc makeup for Bright. (Image courtesy of Amalgamated Dynamics, Inc.)

FALL 2020 VFXVOICE.COM • 9


VFX TRENDS

TOP: The Rise of Skywalker character Babu Frik was a rod puppet operated on set by greenscreen-clad puppeteers, and also operated by its voice-over artist, Shirley Henderson. (Image copyright © 2019 Lucasfilm Ltd) BOTTOM: Creature and Special Makeup Effects Supervisor Neal Scanlan, left, with Star Wars: Rise of Skywalker director J.J. Abrams discussing the mono-wheel droid D-O. (Image copyright © 2019 Lucasfilm Ltd)

10 • VFXVOICE.COM FALL 2020

something on set during production, with DDT Efectos Especiales Co-owner and Special Effects Makeup Artist David Martí noting that directors have made a concerted effort to incorporate practical creations further into the actor experience. “On A Monster Calls, J.A. Bayona had the creature there and he would even do a special introduction to it for everyone. It wasn’t just another animatronic thing, it was another actor.” Like the creature effects specialists, The Grand Budapest Hotel and Isle of Dogs modelmaker Simon Weisse has been utilizing 3D printing in his miniatures work, as well as 2D milling. “We are using 3D printing with liquid resin, and also doing 2D milling. You can very quickly cut shapes in metal, wood or whatever you need.” Weisse notes, however, that he uses 3D printing and 2D milling only as part of a wider range of available tools among other handcrafted techniques. Fellow modelmaker Mike Tucker from The Model Unit, with credits on projects such as Good Omens, Doctor Who and Gerry Anderson’s Firestorm, identifies improvements in high-speed digital cameras as a major change, too. “We were constantly being pressured to move away from film, but the fact was that until ARRI came up with the Alexa, there was no camera that really did the job the way that my DP and I wanted. The Alexa was a game-changer.” For special effects supervisors, leaps and bounds have come in areas such as computer control. “We use computers to control hydraulics, pneumatics, computerized winches – right across the board,” says Special Effects Supervisor Chris Corbould, who recently worked on No Time To Die, one of several James Bond

films on which he has overseen the special effects. “Computer control gives us consistency. Once you press the button, it’s going to do exactly the same thing every time.” The use of a rotisserie rig on Terminator: Dark Fate by Special Effects Supervisor Neil Corbould, VES, exemplifies that same precision in engineering now common in practical effects. “A lot of those rigs just rotate on one axis, but I wanted to make it a little bit harder for myself and have it tilt in as well,” he says. “It took eight months to design and build and weighed upwards of 85 tons.” Special Effects Supervisor Jeremy Hays even had to craft an actual flowing river for The Call of the Wild after he had previously supervised a raging storm on The Equalizer 2. “The river contained approximately 355,000 gallons of water,” he says, “and had a water circulation rate of approximately 75,000 to 80,000 gallons per minute.” Water was also a requirement for Special Effects Supervisor Joel Whist on Bad Times at the El Royale. Here he was called upon to deliver a large indoor rain sequence. “We filled the entire soundstage with 42 rain heads. We were dropping 2,000 gallons a minute on the floor, and for all that water we had to design the set so it was sloped and [the water] re-circulated.” WHERE DIGITAL AND PRACTICAL MEET

Certainly, digital visual effects have edged their way into being the predominant effects solution today. But that doesn’t mean practical has lost its place in filmmaking. Instead, the common theme noted by the experts VFX Voice talked to was that early

TOP: For The Call of the Wild, Jeremy Hays orchestrated the building of an actual flowing river set for shooting scenes involving panning for gold, swimming and canoeing. (Image courtesy of Jeremy Hays) BOTTOM: Weta Workshop Co-founder and Creative Director Richard Taylor works on Geisha molds for Ghost in the Shell. (Image courtesy of Weta Workshop)

FALL 2020 VFXVOICE.COM • 11


VFX TRENDS

TOP: The Rise of Skywalker character Babu Frik was a rod puppet operated on set by greenscreen-clad puppeteers, and also operated by its voice-over artist, Shirley Henderson. (Image copyright © 2019 Lucasfilm Ltd) BOTTOM: Creature and Special Makeup Effects Supervisor Neal Scanlan, left, with Star Wars: Rise of Skywalker director J.J. Abrams discussing the mono-wheel droid D-O. (Image copyright © 2019 Lucasfilm Ltd)

10 • VFXVOICE.COM FALL 2020

something on set during production, with DDT Efectos Especiales Co-owner and Special Effects Makeup Artist David Martí noting that directors have made a concerted effort to incorporate practical creations further into the actor experience. “On A Monster Calls, J.A. Bayona had the creature there and he would even do a special introduction to it for everyone. It wasn’t just another animatronic thing, it was another actor.” Like the creature effects specialists, The Grand Budapest Hotel and Isle of Dogs modelmaker Simon Weisse has been utilizing 3D printing in his miniatures work, as well as 2D milling. “We are using 3D printing with liquid resin, and also doing 2D milling. You can very quickly cut shapes in metal, wood or whatever you need.” Weisse notes, however, that he uses 3D printing and 2D milling only as part of a wider range of available tools among other handcrafted techniques. Fellow modelmaker Mike Tucker from The Model Unit, with credits on projects such as Good Omens, Doctor Who and Gerry Anderson’s Firestorm, identifies improvements in high-speed digital cameras as a major change, too. “We were constantly being pressured to move away from film, but the fact was that until ARRI came up with the Alexa, there was no camera that really did the job the way that my DP and I wanted. The Alexa was a game-changer.” For special effects supervisors, leaps and bounds have come in areas such as computer control. “We use computers to control hydraulics, pneumatics, computerized winches – right across the board,” says Special Effects Supervisor Chris Corbould, who recently worked on No Time To Die, one of several James Bond

films on which he has overseen the special effects. “Computer control gives us consistency. Once you press the button, it’s going to do exactly the same thing every time.” The use of a rotisserie rig on Terminator: Dark Fate by Special Effects Supervisor Neil Corbould, VES, exemplifies that same precision in engineering now common in practical effects. “A lot of those rigs just rotate on one axis, but I wanted to make it a little bit harder for myself and have it tilt in as well,” he says. “It took eight months to design and build and weighed upwards of 85 tons.” Special Effects Supervisor Jeremy Hays even had to craft an actual flowing river for The Call of the Wild after he had previously supervised a raging storm on The Equalizer 2. “The river contained approximately 355,000 gallons of water,” he says, “and had a water circulation rate of approximately 75,000 to 80,000 gallons per minute.” Water was also a requirement for Special Effects Supervisor Joel Whist on Bad Times at the El Royale. Here he was called upon to deliver a large indoor rain sequence. “We filled the entire soundstage with 42 rain heads. We were dropping 2,000 gallons a minute on the floor, and for all that water we had to design the set so it was sloped and [the water] re-circulated.” WHERE DIGITAL AND PRACTICAL MEET

Certainly, digital visual effects have edged their way into being the predominant effects solution today. But that doesn’t mean practical has lost its place in filmmaking. Instead, the common theme noted by the experts VFX Voice talked to was that early

TOP: For The Call of the Wild, Jeremy Hays orchestrated the building of an actual flowing river set for shooting scenes involving panning for gold, swimming and canoeing. (Image courtesy of Jeremy Hays) BOTTOM: Weta Workshop Co-founder and Creative Director Richard Taylor works on Geisha molds for Ghost in the Shell. (Image courtesy of Weta Workshop)

FALL 2020 VFXVOICE.COM • 11


VFX TRENDS

TOP: Special Effects Supervisor Chris Corbould on set during the making of Spectre. (Image courtesy of Chris Corbould) MIDDLE: Mike Tucker’s The Model Unit provided a large-scale model of a missile silo, missile and a surface hatchway for Good Omens. (Image courtesy of Mike Tucker) BOTTOM: For Bad Times at the El Royale, a completely interior stage was rigged with rain machines by Joel Whist’s team to provide continual on-set rain. (Image courtesy of Joel Whist)

12 • VFXVOICE.COM FALL 2020

discussions between the visual effects and practical effects teams always lean towards simply what works best for the story and the shot. Practical effects practitioners have also been conscious of a feeling out there that CG could accomplish the things that they had been doing for so many years, and cheaper. However, in the past few years it has become clearer that both approaches have different aesthetic and cost benefits. “I have always seen the strong want and desire to use practical effects in filmmaking and that the trend seems to swell and fade with the various styles and genre of movies that are being created at the time,” observes Hays. “With any creative, ever-evolving industry, fluctuations in the methods of telling the story are to be expected, and really should be welcomed.” “Some areas have certainly changed over the years, such as miniatures,” acknowledges Taylor. “This was the largest department at the Workshop for the first 20 years of our company, but due to the versatility of digital solutions for architecture in film today our miniature department has become smaller, relative to our other departments. Thankfully, many directors continue to enjoy working with practical effects on set, and I believe that this will be an ongoing aspect of filmmaking for many years to come.” “I think we’re probably stronger now than we have been in many years,” declares Neil Corbould, whose recent projects include the latest Mission: Impossible films. “To work on these films is a practical effects person’s dream; they want Tom Cruise to do as much in-camera as possible. There is a lot pushed into practical effects, and my input is greatly appreciated by them and the visual effects supervisor as well.” “I’m excited anytime I hear somebody say that there is a resurgence in practical effects,” offers Woodruff, Jr. “Well, it’s not really a resurgence, some people say, it’s just a balancing out, but I think it is a resurgence.” Scanlan is also adamant about the advent of a new wave of practical effects. “I was of the opinion that in so many ways practical effects had run their course. But then when I got that call from J.J. Abrams, it was clear to me that there was a way to make a number of magical ingredients come together for the new Star Wars films.” In fact, the on-set collaboration between visual effects and practical effects is incredibly close, attests Chris Corbould, who has also recently been concentrating on second unit directing during production. “I always get together with the visual effects people and we discuss what the best way to do it is and what would look best rather than just, ‘Oh, it’s got to be CG.’” Scott shares that belief, and Legacy Effects has certainly borne witness to that collaboration recently on projects such as The Mandalorian, where both a puppet version and digital version of baby Yoda were used, and on Tales from the Loop where the VFX team, says Scott, “embraced what we could provide knowing full well that there were going to be times where we’re not going to be able to do what is required.” “You know,” shares Berger, “my first question is always, ‘Who’s your VFX supervisor?’ Sometimes they have to call the shots on everything, and I want them to. I might be on a project for six

months, but they’re often on for a lot longer and they have to live with the decisions that get made.” Meanwhile, Whist sees practical work as being the source of authenticity in a scene, an approach he followed for the blood squib hits in season one of Altered Carbon. “There is just something about the weirdness of shooting on set that makes a blood hit look a bit more random, but of course they are often augmented as well.” “My first job on any production is to point out what miniatures can offer that digital can’t, whether that be the quantity of shots that we can deliver, or just the raw, organic nature of a practically-achieved effects sequence,” outlines Tucker, saying his experience is that sometimes people see miniatures and just assume they’re digital, even when they’re not. Hamer argues that a practical approach can also bring unexpected results, such as the solution found to illuminate an area of the ‘Mother’ creature in Wendy. “To do this we attached a series of LED panels on a fiberglass core and sealed in caulking. The wires were sealed in silicone tubes and ran up the control rods to the surface where they connect to a laptop running a lighting program.” For Weisse, and several other modelmakers, his miniature work often has significant crossover with the physical production on a film, as it did on the stop-motion Isle of Dogs. “They built around 200 sets at seventh scale and then we had about 35 sets done at much smaller scale. They both worked in different ways.” Martí’s experience on A Monster Calls was that an original push to construct things practically segued into a mostly CG solution. However, what had been built served as extensive practical reference for the actors. “It helped create the mood for the scenes, and it was great they could get in there and touch it. We were happy to be helping to make that magic on set, but it is the case that 95% of the shots in the film were CG.” “Most people just want it to be awesome,” comments Gillis. “They want it to be big and cool. People do get really excited if you say, ‘We used practical,’ but then if you also say, ‘It’s embellished with digital,’ people also get very excited about that. I think the blend is really where everything elevates beyond the limitations of any one technique.”

TOP LEFT: Neil Corbould’s rotisserie rig designed for the aerial fight sequence in Terminator: Dark Fate. (Image courtesy of Neil Corbould) TOP RIGHT: Modelmaker Simon Weisse in front of the 18th-scale hotel constructed for The Grand Budapest Hotel. (Image copyright © 2014 Fox Searchlight) MIDDLE: Special Makeup Effects Artist and Supervisor Jason Hamer works on a makeup effects appliance for a Die Antwoord music video. (Image courtesy of Jason Hamer) BOTTOM: DDT Efectos Especiales’ Montse Ribé and David Martí work on a character from Crimson Peak. (Image courtesy of David Marti)

FALL 2020 VFXVOICE.COM • 13


VFX TRENDS

TOP: Special Effects Supervisor Chris Corbould on set during the making of Spectre. (Image courtesy of Chris Corbould) MIDDLE: Mike Tucker’s The Model Unit provided a large-scale model of a missile silo, missile and a surface hatchway for Good Omens. (Image courtesy of Mike Tucker) BOTTOM: For Bad Times at the El Royale, a completely interior stage was rigged with rain machines by Joel Whist’s team to provide continual on-set rain. (Image courtesy of Joel Whist)

12 • VFXVOICE.COM FALL 2020

discussions between the visual effects and practical effects teams always lean towards simply what works best for the story and the shot. Practical effects practitioners have also been conscious of a feeling out there that CG could accomplish the things that they had been doing for so many years, and cheaper. However, in the past few years it has become clearer that both approaches have different aesthetic and cost benefits. “I have always seen the strong want and desire to use practical effects in filmmaking and that the trend seems to swell and fade with the various styles and genre of movies that are being created at the time,” observes Hays. “With any creative, ever-evolving industry, fluctuations in the methods of telling the story are to be expected, and really should be welcomed.” “Some areas have certainly changed over the years, such as miniatures,” acknowledges Taylor. “This was the largest department at the Workshop for the first 20 years of our company, but due to the versatility of digital solutions for architecture in film today our miniature department has become smaller, relative to our other departments. Thankfully, many directors continue to enjoy working with practical effects on set, and I believe that this will be an ongoing aspect of filmmaking for many years to come.” “I think we’re probably stronger now than we have been in many years,” declares Neil Corbould, whose recent projects include the latest Mission: Impossible films. “To work on these films is a practical effects person’s dream; they want Tom Cruise to do as much in-camera as possible. There is a lot pushed into practical effects, and my input is greatly appreciated by them and the visual effects supervisor as well.” “I’m excited anytime I hear somebody say that there is a resurgence in practical effects,” offers Woodruff, Jr. “Well, it’s not really a resurgence, some people say, it’s just a balancing out, but I think it is a resurgence.” Scanlan is also adamant about the advent of a new wave of practical effects. “I was of the opinion that in so many ways practical effects had run their course. But then when I got that call from J.J. Abrams, it was clear to me that there was a way to make a number of magical ingredients come together for the new Star Wars films.” In fact, the on-set collaboration between visual effects and practical effects is incredibly close, attests Chris Corbould, who has also recently been concentrating on second unit directing during production. “I always get together with the visual effects people and we discuss what the best way to do it is and what would look best rather than just, ‘Oh, it’s got to be CG.’” Scott shares that belief, and Legacy Effects has certainly borne witness to that collaboration recently on projects such as The Mandalorian, where both a puppet version and digital version of baby Yoda were used, and on Tales from the Loop where the VFX team, says Scott, “embraced what we could provide knowing full well that there were going to be times where we’re not going to be able to do what is required.” “You know,” shares Berger, “my first question is always, ‘Who’s your VFX supervisor?’ Sometimes they have to call the shots on everything, and I want them to. I might be on a project for six

months, but they’re often on for a lot longer and they have to live with the decisions that get made.” Meanwhile, Whist sees practical work as being the source of authenticity in a scene, an approach he followed for the blood squib hits in season one of Altered Carbon. “There is just something about the weirdness of shooting on set that makes a blood hit look a bit more random, but of course they are often augmented as well.” “My first job on any production is to point out what miniatures can offer that digital can’t, whether that be the quantity of shots that we can deliver, or just the raw, organic nature of a practically-achieved effects sequence,” outlines Tucker, saying his experience is that sometimes people see miniatures and just assume they’re digital, even when they’re not. Hamer argues that a practical approach can also bring unexpected results, such as the solution found to illuminate an area of the ‘Mother’ creature in Wendy. “To do this we attached a series of LED panels on a fiberglass core and sealed in caulking. The wires were sealed in silicone tubes and ran up the control rods to the surface where they connect to a laptop running a lighting program.” For Weisse, and several other modelmakers, his miniature work often has significant crossover with the physical production on a film, as it did on the stop-motion Isle of Dogs. “They built around 200 sets at seventh scale and then we had about 35 sets done at much smaller scale. They both worked in different ways.” Martí’s experience on A Monster Calls was that an original push to construct things practically segued into a mostly CG solution. However, what had been built served as extensive practical reference for the actors. “It helped create the mood for the scenes, and it was great they could get in there and touch it. We were happy to be helping to make that magic on set, but it is the case that 95% of the shots in the film were CG.” “Most people just want it to be awesome,” comments Gillis. “They want it to be big and cool. People do get really excited if you say, ‘We used practical,’ but then if you also say, ‘It’s embellished with digital,’ people also get very excited about that. I think the blend is really where everything elevates beyond the limitations of any one technique.”

TOP LEFT: Neil Corbould’s rotisserie rig designed for the aerial fight sequence in Terminator: Dark Fate. (Image courtesy of Neil Corbould) TOP RIGHT: Modelmaker Simon Weisse in front of the 18th-scale hotel constructed for The Grand Budapest Hotel. (Image copyright © 2014 Fox Searchlight) MIDDLE: Special Makeup Effects Artist and Supervisor Jason Hamer works on a makeup effects appliance for a Die Antwoord music video. (Image courtesy of Jason Hamer) BOTTOM: DDT Efectos Especiales’ Montse Ribé and David Martí work on a character from Crimson Peak. (Image courtesy of David Marti)

FALL 2020 VFXVOICE.COM • 13


FILM

DYNAMIC DUOS: DP AND VFX – CALEB DESCHANEL AND ROB LEGATO By IAN FAILES

On a traditional effects-driven film, a cinematographer and a visual effects supervisor might tend to collaborate predominantly on set on individual visual effects shots, and rarely into post-production. This was the reverse on Jon Favreau’s The Lion King, a photorealistic, computer-generated re-imagining of the classic 2D-animated Disney feature, where Director of Photography Caleb Deschanel, ASC and Visual Effects Supervisor Rob Legato, ASC worked together on every single shot of the film. Such a collaboration was possible owing to the virtual production nature of the shoot, in which virtual cameras and equipment (which matched real-world versions), real-time rendering, motion capture and virtual reality (VR) scouting and reviews became the filmmaking tools. GOING VIRTUAL

All images copyright © 2019 Disney Enterprises, Inc.

14 • VFXVOICE.COM FALL 2020

Over several film experiences, including on Bad Boys II, The Aviator, Avatar, Hugo and The Jungle Book, Legato (a three-time Oscar winner and five-time nominee) has been a pioneer in the use of virtual production techniques. It was something he looked to take to a new level on The Lion King. “I wanted to make sure,” says Legato, “that the working environment that we were in was collaborative enough that Caleb could say, I have this crazy idea, I want to do this, let’s go do that. As opposed to saying, ‘It can’t be done.’ All of a sudden, in my experience, you will find a shot that’s even better.” “What made it really easy for me,” comments Deschanel (a six-time Academy Award-nominated cinematographer for films including The Right Stuff and The Natural)”, “was that Rob and the people at Magnopus [the studio behind the virtual production

“The big thing in making the movie was the fact that the tools were very familiar and were designed to mimic a hundred years of filmmaking and all the things that we’ve learned over that period of time to make movies. Beyond that it was up to your imagination to see how you could piece those things together in interesting ways to go beyond that and get out of your comfort zone and try different things. And if it failed it didn’t matter because you could press a button and start over again and try something else.” —Caleb Deschanel, ASC, Director of Photography tools] designed things that were so much like what I’m used to. I didn’t need to reinvent things that I didn’t know. It was lenses, it was dollies, it was cranes. The two of us could then just concentrate on thinking, ‘How do we tell this story?’ “And the great thing about the world that we were in, this virtual world, was that, you could try infinite things,” continues Deschanel. “You didn’t have to wait for the wranglers to take the wildebeest back to the first position, because you just pressed the button and they were there. So this really gave us a lot of freedom that really made it much more exciting than I expected it to be.” The intention of those tools was to replicate the kind of spontaneous filmmaking that occurs on a live-action set and ultimately produce a closely-followed template of action and lighting for editorial and for visual effects studio MPC to complete the final shots. The result was that The Lion King could be shot with all the principles – and restrictions – of live-action filmmaking in mind, but without having to train animals or film in challenging environments. “It was quite a bit different than I expected it to be,” Deschanel

OPPOSITE TOP: The Lion King Director of Photography Caleb Deschanel, ASC, left, and Visual Effects Supervisor Robert Legato, ASC, right, during a reference photography trip in Africa. TOP: Deschanel on location for the Africa shoot. BOTTOM: The Lion King director Jon Favreau, Deschanel, Production Designer James Chinlund, Animation Supervisor Andy Jones and Legato discuss the film.

FALL 2020 VFXVOICE.COM • 15


FILM

DYNAMIC DUOS: DP AND VFX – CALEB DESCHANEL AND ROB LEGATO By IAN FAILES

On a traditional effects-driven film, a cinematographer and a visual effects supervisor might tend to collaborate predominantly on set on individual visual effects shots, and rarely into post-production. This was the reverse on Jon Favreau’s The Lion King, a photorealistic, computer-generated re-imagining of the classic 2D-animated Disney feature, where Director of Photography Caleb Deschanel, ASC and Visual Effects Supervisor Rob Legato, ASC worked together on every single shot of the film. Such a collaboration was possible owing to the virtual production nature of the shoot, in which virtual cameras and equipment (which matched real-world versions), real-time rendering, motion capture and virtual reality (VR) scouting and reviews became the filmmaking tools. GOING VIRTUAL

All images copyright © 2019 Disney Enterprises, Inc.

14 • VFXVOICE.COM FALL 2020

Over several film experiences, including on Bad Boys II, The Aviator, Avatar, Hugo and The Jungle Book, Legato (a three-time Oscar winner and five-time nominee) has been a pioneer in the use of virtual production techniques. It was something he looked to take to a new level on The Lion King. “I wanted to make sure,” says Legato, “that the working environment that we were in was collaborative enough that Caleb could say, I have this crazy idea, I want to do this, let’s go do that. As opposed to saying, ‘It can’t be done.’ All of a sudden, in my experience, you will find a shot that’s even better.” “What made it really easy for me,” comments Deschanel (a six-time Academy Award-nominated cinematographer for films including The Right Stuff and The Natural)”, “was that Rob and the people at Magnopus [the studio behind the virtual production

“The big thing in making the movie was the fact that the tools were very familiar and were designed to mimic a hundred years of filmmaking and all the things that we’ve learned over that period of time to make movies. Beyond that it was up to your imagination to see how you could piece those things together in interesting ways to go beyond that and get out of your comfort zone and try different things. And if it failed it didn’t matter because you could press a button and start over again and try something else.” —Caleb Deschanel, ASC, Director of Photography tools] designed things that were so much like what I’m used to. I didn’t need to reinvent things that I didn’t know. It was lenses, it was dollies, it was cranes. The two of us could then just concentrate on thinking, ‘How do we tell this story?’ “And the great thing about the world that we were in, this virtual world, was that, you could try infinite things,” continues Deschanel. “You didn’t have to wait for the wranglers to take the wildebeest back to the first position, because you just pressed the button and they were there. So this really gave us a lot of freedom that really made it much more exciting than I expected it to be.” The intention of those tools was to replicate the kind of spontaneous filmmaking that occurs on a live-action set and ultimately produce a closely-followed template of action and lighting for editorial and for visual effects studio MPC to complete the final shots. The result was that The Lion King could be shot with all the principles – and restrictions – of live-action filmmaking in mind, but without having to train animals or film in challenging environments. “It was quite a bit different than I expected it to be,” Deschanel

OPPOSITE TOP: The Lion King Director of Photography Caleb Deschanel, ASC, left, and Visual Effects Supervisor Robert Legato, ASC, right, during a reference photography trip in Africa. TOP: Deschanel on location for the Africa shoot. BOTTOM: The Lion King director Jon Favreau, Deschanel, Production Designer James Chinlund, Animation Supervisor Andy Jones and Legato discuss the film.

FALL 2020 VFXVOICE.COM • 15


FILM

“I’ve been doing this for a long time, but I’m still learning from the masters how to light something, how to compose something, how to shoot something, how to operate something and how to create an emotional effect with cinematography. So with Caleb I could be a student, and I’m a collaborator at the same time and appreciate both spectrums. It was a perfect storm for me.” —Rob Legato, ASC, Visual Effects Supervisor says of his The Lion King experience, which involved collaborating with Legato and the filmmaking team largely on a purpose-built capture stage in Playa Vista. “I guess I had really no idea what I was getting myself into. It turns out it was like making a regular movie in the most wonderful way, without the dangers of being attacked by lions or getting heat stroke or any of the other things that are the by-products of shooting out in the real African savanna.” Legato notes that one of the key areas of collaboration came from being able to scout proxy landscapes and shots in VR. This could be done with multiple people donning VR goggles at the same time. “We could go into VR, take a quick peek at the environment, and then as we were operating the camera or making suggestions, we’d know where we were because we’d seen it.” CINEMATIC SOLUTIONS, VIRTUALLY

TOP: Young Simba, Timon and Pumbaa. Deschanel found new ways to film some of the scenes with these characters virtually. MIDDLE: Scar gets a visit from Mufasa and Zazu. Key lighting decisions were made during the virtual cinematography stage. BOTTOM: MPC’s final elephant graveyard shot.

16 • VFXVOICE.COM FALL 2020

The new virtual production paradigm gave Legato and Deschanel ample opportunity to make scenes that followed traditional cinematography techniques, or use them in different ways. One example was being able to experiment with virtual Steadicams, or even repurposing pre-animated characters in the scene as ‘cameras’ themselves. “The big thing in making the movie was the fact that the tools were very familiar and were designed to mimic a hundred years of filmmaking and all the things that we’ve learned over that period of time to make movies,” outlines Deschanel. “Beyond that, it was up to your imagination to see how you could piece those things together in interesting ways to go beyond that and get out of your comfort zone and try different things. And if it failed, it didn’t matter because you could press a button and start over again and try something else.” Another example was sun position, something that could have always been kept in a realistic spot between different shots. But, notes Deschanel, “we literally moved the sun in every shot. What I discovered was that if you just left the sun in the same place, you’d come around to a different angle and it would never feel right. So we would adjust the sun in every shot to feel like the other shot. It was never to destroy the sense of reality, but it was actually to enhance the sense of reality.” A further discovery was the importance of realistic focus pulling

and dolly moving in the virtual cinematography. Although those kinds of things could be achieved ‘in the computer,’ Deschanel and Legato ultimately sought to have them done on the capture stage with trained focus pullers and dolly grips. It also ensured that seemingly simplistic filmmaking methods prevailed, despite the fact that the camera and characters could literally do anything in CG. “For example,” describes Legato, “there’s an early introduction of Scar that I particularly liked where the staging is outrageously simple. It’s a mouse on a vine and Scar just walks forward. But what makes it magical – what makes it cinematic – is depth of field. Scar is merging from dark to light and the light hits him exactly right. The focus being pulled at exactly the right time creates this piece of cinema that belies the simplicity of the staging. “Here, the operator is the audience, the focus puller is the audience, the grip is the audience,” attests Legato. “If the filmmaker is doing it correctly, you automatically look where you’re supposed to, even if it feels invisible. It’s an intangible thing that you see in good movies with good filmmakers.” This also played into one of the duo’s goals for The Lion King – to try and avoid the perfection that could easily be achieved in a CG film. “In fact,” says Legato, “whenever you’re shooting something, you try for perfection and you can never get it. The sun might not behave or the actor might miss their mark or the dolly grip might be a little late. But that’s human and that’s life. In the computer, you get everything perfect, but you don’t want it.” So, the idea of the virtual camera tools on the stage was, in part, to provide for the ‘happy accidents’ that happen on a real set. Even though the filmmakers could do multiple takes to try and get the best shot, they invariably would return to the take that felt the most natural. “All those elements added to that feeling that the film was handmade,” suggests Deschanel, “that the focus was there and the decisions were made based on what you’re seeing on the screen and not just based on some computer deciding that, okay, now it’s time to switch to the next character.” LEARNING FROM EACH OTHER

Both Deschanel and Legato say they were able to push each other further as filmmakers during the making of The Lion King, both in a technical and aesthetic sense. “I’ve been doing this for a long time, but I’m still learning from the masters how to light something, how to compose something, how to shoot something, how to operate something and how to create an emotional effect with cinematography,” says Legato. “So with Caleb I could be a student, and I’m a collaborator at the same time and appreciate both spectrums. It was a perfect storm for me.” Deschanel returns the sentiment. “Part of what makes Rob amazing, aside from his visual sense and wonderful sense of storytelling, is also his understanding of all the tools and all the things that we use to make movies. It just became this wonderful, embryonic fluid of creation. It was really amazing. It was really a lot of fun.”

TOP: At right, Deschanel and Legato review using Magnopus-built tools to capture a scene. MIDDLE: The Lion King director Jon Favreau, right, with Director of Photography Deschanel. BOTTOM: Deschanel grips a hand-held virtual camera on the shooting stage.

FALL 2020 VFXVOICE.COM • 17


FILM

“I’ve been doing this for a long time, but I’m still learning from the masters how to light something, how to compose something, how to shoot something, how to operate something and how to create an emotional effect with cinematography. So with Caleb I could be a student, and I’m a collaborator at the same time and appreciate both spectrums. It was a perfect storm for me.” —Rob Legato, ASC, Visual Effects Supervisor says of his The Lion King experience, which involved collaborating with Legato and the filmmaking team largely on a purpose-built capture stage in Playa Vista. “I guess I had really no idea what I was getting myself into. It turns out it was like making a regular movie in the most wonderful way, without the dangers of being attacked by lions or getting heat stroke or any of the other things that are the by-products of shooting out in the real African savanna.” Legato notes that one of the key areas of collaboration came from being able to scout proxy landscapes and shots in VR. This could be done with multiple people donning VR goggles at the same time. “We could go into VR, take a quick peek at the environment, and then as we were operating the camera or making suggestions, we’d know where we were because we’d seen it.” CINEMATIC SOLUTIONS, VIRTUALLY

TOP: Young Simba, Timon and Pumbaa. Deschanel found new ways to film some of the scenes with these characters virtually. MIDDLE: Scar gets a visit from Mufasa and Zazu. Key lighting decisions were made during the virtual cinematography stage. BOTTOM: MPC’s final elephant graveyard shot.

16 • VFXVOICE.COM FALL 2020

The new virtual production paradigm gave Legato and Deschanel ample opportunity to make scenes that followed traditional cinematography techniques, or use them in different ways. One example was being able to experiment with virtual Steadicams, or even repurposing pre-animated characters in the scene as ‘cameras’ themselves. “The big thing in making the movie was the fact that the tools were very familiar and were designed to mimic a hundred years of filmmaking and all the things that we’ve learned over that period of time to make movies,” outlines Deschanel. “Beyond that, it was up to your imagination to see how you could piece those things together in interesting ways to go beyond that and get out of your comfort zone and try different things. And if it failed, it didn’t matter because you could press a button and start over again and try something else.” Another example was sun position, something that could have always been kept in a realistic spot between different shots. But, notes Deschanel, “we literally moved the sun in every shot. What I discovered was that if you just left the sun in the same place, you’d come around to a different angle and it would never feel right. So we would adjust the sun in every shot to feel like the other shot. It was never to destroy the sense of reality, but it was actually to enhance the sense of reality.” A further discovery was the importance of realistic focus pulling

and dolly moving in the virtual cinematography. Although those kinds of things could be achieved ‘in the computer,’ Deschanel and Legato ultimately sought to have them done on the capture stage with trained focus pullers and dolly grips. It also ensured that seemingly simplistic filmmaking methods prevailed, despite the fact that the camera and characters could literally do anything in CG. “For example,” describes Legato, “there’s an early introduction of Scar that I particularly liked where the staging is outrageously simple. It’s a mouse on a vine and Scar just walks forward. But what makes it magical – what makes it cinematic – is depth of field. Scar is merging from dark to light and the light hits him exactly right. The focus being pulled at exactly the right time creates this piece of cinema that belies the simplicity of the staging. “Here, the operator is the audience, the focus puller is the audience, the grip is the audience,” attests Legato. “If the filmmaker is doing it correctly, you automatically look where you’re supposed to, even if it feels invisible. It’s an intangible thing that you see in good movies with good filmmakers.” This also played into one of the duo’s goals for The Lion King – to try and avoid the perfection that could easily be achieved in a CG film. “In fact,” says Legato, “whenever you’re shooting something, you try for perfection and you can never get it. The sun might not behave or the actor might miss their mark or the dolly grip might be a little late. But that’s human and that’s life. In the computer, you get everything perfect, but you don’t want it.” So, the idea of the virtual camera tools on the stage was, in part, to provide for the ‘happy accidents’ that happen on a real set. Even though the filmmakers could do multiple takes to try and get the best shot, they invariably would return to the take that felt the most natural. “All those elements added to that feeling that the film was handmade,” suggests Deschanel, “that the focus was there and the decisions were made based on what you’re seeing on the screen and not just based on some computer deciding that, okay, now it’s time to switch to the next character.” LEARNING FROM EACH OTHER

Both Deschanel and Legato say they were able to push each other further as filmmakers during the making of The Lion King, both in a technical and aesthetic sense. “I’ve been doing this for a long time, but I’m still learning from the masters how to light something, how to compose something, how to shoot something, how to operate something and how to create an emotional effect with cinematography,” says Legato. “So with Caleb I could be a student, and I’m a collaborator at the same time and appreciate both spectrums. It was a perfect storm for me.” Deschanel returns the sentiment. “Part of what makes Rob amazing, aside from his visual sense and wonderful sense of storytelling, is also his understanding of all the tools and all the things that we use to make movies. It just became this wonderful, embryonic fluid of creation. It was really amazing. It was really a lot of fun.”

TOP: At right, Deschanel and Legato review using Magnopus-built tools to capture a scene. MIDDLE: The Lion King director Jon Favreau, right, with Director of Photography Deschanel. BOTTOM: Deschanel grips a hand-held virtual camera on the shooting stage.

FALL 2020 VFXVOICE.COM • 17


FILM

DYNAMIC DUOS: SFX AND VFX – DOMINIC TUOHY AND ROGER GUYETT By IAN FAILES

When Star Wars: The Rise of Skywalker Visual Effects Supervisor Roger Guyett first sat down with Special Effects Supervisor Dominic Tuohy and director J.J. Abrams to discuss the centerpiece Rey and Kylo Ren lightsaber duel in the film – which happens on a section of the fallen Death Star in the middle of a raging ocean – they all had one common goal in mind. This was, relates Tuohy, “trying to find something that lets the actors act as if they were in that environment,” with an added goal, says Guyett, of “convincing the audience that there is an integrity to whatever they’re seeing on the screen.” The idea for the duel then was to supply as much ‘live’ effects interaction as possible during filming, covering the actors in sprays of water to replicate the waves that would be hitting the section they were on. That practical water was down to Tuohy, who won a VFX Academy Award this year for 1917 and has also been Oscar-nominated for two other films. Visual effects artists from Industrial Light & Magic (overseen by Guyett, himself a six-time Oscar nominee) then used that in-camera water as a starting point for their digital simulations. It was just one of the many times during the making of Rise of Skywalker in which the duo had to closely collaborate, and where they were part of relying upon multiple effects methods for various final shots in the film – others included the dramatic Pasaana desert speeder chase and the moment the heroes sink into black desert sand. AN OCEAN… IN A PADDOCK

All The Rise of Skywalker images copyright © 2019 Lucasfilm.

18 • VFXVOICE.COM FALL 2020

The Death Star lightsaber duel is a demonstration of the power of Rey and Kylo Ren, as well as further insight into their deep

connection. For shooting, which took place on Pinewood Studio’s outdoor paddock lot surrounded by bluescreen in order to acquire the right quality of light, Tuohy’s team designed and built a system of water cannons to generate wave and spray effects, having sold Abrams earlier on the concept by showing the director some similar but smaller previous work and some early tests. “The cannons would push the water out and vertically up in the air and then it would fall down in a straight line,” Tuohy explains. “We were pumping water nearly 40 or 50 feet up into the air. We had 14 water cannons on one side and eight on the other. They were timed with the amazing stunt choreography of the fight to match its ferocity.” Originally the filming had been planned for the U.K. summer but had to be moved to late fall, and that meant extremely cold outdoor temperatures. This necessitated not covering the actors in as much water as originally planned. However, occasionally the wind would still get hold of the water spray and drift onto the performers. “Daisy was there in an outfit with bare arms, and I must admit there were a few times when I heard her shout ‘Dominic!’” shares Tuohy. “The thing is, Daisy would physically flinch when that water hit her – you can see in the footage her tense up when the water is covering her. That’s what makes it feel real, because that’s exactly what you would do. “Plus, every time a water cannon went off, there was a big roar of air,” continues Tuohy. “That almost always gives the actors something to react to, to remind them of where they are. We had big wind machines running at the same time. So the ambiance of noise that surrounded them is something that really made them feel that they were on the top of this Death Star piece.”

“The [water] cannons [for the Death Star lightsaber duel in Star Wars: The Rise of Skywalker] would push the water out and vertically up in the air and then it would fall down in a straight line. We were pumping water nearly 40 or 50 feet up into the air. We had 14 water cannons on one side and eight on the other. They were timed with the amazing stunt choreography of the fight to match its ferocity.” —Dominic Tuohy, Special Effects Supervisor

OPPOSITE LEFT: Dominic Tuohy, Special Effects Supervisor, The Rise of Skywalker. (Image courtesy of Dominic Tuohy) OPPOSITE RIGHT: Roger Guyett, Visual Effects Supervisor, The Rise of Skywalker. TOP: A view of the practical set at Pinewood Studios, where wave machines provided physical water splashes.

FALL 2020 VFXVOICE.COM • 19


FILM

DYNAMIC DUOS: SFX AND VFX – DOMINIC TUOHY AND ROGER GUYETT By IAN FAILES

When Star Wars: The Rise of Skywalker Visual Effects Supervisor Roger Guyett first sat down with Special Effects Supervisor Dominic Tuohy and director J.J. Abrams to discuss the centerpiece Rey and Kylo Ren lightsaber duel in the film – which happens on a section of the fallen Death Star in the middle of a raging ocean – they all had one common goal in mind. This was, relates Tuohy, “trying to find something that lets the actors act as if they were in that environment,” with an added goal, says Guyett, of “convincing the audience that there is an integrity to whatever they’re seeing on the screen.” The idea for the duel then was to supply as much ‘live’ effects interaction as possible during filming, covering the actors in sprays of water to replicate the waves that would be hitting the section they were on. That practical water was down to Tuohy, who won a VFX Academy Award this year for 1917 and has also been Oscar-nominated for two other films. Visual effects artists from Industrial Light & Magic (overseen by Guyett, himself a six-time Oscar nominee) then used that in-camera water as a starting point for their digital simulations. It was just one of the many times during the making of Rise of Skywalker in which the duo had to closely collaborate, and where they were part of relying upon multiple effects methods for various final shots in the film – others included the dramatic Pasaana desert speeder chase and the moment the heroes sink into black desert sand. AN OCEAN… IN A PADDOCK

All The Rise of Skywalker images copyright © 2019 Lucasfilm.

18 • VFXVOICE.COM FALL 2020

The Death Star lightsaber duel is a demonstration of the power of Rey and Kylo Ren, as well as further insight into their deep

connection. For shooting, which took place on Pinewood Studio’s outdoor paddock lot surrounded by bluescreen in order to acquire the right quality of light, Tuohy’s team designed and built a system of water cannons to generate wave and spray effects, having sold Abrams earlier on the concept by showing the director some similar but smaller previous work and some early tests. “The cannons would push the water out and vertically up in the air and then it would fall down in a straight line,” Tuohy explains. “We were pumping water nearly 40 or 50 feet up into the air. We had 14 water cannons on one side and eight on the other. They were timed with the amazing stunt choreography of the fight to match its ferocity.” Originally the filming had been planned for the U.K. summer but had to be moved to late fall, and that meant extremely cold outdoor temperatures. This necessitated not covering the actors in as much water as originally planned. However, occasionally the wind would still get hold of the water spray and drift onto the performers. “Daisy was there in an outfit with bare arms, and I must admit there were a few times when I heard her shout ‘Dominic!’” shares Tuohy. “The thing is, Daisy would physically flinch when that water hit her – you can see in the footage her tense up when the water is covering her. That’s what makes it feel real, because that’s exactly what you would do. “Plus, every time a water cannon went off, there was a big roar of air,” continues Tuohy. “That almost always gives the actors something to react to, to remind them of where they are. We had big wind machines running at the same time. So the ambiance of noise that surrounded them is something that really made them feel that they were on the top of this Death Star piece.”

“The [water] cannons [for the Death Star lightsaber duel in Star Wars: The Rise of Skywalker] would push the water out and vertically up in the air and then it would fall down in a straight line. We were pumping water nearly 40 or 50 feet up into the air. We had 14 water cannons on one side and eight on the other. They were timed with the amazing stunt choreography of the fight to match its ferocity.” —Dominic Tuohy, Special Effects Supervisor

OPPOSITE LEFT: Dominic Tuohy, Special Effects Supervisor, The Rise of Skywalker. (Image courtesy of Dominic Tuohy) OPPOSITE RIGHT: Roger Guyett, Visual Effects Supervisor, The Rise of Skywalker. TOP: A view of the practical set at Pinewood Studios, where wave machines provided physical water splashes.

FALL 2020 VFXVOICE.COM • 19


FILM

“If we hadn’t shot it the way that we had shot it, I just don’t believe it would have looked anywhere near as convincing. Because they wouldn’t actually be there. Daisy wouldn’t be cold. They wouldn’t be covered in spray. They wouldn’t be wet. A spray or splash is changing the lighting constantly. So all of that interactivity is adding in layers to help convince the audience.” —Roger Guyett, Visual Effects Supervisor TAKING IT FURTHER

TOP TO BOTTOM: The scene in progress. ILM simulated waves and pieces of geometry to build out the environment. The live-action actors composited into a wider digital ocean setting. Kylo Ren and Rey duel on a section of the downed Death Star. The scene, shot in the U.K. in late fall, was filmed with cold water, adding an extra layer of authenticity to the actor’s reactions when they would be hit with splashes.

20 • VFXVOICE.COM FALL 2020

“If we hadn’t shot it the way that we had shot it,” suggests Guyett, “I just don’t believe it would have looked anywhere near as convincing. Because they wouldn’t actually be there. Daisy wouldn’t be cold. They wouldn’t be covered in spray. They wouldn’t be wet. A spray or splash is changing the lighting constantly. So all of that interactivity is adding in layers to help convince the audience.” For ILM, of course, the approach to shooting the sequence meant that VFX artists had an authentic base to start with, even if it did end up replacing much of the spray and generating the undulating ocean. “A tremendous amount of the work that Dominic did was very atmospheric,” observes Guyett, “in that it wasn’t actually exactly what a wave would do when it hit a surface like that Death Star section. But what it did do was create atmosphere, which was amazing for the actors to imagine that these explosive waves were going off all around them. They didn’t have to imagine that because it was really happening and they really were getting wet.” Guyett notes that ILM re-wrote its water simulation pipeline as part of the challenging Rise of Skywalker sequence. It allowed artists to lay out and control large blocks of waves to iterate on, and also to treat the water as more of a character in that sequence. The studio also had to deal with the extraction of the actors and partial set from that bluescreen Pinewood set – always an intricate task – but it’s one, again, that Guyett says benefited from the practical water interaction. “You’re replacing the dynamics of a real water splash with the dynamics of a digital water splash, but they’re occupying the same kind of area in the screen. The level of realism that all of that adds is well worth the headache in terms of the extraction. I’m just always thinking, ‘What is the best version of this that’s going to end up on the screen?’” Guyett marvels, in particular, at Tuohy’s efforts in conjuring the wave and spray arch that Kylo Ren emerges from at one point in the duel. “That was almost working in-camera,” he states. Adds Tuohy, “We did that several times to get it right with Adam who just got drenched time after time, and he never complained once.”

SELLING THE REALISM

Although for editorial reasons they did not appear for as long as originally intended in the film, earlier scenes of Rey heading to the downed Death Star on a sea skiff were planned and shot. Again, the moment was imagined as a combination of both practical and digital effects to tell the story of Rey having to struggle across the rough water. “We thought there should be a skiff on a motion base that allowed us to closely read the movements to see Rey fight and hold onto this ship,” details Tuohy. “We built a hydraulic platform that was computer-controlled in that same outdoor paddock lot environment with smaller water cannons, and the entire skiff. Even one of the outriggers was made to swing completely out over the top while we made it rock and roll.” ILM was intricately involved in the operation of the skiff gimbal, too, with Animation Supervisor Paul Kavanagh pre-animating specific behaviors on rough water geometry, which would be matched by Tuohy’s operation on set. Overall, the physical structure and the water interaction was aimed at helping to sell [Daisy] Ridley’s performance. “What you’re trying to do is convince the actor that whatever is happening is actually happening,” attests Guyett. “It’s not just a greenscreen and her sitting on a plank of wood with us saying, ‘Well, you’re on a raging ocean. Just imagine it’s freezing cold and you’re wet and you’re trying to steer this thing to the Death Star.’ We weren’t doing that. “Instead, Daisy’s actually on a real skiff. The special effects guys could talk to her about pulling the levers or the handles and operating this thing. So now she’s not having to work so hard as an actor because she’s actually on the skiff, and now you’re hitting her with water. So her performance elevates because she’s really on that machine and it’s really moving. She doesn’t have to pretend she’s cold because she’s actually freezing! All of those things I think give this a better sense of realism.” THE EFFECTS RELATIONSHIP

Both Guyett and Tuohy are at pains to point out their work for Rise of Skywalker was of course carried out in collaboration with scores of other practitioners (the duo ultimately received a visual effects Oscar nomination for the film, alongside Special Creature Effects Supervisor Neal Scanlan and ILM Visual Effects Supervisor Patrick Tubach). Importantly, they say, no one single effects approach was ever pushed in favor of another during production. “The trick is getting that balance right,” points out Tuohy. “And when you get that combination of special and visual effects right, you believe it.” “We’re all pragmatic about what we think we can achieve,” adds Guyett. “It’s about the collaboration and what’s on-screen. It’s not about me just trying to use a piece of technology for its own sake. You’ve always got that shot in mind. And I think that’s what we did for every aspect of the process.”

TOP TO BOTTOM: ILM’s simulation artists had to deal with many different types of water in the scene. The behavior of the waves often echoes the intensity of the duel. The skiff navigates the treacherous waves as Rey pilots it toward the Death Star. A wide shot of the skiff entering the Death Star area.

FALL 2020 VFXVOICE.COM • 21


FILM

“If we hadn’t shot it the way that we had shot it, I just don’t believe it would have looked anywhere near as convincing. Because they wouldn’t actually be there. Daisy wouldn’t be cold. They wouldn’t be covered in spray. They wouldn’t be wet. A spray or splash is changing the lighting constantly. So all of that interactivity is adding in layers to help convince the audience.” —Roger Guyett, Visual Effects Supervisor TAKING IT FURTHER

TOP TO BOTTOM: The scene in progress. ILM simulated waves and pieces of geometry to build out the environment. The live-action actors composited into a wider digital ocean setting. Kylo Ren and Rey duel on a section of the downed Death Star. The scene, shot in the U.K. in late fall, was filmed with cold water, adding an extra layer of authenticity to the actor’s reactions when they would be hit with splashes.

20 • VFXVOICE.COM FALL 2020

“If we hadn’t shot it the way that we had shot it,” suggests Guyett, “I just don’t believe it would have looked anywhere near as convincing. Because they wouldn’t actually be there. Daisy wouldn’t be cold. They wouldn’t be covered in spray. They wouldn’t be wet. A spray or splash is changing the lighting constantly. So all of that interactivity is adding in layers to help convince the audience.” For ILM, of course, the approach to shooting the sequence meant that VFX artists had an authentic base to start with, even if it did end up replacing much of the spray and generating the undulating ocean. “A tremendous amount of the work that Dominic did was very atmospheric,” observes Guyett, “in that it wasn’t actually exactly what a wave would do when it hit a surface like that Death Star section. But what it did do was create atmosphere, which was amazing for the actors to imagine that these explosive waves were going off all around them. They didn’t have to imagine that because it was really happening and they really were getting wet.” Guyett notes that ILM re-wrote its water simulation pipeline as part of the challenging Rise of Skywalker sequence. It allowed artists to lay out and control large blocks of waves to iterate on, and also to treat the water as more of a character in that sequence. The studio also had to deal with the extraction of the actors and partial set from that bluescreen Pinewood set – always an intricate task – but it’s one, again, that Guyett says benefited from the practical water interaction. “You’re replacing the dynamics of a real water splash with the dynamics of a digital water splash, but they’re occupying the same kind of area in the screen. The level of realism that all of that adds is well worth the headache in terms of the extraction. I’m just always thinking, ‘What is the best version of this that’s going to end up on the screen?’” Guyett marvels, in particular, at Tuohy’s efforts in conjuring the wave and spray arch that Kylo Ren emerges from at one point in the duel. “That was almost working in-camera,” he states. Adds Tuohy, “We did that several times to get it right with Adam who just got drenched time after time, and he never complained once.”

SELLING THE REALISM

Although for editorial reasons they did not appear for as long as originally intended in the film, earlier scenes of Rey heading to the downed Death Star on a sea skiff were planned and shot. Again, the moment was imagined as a combination of both practical and digital effects to tell the story of Rey having to struggle across the rough water. “We thought there should be a skiff on a motion base that allowed us to closely read the movements to see Rey fight and hold onto this ship,” details Tuohy. “We built a hydraulic platform that was computer-controlled in that same outdoor paddock lot environment with smaller water cannons, and the entire skiff. Even one of the outriggers was made to swing completely out over the top while we made it rock and roll.” ILM was intricately involved in the operation of the skiff gimbal, too, with Animation Supervisor Paul Kavanagh pre-animating specific behaviors on rough water geometry, which would be matched by Tuohy’s operation on set. Overall, the physical structure and the water interaction was aimed at helping to sell [Daisy] Ridley’s performance. “What you’re trying to do is convince the actor that whatever is happening is actually happening,” attests Guyett. “It’s not just a greenscreen and her sitting on a plank of wood with us saying, ‘Well, you’re on a raging ocean. Just imagine it’s freezing cold and you’re wet and you’re trying to steer this thing to the Death Star.’ We weren’t doing that. “Instead, Daisy’s actually on a real skiff. The special effects guys could talk to her about pulling the levers or the handles and operating this thing. So now she’s not having to work so hard as an actor because she’s actually on the skiff, and now you’re hitting her with water. So her performance elevates because she’s really on that machine and it’s really moving. She doesn’t have to pretend she’s cold because she’s actually freezing! All of those things I think give this a better sense of realism.” THE EFFECTS RELATIONSHIP

Both Guyett and Tuohy are at pains to point out their work for Rise of Skywalker was of course carried out in collaboration with scores of other practitioners (the duo ultimately received a visual effects Oscar nomination for the film, alongside Special Creature Effects Supervisor Neal Scanlan and ILM Visual Effects Supervisor Patrick Tubach). Importantly, they say, no one single effects approach was ever pushed in favor of another during production. “The trick is getting that balance right,” points out Tuohy. “And when you get that combination of special and visual effects right, you believe it.” “We’re all pragmatic about what we think we can achieve,” adds Guyett. “It’s about the collaboration and what’s on-screen. It’s not about me just trying to use a piece of technology for its own sake. You’ve always got that shot in mind. And I think that’s what we did for every aspect of the process.”

TOP TO BOTTOM: ILM’s simulation artists had to deal with many different types of water in the scene. The behavior of the waves often echoes the intensity of the duel. The skiff navigates the treacherous waves as Rey pilots it toward the Death Star. A wide shot of the skiff entering the Death Star area.

FALL 2020 VFXVOICE.COM • 21


PROFILE

J.D. SCHWALM: FOLLOWING HIS FATHER’S PLAYBOOK TO THE MOON AND BEYOND By TREVOR HOGG

Images courtesy of J.D. Schwalm.

22 • VFXVOICE.COM FALL 2020

Following in the career footsteps of his father, Jim Schwalm, who received a VES Award nomination for Spider-Man 2, J.D. Schwalm stood on the stage of the Dolby Theater to receive the Oscar for Best Visual Effects for First Man. Along with being a respected artist, he is also an astute businessman who founded Innovative Workshop in Los Angeles and subsequently branched out to Atlanta, where he currently lives with his family. “At the age of 19, I was working at a high-performance automotive shop filled with race cars,” says Schwalm. “My dad called and asked if I would like to go work on a movie in Hawaii over the summer. I said, ‘yes’ and never looked back.” Visual effects have increased, rather than diminished, the need for practical elements, notes Schwalm. “The first Spider-Man film is when I first realized that visual effects were actually going to make our job more important,” notes Schwalm. “When doing movies like Hobbs & Shaw, it’s important to embrace the visual effects element because the audience is expecting massive-scale explosions and gags. We try to do everything practically in-camera and then hand it over to the CG guys for enhancement. When you shoot a car into a building or flip a car at high speeds practically, it gives the CG guys a template to follow and allows them to make it bigger, better and, in the end, feel more realistic.” The early pre-production meetings for First Man, helmed by Damien Chazelle, were intimidating. “We had a tight budget and Damien wanted to do everything practically. The team that the producers put together, with [Production Designer] Nathan Crowley and [Visual Effects Supervisor] Paul Lambert, was critical in making that movie work. Nathan could think completely outside of the box, as could Paul. Nathan crafted sets that had a hugeness to them, but he built only what the camera needed to see. There was little to no waste. This, paired with extensive planning by Paul and Damien, allowed for the film to have a blockbuster feel on a modest budget. First Man proved that if everyone studies the shots, an epic special effects movie can be made affordably.” High-tech equipment has been embraced by Schwalm. “The advancement in technology has given my company the ability to have a full-fledged manufacturing facility right down to fullyautomated computerized milling machines, robotic arms, 3D printers and laser cutters. Essentially, when we create a gag, we’re building giant prototypes. It’s a one-off thing that has to be built fast. Technology has enabled us to build with rapid turnaround. For The Right Stuff television series, we had to build a gimbal in two weeks. In the old days, a gimbal like that would have taken eight to 10 weeks to construct. My engineers drew the entire thing from top to bottom in a computer and it took them five or six days. We were sending drawings to an outside engineering company daily to keep them in the loop. This allowed us to have approval on the build at the time design was finished. From there, individual pieces were broken apart in the software and loaded directly into the CNC machines. After pieces were cut, the welders assembled the gimbal as if they were building LEGOs. There is no grinding and trimming here and there. Each piece that goes into these builds is perfectly machine-cut and has a laser-precision fit. This means completing the job quicker, which in turn allows the stunt

“A strict rule around my crew is, ‘Do not ever let production rush you,’ It is always safer to not do the gag than to risk getting hurt. It’s extremely important to never get too comfortable with what you’ve built and the knowledge that you’ve amassed over the years -- that’s how you get bit. To avoid that, we try to ensure that everything that rolls out of our shop door has a CAD file that has been reviewed by an outside engineer. Only then do we go ahead and build. That’s an important message that should be known around the industry.” —J.D. Schwalm, Special Effects Supervisor team and our crew to get on the rig earlier, leaving them more time to work out kinks and to make any necessary changes.” Special effects can be an extremely dangerous profession and requires proper training. “Every day you run the risk of injuring yourself or the people around you,” observes Schwalm. “It’s an array of talented employees. We have engineers who design in CAD, licensed pyrotechnicians from the mining and fireworks industries, CNC machine operators from the automotive industry, hydraulic technicians from large-scale plant operations. You can’t exactly go to school for special effects. They tried it for a bit in California and it fizzled. I try to look at my job as a contractor putting the properly qualified people in the right places. Typically, it’s difficult to find people who do a little bit of everything. You end up with larger crews where each member has specialized skills and we try to optimize their skillsets.” Tight production schedules can pose safety concerns. “A strict

OPPOSITE TOP: J.D. Schwalm TOP: Visual Effects Supervisor Paul Lambert and J.D. Schwalm after their Oscar win for First Man. (Photo: JB Lacroix)

FALL 2020 VFXVOICE.COM • 23


PROFILE

J.D. SCHWALM: FOLLOWING HIS FATHER’S PLAYBOOK TO THE MOON AND BEYOND By TREVOR HOGG

Images courtesy of J.D. Schwalm.

22 • VFXVOICE.COM FALL 2020

Following in the career footsteps of his father, Jim Schwalm, who received a VES Award nomination for Spider-Man 2, J.D. Schwalm stood on the stage of the Dolby Theater to receive the Oscar for Best Visual Effects for First Man. Along with being a respected artist, he is also an astute businessman who founded Innovative Workshop in Los Angeles and subsequently branched out to Atlanta, where he currently lives with his family. “At the age of 19, I was working at a high-performance automotive shop filled with race cars,” says Schwalm. “My dad called and asked if I would like to go work on a movie in Hawaii over the summer. I said, ‘yes’ and never looked back.” Visual effects have increased, rather than diminished, the need for practical elements, notes Schwalm. “The first Spider-Man film is when I first realized that visual effects were actually going to make our job more important,” notes Schwalm. “When doing movies like Hobbs & Shaw, it’s important to embrace the visual effects element because the audience is expecting massive-scale explosions and gags. We try to do everything practically in-camera and then hand it over to the CG guys for enhancement. When you shoot a car into a building or flip a car at high speeds practically, it gives the CG guys a template to follow and allows them to make it bigger, better and, in the end, feel more realistic.” The early pre-production meetings for First Man, helmed by Damien Chazelle, were intimidating. “We had a tight budget and Damien wanted to do everything practically. The team that the producers put together, with [Production Designer] Nathan Crowley and [Visual Effects Supervisor] Paul Lambert, was critical in making that movie work. Nathan could think completely outside of the box, as could Paul. Nathan crafted sets that had a hugeness to them, but he built only what the camera needed to see. There was little to no waste. This, paired with extensive planning by Paul and Damien, allowed for the film to have a blockbuster feel on a modest budget. First Man proved that if everyone studies the shots, an epic special effects movie can be made affordably.” High-tech equipment has been embraced by Schwalm. “The advancement in technology has given my company the ability to have a full-fledged manufacturing facility right down to fullyautomated computerized milling machines, robotic arms, 3D printers and laser cutters. Essentially, when we create a gag, we’re building giant prototypes. It’s a one-off thing that has to be built fast. Technology has enabled us to build with rapid turnaround. For The Right Stuff television series, we had to build a gimbal in two weeks. In the old days, a gimbal like that would have taken eight to 10 weeks to construct. My engineers drew the entire thing from top to bottom in a computer and it took them five or six days. We were sending drawings to an outside engineering company daily to keep them in the loop. This allowed us to have approval on the build at the time design was finished. From there, individual pieces were broken apart in the software and loaded directly into the CNC machines. After pieces were cut, the welders assembled the gimbal as if they were building LEGOs. There is no grinding and trimming here and there. Each piece that goes into these builds is perfectly machine-cut and has a laser-precision fit. This means completing the job quicker, which in turn allows the stunt

“A strict rule around my crew is, ‘Do not ever let production rush you,’ It is always safer to not do the gag than to risk getting hurt. It’s extremely important to never get too comfortable with what you’ve built and the knowledge that you’ve amassed over the years -- that’s how you get bit. To avoid that, we try to ensure that everything that rolls out of our shop door has a CAD file that has been reviewed by an outside engineer. Only then do we go ahead and build. That’s an important message that should be known around the industry.” —J.D. Schwalm, Special Effects Supervisor team and our crew to get on the rig earlier, leaving them more time to work out kinks and to make any necessary changes.” Special effects can be an extremely dangerous profession and requires proper training. “Every day you run the risk of injuring yourself or the people around you,” observes Schwalm. “It’s an array of talented employees. We have engineers who design in CAD, licensed pyrotechnicians from the mining and fireworks industries, CNC machine operators from the automotive industry, hydraulic technicians from large-scale plant operations. You can’t exactly go to school for special effects. They tried it for a bit in California and it fizzled. I try to look at my job as a contractor putting the properly qualified people in the right places. Typically, it’s difficult to find people who do a little bit of everything. You end up with larger crews where each member has specialized skills and we try to optimize their skillsets.” Tight production schedules can pose safety concerns. “A strict

OPPOSITE TOP: J.D. Schwalm TOP: Visual Effects Supervisor Paul Lambert and J.D. Schwalm after their Oscar win for First Man. (Photo: JB Lacroix)

FALL 2020 VFXVOICE.COM • 23


PROFILE

TOP LEFT: At Innovation Workshop. TOP RIGHT: J.D. Schwalm, Jim Schwalm and Richie Schwalm, J.D.’s younger brother, a special effects coordinator and technician. MIDDLE: In front of an explosion for The Fate of the Furious. BOTTOM: Jim Schwalm, J.D. Schwalm and Richie Schwalm on location for The Fate of the Furious.

24 • VFXVOICE.COM FALL 2020

rule around my crew is, ‘Do not ever let production rush you,’” states Schwalm. “It is always safer to not do the gag than to risk getting hurt. It’s extremely important to never get too comfortable with what you’ve built and the knowledge that you’ve amassed over the years – that’s how you get bit. To avoid that, we try to ensure that everything that rolls out of our shop door has a CAD file that has been reviewed by an outside engineer. Only then do we go ahead and build. That’s an important message that should be known around the industry. When you develop a relationship with a good engineering firm you can work around each other’s schedules. We’ve spent six years working with McLaren Engineering out of upstate New York. We send them three to four drawings a day for them to review, from little to big things.” “The fun part of the work is figuring out a way to build the desired gag in the time they want it, with the tools and resources that you have,” remarks Schwalm. “R&D is part of the workflow. There is never idle time at the shop. If we’re not working on a gag then we’re developing something. We’re doing a lot of stuff right now with digital control systems. We have made the switch from analog to a 100% digital communication on all of our motion control equipment. In the past, if you had six hydraulic cylinders to move, you had to run multiple wires to each cylinder. We’re developing new systems now that run on a fiber optic network. Currently, we have one fiber cable and breakout box that runs on an Ethernet network. The process of building breakaways has not changed much, it has just been perfected throughout the years. Now, they work better and go faster. A lot of the stuff we do involves old-school systems with a new-wave technique.” Wires are harder to hide with high-resolution productions. “In the old days we could use piano wire because it was essentially invisible to the camera,” notes Schwalm. “Those days are gone. If you’re hanging stuff you might as well use larger, safer lines because the CG guys are going to have to paint it out no matter what. As technology goes, it makes the job more fun. The advancement in our industry has only made it easier for us to connect and tie in with other departments. For example, on First Man, the camera, the gimbal

and the video playing on the LED wall were all running through a common network. They were triggering each other and telling each other where they were. The further we advance, the more opportunity we are going to have to work together and sync all of these technologies up.” Getting an opportunity to work with James Cameron on Avatar 2 was a thrill for Schwalm, as the maverick filmmaker helped to inspire him to pursue a career in special effects. “If I had to name a single movie that drew me into the industry, I would say Terminator 2: Judgment Day. I saw what quality visual and mechanical effects can do together. And still to this day it holds up. When I started on Avatar 2 I expected Jim to be unreachable because of who he was and what he has accomplished, but the opposite was true. There was a lot more collaboration than I expected. Jim likes intelligent ideas. He has surrounded himself with people who contribute and innovate. Avatar 2 was extremely fun and challenging.” Schwalm is also a fan of Dwayne Johnson. “I feel extremely fortunate to be able to collaborate with Dwayne and his production company on so many projects, from Hobbs & Shaw to Jumanji and Red Notice. All of his movies are action-packed and all have super unique gags. Dwayne’s work ethic and compassion are his characteristics that I look up to most.” A tremendous respect exists between father and son. “As I got older, I realized how good my dad was at dealing with people,” observes Schwalm. “That coupled with the fact that he is a real-life MacGyver made him a master special effects man. My dad came up in an industry where you had to do everything with your hands. I learned all of my pyro skills from him, I learned how to build from him, I learned how to tinker from him. From a young age, I was given such a good playbook to go by.” When it comes to his own success, Schwalm notes, “I dream stuff up, budget, organize and try to keep everyone safe. I’m not one of the people who actually builds the gags. I have really good people around me who have allowed me to succeed.”

TOP LEFT: Behind the barrel of a minigun. TOP RIGHT: With Second Unit Stunt Coordinator Jack Gill and Stunt C oordinator Spiro Razatos discussing an action scene for Fast and Furious 8: Fate of the Furious. MIDDLE: Driving a mini truck with Dyna-Fog fogger. BOTTOM: With an auger on the ice.

FALL 2020 VFXVOICE.COM • 25


PROFILE

TOP LEFT: At Innovation Workshop. TOP RIGHT: J.D. Schwalm, Jim Schwalm and Richie Schwalm, J.D.’s younger brother, a special effects coordinator and technician. MIDDLE: In front of an explosion for The Fate of the Furious. BOTTOM: Jim Schwalm, J.D. Schwalm and Richie Schwalm on location for The Fate of the Furious.

24 • VFXVOICE.COM FALL 2020

rule around my crew is, ‘Do not ever let production rush you,’” states Schwalm. “It is always safer to not do the gag than to risk getting hurt. It’s extremely important to never get too comfortable with what you’ve built and the knowledge that you’ve amassed over the years – that’s how you get bit. To avoid that, we try to ensure that everything that rolls out of our shop door has a CAD file that has been reviewed by an outside engineer. Only then do we go ahead and build. That’s an important message that should be known around the industry. When you develop a relationship with a good engineering firm you can work around each other’s schedules. We’ve spent six years working with McLaren Engineering out of upstate New York. We send them three to four drawings a day for them to review, from little to big things.” “The fun part of the work is figuring out a way to build the desired gag in the time they want it, with the tools and resources that you have,” remarks Schwalm. “R&D is part of the workflow. There is never idle time at the shop. If we’re not working on a gag then we’re developing something. We’re doing a lot of stuff right now with digital control systems. We have made the switch from analog to a 100% digital communication on all of our motion control equipment. In the past, if you had six hydraulic cylinders to move, you had to run multiple wires to each cylinder. We’re developing new systems now that run on a fiber optic network. Currently, we have one fiber cable and breakout box that runs on an Ethernet network. The process of building breakaways has not changed much, it has just been perfected throughout the years. Now, they work better and go faster. A lot of the stuff we do involves old-school systems with a new-wave technique.” Wires are harder to hide with high-resolution productions. “In the old days we could use piano wire because it was essentially invisible to the camera,” notes Schwalm. “Those days are gone. If you’re hanging stuff you might as well use larger, safer lines because the CG guys are going to have to paint it out no matter what. As technology goes, it makes the job more fun. The advancement in our industry has only made it easier for us to connect and tie in with other departments. For example, on First Man, the camera, the gimbal

and the video playing on the LED wall were all running through a common network. They were triggering each other and telling each other where they were. The further we advance, the more opportunity we are going to have to work together and sync all of these technologies up.” Getting an opportunity to work with James Cameron on Avatar 2 was a thrill for Schwalm, as the maverick filmmaker helped to inspire him to pursue a career in special effects. “If I had to name a single movie that drew me into the industry, I would say Terminator 2: Judgment Day. I saw what quality visual and mechanical effects can do together. And still to this day it holds up. When I started on Avatar 2 I expected Jim to be unreachable because of who he was and what he has accomplished, but the opposite was true. There was a lot more collaboration than I expected. Jim likes intelligent ideas. He has surrounded himself with people who contribute and innovate. Avatar 2 was extremely fun and challenging.” Schwalm is also a fan of Dwayne Johnson. “I feel extremely fortunate to be able to collaborate with Dwayne and his production company on so many projects, from Hobbs & Shaw to Jumanji and Red Notice. All of his movies are action-packed and all have super unique gags. Dwayne’s work ethic and compassion are his characteristics that I look up to most.” A tremendous respect exists between father and son. “As I got older, I realized how good my dad was at dealing with people,” observes Schwalm. “That coupled with the fact that he is a real-life MacGyver made him a master special effects man. My dad came up in an industry where you had to do everything with your hands. I learned all of my pyro skills from him, I learned how to build from him, I learned how to tinker from him. From a young age, I was given such a good playbook to go by.” When it comes to his own success, Schwalm notes, “I dream stuff up, budget, organize and try to keep everyone safe. I’m not one of the people who actually builds the gags. I have really good people around me who have allowed me to succeed.”

TOP LEFT: Behind the barrel of a minigun. TOP RIGHT: With Second Unit Stunt Coordinator Jack Gill and Stunt C oordinator Spiro Razatos discussing an action scene for Fast and Furious 8: Fate of the Furious. MIDDLE: Driving a mini truck with Dyna-Fog fogger. BOTTOM: With an auger on the ice.

FALL 2020 VFXVOICE.COM • 25


COVER

IN-CAMERA MAGIC HELPS TENET SLIP THE BONDS OF TIME By KEVIN H. MARTIN

All images courtesy of Warner Bros. Pictures. TOP: Enigmatically identified only as ‘The Protagonist,’ John David Washington’s character is one of the agents tasked with averting a global disaster in Tenet. OPPOSITE TOP: John David Washington is aided by Robert Pattinson.

26 • VFXVOICE.COM FALL 2020

“Even after reading the script four times, I was still working out the complexities of it,” Tenet Visual Effects Supervisor Andrew Jackson admits. “Tenet was a case of just when you think you’ve gotten things clear in your mind, then you catch yourself and realize, ‘Oh no!’ And so you’ve got to think a bit harder. It’ll keep audiences wondering to work things out.” Tenet is writer/producer/director Christopher Nolan’s latest thrill ride, shot almost entirely using 65mm and IMAX film cameras. The premise – featuring spies combating a mysterious global threat and the use of time inversions – has characters facing confounding visual contradictions that leave them wondering if they are coming or going. Starring John David Washington [his character enigmatically only identified as ‘The Protagonist’], Robert Pattinson and Kenneth Branagh, Nolan’s return to technothriller territory a la Inception boasts James Bond-sized full-scale set pieces while not stinting on effects magic – though with the focus primarily on in-camera work. Editor Jennifer Lame estimates only 300 VFX shots in the whole picture, while director Nolan says the level of VFX – created at DNEG, which has worked on Nolan’s films since Batman Begins in 2005 – is less than what would be found in most romantic comedies. Visual Effects Producer Mike Chambers began working with Nolan on Inception. “With Chris being the writer/director/ producer he is, you can be sure after doing several shows with him that he knows not only what he wants but also how he would prefer to have things done,” Chambers says. “He’s very tech-savvy

with all aspects of production, and sees VFX as just one tool in the toolbox. He has always been happy with DNEG and likes the idea of avoiding multiple vendors unless something unusual comes up. Organizing early on for a Nolan project starts with knowing the ideal is to get as much in-camera as possible, but then to plan alternate routes that can get us to where we need if in-camera approach doesn’t get us all the way.” For Jackson, an Oscar nominee for George Miller’s Mad Max: Fury Road and a veteran of the filmmaker’s Dunkirk, Tenet offered a chance to participate on a level deeper than the norm. “One of the best parts of working on a Nolan movie,” he reveals, “involves getting into essential aspects of storytelling and filmmaking problem-solving, going well beyond traditional visual effects concerns. We go into Chris’ films with the view of filming as much as possible in-camera. Sometimes those attempts are used as elements for a visual effects shot or even just as reference, and others survive into the final even without enhancement. I come from a practical effects background, having run that kind of business. It is part of my roots, so to speak, and I really love working with the physical effects unit.” Jackson’s SFX counterpart Scott R. Fisher’s ongoing association with Nolan dates back a full decade to Inception. “Collaborating with Scott made for a very successful association,” Jackson reports. “We could talk things out while speaking the same language, then determine which processes might prove viable. Then it might be a matter of getting the art department and stunts, or some

“With Chris being the writer/director/producer he is, you can be sure after doing several shows with him that he knows not only what he wants but also how he would prefer to have things done. He’s very tech-savvy with all aspects of production, and sees VFX as just one tool in the toolbox. He has always been happy with DNEG and likes the idea of avoiding multiple vendors unless something unusual comes up. Organizing early on for a Nolan project starts with knowing the ideal is to get as much in-camera as possible, but then to plan alternate routes that can get us to where we need if in-camera approach doesn’t get us all the way.” —Mike Chambers, Visual Effects Producer

FALL 2020 VFXVOICE.COM • 27


COVER

IN-CAMERA MAGIC HELPS TENET SLIP THE BONDS OF TIME By KEVIN H. MARTIN

All images courtesy of Warner Bros. Pictures. TOP: Enigmatically identified only as ‘The Protagonist,’ John David Washington’s character is one of the agents tasked with averting a global disaster in Tenet. OPPOSITE TOP: John David Washington is aided by Robert Pattinson.

26 • VFXVOICE.COM FALL 2020

“Even after reading the script four times, I was still working out the complexities of it,” Tenet Visual Effects Supervisor Andrew Jackson admits. “Tenet was a case of just when you think you’ve gotten things clear in your mind, then you catch yourself and realize, ‘Oh no!’ And so you’ve got to think a bit harder. It’ll keep audiences wondering to work things out.” Tenet is writer/producer/director Christopher Nolan’s latest thrill ride, shot almost entirely using 65mm and IMAX film cameras. The premise – featuring spies combating a mysterious global threat and the use of time inversions – has characters facing confounding visual contradictions that leave them wondering if they are coming or going. Starring John David Washington [his character enigmatically only identified as ‘The Protagonist’], Robert Pattinson and Kenneth Branagh, Nolan’s return to technothriller territory a la Inception boasts James Bond-sized full-scale set pieces while not stinting on effects magic – though with the focus primarily on in-camera work. Editor Jennifer Lame estimates only 300 VFX shots in the whole picture, while director Nolan says the level of VFX – created at DNEG, which has worked on Nolan’s films since Batman Begins in 2005 – is less than what would be found in most romantic comedies. Visual Effects Producer Mike Chambers began working with Nolan on Inception. “With Chris being the writer/director/ producer he is, you can be sure after doing several shows with him that he knows not only what he wants but also how he would prefer to have things done,” Chambers says. “He’s very tech-savvy

with all aspects of production, and sees VFX as just one tool in the toolbox. He has always been happy with DNEG and likes the idea of avoiding multiple vendors unless something unusual comes up. Organizing early on for a Nolan project starts with knowing the ideal is to get as much in-camera as possible, but then to plan alternate routes that can get us to where we need if in-camera approach doesn’t get us all the way.” For Jackson, an Oscar nominee for George Miller’s Mad Max: Fury Road and a veteran of the filmmaker’s Dunkirk, Tenet offered a chance to participate on a level deeper than the norm. “One of the best parts of working on a Nolan movie,” he reveals, “involves getting into essential aspects of storytelling and filmmaking problem-solving, going well beyond traditional visual effects concerns. We go into Chris’ films with the view of filming as much as possible in-camera. Sometimes those attempts are used as elements for a visual effects shot or even just as reference, and others survive into the final even without enhancement. I come from a practical effects background, having run that kind of business. It is part of my roots, so to speak, and I really love working with the physical effects unit.” Jackson’s SFX counterpart Scott R. Fisher’s ongoing association with Nolan dates back a full decade to Inception. “Collaborating with Scott made for a very successful association,” Jackson reports. “We could talk things out while speaking the same language, then determine which processes might prove viable. Then it might be a matter of getting the art department and stunts, or some

“With Chris being the writer/director/producer he is, you can be sure after doing several shows with him that he knows not only what he wants but also how he would prefer to have things done. He’s very tech-savvy with all aspects of production, and sees VFX as just one tool in the toolbox. He has always been happy with DNEG and likes the idea of avoiding multiple vendors unless something unusual comes up. Organizing early on for a Nolan project starts with knowing the ideal is to get as much in-camera as possible, but then to plan alternate routes that can get us to where we need if in-camera approach doesn’t get us all the way.” —Mike Chambers, Visual Effects Producer

FALL 2020 VFXVOICE.COM • 27


COVER

TOP: John David Washington and Elizabeth Debicki. MIDDLE: Tenet marks a return on filmmaker Christopher Nolan’s part to sci-fi technothriller territory a la his Inception. As is his preference, in-camera effects were employed whenever possible. BOTTOM: Special Effects Supervisor Scott Fisher inspects his department’s bullet hits. Achieving such matters practically on set is less and less common on other productions where the speed of shooting prohibits time to reset and leaves such effects for post. (Photo: Melinda Sue Gordon)

28 • VFXVOICE.COM FALL 2020

departments, in the room for additional input. The synthesis of ideas and experience when you’ve got three or four departments all teaming up is very enjoyable for me to be a part of.” Fisher’s long-held dictum for what he calls ‘special effects 101’ meshed with that of Nolan and Jackson. “You need to think like a magician,” he states. “If you use the same trick over and over, the audience will get wise and figure you out. So things need to keep changing up. Chris obviously feels the same way, so you try to achieve a similar look with different methodologies, which can span from practical effects to CGI to model work. “When you break down a script for a Chris Nolan movie, it’s a different process than when you work on any other film,” he continues. “There are aspects that for anybody else, the default solution today would be to go CG, but that’s not necessarily the case with Chris – almost the opposite is true, which I find refreshing and exciting and really gives me and everyone in my crew a sense of challenge. In this day and age, there is so much effects being done as VFX is the go-to mindset, I couldn’t be in a better spot with a more willing boss than Chris when it comes to practicing my craft properly, getting the time to do all these tests and get things right for the camera.” Fisher learned his craft alongside his father, Thomas L. Fisher, who he assisted on Terminator 2: Judgment Day. “I think part of what puts me in a good spot today relative to some other SFX guys is because of my experience during that era when the work was very well-integrated, before post could be considered viable for delivering enough of the appropriate look,” he observes. “What’s nice now is that the factor of newer technologies can enhance practical effects approaches, letting us raise the bar when doing many of the things we used to do back in the ’90s, but in bigger ways, with much greater control, including repeatability. “On a film like Bumblebee,” Fisher states, “I know my role is going to be more limited, though you still try to offer options to enhance what is achieved on set, or can provide some interactivity that will help the CG integrate. When I did X-Men: First Class, there was a scene with two bullet hits on the beach. Director Matthew Vaughan thought it would take forever to get them done practically and indicated he would leave it for CG. I told him, ‘If you think I can’t rig two bullet hits right away, you should fire me. If you don’t like the results and don’t want take two, I’ll understand, but let me at least try.’ Maybe he had a film where they had some bad luck, or there was a lazy effects guy who didn’t offer up a practical solution, but offering options that save money and time and look better is a big part of the job for me.” Involving Jackson at an early stage allowed him to weigh in and suggest ideas that helped with realizing Nolan’s preference to do the shots in-camera if possible or as a simple composite, only resorting to CG when absolutely necessary. Jackson breaks down the general thrust of pre-production planning as follows. “The hierarchy of work preference is pretty straightforward. Number 1: Get it shot in-camera. Number 2: If you can composite two practically-shot elements with minimal intervention, that’s the next best ideal. Then as you go down the line, there’s always some kind of full-on CG VFX solution – which we do have to utilize a certain

amount of the time – but it isn’t the first solution we look to.” Jackson finds Nolan’s working style unique in the modern film world. “It’s certainly apart from any other high-end Hollywood movie,” he acknowledges, “and it is interesting to me that while he focuses on some aspects with extreme detail, others don’t concern him at all. I found myself worrying about certain bits on Dunkirk, but it turned out his instincts to ignore those were correct, so this time I’m settling in more with thinking the way he does, feeling more tuned into his processes, which has made for a very enjoyable shoot. There are scenes in the film where some of the action is seen happening backwards as the main characters are continuing ahead. It was a major challenge to figure out how to shoot this and get the forwards/backwards parts to live together in the final. That occupied a lot of time for the on-set VFX team, figuring out what we could film [with a camera running in reverse] versus what was impossible and would have to be created in post.” The car chase is a prime example. “There was a technical previs, which used the computer to visualize aspects that would be more difficult to shoot,” Jackson clarifies. “The whole 3D animation toolset is very useful with these shots, as it lets you scroll right through a scene quickly in either forward or reverse to see how those parts fit together while testing the timing and beats of the events in a shot.” More conventional VFX work was used on a scene featuring principals on a sailing yacht sporting a hydrofoil. “There were issues with getting the cast members on an actual working boat, which is the equivalent of being on a Grand Prix racer,” says Jackson. “So we did some shots with the real actors on the boat as it was towed, with no sail on it. Then we added the rigid sail, which was like a wing, as a full CG element. For shots of the real boat with a regular crew operating it at sea, we did face replacements, putting our actors’ features over the guys doing the work. Then, for the opera house, we had to do a comp to put people in the audience beneath an explosion, because it wouldn’t have been safe otherwise. There are films where an issue of safety arises where VFX needs to create a CG character to stand in for the performer, but that is really the exception to the rule with Chris’ films.” Fisher’s end of the opera house scene, which was shot in Estonia, involved extensive testing for blowing out the windows. “It was an unused location, which sounds like a good candidate for film work,” he reports. “But it was also a local landmark, so we couldn’t just do whatever we wanted and just restore it afterwards. We knew there was a significant potential fire hazard, so that meant doing a lot of testing and cleaning, removing dust. Then, for the big jet plane scene, production built a set around an available plane, and our end involved towing it and achieving a control system for starting and stopping it in a repeatable and safe fashion.” While Nolan had initially thought the plane crash would involve model work, that not-quite-lost art did feature in aerial scenes. “We built all those model planes for Dunkirk,” recalls Fisher, “which involved me digging up a lot of guys to do those miniatures and then to fly them with remote control. Nobody ever questions those shots, because they looked good and moved credibly, so going that route again for the Tenet helicopters made good sense.”

TOP: Elizabeth Debicki and Kenneth Branagh. MIDDLE: The car chase required a technical previs in order to work out the various beats and storypoints. DNEG’s 3D animation toolset facilitated planning of forward and backward action for the reversed time moments. BOTTOM: Christopher Nolan, Visual Effects Supervisor Andrew Jackson, Fisher and Production Designer Nathan Crowley block out the plane crash sequence with a tabletop model. (Photo: Melinda Sue Gordon)

FALL 2020 VFXVOICE.COM • 29


COVER

TOP: John David Washington and Elizabeth Debicki. MIDDLE: Tenet marks a return on filmmaker Christopher Nolan’s part to sci-fi technothriller territory a la his Inception. As is his preference, in-camera effects were employed whenever possible. BOTTOM: Special Effects Supervisor Scott Fisher inspects his department’s bullet hits. Achieving such matters practically on set is less and less common on other productions where the speed of shooting prohibits time to reset and leaves such effects for post. (Photo: Melinda Sue Gordon)

28 • VFXVOICE.COM FALL 2020

departments, in the room for additional input. The synthesis of ideas and experience when you’ve got three or four departments all teaming up is very enjoyable for me to be a part of.” Fisher’s long-held dictum for what he calls ‘special effects 101’ meshed with that of Nolan and Jackson. “You need to think like a magician,” he states. “If you use the same trick over and over, the audience will get wise and figure you out. So things need to keep changing up. Chris obviously feels the same way, so you try to achieve a similar look with different methodologies, which can span from practical effects to CGI to model work. “When you break down a script for a Chris Nolan movie, it’s a different process than when you work on any other film,” he continues. “There are aspects that for anybody else, the default solution today would be to go CG, but that’s not necessarily the case with Chris – almost the opposite is true, which I find refreshing and exciting and really gives me and everyone in my crew a sense of challenge. In this day and age, there is so much effects being done as VFX is the go-to mindset, I couldn’t be in a better spot with a more willing boss than Chris when it comes to practicing my craft properly, getting the time to do all these tests and get things right for the camera.” Fisher learned his craft alongside his father, Thomas L. Fisher, who he assisted on Terminator 2: Judgment Day. “I think part of what puts me in a good spot today relative to some other SFX guys is because of my experience during that era when the work was very well-integrated, before post could be considered viable for delivering enough of the appropriate look,” he observes. “What’s nice now is that the factor of newer technologies can enhance practical effects approaches, letting us raise the bar when doing many of the things we used to do back in the ’90s, but in bigger ways, with much greater control, including repeatability. “On a film like Bumblebee,” Fisher states, “I know my role is going to be more limited, though you still try to offer options to enhance what is achieved on set, or can provide some interactivity that will help the CG integrate. When I did X-Men: First Class, there was a scene with two bullet hits on the beach. Director Matthew Vaughan thought it would take forever to get them done practically and indicated he would leave it for CG. I told him, ‘If you think I can’t rig two bullet hits right away, you should fire me. If you don’t like the results and don’t want take two, I’ll understand, but let me at least try.’ Maybe he had a film where they had some bad luck, or there was a lazy effects guy who didn’t offer up a practical solution, but offering options that save money and time and look better is a big part of the job for me.” Involving Jackson at an early stage allowed him to weigh in and suggest ideas that helped with realizing Nolan’s preference to do the shots in-camera if possible or as a simple composite, only resorting to CG when absolutely necessary. Jackson breaks down the general thrust of pre-production planning as follows. “The hierarchy of work preference is pretty straightforward. Number 1: Get it shot in-camera. Number 2: If you can composite two practically-shot elements with minimal intervention, that’s the next best ideal. Then as you go down the line, there’s always some kind of full-on CG VFX solution – which we do have to utilize a certain

amount of the time – but it isn’t the first solution we look to.” Jackson finds Nolan’s working style unique in the modern film world. “It’s certainly apart from any other high-end Hollywood movie,” he acknowledges, “and it is interesting to me that while he focuses on some aspects with extreme detail, others don’t concern him at all. I found myself worrying about certain bits on Dunkirk, but it turned out his instincts to ignore those were correct, so this time I’m settling in more with thinking the way he does, feeling more tuned into his processes, which has made for a very enjoyable shoot. There are scenes in the film where some of the action is seen happening backwards as the main characters are continuing ahead. It was a major challenge to figure out how to shoot this and get the forwards/backwards parts to live together in the final. That occupied a lot of time for the on-set VFX team, figuring out what we could film [with a camera running in reverse] versus what was impossible and would have to be created in post.” The car chase is a prime example. “There was a technical previs, which used the computer to visualize aspects that would be more difficult to shoot,” Jackson clarifies. “The whole 3D animation toolset is very useful with these shots, as it lets you scroll right through a scene quickly in either forward or reverse to see how those parts fit together while testing the timing and beats of the events in a shot.” More conventional VFX work was used on a scene featuring principals on a sailing yacht sporting a hydrofoil. “There were issues with getting the cast members on an actual working boat, which is the equivalent of being on a Grand Prix racer,” says Jackson. “So we did some shots with the real actors on the boat as it was towed, with no sail on it. Then we added the rigid sail, which was like a wing, as a full CG element. For shots of the real boat with a regular crew operating it at sea, we did face replacements, putting our actors’ features over the guys doing the work. Then, for the opera house, we had to do a comp to put people in the audience beneath an explosion, because it wouldn’t have been safe otherwise. There are films where an issue of safety arises where VFX needs to create a CG character to stand in for the performer, but that is really the exception to the rule with Chris’ films.” Fisher’s end of the opera house scene, which was shot in Estonia, involved extensive testing for blowing out the windows. “It was an unused location, which sounds like a good candidate for film work,” he reports. “But it was also a local landmark, so we couldn’t just do whatever we wanted and just restore it afterwards. We knew there was a significant potential fire hazard, so that meant doing a lot of testing and cleaning, removing dust. Then, for the big jet plane scene, production built a set around an available plane, and our end involved towing it and achieving a control system for starting and stopping it in a repeatable and safe fashion.” While Nolan had initially thought the plane crash would involve model work, that not-quite-lost art did feature in aerial scenes. “We built all those model planes for Dunkirk,” recalls Fisher, “which involved me digging up a lot of guys to do those miniatures and then to fly them with remote control. Nobody ever questions those shots, because they looked good and moved credibly, so going that route again for the Tenet helicopters made good sense.”

TOP: Elizabeth Debicki and Kenneth Branagh. MIDDLE: The car chase required a technical previs in order to work out the various beats and storypoints. DNEG’s 3D animation toolset facilitated planning of forward and backward action for the reversed time moments. BOTTOM: Christopher Nolan, Visual Effects Supervisor Andrew Jackson, Fisher and Production Designer Nathan Crowley block out the plane crash sequence with a tabletop model. (Photo: Melinda Sue Gordon)

FALL 2020 VFXVOICE.COM • 29


COVER

“What’s nice now is that the factor of newer technologies can enhance practical effects approaches, letting us raise the bar when doing many of the things we used to do back in the ’90s, but in bigger ways, with much greater control, including repeatability.” —Scott R. Fisher, Special Effects Supervisor

Jackson’s team helped Fisher with the construction of these large-scale models, which – along with the occasional CG copter – augmented production’s four actual Chinooks. “The flying R/C miniatures didn’t wind up in too many shots, but there are some very nice ones,” he states. “I was operating them from another boat as they seemed to be hovering over a real full-sized seagoing vessel. It was all forced perspective, with the models lined up in the foreground.” With a post-production workflow that eschews the digital intermediate process in favor of a photochemically-timed finish, Nolan’s films utilize actual filmouts from VFX vendors. “For his post pipeline, we output the negative, and that gets color-timed and cut in with the rest of the film,” Jackson explains. “Also, Chris prefers to avoid scanning negative except when absolutely necessary. So, as was the case on Dunkirk, we again didn’t use any bluescreen or greenscreen, and didn’t have to remove tracking markers, which is definitely preferable from his point of view. There’s still a fair amount of roto, including incidental cleanup of rigs. For example, that was pretty much all of our end of things on the jumps and ascents done on the building in Mumbai.” [Stunt coordinator George Cottle built a structure on the outside of the building with computer-controlled winches that worked like motion-control to raise and lower the performers.] While the scale of many modern blockbusters can seem daunting, the methods used to achieve these effects can vary the actual impact. “These days, moviegoers can become spoiled by all the spectacle, because there’s literally nothing you can’t create anymore in a big VFX-heavy film,” declares Jackson. “But viewers know things in Nolan films are being done for real. Seeing something done without our typical brand of trickery creates a sense of awe, harkening back to a different era.” Fisher firmly agrees. “Chris has really integrated a sense of showmanship into his brand, so people expect a really big show, and he hasn’t disappointed yet. He gives people a desire and reason to go see movies on a big screen.” Finishing on film has to date been an important ingredient to Nolan’s films. “That’s becoming harder to do given the lack of sustaining infrastructure,” Chambers relates. “He’s a believer in the theatrical cinema-going experience, trying almost single-handedly to keep it alive. The pandemic certainly impacted post for us and how we finish. Not detrimentally – since it didn’t hurt the quality of the final product – but in methodology. The million-dollar question right now is, how are we all going forward?” TOP: Originally conceived as a showcase for miniatures, the plane crash effect was achieved using an actual jet plane, with destroyable buildings fabricated for its impact. (Photo: Melinda Sue Gordon) MIDDLE AND BOTTOM: A large explosion devastates an opera house. Fisher executed various blasts for the interior and exterior views.

30 • VFXVOICE.COM FALL 2020


PG 31 DNEG AD.indd 31

9/4/20 11:42 AM


PROFILE

SHANNAN LOUIS: TAKING ON THE CHALLENGE OF STARTING A VFX STUDIO IN 2020 By IAN FAILES

All images courtesy of FatBelly VFX. Photos by Sean Coonce. TOP: Shannan Louis started FatBelly VFX in early 2020, unaware of the enormous impending impact of COVID-19. The studio was able to continue working from home.

32 • VFXVOICE.COM FALL 2020

When Shannan Louis launched a visual effects outfit in Vancouver at the start of this year, she had no idea what challenges her new business venture would bring. It wasn’t only the challenges of crewing up, gearing up or pitching for work in what is generally considered a tough industry; it was also, of course, battling through the unexpected outbreak of COVID-19. The worldwide coronavirus crisis severely impacted live-action filmmaking and forced VFX studios to have artists work remotely. But Louis’ new studio, FatBelly VFX, was engineered with a ‘working in the cloud’ approach in mind from the beginning, and that enabled it to adapt quickly to the changes that have swept through the industry. Louis shares FatBelly’s and her own VFX origin story with VFX Voice. Louis did not start out in the creative industries. She had an earlier career in the non-profit sector before shifting to film and television, “an area that I had always been passionate about,” she says. It was actually acting that Louis had originally pursued as a potential career, in addition to the areas of producing and directing theater productions. “I had always admired the world of TV and film as a way to create emotions or share a story. Television was definitely a way for me to escape when I was a child. I studied documentary film production in 2013 and continue to have an avid interest and involvement in doc making, particularly editing.” An Australian native, Louis made British Columbia her home and entered the visual effects industry with a role as studio manager at Psyop Film & TV in Vancouver in 2015. “This was back in the early days of the studio, when it had a startup feel and you often wore multiple hats. On any given day, I could be helping out with editorial cuts, or working in a production capacity, in addition to my normal day. For me that was great, providing exposure across the board and a way to learn and absorb hands-on. As the studio grew I was promoted to head of studio operations and my role honed in on higher management and operations including overseeing IT, HR and recruitment.” From Psyop, Louis broadened her VFX studio experience by first experiencing time with FuseFX through a project in a production management position and then moving onto CoSA VFX as Head of Studio. At CoSA, Louis says that “by implementing strategic leadership, establishing targeted hiring and focusing on advancing technology and streamlining work processes, I took the Vancouver studio from being an internal outsource facility to becoming a stand-alone, high-performance studio for the global CoSA company.” While gaining that studio experience was vital, there was an entrepreneurial spirit rising. Louis calls this a “desire to create something unique that hadn’t yet been satisfied.” This set in motion a decision to create a new visual effects company, with a particular focus in mind. “FatBelly VFX focuses on personalized client relationships, engaging and investing in the crew and creating opportunities for women in creative positions,” says Louis.

“Tax credits are available in multiple provinces [in Canada], creating enticing opportunities for international productions. And we are fortunate to have such an incredible breadth of talented and accomplished artists available to us at our fingertips. This doesn’t mean we can rest on our laurels, though. There is always something new to learn and ways to improve. Technology is constantly evolving, and adapting to those changes is how we’ll continue to be leaders in the industry.” —Shannan Louis, Founder and Head of Studio, FatBelly VFX What makes Louis’ bold move to open a visual effects studio this year perhaps even bolder is that she admits she was not really aware of how prevalent VFX in film and television was prior to her first job in the industry. “I think people assume that VFX is only explosions or dragons, but it’s the subtlety of the work that is incredible,” she says. “VFX is everywhere and people aren’t conscious of it. I’m constantly amazed at the work that is created.” “Booming!” is how Louis describes the Canadian visual effects industry, a location which houses many large, medium and smaller-sized studios. It means there is also already a large pool of VFX artists who work here, and in a convenient time zone for Hollywood production. Vancouver, therefore, was where Louis decided to start FatBelly, a name, by the way, that was conceived to match the idea of working to keep one’s belly full. In setting up in Vancouver – where both significant live-action production and post-production occurs – Louis notes that “tax

TOP: FatBelly VFX’s office space on West Pender in downtown Vancouver. BOTTOM: The FatBelly VFX crew re-group following the coronavirus crisis.

FALL 2020 VFXVOICE.COM • 33


PROFILE

SHANNAN LOUIS: TAKING ON THE CHALLENGE OF STARTING A VFX STUDIO IN 2020 By IAN FAILES

All images courtesy of FatBelly VFX. Photos by Sean Coonce. TOP: Shannan Louis started FatBelly VFX in early 2020, unaware of the enormous impending impact of COVID-19. The studio was able to continue working from home.

32 • VFXVOICE.COM FALL 2020

When Shannan Louis launched a visual effects outfit in Vancouver at the start of this year, she had no idea what challenges her new business venture would bring. It wasn’t only the challenges of crewing up, gearing up or pitching for work in what is generally considered a tough industry; it was also, of course, battling through the unexpected outbreak of COVID-19. The worldwide coronavirus crisis severely impacted live-action filmmaking and forced VFX studios to have artists work remotely. But Louis’ new studio, FatBelly VFX, was engineered with a ‘working in the cloud’ approach in mind from the beginning, and that enabled it to adapt quickly to the changes that have swept through the industry. Louis shares FatBelly’s and her own VFX origin story with VFX Voice. Louis did not start out in the creative industries. She had an earlier career in the non-profit sector before shifting to film and television, “an area that I had always been passionate about,” she says. It was actually acting that Louis had originally pursued as a potential career, in addition to the areas of producing and directing theater productions. “I had always admired the world of TV and film as a way to create emotions or share a story. Television was definitely a way for me to escape when I was a child. I studied documentary film production in 2013 and continue to have an avid interest and involvement in doc making, particularly editing.” An Australian native, Louis made British Columbia her home and entered the visual effects industry with a role as studio manager at Psyop Film & TV in Vancouver in 2015. “This was back in the early days of the studio, when it had a startup feel and you often wore multiple hats. On any given day, I could be helping out with editorial cuts, or working in a production capacity, in addition to my normal day. For me that was great, providing exposure across the board and a way to learn and absorb hands-on. As the studio grew I was promoted to head of studio operations and my role honed in on higher management and operations including overseeing IT, HR and recruitment.” From Psyop, Louis broadened her VFX studio experience by first experiencing time with FuseFX through a project in a production management position and then moving onto CoSA VFX as Head of Studio. At CoSA, Louis says that “by implementing strategic leadership, establishing targeted hiring and focusing on advancing technology and streamlining work processes, I took the Vancouver studio from being an internal outsource facility to becoming a stand-alone, high-performance studio for the global CoSA company.” While gaining that studio experience was vital, there was an entrepreneurial spirit rising. Louis calls this a “desire to create something unique that hadn’t yet been satisfied.” This set in motion a decision to create a new visual effects company, with a particular focus in mind. “FatBelly VFX focuses on personalized client relationships, engaging and investing in the crew and creating opportunities for women in creative positions,” says Louis.

“Tax credits are available in multiple provinces [in Canada], creating enticing opportunities for international productions. And we are fortunate to have such an incredible breadth of talented and accomplished artists available to us at our fingertips. This doesn’t mean we can rest on our laurels, though. There is always something new to learn and ways to improve. Technology is constantly evolving, and adapting to those changes is how we’ll continue to be leaders in the industry.” —Shannan Louis, Founder and Head of Studio, FatBelly VFX What makes Louis’ bold move to open a visual effects studio this year perhaps even bolder is that she admits she was not really aware of how prevalent VFX in film and television was prior to her first job in the industry. “I think people assume that VFX is only explosions or dragons, but it’s the subtlety of the work that is incredible,” she says. “VFX is everywhere and people aren’t conscious of it. I’m constantly amazed at the work that is created.” “Booming!” is how Louis describes the Canadian visual effects industry, a location which houses many large, medium and smaller-sized studios. It means there is also already a large pool of VFX artists who work here, and in a convenient time zone for Hollywood production. Vancouver, therefore, was where Louis decided to start FatBelly, a name, by the way, that was conceived to match the idea of working to keep one’s belly full. In setting up in Vancouver – where both significant live-action production and post-production occurs – Louis notes that “tax

TOP: FatBelly VFX’s office space on West Pender in downtown Vancouver. BOTTOM: The FatBelly VFX crew re-group following the coronavirus crisis.

FALL 2020 VFXVOICE.COM • 33


PROFILE

“FatBelly VFX focuses on personalized client relationships, engaging and investing in the crew and creating opportunities for women in creative positions.” —Shannan Louis, Founder and Head of Studio, FatBelly VFX

TOP LEFT: Taylor LeBlanc works on a shot at the FatBelly VFX offices. TOP RIGHT: FatBelly VFX crew member Florian Schuck. BOTTOM: Ben Case talks with FatBelly VFX Founder and Head of Studio Shannan Louis.

34 • VFXVOICE.COM FALL 2020

credits are available in multiple provinces [in Canada], creating enticing opportunities for international productions. And we are fortunate to have such an incredible breadth of talented and accomplished artists available to us at our fingertips. “This doesn’t mean we can rest on our laurels, though,” adds Louis. “There is always something new to learn and ways to improve. Technology is constantly evolving, and adapting to those changes is how we’ll continue to be leaders in the industry.” Indeed, Louis saw technology as a crucial part of her start-up VFX studio. “I wanted to ensure that FatBelly had a cost-effective, scalable and current solution. This is where the cloud platform we implemented through Arch Platform Technologies really shines. Arch is a plug-and-play, cloud-based, highly secure VFX platform.” Arch enabled FatBelly artists to access virtual workstations in the cloud, something that could be utilized both for working from home and still in a studio environment. “As for setup,” continues Louis, “we opened the doors to our satellite office back in February. After fleshing out a unified vision with our core team, and stress testing the pipeline, we moved into our permanent location on West Pender in downtown Vancouver.” On March 17 of this year, a public health emergency was declared in British Columbia, followed by various states of emergency. As with just about all other locations where visual effects are crafted, businesses in Vancouver began remote-working preparation around this time. The impact was obvious. Whereas crews generally spent time in offices together to work on and discuss VFX shots, now they had to do so in a distributed fashion, from home. This brought up issues of access to files, sharing data and security for the brand-new company. “Like many other studios,” attests Louis, “FatBelly was definitely affected by the shutdown. But due to our cloud platform we were able to seamlessly commence working from home immediately, without interruptions. Our focus was, and will always be, the health of the crew and our community. And although there’s no way we could have predicted something like this, the crew embraced the

challenge of the situation with positivity.” So positive, in fact, that Louis notes some of her crew made several innovations during lockdown. “When not working on FatBelly projects, our comp supervisor, Florian Schuck, designed and 3D printed a flight control replica to use for PC flight simulation. And keeping the team connected socially, we delved into the virtual world via games. I’ve got to admit, they even got me slightly addicted to Stardew Valley.” As a young company, FatBelly has been specializing in 2D visual effects, but is not limited to just that kind of work. In terms of the clientele, Louis advises that “given current market trends, we’re currently focused on streaming content. We are also making connections with independent film companies, ad agencies and local Canadian productions. We believe that we can provide a phenomenal service across multiple media and entertainment platforms. “One of our active shows is Another Life, Season 2, for Netflix,” details Louis, noting that the show is about the aftermath of an alien ship that lands on Earth. “We’re working with a great VFX producer and supervisor on the client side and have established an amicable partnership with them.” In these early stages, Louis says one of the most challenging sides of establishing a new company is business development. “Our artists are masters in their domains. However, it’s difficult to showcase their talent in a public way until you have a complete company showreel. That being said, our reputation speaks for itself – the work is coming in and I know it’s just the beginning.” Louis acknowledges she has developed a certain kind of resilience, something that she’s looking to use in navigating the VFX industry as a company owner/manager as well. “I’ve learned so much over my career. Establishing trust and maintaining integrity with clients is imperative, as is transparency with employees. Businesses don’t just thrive on their own. It takes the input of every single employee, from production assistants to producers, roto artists to supervisors, to ensure success.”

TOP LEFT: Adela Baborova at FatBelly’s premises. TOP RIGHT: Michelle Ross composites a shot in Nuke. MIDDLE: Jenna Sunde and Florian Schuck discuss a scene. BOTTOM: Ben Case and Adela Baborova enjoy some down-time at the FatBelly VFX studio.

FALL 2020 VFXVOICE.COM • 35


PROFILE

“FatBelly VFX focuses on personalized client relationships, engaging and investing in the crew and creating opportunities for women in creative positions.” —Shannan Louis, Founder and Head of Studio, FatBelly VFX

TOP LEFT: Taylor LeBlanc works on a shot at the FatBelly VFX offices. TOP RIGHT: FatBelly VFX crew member Florian Schuck. BOTTOM: Ben Case talks with FatBelly VFX Founder and Head of Studio Shannan Louis.

34 • VFXVOICE.COM FALL 2020

credits are available in multiple provinces [in Canada], creating enticing opportunities for international productions. And we are fortunate to have such an incredible breadth of talented and accomplished artists available to us at our fingertips. “This doesn’t mean we can rest on our laurels, though,” adds Louis. “There is always something new to learn and ways to improve. Technology is constantly evolving, and adapting to those changes is how we’ll continue to be leaders in the industry.” Indeed, Louis saw technology as a crucial part of her start-up VFX studio. “I wanted to ensure that FatBelly had a cost-effective, scalable and current solution. This is where the cloud platform we implemented through Arch Platform Technologies really shines. Arch is a plug-and-play, cloud-based, highly secure VFX platform.” Arch enabled FatBelly artists to access virtual workstations in the cloud, something that could be utilized both for working from home and still in a studio environment. “As for setup,” continues Louis, “we opened the doors to our satellite office back in February. After fleshing out a unified vision with our core team, and stress testing the pipeline, we moved into our permanent location on West Pender in downtown Vancouver.” On March 17 of this year, a public health emergency was declared in British Columbia, followed by various states of emergency. As with just about all other locations where visual effects are crafted, businesses in Vancouver began remote-working preparation around this time. The impact was obvious. Whereas crews generally spent time in offices together to work on and discuss VFX shots, now they had to do so in a distributed fashion, from home. This brought up issues of access to files, sharing data and security for the brand-new company. “Like many other studios,” attests Louis, “FatBelly was definitely affected by the shutdown. But due to our cloud platform we were able to seamlessly commence working from home immediately, without interruptions. Our focus was, and will always be, the health of the crew and our community. And although there’s no way we could have predicted something like this, the crew embraced the

challenge of the situation with positivity.” So positive, in fact, that Louis notes some of her crew made several innovations during lockdown. “When not working on FatBelly projects, our comp supervisor, Florian Schuck, designed and 3D printed a flight control replica to use for PC flight simulation. And keeping the team connected socially, we delved into the virtual world via games. I’ve got to admit, they even got me slightly addicted to Stardew Valley.” As a young company, FatBelly has been specializing in 2D visual effects, but is not limited to just that kind of work. In terms of the clientele, Louis advises that “given current market trends, we’re currently focused on streaming content. We are also making connections with independent film companies, ad agencies and local Canadian productions. We believe that we can provide a phenomenal service across multiple media and entertainment platforms. “One of our active shows is Another Life, Season 2, for Netflix,” details Louis, noting that the show is about the aftermath of an alien ship that lands on Earth. “We’re working with a great VFX producer and supervisor on the client side and have established an amicable partnership with them.” In these early stages, Louis says one of the most challenging sides of establishing a new company is business development. “Our artists are masters in their domains. However, it’s difficult to showcase their talent in a public way until you have a complete company showreel. That being said, our reputation speaks for itself – the work is coming in and I know it’s just the beginning.” Louis acknowledges she has developed a certain kind of resilience, something that she’s looking to use in navigating the VFX industry as a company owner/manager as well. “I’ve learned so much over my career. Establishing trust and maintaining integrity with clients is imperative, as is transparency with employees. Businesses don’t just thrive on their own. It takes the input of every single employee, from production assistants to producers, roto artists to supervisors, to ensure success.”

TOP LEFT: Adela Baborova at FatBelly’s premises. TOP RIGHT: Michelle Ross composites a shot in Nuke. MIDDLE: Jenna Sunde and Florian Schuck discuss a scene. BOTTOM: Ben Case and Adela Baborova enjoy some down-time at the FatBelly VFX studio.

FALL 2020 VFXVOICE.COM • 35


FILM

WARRIOR PRINCESS RIDES AGAIN IN LIVE-ACTION MULAN By TREVOR HOGG

Director Niki Caro Images copyright © 2019 Disney Enterprises, Inc.

36 • VFXVOICE.COM FALL 2020

Going from an independent film produced in New Zealand on a budget of $3.5 million to a Hollywood adaptation with a production cost of $200 million, director Niki Caro has kept things in perspective and has retained the same ethos for Mulan, the liveaction re-make of the 1998 Disney animated film, that she had when making her international breakthrough Whale Rider (2002). “I got to work with all of the best tools in the toolbox on this one,” says Caro. “I loved every second of it and being able to stretch my filmmaking to this genre and scale. But the storytelling is exactly the same.” Caro had previously worked with special and visual effects, but nothing at this massive scale before. “No, but, boy, was that latent within me. I was provided an opportunity by Disney to show them my vision, and was given a team of people and previs. The first thing that I did was previs the biggest sequence in the film.” In order to repel the invading Rourans, the Chinese Imperial Army conscripts one male per family, which causes Hua Mulan (Liu Yifei) to take the place of her father by posing as a man. “I was leery of visual effects,” admits Caro. “I’m an in-camera girl, and the film was built more like Lawrence of Arabia than Marvel. We were out in real landscapes, had real horses charging across real battlegrounds, and real actors and stunt people.” Recruited to be the Visual Effects Supervisor on the production was Sean Faden (Power Rangers). “The wonderful thing about Sean as a member of the team,” says Caro, “is that he’s an artist, as all of my heads of departments are. Sean appreciated and embraced

the need for visual effects to be subtle and integrated into the real world. In the visual effects realm, you can do so much, and it was up to me and Sean to get our vendors to pull it back in line with the real cinematic approach of the movie.” About 2,046 visual effects shots were produced by Weta Digital, Sony Pictures Imageworks, Image Engine, Framestore, Crafty Apes and an in-house team. “Grant Major [production designer] and Niki chose parts of New Zealand that could be seen as Chinese landscape, and then anything we wanted to look more [authentically] Chinese, especially all of the shooting in the Imperial City, we went to a backlot in China and shot there for three weeks with second unit,” states Visual Effects Producer Diana Giorgiutti. “This is the only time I’ve had a year in post,” she adds. “Having that extra time for the Phoenix [Mulan’s spiritual guide] was a blessing, because there are versions of the cut that are entirely different to where we landed in the end. The Phoenix is only in 19 shots versus close to 100 of them. The Phoenix now is a much more

OPPOSITE TOP: The rooftop chase was a combination of second unit footage captured at Xiangyang Tangcheng Film and Television Base in China, and the main unit shooting closeups of Mulan (Liu Yifei) against greenscreen at Kumeu Film Studios in New Zealand. TOP: Steam was an important element in depicting a bloodless but visceral battle sequence. MIDDLE LEFT TO RIGHT: Sean Faden, Visual Effects Supervisor Diana Giorgiutti, Visual Effects Producer Anders Langlands, Visual Effects Supervisor, Weta Digital Christian Irles, Visual Effects Supervisor, Image Engine Hubert Maston, Visual Effects Supervisor, Framestore Seth Maury, Visual Effects Supervisor, Sony Pictures Imageworks Rpin Suwannath, Previs Supervisor, Day for Nite Cody Hernandez, Postvis Supervisor, Day for Nite

FALL 2020 VFXVOICE.COM • 37


FILM

WARRIOR PRINCESS RIDES AGAIN IN LIVE-ACTION MULAN By TREVOR HOGG

Director Niki Caro Images copyright © 2019 Disney Enterprises, Inc.

36 • VFXVOICE.COM FALL 2020

Going from an independent film produced in New Zealand on a budget of $3.5 million to a Hollywood adaptation with a production cost of $200 million, director Niki Caro has kept things in perspective and has retained the same ethos for Mulan, the liveaction re-make of the 1998 Disney animated film, that she had when making her international breakthrough Whale Rider (2002). “I got to work with all of the best tools in the toolbox on this one,” says Caro. “I loved every second of it and being able to stretch my filmmaking to this genre and scale. But the storytelling is exactly the same.” Caro had previously worked with special and visual effects, but nothing at this massive scale before. “No, but, boy, was that latent within me. I was provided an opportunity by Disney to show them my vision, and was given a team of people and previs. The first thing that I did was previs the biggest sequence in the film.” In order to repel the invading Rourans, the Chinese Imperial Army conscripts one male per family, which causes Hua Mulan (Liu Yifei) to take the place of her father by posing as a man. “I was leery of visual effects,” admits Caro. “I’m an in-camera girl, and the film was built more like Lawrence of Arabia than Marvel. We were out in real landscapes, had real horses charging across real battlegrounds, and real actors and stunt people.” Recruited to be the Visual Effects Supervisor on the production was Sean Faden (Power Rangers). “The wonderful thing about Sean as a member of the team,” says Caro, “is that he’s an artist, as all of my heads of departments are. Sean appreciated and embraced

the need for visual effects to be subtle and integrated into the real world. In the visual effects realm, you can do so much, and it was up to me and Sean to get our vendors to pull it back in line with the real cinematic approach of the movie.” About 2,046 visual effects shots were produced by Weta Digital, Sony Pictures Imageworks, Image Engine, Framestore, Crafty Apes and an in-house team. “Grant Major [production designer] and Niki chose parts of New Zealand that could be seen as Chinese landscape, and then anything we wanted to look more [authentically] Chinese, especially all of the shooting in the Imperial City, we went to a backlot in China and shot there for three weeks with second unit,” states Visual Effects Producer Diana Giorgiutti. “This is the only time I’ve had a year in post,” she adds. “Having that extra time for the Phoenix [Mulan’s spiritual guide] was a blessing, because there are versions of the cut that are entirely different to where we landed in the end. The Phoenix is only in 19 shots versus close to 100 of them. The Phoenix now is a much more

OPPOSITE TOP: The rooftop chase was a combination of second unit footage captured at Xiangyang Tangcheng Film and Television Base in China, and the main unit shooting closeups of Mulan (Liu Yifei) against greenscreen at Kumeu Film Studios in New Zealand. TOP: Steam was an important element in depicting a bloodless but visceral battle sequence. MIDDLE LEFT TO RIGHT: Sean Faden, Visual Effects Supervisor Diana Giorgiutti, Visual Effects Producer Anders Langlands, Visual Effects Supervisor, Weta Digital Christian Irles, Visual Effects Supervisor, Image Engine Hubert Maston, Visual Effects Supervisor, Framestore Seth Maury, Visual Effects Supervisor, Sony Pictures Imageworks Rpin Suwannath, Previs Supervisor, Day for Nite Cody Hernandez, Postvis Supervisor, Day for Nite

FALL 2020 VFXVOICE.COM • 37


FILM

“I was leery of visual effects. I’m an in-camera girl, and the film was built more like Lawrence of Arabia than Marvel. We were out in real landscapes, had real horses charging across real battlegrounds, and real actors and stunt people.” —Niki Caro, Director TOP: A gully was created by Sony Pictures Imageworks to absorb most of the impact of the avalanche as well as help with the integration of the Double Cone Mountains with the Ahuriri Valley. BOTTOM: Weta Digital replaced most of the dirty stairs and added a full CG palace and city around the young soldier.

38 • VFXVOICE.COM FALL 2020

mythical creature that guides Mulan, which is right. Originally, it was much more comic.” In wuxia martial arts movie tradition, Böri Khan (Jason Scott Lee) and the Rouran soldiers leap off of their horses and run up the wall of the desert garrison. “There was a lot of wire rigs work to get people up on the walls,” notes Faden. “We settled on a 20-degree slope that they ran up against a greenscreen and ended up rolling our world around them. We did a bunch of takes, but the winner was the one that had the best stunt performance for Böri Khan and the guy right in front of him. The other two guys in the back, we ended up doing some adjustments on them. Because they were literally jumping off of moving sleds onto a partial wall, most of that shot was CG aside from their performance. We replaced the wall because it wasn’t as detailed as our digital asset. We ended up doing a face replacement for Jason Scott Lee as well.” A massive environment build was the Imperial City based on Chang’an, which was one of largest cities in the world during the 6th century. “We looked at historical maps of the city to figure out size and shape, how the different districts were broken down into individual wards and how those were populated with buildings, and where the richer and poorer areas would have been,” explains Weta Digital Visual Effects Supervisor Anders Langlands. “We looked at things like where high-ranking government officials were living, and deciding that’s where the richer parts of the city were and populating those with bigger and more refined buildings. All of our buildings had a poor, middle-class and rich version. Then we would use those to construct compounds that they sat in, at different sizes, based on how wealthy we assumed the area to be, and that fit into the procedural layout system that our layout team built in Houdini, which was able to quickly fill the whole city with buildings and put walls around the wards, all laid out according to this plan based on the structure of the real city. The system was able to adapt to things, like if you wanted to change the path of the river then it could re-populate the buildings automatically based on where the river was going or if we wanted to extend the city.” Not all of the visual effects were epic in scope, such as the matchmaker scene when a tiny spider interrupts a tea ceremony with comedic results. “The spider was the first sequence that we had to deliver in entirety for [biennial Disney exposition event] D23 last August,” remarks Image Engine Visual Effects Supervisor Christian Irles. “It helped us get our act together and get everything working. Sean spent tons of time prevising and postvising every single shot, so by the time we got the sequence, we had a good indication of the starting and end points of the spider on every single shot. The way we approached all of those shots was using the postvis as a reference for the onscreen position of the spider. Match move every shot, run it through layouts to make sure it was in the right position in the environment, and hand it to animation to make sure they added a nice polishing and character to the spider.” Framestore was responsible for most of the training sequence, which features Mulan going up a mountain with buckets filled with water on a stick. “That was a lot of world-building trying to create this gigantic wall that is around her,” admits Framestore Visual

Effects Supervisor Hubert Maston. “One of the things that was always a priority for Niki was everything had to look epic. You have to have the feeling that you have thousands of soldiers coming in to train in that camp. They had to be integrated into a large-scale environment and at the same time not constantly be a small speck of color on the screen. We had to make sure that nothing in the background or midground or around Mulan distracted from her. The live element that we had was mostly around the characters. We used a lot of drone footage for the far backgrounds and matte paintings.” Transporting Mulan in and out of battle is her trusted steed Black Wind. “Sean told me when we first met, ‘I like to do the ‘Pepsi Challenge.’ You put your asset right next to the real thing and if we can’t tell the difference then we’re good to go,’” remarks Sony Pictures Imageworks Visual Effects Supervisor Seth Maury. “There was some second unit footage of a stunt rider on the Black Wind horse and we used that. We rotomated that horse and animated a run cycle to match with what that horse was doing. We have a new hair system called Fiber, which is essentially a more interactive hair placement tool where you can put down guide curves and see the result of all of the other hair in real-time. We chose slow-motion 6K footage of Black Wind running full speed down a canyon and matched the lighting. It went over well. The only way that you could tell it wasn’t a practical horse is that ours didn’t have a rider because we never built the CG Mulan. We had some new in-house tools to do muscle simulations with and great CFX to simulate the saddle and reins. We paid a lot of attention to the small details.” Previs and postvis were completed by Day for Nite. “The nice thing about this one is the previs process was director-driven,” states Day for Nite Previs Supervisor Rpin Suwannath. “We would do the previs and have a review with Niki and all of the department heads. It wasn’t like we were trying to anticipate or guess what Niki wanted. She would be there to field all of the questions.” The lower portion of the bamboo palace was an outdoor set, while the top section was shot on a soundstage. “You always have to think about where they are spinning on the beam because the sun is only one way in the world,” remarks Day for Nite Postvis Supervisor Cody Hernandez. “For me that was a challenge because I had to make sure that the artists were constantly talking to each other about where their shot was and the next shot was so that the sun placement on certain posts were in the right direction. We ended up redoing that sequence three or four times completely.” ‘Cloth Fu’ moments happen when the sleeves of the Witch (Gong Li) grab hold of an opponent. “Because her outfit was designed with big sleeves, stunts saw that and thought, ‘How can we use this?’” recalls Giorgiutti. “They had plastic bags, cloth and dust flipping around with basic After Effects work to pitch it all. Niki loved the idea. That was never in the original script.” Special effects led by Steve Ingram (Ghost in the Shell) made significant contributions with the avalanche sequence by building a horse rig for Mulan and a blue conveyor belt for when Honghui (Yoson An) gets swept away by an ice chunk flow. “We had to design the avalanche so it would lose some of its deadliness before

“[Visual Effects Supervisor] Sean [Faden] told me when we first met, ‘I like to do the ‘Pepsi Challenge.’ You put your asset right next to the real thing and if we can’t tell the difference then we’re good to go.’” —Seth Maury, Visual Effects Supervisor, Sony Pictures Imageworks

TOP: Along with being responsible for recreating the Imperial City, Weta Digital produced the CG hawk into which the Witch (Gong Li) transforms. BOTTOM: The battle sequence which occurs in the middle of the movie was the focus of a 14-minute pitchvis presented by filmmaker Niki Caro to Disney studio executives.

FALL 2020 VFXVOICE.COM • 39


FILM

“I was leery of visual effects. I’m an in-camera girl, and the film was built more like Lawrence of Arabia than Marvel. We were out in real landscapes, had real horses charging across real battlegrounds, and real actors and stunt people.” —Niki Caro, Director TOP: A gully was created by Sony Pictures Imageworks to absorb most of the impact of the avalanche as well as help with the integration of the Double Cone Mountains with the Ahuriri Valley. BOTTOM: Weta Digital replaced most of the dirty stairs and added a full CG palace and city around the young soldier.

38 • VFXVOICE.COM FALL 2020

mythical creature that guides Mulan, which is right. Originally, it was much more comic.” In wuxia martial arts movie tradition, Böri Khan (Jason Scott Lee) and the Rouran soldiers leap off of their horses and run up the wall of the desert garrison. “There was a lot of wire rigs work to get people up on the walls,” notes Faden. “We settled on a 20-degree slope that they ran up against a greenscreen and ended up rolling our world around them. We did a bunch of takes, but the winner was the one that had the best stunt performance for Böri Khan and the guy right in front of him. The other two guys in the back, we ended up doing some adjustments on them. Because they were literally jumping off of moving sleds onto a partial wall, most of that shot was CG aside from their performance. We replaced the wall because it wasn’t as detailed as our digital asset. We ended up doing a face replacement for Jason Scott Lee as well.” A massive environment build was the Imperial City based on Chang’an, which was one of largest cities in the world during the 6th century. “We looked at historical maps of the city to figure out size and shape, how the different districts were broken down into individual wards and how those were populated with buildings, and where the richer and poorer areas would have been,” explains Weta Digital Visual Effects Supervisor Anders Langlands. “We looked at things like where high-ranking government officials were living, and deciding that’s where the richer parts of the city were and populating those with bigger and more refined buildings. All of our buildings had a poor, middle-class and rich version. Then we would use those to construct compounds that they sat in, at different sizes, based on how wealthy we assumed the area to be, and that fit into the procedural layout system that our layout team built in Houdini, which was able to quickly fill the whole city with buildings and put walls around the wards, all laid out according to this plan based on the structure of the real city. The system was able to adapt to things, like if you wanted to change the path of the river then it could re-populate the buildings automatically based on where the river was going or if we wanted to extend the city.” Not all of the visual effects were epic in scope, such as the matchmaker scene when a tiny spider interrupts a tea ceremony with comedic results. “The spider was the first sequence that we had to deliver in entirety for [biennial Disney exposition event] D23 last August,” remarks Image Engine Visual Effects Supervisor Christian Irles. “It helped us get our act together and get everything working. Sean spent tons of time prevising and postvising every single shot, so by the time we got the sequence, we had a good indication of the starting and end points of the spider on every single shot. The way we approached all of those shots was using the postvis as a reference for the onscreen position of the spider. Match move every shot, run it through layouts to make sure it was in the right position in the environment, and hand it to animation to make sure they added a nice polishing and character to the spider.” Framestore was responsible for most of the training sequence, which features Mulan going up a mountain with buckets filled with water on a stick. “That was a lot of world-building trying to create this gigantic wall that is around her,” admits Framestore Visual

Effects Supervisor Hubert Maston. “One of the things that was always a priority for Niki was everything had to look epic. You have to have the feeling that you have thousands of soldiers coming in to train in that camp. They had to be integrated into a large-scale environment and at the same time not constantly be a small speck of color on the screen. We had to make sure that nothing in the background or midground or around Mulan distracted from her. The live element that we had was mostly around the characters. We used a lot of drone footage for the far backgrounds and matte paintings.” Transporting Mulan in and out of battle is her trusted steed Black Wind. “Sean told me when we first met, ‘I like to do the ‘Pepsi Challenge.’ You put your asset right next to the real thing and if we can’t tell the difference then we’re good to go,’” remarks Sony Pictures Imageworks Visual Effects Supervisor Seth Maury. “There was some second unit footage of a stunt rider on the Black Wind horse and we used that. We rotomated that horse and animated a run cycle to match with what that horse was doing. We have a new hair system called Fiber, which is essentially a more interactive hair placement tool where you can put down guide curves and see the result of all of the other hair in real-time. We chose slow-motion 6K footage of Black Wind running full speed down a canyon and matched the lighting. It went over well. The only way that you could tell it wasn’t a practical horse is that ours didn’t have a rider because we never built the CG Mulan. We had some new in-house tools to do muscle simulations with and great CFX to simulate the saddle and reins. We paid a lot of attention to the small details.” Previs and postvis were completed by Day for Nite. “The nice thing about this one is the previs process was director-driven,” states Day for Nite Previs Supervisor Rpin Suwannath. “We would do the previs and have a review with Niki and all of the department heads. It wasn’t like we were trying to anticipate or guess what Niki wanted. She would be there to field all of the questions.” The lower portion of the bamboo palace was an outdoor set, while the top section was shot on a soundstage. “You always have to think about where they are spinning on the beam because the sun is only one way in the world,” remarks Day for Nite Postvis Supervisor Cody Hernandez. “For me that was a challenge because I had to make sure that the artists were constantly talking to each other about where their shot was and the next shot was so that the sun placement on certain posts were in the right direction. We ended up redoing that sequence three or four times completely.” ‘Cloth Fu’ moments happen when the sleeves of the Witch (Gong Li) grab hold of an opponent. “Because her outfit was designed with big sleeves, stunts saw that and thought, ‘How can we use this?’” recalls Giorgiutti. “They had plastic bags, cloth and dust flipping around with basic After Effects work to pitch it all. Niki loved the idea. That was never in the original script.” Special effects led by Steve Ingram (Ghost in the Shell) made significant contributions with the avalanche sequence by building a horse rig for Mulan and a blue conveyor belt for when Honghui (Yoson An) gets swept away by an ice chunk flow. “We had to design the avalanche so it would lose some of its deadliness before

“[Visual Effects Supervisor] Sean [Faden] told me when we first met, ‘I like to do the ‘Pepsi Challenge.’ You put your asset right next to the real thing and if we can’t tell the difference then we’re good to go.’” —Seth Maury, Visual Effects Supervisor, Sony Pictures Imageworks

TOP: Along with being responsible for recreating the Imperial City, Weta Digital produced the CG hawk into which the Witch (Gong Li) transforms. BOTTOM: The battle sequence which occurs in the middle of the movie was the focus of a 14-minute pitchvis presented by filmmaker Niki Caro to Disney studio executives.

FALL 2020 VFXVOICE.COM • 39


FILM

“The biggest challenge on this was finding a tone with the visual effects that could support the grand scope of the photography and expand it when necessary. A long time was spent gathering photography and drone reference in China that are actual shots in the movie. We would enhance the China footage to make it feel more period-correct and give it the expanse that it didn’t have in the original photography.” —Sean Faden, Visual Effects Supervisor it got to the Chinese army that was still running away from it,” reveals Faden. “The solution for that was to create a big gully at the base of the hill that it descends down. When the avalanche reaches the bottom, it shoots up and a lot of that energy is dissipated upward, but there is still flow of ice and debris that has momentum and is pushing forward.” A particular scene always gets Giorgiutti teary-eyed. “There’s a moment where Mulan realizes that truth is more important and strips off all of her armor as she rides down a canyon on Black Wind back into battle. Mulan doesn’t care about the fact that she is probably going to be expelled from the army but wants to support her comrades. It is amazing.” Making sure that the visual effects furthered rather than distracted from the storytelling was critical. “The biggest challenge on this was finding a tone with the visual effects that could support the grand scope of the photography and expand it when necessary,” notes Faden. “A long time was spent gathering photography and drone reference in China that are actual shots in the movie. We would enhance the China footage to make it feel more periodcorrect and give it the expanse that it didn’t have in the original photography. Visual effects wouldn’t have been able to get through what we had to get through without having the trust of Niki and Mandy [Walker].”

TOP TO BOTTOM: Mulan (Liu Yifei) takes the place of her conscripted father in the Imperial Army by posing as his son. A soft and fluffy snow was settled upon by Sony Pictures Imageworks for the avalanche sequence. Böri Khan (Jason Scott Lee) leads the Shadow Warriors against the Imperial Army. One of the things that was always a priority for filmmaker Niki Caro was that everything had to look epic. Framestore was responsible for doing set extensions and increasing the number of soldiers for the training sequences. The hometown of Mulan (Liu Yifei) was a combination of footage shot at Xiaochun Tulou, China, and set footage captured in New Zealand.

40 • VFXVOICE.COM FALL 2020



TV/STREAMING

NEW VISUAL VISTAS PROPEL THE EXPANSE INTO HIGHER ORBIT By TREVOR HOGG

Images courtesy of Amazon Studios. TOP: From left, Rocinante crewmates Alex Kamal (Cas Anvar), Amos Burton (Wes Chatham), Naomi Nagata (Dominique Tipper) and Jim Holden (Steven Strait) land on Ilus.

42 • VFXVOICE.COM FALL 2020

Human colonization of the solar system has led to a mining colony in an asteroid belt being caught in an escalating political and military conflict between Earth and Mars. Amidst this hostile environment is a conspiracy to weaponize an alien pathogen that has a mind of its own. The Expanse spans eight novels, three short stories and five novellas authored by Daniel Abraham and Ty Franck under the pen name of James S. A. Corey. When the Syfy Network decided not to renew the television adaptation of the science fiction mystery series created by Mark Fergus and Hawk Ostby, Amazon Studios came to the rescue of fans to release a fourth season and is currently producing a fifth. “There were some changes in terms of how we delivered things such as switching to HDR, which was a big plus for us,” notes Executive Producer and Showrunner Naren Shankar. “We were able to develop some much more interesting looks in Season 4, such as all of the stuff on the planet [of Ilus] being in anamorphic 2:39:1.” “One switch that did happen over the course of the seasons was a reliance on Redshift and other graphic engine renderers that allow us a lot more iterations in a much more efficient way,” remarks Senior Visual Effects Supervisor Bret Culp. “We’re constantly improving everything. We went back to one of our big locations from Seasons 2 and 3, and gave it a big Season 5 upgrade by putting a lot more detail in the geometry.” An extensive amount of bluescreen is utilized to achieve the necessary size and scale. “A whole section of the biggest stage that we use is cordoned off as a pre-walled bluescreen stage. We shoot a lot of the wire stuff there. There are so many sets on this show, we put them on wheels and roll them in and out, like a Tetris game. The height of the Martian construction dome from Season 4 would be a third higher than the CN Tower, so we are clearly building a small portion of that and extending.” A director is responsible for a pair of consecutive episodes, resulting in even and odd blocks which are divided between two

visual effects teams. Visual Effects Supervisor Robert Crowther and Visual Effects Producer Sarah Wormsbecher are responsible for the even blocks, while Visual Effects Producer Krista Allain focuses on the odd blocks. “The books are written with a scientific angle, and Naren has a background in that as well,” states Culp. “We definitely lean heavily on the science and have embraced it.” A multi-Oscar-winning sci-fi thriller by filmmaker Alfonso Cuarón was also a source of inspiration. “We did talk a lot about Gravity in the early years because that was a great model for what we were

TOP: David Strathairn portrays Commander Klaes Ashford, who returns in Season 4 as the second-in-command of the Behemoth. BOTTOM: Cas Anvar portrays veteran Martian Navy pilot Alex Kamal, who is responsible for flying the Rocinante.

FALL 2020 VFXVOICE.COM • 43


TV/STREAMING

NEW VISUAL VISTAS PROPEL THE EXPANSE INTO HIGHER ORBIT By TREVOR HOGG

Images courtesy of Amazon Studios. TOP: From left, Rocinante crewmates Alex Kamal (Cas Anvar), Amos Burton (Wes Chatham), Naomi Nagata (Dominique Tipper) and Jim Holden (Steven Strait) land on Ilus.

42 • VFXVOICE.COM FALL 2020

Human colonization of the solar system has led to a mining colony in an asteroid belt being caught in an escalating political and military conflict between Earth and Mars. Amidst this hostile environment is a conspiracy to weaponize an alien pathogen that has a mind of its own. The Expanse spans eight novels, three short stories and five novellas authored by Daniel Abraham and Ty Franck under the pen name of James S. A. Corey. When the Syfy Network decided not to renew the television adaptation of the science fiction mystery series created by Mark Fergus and Hawk Ostby, Amazon Studios came to the rescue of fans to release a fourth season and is currently producing a fifth. “There were some changes in terms of how we delivered things such as switching to HDR, which was a big plus for us,” notes Executive Producer and Showrunner Naren Shankar. “We were able to develop some much more interesting looks in Season 4, such as all of the stuff on the planet [of Ilus] being in anamorphic 2:39:1.” “One switch that did happen over the course of the seasons was a reliance on Redshift and other graphic engine renderers that allow us a lot more iterations in a much more efficient way,” remarks Senior Visual Effects Supervisor Bret Culp. “We’re constantly improving everything. We went back to one of our big locations from Seasons 2 and 3, and gave it a big Season 5 upgrade by putting a lot more detail in the geometry.” An extensive amount of bluescreen is utilized to achieve the necessary size and scale. “A whole section of the biggest stage that we use is cordoned off as a pre-walled bluescreen stage. We shoot a lot of the wire stuff there. There are so many sets on this show, we put them on wheels and roll them in and out, like a Tetris game. The height of the Martian construction dome from Season 4 would be a third higher than the CN Tower, so we are clearly building a small portion of that and extending.” A director is responsible for a pair of consecutive episodes, resulting in even and odd blocks which are divided between two

visual effects teams. Visual Effects Supervisor Robert Crowther and Visual Effects Producer Sarah Wormsbecher are responsible for the even blocks, while Visual Effects Producer Krista Allain focuses on the odd blocks. “The books are written with a scientific angle, and Naren has a background in that as well,” states Culp. “We definitely lean heavily on the science and have embraced it.” A multi-Oscar-winning sci-fi thriller by filmmaker Alfonso Cuarón was also a source of inspiration. “We did talk a lot about Gravity in the early years because that was a great model for what we were

TOP: David Strathairn portrays Commander Klaes Ashford, who returns in Season 4 as the second-in-command of the Behemoth. BOTTOM: Cas Anvar portrays veteran Martian Navy pilot Alex Kamal, who is responsible for flying the Rocinante.

FALL 2020 VFXVOICE.COM • 43


TV/STREAMING

trying to achieve,” remarks Shankar. “The Expanse wasn’t going to be Star Trek and we didn’t want to do World War II fighter planes in the Pacific. We wanted these ships to move like real ships and to have the space battles be real.” Season 4 had around 2,400 visual effects shots across the 10 episodes created by Spin VFX, Rocket Science VFX, Krow VFX, Mavericks VFX, MARZ, Torpedo Pictures, Deluxe VFX, Switch VFX and Playfight. “Luckily, we have a group of incredible vendors, ones that have been with us for almost the entire duration of the series,” remarks Allain. “They work closely with us to manage the schedule. A lot of shots start much earlier than the wrap of production.” Sharing of assets and shots among the vendors is part of the post-production process. “We usually allocate the work based on the skillset and background of a vendor,” notes Wormsbecher. “You get to work with a vendor on a certain type of effect or shot for this show, and then it makes sense to go to them for that specific thing. There are some rare cases where maybe three or four vendors might touch one shot.” The Expanse has its own dedicated team of concept artists. “We actually start our concept art team with the writers’ room, so by the time we get to prep for the beginning of the season we have a lot of environments already created partially because there are so many of them in the show,” remarks Culp. “The lead time on getting these done is fairly long because they’re very detailed.” There is also a practical component that assists with shooting scenes. “I find it helpful on set to have something that the director can look at and the actors have an idea, rather than flaying around or trying

44 • VFXVOICE.COM FALL 2020

to imagine what’s going to be there,” states Crowther. “The entire space battle this year was previs so the director would know the kinds of movements the ships were going to be making and the actors could react appropriately.” The protomolecule, an alien pathogen, has taken on many forms including that of a deceased detective from Ceres Station. “Season 3 was about the hybrid weapons, and in Season 4 a whole planet was based on it and there was Proto-Miller,” remarks Culp. “The protomolecule is using Joe Miller’s (Thomas Jane) essence to do its work. It was interesting bringing Miller back, but it wasn’t really Miller. There were ways that we manipulated the image in order to show that, such as using a glitchy effect. It manifests itself in everything from Proto-Miller, hybrids and the giant structures on Ilus that become awakened. It’s constant evolution and we’re not done yet.” Abrupt glitches occur in the image of Joe Miller to emphasize his ongoing battle with the protomolecule. “I was inspired by an effect that David Lynch did with the Twin Peaks rebirth. It was a back-and-forth time thing that I had done years before using a SideFX software called Tima. Where the image in the alpha channel was white, you might see four frames in the future where it’s black. You might be four frames in the past and middle grey would be the current time. You can get into some interesting ideas of manipulating the time within a single image. It ended up becoming more of an editorial thing in many cases than that time filter effect.” Abstract state-of-consciousness moments between Miller and the Captain of the Rocinante, James Holden (Steven Strait),

“We did talk a lot about Gravity in the early years because that was a great model for what we were trying to achieve. The Expanse wasn’t going to be Star Trek and we didn’t want to do World War II fighter planes in the Pacific. We wanted these ships to move like real ships and to have the space battles be real.” —Naren Shankar, Executive Producer/Showrunner

OPPOSITE TOP: Dominique Tipper portrays talented Belter engineer Naomi Nagata, who has never set foot on a planet before arriving on Ilus. OPPOSITE BOTTOM: The previs, bluescreen plate photography and final composite of the tether spacewalk between the Rocinante and Barbapiccola above Ilus. TOP: Shohreh Aghdashloo gets ready on set for her role of Her Excellency, Chrisjen Avasarala. Secretary-General of the United Nations.

FALL 2020 VFXVOICE.COM • 45


TV/STREAMING

trying to achieve,” remarks Shankar. “The Expanse wasn’t going to be Star Trek and we didn’t want to do World War II fighter planes in the Pacific. We wanted these ships to move like real ships and to have the space battles be real.” Season 4 had around 2,400 visual effects shots across the 10 episodes created by Spin VFX, Rocket Science VFX, Krow VFX, Mavericks VFX, MARZ, Torpedo Pictures, Deluxe VFX, Switch VFX and Playfight. “Luckily, we have a group of incredible vendors, ones that have been with us for almost the entire duration of the series,” remarks Allain. “They work closely with us to manage the schedule. A lot of shots start much earlier than the wrap of production.” Sharing of assets and shots among the vendors is part of the post-production process. “We usually allocate the work based on the skillset and background of a vendor,” notes Wormsbecher. “You get to work with a vendor on a certain type of effect or shot for this show, and then it makes sense to go to them for that specific thing. There are some rare cases where maybe three or four vendors might touch one shot.” The Expanse has its own dedicated team of concept artists. “We actually start our concept art team with the writers’ room, so by the time we get to prep for the beginning of the season we have a lot of environments already created partially because there are so many of them in the show,” remarks Culp. “The lead time on getting these done is fairly long because they’re very detailed.” There is also a practical component that assists with shooting scenes. “I find it helpful on set to have something that the director can look at and the actors have an idea, rather than flaying around or trying

44 • VFXVOICE.COM FALL 2020

to imagine what’s going to be there,” states Crowther. “The entire space battle this year was previs so the director would know the kinds of movements the ships were going to be making and the actors could react appropriately.” The protomolecule, an alien pathogen, has taken on many forms including that of a deceased detective from Ceres Station. “Season 3 was about the hybrid weapons, and in Season 4 a whole planet was based on it and there was Proto-Miller,” remarks Culp. “The protomolecule is using Joe Miller’s (Thomas Jane) essence to do its work. It was interesting bringing Miller back, but it wasn’t really Miller. There were ways that we manipulated the image in order to show that, such as using a glitchy effect. It manifests itself in everything from Proto-Miller, hybrids and the giant structures on Ilus that become awakened. It’s constant evolution and we’re not done yet.” Abrupt glitches occur in the image of Joe Miller to emphasize his ongoing battle with the protomolecule. “I was inspired by an effect that David Lynch did with the Twin Peaks rebirth. It was a back-and-forth time thing that I had done years before using a SideFX software called Tima. Where the image in the alpha channel was white, you might see four frames in the future where it’s black. You might be four frames in the past and middle grey would be the current time. You can get into some interesting ideas of manipulating the time within a single image. It ended up becoming more of an editorial thing in many cases than that time filter effect.” Abstract state-of-consciousness moments between Miller and the Captain of the Rocinante, James Holden (Steven Strait),

“We did talk a lot about Gravity in the early years because that was a great model for what we were trying to achieve. The Expanse wasn’t going to be Star Trek and we didn’t want to do World War II fighter planes in the Pacific. We wanted these ships to move like real ships and to have the space battles be real.” —Naren Shankar, Executive Producer/Showrunner

OPPOSITE TOP: Dominique Tipper portrays talented Belter engineer Naomi Nagata, who has never set foot on a planet before arriving on Ilus. OPPOSITE BOTTOM: The previs, bluescreen plate photography and final composite of the tether spacewalk between the Rocinante and Barbapiccola above Ilus. TOP: Shohreh Aghdashloo gets ready on set for her role of Her Excellency, Chrisjen Avasarala. Secretary-General of the United Nations.

FALL 2020 VFXVOICE.COM • 45


TV/STREAMING

The Expanse of Special Effects

“[The Rocinante landing on the surface of Ilus] was so emotional because this is the first time that we see the Rocinante landing in an atmosphere environment, and Naomi Nagata (Dominique Tipper) has never set foot on a planet before. It felt like all of the seasons were leading up to that moment and the visuals did not disappoint.” —Krista Allain, Visual Effects Producer

TOP TO BOTTOM: The concept that there is a one-kilometerlong space tether between the Rocinante and Barbapiccola, with the later catching drag, was tough to imagine. OPPOSITE TOP TO BOTOM: For the tsunami on Ilus, special effects built the set inside an on-set pool and fired off water canons and dump tanks.

46 • VFXVOICE.COM FALL 2020

originated from the source material. “In Season 4 particularly, we actually get inside Miller’s head and have a protomolecule POV for the first time and see how it views the world,” states Shankar. “The relationship of Miller and his protomolecule version with Holden is one of the big engines that drive the story.” Episode 409 introduces the Miller-Bot, where Miller projects himself inside of a machine and uses it as a tool to move around a room. “My initial idea was of these little learning robot things where you can bend their legs up, so instead of having four or six legs moving across the table, they only have two,” remarks Culp. “How do they adapt and learn to move out of necessity?” The Miller-Bot’s ability to speak was inspired by Chladni Plates. “The Miller-Bot was created by these main plates that were used to move around and covered with these fine cilia-like metal structures that Miller would vibrate in order to talk. As that vibration happened, we would plug in clips that would interpret the actual sound of Thomas Jane speaking as Miller into a Rorschach resonance pattern that would then flip up the little cilia. That was a cool scene to do.” A major new setting for Season 4 is the Earth-like planet of Ilus, consisting of one large continent and thousands of islands scattered throughout a giant ocean. “Every season we’re world-building and it feels like a brand-new series,” remarks Allain. “Ilus was a challenge but that was the joy of it too. We shot in a cold quarry out near Hamilton [Ontario] and used a lot of practical locations that were enhanced. During our many discussions in prep we talked about what Ilus look liked. What color are plants? What vegetation is there? Are there animals? What we ended up landing upon is that there would be no tall trees. We had to remove or reduce the treeline in hundreds of shots. That’s something the average viewer would have no idea about. However, those tiny details help to make it feel real.” “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like de-orbiting maneuvers where you slow going forward to be able to drop,” states Culp. “We’ve done a good job, and as a result it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].” The scene of the Rocinante landing on the surface of Ilus is a personal favorite of Allain’s. “It was so emotional because this is the first time that we see the Rocinante landing in an atmosphere environment, and Naomi Nagata (Dominique Tipper) has never set foot on a planet before. It felt like all of the seasons were leading up to that moment

A member of The Expanse production team from day one is Special Effects Coordinator Tim Barraball. “Since we’ve been doing The Expanse for so many years, there is a real family feeling to the show that makes making it that much more enjoyable. There are a few more things that visual effects do now than we did in the past, like floating objects. I made a passionate speech early on in a Season 5 meeting about actor interaction because they didn’t want to do squibs.” The zero-gravity scenes utilize the expertise of stunts and special effects. “What I created at the beginning of Season 2 was a big teeter-totter gimbal that has several different attachments on one end,” explains Barraball. “You can sit or stand or hang from it. It’s counterweighted on the back and can go on a track and pivot around. Adam Savage was famously on it as a dead spaceman at the end of Season 2.” A cold, wet and muddy quarry doubled for the planet of Ilus. “When the dust storm first hits, James Holden (Steven Strait) and another character turn around as the whole room explodes inwards,” remarks Barraball. “We did some huge air canons outside the set full of dirt and dust, breakaway glass in the windows, and had all of the set pieces inside on little jerk rams. Everything went at once. Stunt guys were in there. For the tsunami, we built the set inside of an on-set pool. We fired off these water cannons and dump tanks. It was quite the deluge. Steven Strait was in the scene getting nailed by this water. “For Season 4, we built these moon buggies that drove around on Ilus,” explains Barraball. “We stripped down regular golf carts, did some conceptual art and completely rebuilt them. That took several months.” A space shuttle crashes on Ilus. “We had a lot of spot fires all over the crash scene. Burning piles of steel. There were 20 to 30 characters strewn everywhere impaled by pieces of metal. It was a challenge keeping them safe, yet still create a sense of peril. I had a great team, and it really does look like hell on Earth.”

and the visuals did not disappoint.” Shankar adds, “We had looked at controlled landing footage of a Blue Origin rocket returning to a platform and actually got to show that sequence Krista is talking about to the designers at Blue Origin in Seattle, and they loved it.” The Rocinante towing the Barbapiccola above Ilus was hard to envision. “Imagine the forces of these huge ships in orbit and the Barb is catching drag,” remarks Crowther. “Just the concept that there is this one-kilometer-long space tether between the ships. It was a tough thing to imagine and it came together well.” “We talk a lot about making visual effects feel like the shots are operated by real human beings with cameras,” remarks Shankar. “That’s not easy to do when you’re talking about realizing things in space, because the distances are big and the way objects move against different stars can be incredibly deceptive. We talk a lot about filmmaking. Everybody here is so fundamentally responsible for this great cinematic look. We have come so far in four seasons and now in a fifth. Every year we come up with new ways to destroy sets or go to places we haven’t been before.”

“For Season 4, we built these moon buggies that drove around on Ilus. We stripped down regular golf carts, did some conceptual art and completely rebuilt them. That took several months. We had a lot of spot fires all over the [space shuttle] crash scene [on Ilus]. Burning piles of steel. There were 20 to 30 characters strewn everywhere impaled by pieces of metal. It was a challenge keeping them safe, yet still create a sense of peril. I had a great team, and it really does look like hell on Earth.” —Tim Barraball, Special Effects Coordinator

FALL 2020 VFXVOICE.COM • 47


TV/STREAMING

The Expanse of Special Effects

“[The Rocinante landing on the surface of Ilus] was so emotional because this is the first time that we see the Rocinante landing in an atmosphere environment, and Naomi Nagata (Dominique Tipper) has never set foot on a planet before. It felt like all of the seasons were leading up to that moment and the visuals did not disappoint.” —Krista Allain, Visual Effects Producer

TOP TO BOTTOM: The concept that there is a one-kilometerlong space tether between the Rocinante and Barbapiccola, with the later catching drag, was tough to imagine. OPPOSITE TOP TO BOTOM: For the tsunami on Ilus, special effects built the set inside an on-set pool and fired off water canons and dump tanks.

46 • VFXVOICE.COM FALL 2020

originated from the source material. “In Season 4 particularly, we actually get inside Miller’s head and have a protomolecule POV for the first time and see how it views the world,” states Shankar. “The relationship of Miller and his protomolecule version with Holden is one of the big engines that drive the story.” Episode 409 introduces the Miller-Bot, where Miller projects himself inside of a machine and uses it as a tool to move around a room. “My initial idea was of these little learning robot things where you can bend their legs up, so instead of having four or six legs moving across the table, they only have two,” remarks Culp. “How do they adapt and learn to move out of necessity?” The Miller-Bot’s ability to speak was inspired by Chladni Plates. “The Miller-Bot was created by these main plates that were used to move around and covered with these fine cilia-like metal structures that Miller would vibrate in order to talk. As that vibration happened, we would plug in clips that would interpret the actual sound of Thomas Jane speaking as Miller into a Rorschach resonance pattern that would then flip up the little cilia. That was a cool scene to do.” A major new setting for Season 4 is the Earth-like planet of Ilus, consisting of one large continent and thousands of islands scattered throughout a giant ocean. “Every season we’re world-building and it feels like a brand-new series,” remarks Allain. “Ilus was a challenge but that was the joy of it too. We shot in a cold quarry out near Hamilton [Ontario] and used a lot of practical locations that were enhanced. During our many discussions in prep we talked about what Ilus look liked. What color are plants? What vegetation is there? Are there animals? What we ended up landing upon is that there would be no tall trees. We had to remove or reduce the treeline in hundreds of shots. That’s something the average viewer would have no idea about. However, those tiny details help to make it feel real.” “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like de-orbiting maneuvers where you slow going forward to be able to drop,” states Culp. “We’ve done a good job, and as a result it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].” The scene of the Rocinante landing on the surface of Ilus is a personal favorite of Allain’s. “It was so emotional because this is the first time that we see the Rocinante landing in an atmosphere environment, and Naomi Nagata (Dominique Tipper) has never set foot on a planet before. It felt like all of the seasons were leading up to that moment

A member of The Expanse production team from day one is Special Effects Coordinator Tim Barraball. “Since we’ve been doing The Expanse for so many years, there is a real family feeling to the show that makes making it that much more enjoyable. There are a few more things that visual effects do now than we did in the past, like floating objects. I made a passionate speech early on in a Season 5 meeting about actor interaction because they didn’t want to do squibs.” The zero-gravity scenes utilize the expertise of stunts and special effects. “What I created at the beginning of Season 2 was a big teeter-totter gimbal that has several different attachments on one end,” explains Barraball. “You can sit or stand or hang from it. It’s counterweighted on the back and can go on a track and pivot around. Adam Savage was famously on it as a dead spaceman at the end of Season 2.” A cold, wet and muddy quarry doubled for the planet of Ilus. “When the dust storm first hits, James Holden (Steven Strait) and another character turn around as the whole room explodes inwards,” remarks Barraball. “We did some huge air canons outside the set full of dirt and dust, breakaway glass in the windows, and had all of the set pieces inside on little jerk rams. Everything went at once. Stunt guys were in there. For the tsunami, we built the set inside of an on-set pool. We fired off these water cannons and dump tanks. It was quite the deluge. Steven Strait was in the scene getting nailed by this water. “For Season 4, we built these moon buggies that drove around on Ilus,” explains Barraball. “We stripped down regular golf carts, did some conceptual art and completely rebuilt them. That took several months.” A space shuttle crashes on Ilus. “We had a lot of spot fires all over the crash scene. Burning piles of steel. There were 20 to 30 characters strewn everywhere impaled by pieces of metal. It was a challenge keeping them safe, yet still create a sense of peril. I had a great team, and it really does look like hell on Earth.”

and the visuals did not disappoint.” Shankar adds, “We had looked at controlled landing footage of a Blue Origin rocket returning to a platform and actually got to show that sequence Krista is talking about to the designers at Blue Origin in Seattle, and they loved it.” The Rocinante towing the Barbapiccola above Ilus was hard to envision. “Imagine the forces of these huge ships in orbit and the Barb is catching drag,” remarks Crowther. “Just the concept that there is this one-kilometer-long space tether between the ships. It was a tough thing to imagine and it came together well.” “We talk a lot about making visual effects feel like the shots are operated by real human beings with cameras,” remarks Shankar. “That’s not easy to do when you’re talking about realizing things in space, because the distances are big and the way objects move against different stars can be incredibly deceptive. We talk a lot about filmmaking. Everybody here is so fundamentally responsible for this great cinematic look. We have come so far in four seasons and now in a fifth. Every year we come up with new ways to destroy sets or go to places we haven’t been before.”

“For Season 4, we built these moon buggies that drove around on Ilus. We stripped down regular golf carts, did some conceptual art and completely rebuilt them. That took several months. We had a lot of spot fires all over the [space shuttle] crash scene [on Ilus]. Burning piles of steel. There were 20 to 30 characters strewn everywhere impaled by pieces of metal. It was a challenge keeping them safe, yet still create a sense of peril. I had a great team, and it really does look like hell on Earth.” —Tim Barraball, Special Effects Coordinator

FALL 2020 VFXVOICE.COM • 47


VFX TRENDS

WORKING REMOTELY: HOME RULES FOR THE VFX INDUSTRY By TREVOR HOGG

TOP: A weekly remote meeting being held by SideFX. (Image courtesy of SideFX) OPPOSITE TOP: Lost Boys Studios conduct a remote 3D tracking lecture for compositors in Montreal. (Image courtesy of Lost Boys Studios)

48 • VFXVOICE.COM FALL 2020

Working remotely is part of the regular routine for the visual effects industry; however, with the global lockdown caused by the coronavirus pandemic, the sheer number of individuals requiring offsite access has been unprecedented. Initially, impeding the remote solution were studio concerns about security, which led to an online petition from 10,000 visual effects artists to the Motion Picture Association of America and the Visual Effects Society releasing a statement in support of allowing artists to work remotely. For expert insight into the logistical challenges of relying on local Internet providers to having to balance domestic and professional lives within the same space, VFX Voice “traveled” to New Zealand, U.K., Canada and U.S. via video conferencing, phone and email to learn about the short and long-term impact of relying entirely on a remote workflow. Decorating his basement office in Atlanta with self-made woodcarvings is Aldo Ruggiero, Visual Effects Supervisor at Crafty Apes. “It wouldn’t work so much if one person was remote and everyone else is in the office, but this strangely works well. We’re pushing 450 shots for a Netflix show. Nobody got furloughed in my office. Crafty Apes is a smart company. They do half movies and half television shows. The companies that are having the biggest trouble are the ones working in TV because it’s week by week. You shoot something, edit, and four weeks later you’re doing visual effects. We are using a system called Teradici and connecting through VPN. I do ask people to be available between 9 a.m. and 7 p.m., and to communicate. We video conference every day. My colleagues and I know more about each other’s lives than we ever did. It actually has become a more intimate type of work. It goes back to artists working by themselves. You miss not having the expertise of the people around you, and sharing ideas and opinions.” His backyard in Los Angeles serves as the conference area for Cody Hernandez, Postvis Supervisor at Day for Nite. “I was on

“It wouldn’t work so much if one person was remote and everyone else is in the office, but this strangely works well. We’re pushing 450 shots for a Netflix show. Nobody got furloughed in my office. … We are using a system called Teradici and connecting through VPN.” —Aldo Ruggiero, Visual Effects Supervisor, Crafty Apes another show on set. I would get up in the morning, go into work, login into my computer, see what the notes are for the day from production, and go from there. It’s really the same thing here. I have an extra hour of sleep and then I start my same routine. At the end of work, instead of driving for an hour I can go for a walk or run with my family. The only trouble my wife and I have is keeping the dogs quiet when we’re in meetings. I give the dogs a bone and they’re good for the next couple of hours! If you have an artist in Spain who is an amazing animator, you can use him now as long as he has an Internet connection. It’s all going to work out to the artist’s favor.” A family expansion occurred in Vancouver for Chris Downs, Visual Effects Supervisor at DNEG. “Our second daughter was born in early March, and we’ve been home between maternity leave and lockdown since then. The biggest challenge is trying to entertain the four-year-old given that with the newborn we’re less mobile to begin with and the playgrounds are shut down. Our garage has been converted into an in-law suite, so I’m set up in there with my own workspace which is quite nice! Week one was an eye-opener with where things actually were with the Internet. We ended up upgrading our home Internet so I could review the final comps at an appropriate quality level and reasonable frame rate. We never had planned to be using the garage this frequently, so I had to add a Wi-Fi booster to make sure that we got it all the way out here.”

Residing in the English countryside and London are the Framestore trio of Jonathan Fawkner, Creative Director of Film; Fiona Walkinshaw, Global Managing Director of Film; and Alex Webster, Managing Director of Pre-Production. “From March 16, we had a third of our people already working from home for various reasons and everyone from March 23,” notes Walkinshaw. “It took about two weeks. For film, that was 1,600 people, but for the company as a whole it’s about 2,500. Our American offices in New York City and Los Angeles operated on Teradici anyway and are smaller, so it was easy for them to take their things and go home. For film, about 60% of people needed individual configurations or kits. Our systems teams were unbelievable and went into military mode. It was one of those instances in the horrible circumstances that we found ourselves in what was quite a positive thing. Everyone wanted to make it work.” Webster was in the midst of establishing a new department when the lockdown occurred. “We had to push pause on the postvis projects that were shooting at that time. Simultaneously, we were delivering previs, character modeling and development, and lightweight virtual production in terms of virtual location scouting and camera sessions for other projects. We focused on getting those artists working remotely and provided them with Teradici in most instances. What complicated that is we had to get the vis and visual effects networks talking to each other for the first time. At the same time, we’re working in Unreal as well as Maya.”

FALL 2020 VFXVOICE.COM • 49


VFX TRENDS

WORKING REMOTELY: HOME RULES FOR THE VFX INDUSTRY By TREVOR HOGG

TOP: A weekly remote meeting being held by SideFX. (Image courtesy of SideFX) OPPOSITE TOP: Lost Boys Studios conduct a remote 3D tracking lecture for compositors in Montreal. (Image courtesy of Lost Boys Studios)

48 • VFXVOICE.COM FALL 2020

Working remotely is part of the regular routine for the visual effects industry; however, with the global lockdown caused by the coronavirus pandemic, the sheer number of individuals requiring offsite access has been unprecedented. Initially, impeding the remote solution were studio concerns about security, which led to an online petition from 10,000 visual effects artists to the Motion Picture Association of America and the Visual Effects Society releasing a statement in support of allowing artists to work remotely. For expert insight into the logistical challenges of relying on local Internet providers to having to balance domestic and professional lives within the same space, VFX Voice “traveled” to New Zealand, U.K., Canada and U.S. via video conferencing, phone and email to learn about the short and long-term impact of relying entirely on a remote workflow. Decorating his basement office in Atlanta with self-made woodcarvings is Aldo Ruggiero, Visual Effects Supervisor at Crafty Apes. “It wouldn’t work so much if one person was remote and everyone else is in the office, but this strangely works well. We’re pushing 450 shots for a Netflix show. Nobody got furloughed in my office. Crafty Apes is a smart company. They do half movies and half television shows. The companies that are having the biggest trouble are the ones working in TV because it’s week by week. You shoot something, edit, and four weeks later you’re doing visual effects. We are using a system called Teradici and connecting through VPN. I do ask people to be available between 9 a.m. and 7 p.m., and to communicate. We video conference every day. My colleagues and I know more about each other’s lives than we ever did. It actually has become a more intimate type of work. It goes back to artists working by themselves. You miss not having the expertise of the people around you, and sharing ideas and opinions.” His backyard in Los Angeles serves as the conference area for Cody Hernandez, Postvis Supervisor at Day for Nite. “I was on

“It wouldn’t work so much if one person was remote and everyone else is in the office, but this strangely works well. We’re pushing 450 shots for a Netflix show. Nobody got furloughed in my office. … We are using a system called Teradici and connecting through VPN.” —Aldo Ruggiero, Visual Effects Supervisor, Crafty Apes another show on set. I would get up in the morning, go into work, login into my computer, see what the notes are for the day from production, and go from there. It’s really the same thing here. I have an extra hour of sleep and then I start my same routine. At the end of work, instead of driving for an hour I can go for a walk or run with my family. The only trouble my wife and I have is keeping the dogs quiet when we’re in meetings. I give the dogs a bone and they’re good for the next couple of hours! If you have an artist in Spain who is an amazing animator, you can use him now as long as he has an Internet connection. It’s all going to work out to the artist’s favor.” A family expansion occurred in Vancouver for Chris Downs, Visual Effects Supervisor at DNEG. “Our second daughter was born in early March, and we’ve been home between maternity leave and lockdown since then. The biggest challenge is trying to entertain the four-year-old given that with the newborn we’re less mobile to begin with and the playgrounds are shut down. Our garage has been converted into an in-law suite, so I’m set up in there with my own workspace which is quite nice! Week one was an eye-opener with where things actually were with the Internet. We ended up upgrading our home Internet so I could review the final comps at an appropriate quality level and reasonable frame rate. We never had planned to be using the garage this frequently, so I had to add a Wi-Fi booster to make sure that we got it all the way out here.”

Residing in the English countryside and London are the Framestore trio of Jonathan Fawkner, Creative Director of Film; Fiona Walkinshaw, Global Managing Director of Film; and Alex Webster, Managing Director of Pre-Production. “From March 16, we had a third of our people already working from home for various reasons and everyone from March 23,” notes Walkinshaw. “It took about two weeks. For film, that was 1,600 people, but for the company as a whole it’s about 2,500. Our American offices in New York City and Los Angeles operated on Teradici anyway and are smaller, so it was easy for them to take their things and go home. For film, about 60% of people needed individual configurations or kits. Our systems teams were unbelievable and went into military mode. It was one of those instances in the horrible circumstances that we found ourselves in what was quite a positive thing. Everyone wanted to make it work.” Webster was in the midst of establishing a new department when the lockdown occurred. “We had to push pause on the postvis projects that were shooting at that time. Simultaneously, we were delivering previs, character modeling and development, and lightweight virtual production in terms of virtual location scouting and camera sessions for other projects. We focused on getting those artists working remotely and provided them with Teradici in most instances. What complicated that is we had to get the vis and visual effects networks talking to each other for the first time. At the same time, we’re working in Unreal as well as Maya.”

FALL 2020 VFXVOICE.COM • 49


VFX TRENDS

Facility and remote workflows are quite similar. “I can drive the pictures and get full access to all of the dailies that we’re producing,” states Fawkner. “We use G Suite and have Google Meet. We have permanent meeting rooms which you can jump into. People are sticking much more to schedules because you’re not having to literally walk out of one meeting room and go around the building to another one. If I’m late to a meeting and we can’t start, then everyone else can carry on working. The rhythm of the day hasn’t changed at all. We are keeping to the same time slots that we had before going virtual. The interfacility communication has skyrocketed. I now talk to the other facilities around the world way more than I did before. It’s much more fluid and that has surprised me. It’s something we’ll want to keep going.” Sequestered in Los Angeles with seven dogs is Sam Nicholson, CEO and Founder of Stargate Studios. “We’re seeing a huge upswing of interest in virtual production, which is predictable. It happened after 9/11. We built the whole virtual backlot and started to realize it was a lot cheaper to bring the location to the actors

50 • VFXVOICE.COM FALL 2020

than the actors to the location. It could potentially replace 50% of the greenscreen work, if not more. But be careful if you can’t make up your mind in pre-production as to what you’re going to get on set. There is no alpha channel. We’re working on that. The Mandalorian took care of that by floating a greenscreen behind the people so it looks great, but you have to fix it in post.” With support from the New Zealand government, Weta Digital was able to protect its employees and business. “We were able to shift 1,500 crew to ‘Work from Home’ status with minimal impact to our shot production capabilities,” remarks David Conley, Executive Visual Effects Producer. “We also benefited from some good fortune by having a strong slate of shows that were already in-house and from a bit of hustle on our side to secure additional work based on the strength of our response so far.” The next step is shifting the workforce back to the facility. “We’ve already moved our first group of about a dozen artists back in with proper social distancing and it has worked out well. We want to make sure artists feel comfortable returning and we are able to create a new

“Our systems teams were unbelievable and went into military mode. It was one of those instances in the horrible circumstances that we found ourselves in what was quite a positive thing. Everyone wanted to make it work.” —Fiona Walkinshaw, Global Managing Director of Film, Framestore environment that supports a mix of crew who are at home and others who are in the office.” “We are reasonably fortunate to have a good internet infrastructure here in Wellington,” remarks Sean Walker, Sequence Visual Effects Supervisor at Weta Digital. “Fiber is available in most places, and the remote working software from Weta isn’t data-heavy. I’ve found that I can even conduct dailies with little preparation [not pre-downloading clips], and it does not negatively impact on our reviews. This is something I thought may affect our workflows, but hasn’t at all. Last-minute dailies additions are definitely still an occurrence. Having most of our communication through a single app [Microsoft Teams] has made keeping in touch

OPPOSITE TOP: Seven weeks were left in the production of Soul when the entire workforce at Pixar Animation Studios had to switch to a remote workflow. (Image courtesy of Pixar Animations Studios) OPPOSITE BOTTOM: Some shots for Wonder Woman 1984 had to be completed remotely. (Image copyright © 2020 Warner Bros. Entertainment Inc.) TOP: Connected was completed remotely during the quarantine. The animated adventures of the Mitchell Family feature the voices of Maya Rudolph, Abbi Jacobson, Michael Rianda and Danny McBride. (Image courtesy of Columbia Pictures and Sony Pictures Animation) BOTTOM: agora.studio recently completed work on The Witness. (Image courtesy of Netflix and agora.studio)

FALL 2020 VFXVOICE.COM • 51


VFX TRENDS

Facility and remote workflows are quite similar. “I can drive the pictures and get full access to all of the dailies that we’re producing,” states Fawkner. “We use G Suite and have Google Meet. We have permanent meeting rooms which you can jump into. People are sticking much more to schedules because you’re not having to literally walk out of one meeting room and go around the building to another one. If I’m late to a meeting and we can’t start, then everyone else can carry on working. The rhythm of the day hasn’t changed at all. We are keeping to the same time slots that we had before going virtual. The interfacility communication has skyrocketed. I now talk to the other facilities around the world way more than I did before. It’s much more fluid and that has surprised me. It’s something we’ll want to keep going.” Sequestered in Los Angeles with seven dogs is Sam Nicholson, CEO and Founder of Stargate Studios. “We’re seeing a huge upswing of interest in virtual production, which is predictable. It happened after 9/11. We built the whole virtual backlot and started to realize it was a lot cheaper to bring the location to the actors

50 • VFXVOICE.COM FALL 2020

than the actors to the location. It could potentially replace 50% of the greenscreen work, if not more. But be careful if you can’t make up your mind in pre-production as to what you’re going to get on set. There is no alpha channel. We’re working on that. The Mandalorian took care of that by floating a greenscreen behind the people so it looks great, but you have to fix it in post.” With support from the New Zealand government, Weta Digital was able to protect its employees and business. “We were able to shift 1,500 crew to ‘Work from Home’ status with minimal impact to our shot production capabilities,” remarks David Conley, Executive Visual Effects Producer. “We also benefited from some good fortune by having a strong slate of shows that were already in-house and from a bit of hustle on our side to secure additional work based on the strength of our response so far.” The next step is shifting the workforce back to the facility. “We’ve already moved our first group of about a dozen artists back in with proper social distancing and it has worked out well. We want to make sure artists feel comfortable returning and we are able to create a new

“Our systems teams were unbelievable and went into military mode. It was one of those instances in the horrible circumstances that we found ourselves in what was quite a positive thing. Everyone wanted to make it work.” —Fiona Walkinshaw, Global Managing Director of Film, Framestore environment that supports a mix of crew who are at home and others who are in the office.” “We are reasonably fortunate to have a good internet infrastructure here in Wellington,” remarks Sean Walker, Sequence Visual Effects Supervisor at Weta Digital. “Fiber is available in most places, and the remote working software from Weta isn’t data-heavy. I’ve found that I can even conduct dailies with little preparation [not pre-downloading clips], and it does not negatively impact on our reviews. This is something I thought may affect our workflows, but hasn’t at all. Last-minute dailies additions are definitely still an occurrence. Having most of our communication through a single app [Microsoft Teams] has made keeping in touch

OPPOSITE TOP: Seven weeks were left in the production of Soul when the entire workforce at Pixar Animation Studios had to switch to a remote workflow. (Image courtesy of Pixar Animations Studios) OPPOSITE BOTTOM: Some shots for Wonder Woman 1984 had to be completed remotely. (Image copyright © 2020 Warner Bros. Entertainment Inc.) TOP: Connected was completed remotely during the quarantine. The animated adventures of the Mitchell Family feature the voices of Maya Rudolph, Abbi Jacobson, Michael Rianda and Danny McBride. (Image courtesy of Columbia Pictures and Sony Pictures Animation) BOTTOM: agora.studio recently completed work on The Witness. (Image courtesy of Netflix and agora.studio)

FALL 2020 VFXVOICE.COM • 51


VFX TRENDS

very easy.” Shifting to a remote workflow occurred during the final weeks of delivery for Black Widow. “Amazingly, we only lost about half a day to a full day for the transition, and in a miraculous fashion we had already made up for the lag by the end of the week.” Straddling the worlds of visual effects and animation is Sony Pictures Imageworks, with the remote strategy coordinated by Michael Ford, Vice President, Head of Systems & Software Development in Los Angeles. “On the animated side it is very much business as usual because there isn’t a dependency on shooting. We’re taking it day by day on the visual effects side waiting for what the filmmakers and studios are able to do. You can replace things that were going to be photographed with CG and augment it or get it to a point where you’re ready for whatever the film elements are going to be. One of the unique things about us is that

52 • VFXVOICE.COM FALL 2020

pre-COVID-19 we operated out of a single data center. We were using the same process already. All that we had to do was to get those users to take a ‘portal’ using Teradici to connect into our data center and put them in their houses. The configuration was the only difference, along with making sure that it was secure and matched the requirements of our client.” Pixar Animation Studios does not have a history of working remotely. Entering into this unknown territory were Jim Morris, President; Pete Docter, Chief Creative Officer; Steve May, CTO; and Dana Murray, Producer. “We got things up and running to 75% efficiency fairly quickly,” remarks Morris. “We have five movies at some level of production at the moment plus different shows for Disney+, and we’re able to keep forward momentum on all of them. Those of us who do our work on computers have an advantage

in this tough situation where we can keep working. Some of the challenges were not the ones I was thinking. People have families while others live by themselves. Just coming up with ways to keep connectivity and esprit de corps around the work we’re doing has taken some thought.” “We’re our own client so that makes things simpler, because I could say we’re only going to worry about supporting artists, animators and production staff who are on our highest priority show, which was Soul,” notes May. “A few years ago, we moved the artists and animators to virtual workstations that consist of Teradici being connected to our onsite data center. Fortunately, we already had high-capacity incoming network bandwidth. With Soul, we had over 200 people on the crew at the time this happened, and to get them working remotely when they’re at the

OPPOSITE TOP: Post-production on Over the Moon, directed by Glen Keane, was impacted by the coronavirus. (Image courtesy of Columbia Pictures, Sony Pictures Animation and Sony Pictures Imageworks) OPPOSITE BOTTOM: DNEG worked remotely on the highly anticipated adaptation of Dune, directed by Dennis Villeneuve. (Image courtesy of DNEG and Warner Bros. Pictures) TOP: The final two episodes of Space Force were worked on remotely by Crafty Apes. From left: Jimmy O. Yang (Dr. Chan Kaifang) and John Malkovich (Dr. Adrian Mallory). (Photo by Aaron Epstein courtesy of Netflix) BOTTOM: The post-production process for Black Widow (Scarlett Johansson) was impacted by the lockdown. (Photo by Film Frame copyright © 2020 Marvel Studios)

FALL 2020 VFXVOICE.COM • 53


VFX TRENDS

very easy.” Shifting to a remote workflow occurred during the final weeks of delivery for Black Widow. “Amazingly, we only lost about half a day to a full day for the transition, and in a miraculous fashion we had already made up for the lag by the end of the week.” Straddling the worlds of visual effects and animation is Sony Pictures Imageworks, with the remote strategy coordinated by Michael Ford, Vice President, Head of Systems & Software Development in Los Angeles. “On the animated side it is very much business as usual because there isn’t a dependency on shooting. We’re taking it day by day on the visual effects side waiting for what the filmmakers and studios are able to do. You can replace things that were going to be photographed with CG and augment it or get it to a point where you’re ready for whatever the film elements are going to be. One of the unique things about us is that

52 • VFXVOICE.COM FALL 2020

pre-COVID-19 we operated out of a single data center. We were using the same process already. All that we had to do was to get those users to take a ‘portal’ using Teradici to connect into our data center and put them in their houses. The configuration was the only difference, along with making sure that it was secure and matched the requirements of our client.” Pixar Animation Studios does not have a history of working remotely. Entering into this unknown territory were Jim Morris, President; Pete Docter, Chief Creative Officer; Steve May, CTO; and Dana Murray, Producer. “We got things up and running to 75% efficiency fairly quickly,” remarks Morris. “We have five movies at some level of production at the moment plus different shows for Disney+, and we’re able to keep forward momentum on all of them. Those of us who do our work on computers have an advantage

in this tough situation where we can keep working. Some of the challenges were not the ones I was thinking. People have families while others live by themselves. Just coming up with ways to keep connectivity and esprit de corps around the work we’re doing has taken some thought.” “We’re our own client so that makes things simpler, because I could say we’re only going to worry about supporting artists, animators and production staff who are on our highest priority show, which was Soul,” notes May. “A few years ago, we moved the artists and animators to virtual workstations that consist of Teradici being connected to our onsite data center. Fortunately, we already had high-capacity incoming network bandwidth. With Soul, we had over 200 people on the crew at the time this happened, and to get them working remotely when they’re at the

OPPOSITE TOP: Post-production on Over the Moon, directed by Glen Keane, was impacted by the coronavirus. (Image courtesy of Columbia Pictures, Sony Pictures Animation and Sony Pictures Imageworks) OPPOSITE BOTTOM: DNEG worked remotely on the highly anticipated adaptation of Dune, directed by Dennis Villeneuve. (Image courtesy of DNEG and Warner Bros. Pictures) TOP: The final two episodes of Space Force were worked on remotely by Crafty Apes. From left: Jimmy O. Yang (Dr. Chan Kaifang) and John Malkovich (Dr. Adrian Mallory). (Photo by Aaron Epstein courtesy of Netflix) BOTTOM: The post-production process for Black Widow (Scarlett Johansson) was impacted by the lockdown. (Photo by Film Frame copyright © 2020 Marvel Studios)

FALL 2020 VFXVOICE.COM • 53


VFX TRENDS

TOP: Pixar Animation Studios Producer Dana Murray supervises the final sound mix for Soul while working remotely at Skywalker Ranch. (Image courtesy of Pixar Animation Studios and Dana Murray) MIDDLE: With the help of Teradici, Mike Accettura, who is a lead compositor for Crafty Apes, was able to work remotely. (Image courtesy of Crafty Apes) BOTTOM: Pixar Animation Studios Chief Creative Officer Pete Docter has been impressed by the resourcefulness of his colleagues. (Image courtesy of Pixar Animation Studios and Pete Docter)

54 • VFXVOICE.COM FALL 2020

final crunch of production felt like a major accomplishment. It went so well that within two weeks we had all of the other productions up and running. We had to change the ways that some people were used to working. Editorial had used Avid on the Mac since the beginning of time, and in order to remote in we had to ask them to switch to Windows.” Docter has been impressed by the resourcefulness of his colleagues. “People are finding ways to record actors in their closets and to do these big editorial sessions that we normally do in a room with everybody connected separately. Animation dailies are tough. Animators can work well from home, but when it comes time to show other people, we usually sit in a screening room and turn to each other and say, ‘What do you think of this?’ There are a lot of suggestions and solutions. That’s a lot harder and slower this way. If you’re just starting a project, working from home might actually be an advantage because you’re less distracted by people knocking on your door and talking to you. But if you’re in the middle, like we have with two or three films, that’s the most difficult because of lack of being able to work together.” Seven weeks were left in the production of Soul. “Sound and post have been delayed, but since our release date has been shifted, it’s all fine,” remarks Murray. “What was cool is that John Batiste actually recorded our end credit song in his home.” Fortunately, Soul composers Trent Reznor and Atticus Ross, as well as sound designer Ren Klyce, have home studios as well. “The hardest part is some people have good Wi-Fi and others don’t. Also, people with kids have to figure out teaching and cooking while also still managing the work on their plate, so we’ve been flexible. For the most part, people appreciate having something to focus on.” Classrooms became a virtual experience for visual effects schools and programs. Such is the case for Lost Boys Studios in Vancouver and Montreal, as well as Sheridan College in Oakville, Ontario. “Every student’s workstation was rebuilt to operate independently of our network,” explains Mark Bénard, Founder and VFX Director/Supervisor at Lost Boys Studios. “Software, licensing and data all had to be localized. Additional storage was installed. Wi-Fi adapters compatible with Linux were sourced and tested. In tandem with our workstation prep, our other instructors began researching the best tools for remote instruction and student support. We are creating more pre-recorded lessons which can be more efficient than large group lectures that are prone to tangential discussions. One-on-one support has become a larger priority with sessions taking more time than in person. Since many of our alumni are also working from home, this has provided an opportunity to involve them in guest artist discussions, allowing more interaction between students with professional artists.” “This is an ongoing process. We turned around effective technical solutions for emergency remote teaching within a couple of weeks of being told we were not allowed on campus,” remarks Noel Hooper, Program Coordinator/Professor of Computer Animation, Visual Effects, Digital Creature Animation at Sheridan College. “We were in the later stages of production in our programs, so it was primarily the review/critique process that we had to move online, which has methodologies already established in the

industry. As a contingency plan, we are preparing all the curriculum for online delivery in case we’re still remote this fall. The challenge is to now truly convert our programs into effective online learning. I anticipate that if we are back on campus there will be recommended social distancing in place. In that case, fewer students may be allowed in each space, so we’ll have to develop a hybrid system to keep class sizes the same.” Software developers are implementing measures to ease the transition, whether it be Ziva Dynamics in Vancouver or SideFX in Toronto. “At Ziva, we have always had a significant contingent of remote workers, and we have prided ourselves on our ability to work effectively with a distributed team,” remarks James Jacobs, Co-Founder and CEO at Ziva Dynamics. “Amidst all of this, we are also trying to recognize the unique challenges our users may be facing as they transition out of the studio to homebased work environments. As such, we’ve made our academic licenses free, waived the wait time between free trials, and offered free remote licenses for all of our studio customers. Ziva is proud to say that many of our customer relationships have been strengthened as a result of these measures.” “As developers of software, we have been less impacted than our customers,” notes Kim Davidson, President and CEO at SideFX. “We’ve worked closely with our customers to help them with remote licensing. Additionally, we’ve focused on delivering more learning and training to all customers including studios, schools and self-learners. The biggest change has been the suspension of in-person industry and Houdini community events. Instead, we have reorganized to run online Houdini events, training and summits, such as the Gamedev Hive 2020 and the Houdini Hive Worldwide. These have been successful as they allow for a broader range of presenters and participants and still allow for extensive two-way interactions. We still like to meet in-person with our customers, but connecting more online may be the most significant adjustment that will continue for SideFX post-pandemic.” An interesting postscript is whether there will be a rise in and growing acceptance of virtual companies. “We’re mostly doing character animation and rigging,” remarks David Hubert, Founder and Creative Director at agora.studio, from Montreal. “Data is transferred on Nextcloud servers. We have a programmer who builds custom tools for the pipeline such as in Maya. We’re also using a lot of software like Slack and Zoom that allow us to properly manage all of this. Everyone has their own computer.” Reviews and notes are conducted through SyncSketch. “We push an update to the cloud and it automatically goes to their machine,” states Jacob Gardner, Animation Director at agora.studio, who is based in Chicago. “When people are working locally, they’re not having any issues with fiber optics or Internet connections.” A financial hurdle to the virtual process is that cloud rendering remains expensive. “I don’t think that everyone is going to go back to the studio or everyone is going to work from home,” notes Hubert. “Having a computer at home that you can work on and one at the studio will become the norm.”

TOP: An example of a remote copy of Ziva VFX being utilized. (Image courtesy of Ziva Dynamics) MIDDLE: One of the projects that Weta Digital Visual Effects Supervisor Sean Walker had to finish remotely was Black Widow. (Image courtesy of Weta Digital and Sean Walker) BOTTOM LEFT: Sony Pictures Imageworks Animation Supervisor Alan Hawkins gets creative in finding an appropriate office space to finish his work on Connected. (Image Courtesy of Sony Pictures Imageworks and Alan Hawkins) BOTTOM RIGHT: Sheridan College student Xiao Yang balances home and academic life. (Image courtesy of Sheridan College and Xiao Yang)

FALL 2020 VFXVOICE.COM • 55


VFX TRENDS

TOP: Pixar Animation Studios Producer Dana Murray supervises the final sound mix for Soul while working remotely at Skywalker Ranch. (Image courtesy of Pixar Animation Studios and Dana Murray) MIDDLE: With the help of Teradici, Mike Accettura, who is a lead compositor for Crafty Apes, was able to work remotely. (Image courtesy of Crafty Apes) BOTTOM: Pixar Animation Studios Chief Creative Officer Pete Docter has been impressed by the resourcefulness of his colleagues. (Image courtesy of Pixar Animation Studios and Pete Docter)

54 • VFXVOICE.COM FALL 2020

final crunch of production felt like a major accomplishment. It went so well that within two weeks we had all of the other productions up and running. We had to change the ways that some people were used to working. Editorial had used Avid on the Mac since the beginning of time, and in order to remote in we had to ask them to switch to Windows.” Docter has been impressed by the resourcefulness of his colleagues. “People are finding ways to record actors in their closets and to do these big editorial sessions that we normally do in a room with everybody connected separately. Animation dailies are tough. Animators can work well from home, but when it comes time to show other people, we usually sit in a screening room and turn to each other and say, ‘What do you think of this?’ There are a lot of suggestions and solutions. That’s a lot harder and slower this way. If you’re just starting a project, working from home might actually be an advantage because you’re less distracted by people knocking on your door and talking to you. But if you’re in the middle, like we have with two or three films, that’s the most difficult because of lack of being able to work together.” Seven weeks were left in the production of Soul. “Sound and post have been delayed, but since our release date has been shifted, it’s all fine,” remarks Murray. “What was cool is that John Batiste actually recorded our end credit song in his home.” Fortunately, Soul composers Trent Reznor and Atticus Ross, as well as sound designer Ren Klyce, have home studios as well. “The hardest part is some people have good Wi-Fi and others don’t. Also, people with kids have to figure out teaching and cooking while also still managing the work on their plate, so we’ve been flexible. For the most part, people appreciate having something to focus on.” Classrooms became a virtual experience for visual effects schools and programs. Such is the case for Lost Boys Studios in Vancouver and Montreal, as well as Sheridan College in Oakville, Ontario. “Every student’s workstation was rebuilt to operate independently of our network,” explains Mark Bénard, Founder and VFX Director/Supervisor at Lost Boys Studios. “Software, licensing and data all had to be localized. Additional storage was installed. Wi-Fi adapters compatible with Linux were sourced and tested. In tandem with our workstation prep, our other instructors began researching the best tools for remote instruction and student support. We are creating more pre-recorded lessons which can be more efficient than large group lectures that are prone to tangential discussions. One-on-one support has become a larger priority with sessions taking more time than in person. Since many of our alumni are also working from home, this has provided an opportunity to involve them in guest artist discussions, allowing more interaction between students with professional artists.” “This is an ongoing process. We turned around effective technical solutions for emergency remote teaching within a couple of weeks of being told we were not allowed on campus,” remarks Noel Hooper, Program Coordinator/Professor of Computer Animation, Visual Effects, Digital Creature Animation at Sheridan College. “We were in the later stages of production in our programs, so it was primarily the review/critique process that we had to move online, which has methodologies already established in the

industry. As a contingency plan, we are preparing all the curriculum for online delivery in case we’re still remote this fall. The challenge is to now truly convert our programs into effective online learning. I anticipate that if we are back on campus there will be recommended social distancing in place. In that case, fewer students may be allowed in each space, so we’ll have to develop a hybrid system to keep class sizes the same.” Software developers are implementing measures to ease the transition, whether it be Ziva Dynamics in Vancouver or SideFX in Toronto. “At Ziva, we have always had a significant contingent of remote workers, and we have prided ourselves on our ability to work effectively with a distributed team,” remarks James Jacobs, Co-Founder and CEO at Ziva Dynamics. “Amidst all of this, we are also trying to recognize the unique challenges our users may be facing as they transition out of the studio to homebased work environments. As such, we’ve made our academic licenses free, waived the wait time between free trials, and offered free remote licenses for all of our studio customers. Ziva is proud to say that many of our customer relationships have been strengthened as a result of these measures.” “As developers of software, we have been less impacted than our customers,” notes Kim Davidson, President and CEO at SideFX. “We’ve worked closely with our customers to help them with remote licensing. Additionally, we’ve focused on delivering more learning and training to all customers including studios, schools and self-learners. The biggest change has been the suspension of in-person industry and Houdini community events. Instead, we have reorganized to run online Houdini events, training and summits, such as the Gamedev Hive 2020 and the Houdini Hive Worldwide. These have been successful as they allow for a broader range of presenters and participants and still allow for extensive two-way interactions. We still like to meet in-person with our customers, but connecting more online may be the most significant adjustment that will continue for SideFX post-pandemic.” An interesting postscript is whether there will be a rise in and growing acceptance of virtual companies. “We’re mostly doing character animation and rigging,” remarks David Hubert, Founder and Creative Director at agora.studio, from Montreal. “Data is transferred on Nextcloud servers. We have a programmer who builds custom tools for the pipeline such as in Maya. We’re also using a lot of software like Slack and Zoom that allow us to properly manage all of this. Everyone has their own computer.” Reviews and notes are conducted through SyncSketch. “We push an update to the cloud and it automatically goes to their machine,” states Jacob Gardner, Animation Director at agora.studio, who is based in Chicago. “When people are working locally, they’re not having any issues with fiber optics or Internet connections.” A financial hurdle to the virtual process is that cloud rendering remains expensive. “I don’t think that everyone is going to go back to the studio or everyone is going to work from home,” notes Hubert. “Having a computer at home that you can work on and one at the studio will become the norm.”

TOP: An example of a remote copy of Ziva VFX being utilized. (Image courtesy of Ziva Dynamics) MIDDLE: One of the projects that Weta Digital Visual Effects Supervisor Sean Walker had to finish remotely was Black Widow. (Image courtesy of Weta Digital and Sean Walker) BOTTOM LEFT: Sony Pictures Imageworks Animation Supervisor Alan Hawkins gets creative in finding an appropriate office space to finish his work on Connected. (Image Courtesy of Sony Pictures Imageworks and Alan Hawkins) BOTTOM RIGHT: Sheridan College student Xiao Yang balances home and academic life. (Image courtesy of Sheridan College and Xiao Yang)

FALL 2020 VFXVOICE.COM • 55


TECH & TOOLS

WORKING FROM HOME: HOW THE COVID-19 PANDEMIC HAS IMPACTED THE VFX INDUSTRY AND WILL CHANGE IT FOREVER By THE VES TECHNOLOGY COMMITTEE

WHAT HAPPENED

This year our industry faced an unprecedented health crisis that presented challenges to our safety, technology and security. Many of our members’ businesses and employers were forced to take radical action. The VES responded by helping encourage a dialogue between members, studios, vendors and facilities to move us all forward. One of the groups leading the effort was the VES Technology Committee. The committee is composed of professionals from across the industry that represent a diverse set of technical backgrounds and works with other industry organizations like the Academy Software Foundation to develop and promote common standards and best practices across the industry. Recent projects led by the committee include the VFX Reference Platform, Cinematic Color Pipeline and Camera Report specification. Acting as a technical conduit to software vendors and helping push initiatives to benefit the VFX industry as a whole are also part of our core mission. In response to the COVID-19 pandemic, the VES Technology Committee began meeting on a weekly basis to share ideas and give updates on successful attempts to start solving the engineering problems presented by Working From Home (WFH) during the pandemic. The lockdown of non-essential commercial activity came swiftly in the areas where VFX facilities are clustered. Most facilities only had a few days to set up initial WFH infrastructure before the lockdowns became effective. These weekly committee meetings created a platform to develop a document for best practices that was shared across the industry and is open to anyone who wishes to contribute additional ideas. The document (available at: https://tinyurl.com/VES-Covid19) continues to evolve and represents an excellent reference point to develop WFH procedures and strategy. Additional resources for working remotely can be accessed on the VES website at: https://www.visualeffectssociety.com/ves-covid-19-resources/. WHAT DID WE LEARN?

OPPOSITE TOP: The VES Technology Committee in action.

56 • VFXVOICE.COM FALL 2020

The majority of VFX work requires on-premises infrastructure for some combination of security, technical and economic reasons. Few companies in our industry had plans in place to meet the challenge of a global pandemic under these constraints. The wholesale conversion of facilities to a WFH scenario was not something that was previously considered, given the physical security requirements necessary to ensure the safety of intellectual property. As

The weekly meeting [of the VES Technology Committee] created a platform to develop a document for best practices that was shared across the industry and is open to anyone who wishes to contribute additional ideas. The document (available at: https://tinyurl.com/VES-Covid19) continues to evolve and represents an excellent reference point to develop WFH procedures and strategy. an industry, we needed to think quickly, adapt our methods, and invent new procedures to continue our creative journey. The main solution to tackle the challenge involved extending the desktop workstation experience to the user’s home. This required remote display solutions, so that no actual production files needed to leave the facilities’ networks. This solution transfers the display of secure office workstations directly to home computer screens over an encrypted data stream. Some facilities were already leveraging these remote access solutions internally in order to gain flexibility in electrical requirements, improve cooling, reduce the physical labor of moving heavy workstations around the facility, and manage the requirements for physically secure locations for end users. Others were using this technology to support a hybrid model where some users worked remotely. For these facilities already using remote display solutions, the most straightforward approach was to extend this infrastructure to allow everyone to work remotely. Thus infrastructure initially deployed to meet on-premises flexibility goals turned out to be a crucial part of a disaster mitigation plan. Owners and managers of facilities without existing infrastructure had to quickly decide on a solution they could deploy in a very short time. This meant assessing technologies the in-house teams were already familiar with and exploring what expertise and services were offered by industry-specific resellers, consultants and system integrators. The animation and VFX technology community came together on various forums to share as much expertise as possible. This helped avoid duplication of effort at the start of the crisis and quickly spread information focused on remote access technologies. The industry also needed to seek software-based solutions since supply chains had already been

impacted. Licensing models and costs of commercial solutions needed to fit within the technical, operating, and business plan of each company. In addition, some other facilities preferred to leverage capabilities already included in client operating systems or to use free or open-source technologies. The workflows for VFX and animation also created unique challenges. Many users required higher resolution, frame rate, and/ or more color accuracy than typical remote office work. Remote access protocols specifically targeting the media and entertainment industries typically have better support for these requirements. VFX facilities often use a mix of Windows, Linux and Mac systems. Users were often required to use their personal equipment at home to access their workstation back at the office (and in some cases, mobile devices running iOS/Android were part of the equation). At the time, few remote access solutions supported all of these platforms on both the server-side (the workstation at the office) and the client-side (the device used at home). So, in many cases, more than one remote access system had to be deployed to efficiently support different platforms required for different workflows. Additional challenges were presented by latency and bandwidth of home internet connections. Any extra delay directly affects tasks that are dependent on getting immediate feedback from interacting with your mouse or tablet, such as storyboarding, production design, modeling and texture painting. This remains a challenge today, especially as not all combinations of operating systems and remote display technology are compatible with input accessories like tablets. The Tech Committee worked to voice these issues to software and hardware vendors and has seen responses in the form of fixes, new releases and planned improvements. More

FALL 2020 VFXVOICE.COM • 57


TECH & TOOLS

WORKING FROM HOME: HOW THE COVID-19 PANDEMIC HAS IMPACTED THE VFX INDUSTRY AND WILL CHANGE IT FOREVER By THE VES TECHNOLOGY COMMITTEE

WHAT HAPPENED

This year our industry faced an unprecedented health crisis that presented challenges to our safety, technology and security. Many of our members’ businesses and employers were forced to take radical action. The VES responded by helping encourage a dialogue between members, studios, vendors and facilities to move us all forward. One of the groups leading the effort was the VES Technology Committee. The committee is composed of professionals from across the industry that represent a diverse set of technical backgrounds and works with other industry organizations like the Academy Software Foundation to develop and promote common standards and best practices across the industry. Recent projects led by the committee include the VFX Reference Platform, Cinematic Color Pipeline and Camera Report specification. Acting as a technical conduit to software vendors and helping push initiatives to benefit the VFX industry as a whole are also part of our core mission. In response to the COVID-19 pandemic, the VES Technology Committee began meeting on a weekly basis to share ideas and give updates on successful attempts to start solving the engineering problems presented by Working From Home (WFH) during the pandemic. The lockdown of non-essential commercial activity came swiftly in the areas where VFX facilities are clustered. Most facilities only had a few days to set up initial WFH infrastructure before the lockdowns became effective. These weekly committee meetings created a platform to develop a document for best practices that was shared across the industry and is open to anyone who wishes to contribute additional ideas. The document (available at: https://tinyurl.com/VES-Covid19) continues to evolve and represents an excellent reference point to develop WFH procedures and strategy. Additional resources for working remotely can be accessed on the VES website at: https://www.visualeffectssociety.com/ves-covid-19-resources/. WHAT DID WE LEARN?

OPPOSITE TOP: The VES Technology Committee in action.

56 • VFXVOICE.COM FALL 2020

The majority of VFX work requires on-premises infrastructure for some combination of security, technical and economic reasons. Few companies in our industry had plans in place to meet the challenge of a global pandemic under these constraints. The wholesale conversion of facilities to a WFH scenario was not something that was previously considered, given the physical security requirements necessary to ensure the safety of intellectual property. As

The weekly meeting [of the VES Technology Committee] created a platform to develop a document for best practices that was shared across the industry and is open to anyone who wishes to contribute additional ideas. The document (available at: https://tinyurl.com/VES-Covid19) continues to evolve and represents an excellent reference point to develop WFH procedures and strategy. an industry, we needed to think quickly, adapt our methods, and invent new procedures to continue our creative journey. The main solution to tackle the challenge involved extending the desktop workstation experience to the user’s home. This required remote display solutions, so that no actual production files needed to leave the facilities’ networks. This solution transfers the display of secure office workstations directly to home computer screens over an encrypted data stream. Some facilities were already leveraging these remote access solutions internally in order to gain flexibility in electrical requirements, improve cooling, reduce the physical labor of moving heavy workstations around the facility, and manage the requirements for physically secure locations for end users. Others were using this technology to support a hybrid model where some users worked remotely. For these facilities already using remote display solutions, the most straightforward approach was to extend this infrastructure to allow everyone to work remotely. Thus infrastructure initially deployed to meet on-premises flexibility goals turned out to be a crucial part of a disaster mitigation plan. Owners and managers of facilities without existing infrastructure had to quickly decide on a solution they could deploy in a very short time. This meant assessing technologies the in-house teams were already familiar with and exploring what expertise and services were offered by industry-specific resellers, consultants and system integrators. The animation and VFX technology community came together on various forums to share as much expertise as possible. This helped avoid duplication of effort at the start of the crisis and quickly spread information focused on remote access technologies. The industry also needed to seek software-based solutions since supply chains had already been

impacted. Licensing models and costs of commercial solutions needed to fit within the technical, operating, and business plan of each company. In addition, some other facilities preferred to leverage capabilities already included in client operating systems or to use free or open-source technologies. The workflows for VFX and animation also created unique challenges. Many users required higher resolution, frame rate, and/ or more color accuracy than typical remote office work. Remote access protocols specifically targeting the media and entertainment industries typically have better support for these requirements. VFX facilities often use a mix of Windows, Linux and Mac systems. Users were often required to use their personal equipment at home to access their workstation back at the office (and in some cases, mobile devices running iOS/Android were part of the equation). At the time, few remote access solutions supported all of these platforms on both the server-side (the workstation at the office) and the client-side (the device used at home). So, in many cases, more than one remote access system had to be deployed to efficiently support different platforms required for different workflows. Additional challenges were presented by latency and bandwidth of home internet connections. Any extra delay directly affects tasks that are dependent on getting immediate feedback from interacting with your mouse or tablet, such as storyboarding, production design, modeling and texture painting. This remains a challenge today, especially as not all combinations of operating systems and remote display technology are compatible with input accessories like tablets. The Tech Committee worked to voice these issues to software and hardware vendors and has seen responses in the form of fixes, new releases and planned improvements. More

FALL 2020 VFXVOICE.COM • 57


TECH & TOOLS

The necessity of remote work required a significant departure from these security postures that barred any access to content from employees’ homes. A common approach to address this was to leverage VPN technology, which encrypts all traffic between a user and their company’s datacenter. than anything, though, it has become clear there is not an industry-standard alternative to artists sitting together in a calibrated room, viewing and discussing high-resolution and high-dynamic-range uncompressed color-correct content in real-time. Most teams cobbled together workflows by combining commodity video-conferencing software solutions. Reviewers were forced to accept imperfect synchronized playback or approaches that required pre-staging media on every user’s work-provided device, adding delay to the creative approval process. A few cloud and streaming solutions are beginning to emerge, but they are not yet fully vetted from a security point of view. This is an area that will likely see significant evolution over time. Digital content security is incredibly important to everyone, from client studios to VFX and animation facilities, with strict and precise requirements that are often mandated in order to work on certain projects. Facilities doing this work were compelled to coordinate with the security teams of the content owners to ensure that the remote-access infrastructure met enough security requirements and best practices. Previously, this involved a strict separation between machines accessing pre-release content and the open internet. The necessity of remote work required a significant departure from these security postures that barred any access to content from employees’ homes. A common approach to address this was to leverage VPN technology, which encrypts all traffic between a user and their company’s datacenter. Since allowing production content to be broadly accessed by employees from home wasn’t a typically permissible working condition in most facilities, no standardization existed around these requirements. It was only after several iterations that facilities derived a practical checklist to which employees were asked to adhere. Helping users walk through the huge variety of home setups and troubleshooting

58 • VFXVOICE.COM FALL 2020

the problems that users encountered was a challenge for already strained IT departments. Home users, in many cases, were required to upgrade the security of their Wi-Fi by reconfiguring their router. Additional security requirements included a minimum precaution that all monitors face away from doors and windows in order to minimize exposure to additional family members. There was another potential challenge for those who share their home space with individuals employed at other facilities. While pre-existing workflows have undoubtedly been impacted by the aforementioned challenges, new opportunities and workflows have evolved to allow productivity to continue at levels close to those from before the disruption. Reports started flowing in about the advantages of checking a render after dinner, managing time more flexibly, and not having to wait for a meeting room when a client review runs long. The collaboration through video conferencing was forced on the industry, but some of the workflow changes are viewed as positive. LOOKING FORWARD

Having made the leap to allow Working From Home, studios and VFX partners are recognizing this as a time to completely reimagine workflows. There is an expectation that WFH and distributed collaboration will be the new norm for at least a portion of the workforce. We are learning what safety looks like in a post-COVID-19 world. Potential benefits of artists working remotely include rapid scaling of resources based on current needs and easing the need for costly infrastructure build-outs. It’s a great time for experimentation and rethinking the way we work, including an evaluation of cloud solutions that might enable more elastic resources to leverage the global ecosystem. While visual effects is a highly competitive sector of content

creation, the community recognizes we must come together for the health and safety of our talent and clients. New protocols will lead to shorter shooting days and longer shooting schedules. There will be more extensive visual effects and a need for digi-doubles where extras once stood. Producers will need to carefully evaluate what can happen remotely versus what can be achieved in post-production. Previsualization will likely become an even more helpful tool to plan and execute shoots efficiently and safely. Methods of virtual production may offer the potential for a reduced footprint, but will not be able to replace top talent or fully address the needs of collaborative filmmaking. Remote-friendly collaborative tools that familiarize, simplify or abstract the more technical parts of the creative process and allow artists to focus on vision will be in high demand. One positive side effect of the crisis for our industry is that content consumption grew. As everyone stayed safe at home, they increasingly wanted new things to watch and play. Streaming service adoption accelerated, which continues to help drive more content creation demand. New platforms including AR and VR are emerging that will require even more services from our industry. Many video games set new sales records as gamers used online services to stay connected with friends and fill more free time. With a very bright future ahead, how do we apply the learnings from this experience and evolve? How do we hold on to the benefits we gained in the COVID-19 response? As an industry, the WFH transformation has accelerated new approaches to secure remote workflows, but there is more work to be done, and more opportunities to be discovered. It is the opinion of the VES Tech Committee that industry practices would benefit from a more formal exchange of ideas and a broader discussion of lessons learned. We would like to advocate that the industry

choose a method to foster dialogue and create permanent change, perhaps utilizing some form of an ongoing working group to build upon the efforts that have taken place. Many thanks are owed to the very small group of dedicated individuals who helped capture the information, prepare the document and coordinate the efforts of the VES Tech Committee. This greatly helped with the emergency transition that occurred, but going forward we see a place for a broader industry focus on best practices with more voices participating.

OPPOSITE TOP: Dave Burgess, Animation Supervisor at Animal Logic, working from his home in Vancouver. TOP: Max Sachar, producer of Pixar’s OUT, working from his home in San Francisco. BOTTOM: VES Technology Committee member Darin Grant’s setup at home in Southern California.

FALL 2020 VFXVOICE.COM • 59


TECH & TOOLS

The necessity of remote work required a significant departure from these security postures that barred any access to content from employees’ homes. A common approach to address this was to leverage VPN technology, which encrypts all traffic between a user and their company’s datacenter. than anything, though, it has become clear there is not an industry-standard alternative to artists sitting together in a calibrated room, viewing and discussing high-resolution and high-dynamic-range uncompressed color-correct content in real-time. Most teams cobbled together workflows by combining commodity video-conferencing software solutions. Reviewers were forced to accept imperfect synchronized playback or approaches that required pre-staging media on every user’s work-provided device, adding delay to the creative approval process. A few cloud and streaming solutions are beginning to emerge, but they are not yet fully vetted from a security point of view. This is an area that will likely see significant evolution over time. Digital content security is incredibly important to everyone, from client studios to VFX and animation facilities, with strict and precise requirements that are often mandated in order to work on certain projects. Facilities doing this work were compelled to coordinate with the security teams of the content owners to ensure that the remote-access infrastructure met enough security requirements and best practices. Previously, this involved a strict separation between machines accessing pre-release content and the open internet. The necessity of remote work required a significant departure from these security postures that barred any access to content from employees’ homes. A common approach to address this was to leverage VPN technology, which encrypts all traffic between a user and their company’s datacenter. Since allowing production content to be broadly accessed by employees from home wasn’t a typically permissible working condition in most facilities, no standardization existed around these requirements. It was only after several iterations that facilities derived a practical checklist to which employees were asked to adhere. Helping users walk through the huge variety of home setups and troubleshooting

58 • VFXVOICE.COM FALL 2020

the problems that users encountered was a challenge for already strained IT departments. Home users, in many cases, were required to upgrade the security of their Wi-Fi by reconfiguring their router. Additional security requirements included a minimum precaution that all monitors face away from doors and windows in order to minimize exposure to additional family members. There was another potential challenge for those who share their home space with individuals employed at other facilities. While pre-existing workflows have undoubtedly been impacted by the aforementioned challenges, new opportunities and workflows have evolved to allow productivity to continue at levels close to those from before the disruption. Reports started flowing in about the advantages of checking a render after dinner, managing time more flexibly, and not having to wait for a meeting room when a client review runs long. The collaboration through video conferencing was forced on the industry, but some of the workflow changes are viewed as positive. LOOKING FORWARD

Having made the leap to allow Working From Home, studios and VFX partners are recognizing this as a time to completely reimagine workflows. There is an expectation that WFH and distributed collaboration will be the new norm for at least a portion of the workforce. We are learning what safety looks like in a post-COVID-19 world. Potential benefits of artists working remotely include rapid scaling of resources based on current needs and easing the need for costly infrastructure build-outs. It’s a great time for experimentation and rethinking the way we work, including an evaluation of cloud solutions that might enable more elastic resources to leverage the global ecosystem. While visual effects is a highly competitive sector of content

creation, the community recognizes we must come together for the health and safety of our talent and clients. New protocols will lead to shorter shooting days and longer shooting schedules. There will be more extensive visual effects and a need for digi-doubles where extras once stood. Producers will need to carefully evaluate what can happen remotely versus what can be achieved in post-production. Previsualization will likely become an even more helpful tool to plan and execute shoots efficiently and safely. Methods of virtual production may offer the potential for a reduced footprint, but will not be able to replace top talent or fully address the needs of collaborative filmmaking. Remote-friendly collaborative tools that familiarize, simplify or abstract the more technical parts of the creative process and allow artists to focus on vision will be in high demand. One positive side effect of the crisis for our industry is that content consumption grew. As everyone stayed safe at home, they increasingly wanted new things to watch and play. Streaming service adoption accelerated, which continues to help drive more content creation demand. New platforms including AR and VR are emerging that will require even more services from our industry. Many video games set new sales records as gamers used online services to stay connected with friends and fill more free time. With a very bright future ahead, how do we apply the learnings from this experience and evolve? How do we hold on to the benefits we gained in the COVID-19 response? As an industry, the WFH transformation has accelerated new approaches to secure remote workflows, but there is more work to be done, and more opportunities to be discovered. It is the opinion of the VES Tech Committee that industry practices would benefit from a more formal exchange of ideas and a broader discussion of lessons learned. We would like to advocate that the industry

choose a method to foster dialogue and create permanent change, perhaps utilizing some form of an ongoing working group to build upon the efforts that have taken place. Many thanks are owed to the very small group of dedicated individuals who helped capture the information, prepare the document and coordinate the efforts of the VES Tech Committee. This greatly helped with the emergency transition that occurred, but going forward we see a place for a broader industry focus on best practices with more voices participating.

OPPOSITE TOP: Dave Burgess, Animation Supervisor at Animal Logic, working from his home in Vancouver. TOP: Max Sachar, producer of Pixar’s OUT, working from his home in San Francisco. BOTTOM: VES Technology Committee member Darin Grant’s setup at home in Southern California.

FALL 2020 VFXVOICE.COM • 59


TECH & TOOLS

NEW TECH IN VFX: FALL EDITION By IAN FAILES

Keeping up with the latest in visual effects tools can be daunting. These past few years have seen major innovations in technology solutions for VFX artists and, as VFX Voice found out, there are many to come. Here’s a roundup of new or recently-launched products that you might already be using, or that you may soon find part of your visual effects production workflow.

Facial motion capture company Faceware Technologies released Faceware Studio this year as a real-time facial animation platform. While a re-engineering of Faceware Live, Studio incorporates a real-time streaming workflow that allows users to track any face, and includes machine learning techniques to perform better facial tracks. “We currently use neural networks on the ‘hard to track’ parts of the face, like determining jaw position,” discusses Faceware Vice President, Product and Development Jay Grenier. “We’ve had huge success with it so far, and we’re in the process of evaluating the benefits to the user of tracking the entire face with these new techniques.

TOP: A demonstration of the shading capabilities in Unreal Engine 4.25. The tool is now widely used in virtual production, visualization and, of course, game production. (Image courtesy of Epic Games) BOTTOM: The Faceware Studio interface, with a link into Unreal Engine. (Image courtesy of Faceware Technologies)

SOLUTIONS IN THE REAL-TIME SPACE

TOP: Epic Games’ Unreal Engine 5 real-time demo called ‘Lumen in the Land of Nanite,’ running live on PlayStation 5. (Image courtesy of Epic Games)

60 • VFXVOICE.COM FALL 2020

Game engine maker Epic Games made an early reveal this year when it demonstrated the photoreal capabilities of Unreal Engine 5 (releasing in late 2021), including features such as the Nanite virtualized micropolygon geometry and the Lumen global illumination tool. “With these new features,” says Epic Games Business Development Manager Miles Perkins, “there will be less of a concern to design assets with a game engine GPU rendering budget in mind. You won’t have to cut corners on asset fidelity because you’ll be able to work with the same level of detail in-engine as those built for a traditional VFX feature film pipeline.” Meanwhile, Unreal Engine has now become one of the mainstays of virtual production workflows, and Epic has been pitching the game engine as more than just a rendering portal, with Perkins noting, in particular, that Unreal Engine’s physics abilities can match what happens in the real world to the virtual.

FALL 2020 VFXVOICE.COM • 61


TECH & TOOLS

NEW TECH IN VFX: FALL EDITION By IAN FAILES

Keeping up with the latest in visual effects tools can be daunting. These past few years have seen major innovations in technology solutions for VFX artists and, as VFX Voice found out, there are many to come. Here’s a roundup of new or recently-launched products that you might already be using, or that you may soon find part of your visual effects production workflow.

Facial motion capture company Faceware Technologies released Faceware Studio this year as a real-time facial animation platform. While a re-engineering of Faceware Live, Studio incorporates a real-time streaming workflow that allows users to track any face, and includes machine learning techniques to perform better facial tracks. “We currently use neural networks on the ‘hard to track’ parts of the face, like determining jaw position,” discusses Faceware Vice President, Product and Development Jay Grenier. “We’ve had huge success with it so far, and we’re in the process of evaluating the benefits to the user of tracking the entire face with these new techniques.

TOP: A demonstration of the shading capabilities in Unreal Engine 4.25. The tool is now widely used in virtual production, visualization and, of course, game production. (Image courtesy of Epic Games) BOTTOM: The Faceware Studio interface, with a link into Unreal Engine. (Image courtesy of Faceware Technologies)

SOLUTIONS IN THE REAL-TIME SPACE

TOP: Epic Games’ Unreal Engine 5 real-time demo called ‘Lumen in the Land of Nanite,’ running live on PlayStation 5. (Image courtesy of Epic Games)

60 • VFXVOICE.COM FALL 2020

Game engine maker Epic Games made an early reveal this year when it demonstrated the photoreal capabilities of Unreal Engine 5 (releasing in late 2021), including features such as the Nanite virtualized micropolygon geometry and the Lumen global illumination tool. “With these new features,” says Epic Games Business Development Manager Miles Perkins, “there will be less of a concern to design assets with a game engine GPU rendering budget in mind. You won’t have to cut corners on asset fidelity because you’ll be able to work with the same level of detail in-engine as those built for a traditional VFX feature film pipeline.” Meanwhile, Unreal Engine has now become one of the mainstays of virtual production workflows, and Epic has been pitching the game engine as more than just a rendering portal, with Perkins noting, in particular, that Unreal Engine’s physics abilities can match what happens in the real world to the virtual.

FALL 2020 VFXVOICE.COM • 61


TECH & TOOLS

“There is so much more to come that we’re excited about,” adds Grenier. “From here, we’ll move on to features for improving the quality of the animation so that users can not only create rapid content, but also refine the data and produce even higher quality results with the tool.” USD TAKES THE STAGE

TOP TO BOTTOM: Faceware Studio enables tracking of faces captured with facial capture cameras to produce facial animation on a corresponding CG model. (Image courtesy of Faceware Technologies) Autodesk Maya’s USD Layer Editor. (Image courtesy of Autodesk) Foundry’s HEIST app, which came out of the EIST project with the BBC R&D. (Image courtesy of Foundry) OPPOSITE TOP TO BOTTOM: A frame from NVIDIA’s Omniverse playable ‘Marbles’ game. (Image courtesy of NVIDIA) Houdini’s new Topo Transfer tool. (Image courtesy of SideFX) The final CG plant asset by SO REAL, intended for use in AR/VR, game or VFX projects. (Image courtesy of SO REAL)

62 • VFXVOICE.COM FALL 2020

Pixar’s Universal Scene Description (USD) framework, which is being adopted in several different tools, is currently receiving significant attention in VFX. Epic Games’ Unreal Engine was one of the first applications to integrate USD back in 2017 with UE 4.16, and has continued to the latest release. “Scenes can be read and modified in Unreal Engine with changes reflected in the USD data immediately,” says Ryan Mayeda, Product Manager for Unreal Engine Virtual Production at Epic Games. “4.25 improves large scene performance and offers complete Python support for pipeline developers. Going forward, Epic is fully committed to USD, and each release in our roadmap will bring new features that focus on building connected asset workflows.” Meanwhile, Autodesk has been developing a USD for Maya plug-in to provide translation and editing capabilities for USD. “Our goal is to teach Maya’s tools and UI how to talk to native USD data,” outlines Autodesk Senior Software Architect Gordon Bradley. “You can load USD data into Maya, edit it naturally using Maya’s standard tools, and save it out.” The plug-in has been developed as a fully open source project, following on from early individual work done by Pixar, Animal Logic, Luma Pictures and Blue Sky Studios. Adds Bradley, “We’re excited to include this out of the box once it’s ready so artists can just install Maya and start working with USD.”

Another key USD adopter is NVIDIA. The company has been pushing ahead with Omniverse, which draws upon USD and NVIDIA’s RTX technology and allows teams to interactively work together in different pieces of creative software. NVIDIA’s Richard Kerris, Industry General Manager for Media and Entertainment, notes that NVIDIA has been collaborating with several major VFX studios on Omniverse and pursuing more virtual production uses, too. “The feedback we’ve been getting is, how can we incorporate Omniverse into virtual sets, and how can we use that to interact with devices or objects from other studios we might be working with, rather than the antiquated import/export model.” Omniverse was unveiled for the architecture, engineering and construction (AEC) industry earlier this year, and also featured in a key ‘Marbles’ demo showcasing a playable game environment with real-time physics and dynamic lighting. Kerris says more will be happening soon with its media and entertainment customers, a group he cites is “in our DNA.” Meanwhile, Foundry has utilized USD as part of research done during the EU-funded Enabling Interactive Story Telling (EIST) project, a collaboration with the BBC R&D. Here, an interactive ‘branching’ AR/VR storytelling tool was developed that Foundry adjudged could be extended to VFX and post-production. The team built an application that allowed for sequencing of USD files on a timeline. “We realized in the project that USD could be used to solve other problems, too,” shares Foundry Head of Research Dan Ring, noting that the EIST project also bore the development of a realtime scene layout and playback review tool. “The big thing we’re looking at with it all,” says Ring, “is how you might capture data during production, particularly virtual production or on set. We want to have a timeline of truth – a single source of truth for a production where you collect all of your data.”

FALL 2020 VFXVOICE.COM • 63


TECH & TOOLS

“There is so much more to come that we’re excited about,” adds Grenier. “From here, we’ll move on to features for improving the quality of the animation so that users can not only create rapid content, but also refine the data and produce even higher quality results with the tool.” USD TAKES THE STAGE

TOP TO BOTTOM: Faceware Studio enables tracking of faces captured with facial capture cameras to produce facial animation on a corresponding CG model. (Image courtesy of Faceware Technologies) Autodesk Maya’s USD Layer Editor. (Image courtesy of Autodesk) Foundry’s HEIST app, which came out of the EIST project with the BBC R&D. (Image courtesy of Foundry) OPPOSITE TOP TO BOTTOM: A frame from NVIDIA’s Omniverse playable ‘Marbles’ game. (Image courtesy of NVIDIA) Houdini’s new Topo Transfer tool. (Image courtesy of SideFX) The final CG plant asset by SO REAL, intended for use in AR/VR, game or VFX projects. (Image courtesy of SO REAL)

62 • VFXVOICE.COM FALL 2020

Pixar’s Universal Scene Description (USD) framework, which is being adopted in several different tools, is currently receiving significant attention in VFX. Epic Games’ Unreal Engine was one of the first applications to integrate USD back in 2017 with UE 4.16, and has continued to the latest release. “Scenes can be read and modified in Unreal Engine with changes reflected in the USD data immediately,” says Ryan Mayeda, Product Manager for Unreal Engine Virtual Production at Epic Games. “4.25 improves large scene performance and offers complete Python support for pipeline developers. Going forward, Epic is fully committed to USD, and each release in our roadmap will bring new features that focus on building connected asset workflows.” Meanwhile, Autodesk has been developing a USD for Maya plug-in to provide translation and editing capabilities for USD. “Our goal is to teach Maya’s tools and UI how to talk to native USD data,” outlines Autodesk Senior Software Architect Gordon Bradley. “You can load USD data into Maya, edit it naturally using Maya’s standard tools, and save it out.” The plug-in has been developed as a fully open source project, following on from early individual work done by Pixar, Animal Logic, Luma Pictures and Blue Sky Studios. Adds Bradley, “We’re excited to include this out of the box once it’s ready so artists can just install Maya and start working with USD.”

Another key USD adopter is NVIDIA. The company has been pushing ahead with Omniverse, which draws upon USD and NVIDIA’s RTX technology and allows teams to interactively work together in different pieces of creative software. NVIDIA’s Richard Kerris, Industry General Manager for Media and Entertainment, notes that NVIDIA has been collaborating with several major VFX studios on Omniverse and pursuing more virtual production uses, too. “The feedback we’ve been getting is, how can we incorporate Omniverse into virtual sets, and how can we use that to interact with devices or objects from other studios we might be working with, rather than the antiquated import/export model.” Omniverse was unveiled for the architecture, engineering and construction (AEC) industry earlier this year, and also featured in a key ‘Marbles’ demo showcasing a playable game environment with real-time physics and dynamic lighting. Kerris says more will be happening soon with its media and entertainment customers, a group he cites is “in our DNA.” Meanwhile, Foundry has utilized USD as part of research done during the EU-funded Enabling Interactive Story Telling (EIST) project, a collaboration with the BBC R&D. Here, an interactive ‘branching’ AR/VR storytelling tool was developed that Foundry adjudged could be extended to VFX and post-production. The team built an application that allowed for sequencing of USD files on a timeline. “We realized in the project that USD could be used to solve other problems, too,” shares Foundry Head of Research Dan Ring, noting that the EIST project also bore the development of a realtime scene layout and playback review tool. “The big thing we’re looking at with it all,” says Ring, “is how you might capture data during production, particularly virtual production or on set. We want to have a timeline of truth – a single source of truth for a production where you collect all of your data.”

FALL 2020 VFXVOICE.COM • 63


TECH & TOOLS

NEW TOOLS AND NEW FEATURES TO TRY OUT

TOP TO BOTTOM: A look at the latest Foundry’s Katana user interface. (Image courtesy of Foundry) Clarisse’s procedural scene layout. (Image courtesy of Isotropix) User interface for Substance Alchemist, where users build material libraries. (Image courtesy of Adobe) Users can mark up and scrub through media using SyncReview to review the content together. (Image courtesy of Foundry)

SideFX Houdini’s latest features for the procedural software center around three areas, and further its focus on USD. First, there’s a number of developments in Houdini’s Solaris, its USD-based context for procedural lookdev, layout and lighting, and the Karma renderer that integrates into Solaris. The second area is interactive physics, where aspects such as pyro sims and Vellum brushes are being solved essentially in real-time, providing direct interaction for the artist in their scenes. The third area involves character creation within Houdini. Here, motion retargeting, procedural rigging, a Topo Transfer tool and other developments are leading the charge into new character workflows. Part of the push with each of the above areas is, says Cristin Barghiel, Vice President of Product Development at SideFX, a response to customers asking the software maker to enable more and more development to occur inside Houdini. “This has been a running string through the history of Houdini – we strive to make it easier for artists to make beautiful work, and make sure our software works well with others to ensure a performant pipeline.” If you’re looking for new ways to digitally scan assets both inside and out, SO REAL Digital Twins AG has launched a service aimed at doing exactly that primarily for AR/VR, but also for VFX and other areas of CG. The service works using CT scans. “CT uses X-rays,” explains SO REAL’s Head of Film & AR Raffael Dickreuter. “The X-ray photons penetrate the entire object. We capture the inside and outside at the same time. We work with the raw data produced by the scanner and then can deliver many formats. “We already knew how precise CT scans can be, down at the level of microns,” adds Dickreuter. “We knew that many physical parameters, CG for example, can be extracted from the volume data. The question was: could that be converted from the volume domain to the polygon domain? So we tried it and it worked.” For Katana users, the software’s latest incarnation offers up a number of enhancements, including streamlined workflows, shading network improvements, new snapping UX built on top of USD, dockable widgets and network material editing. Jordan Thistlewood, Director of Product - Pre-production, Lookdev & Lighting at Foundry, observes that “the scale of productions are not getting smaller, and it’s been getting harder and harder to manage massive amounts of information. So with tools like Katana what we’re trying to do is look at what are the core performance aspects, the core workflow aspects, the core interoperability – what are the other programs being used and how is it part of the pipeline? Also, what does an artist do? How do they sit there and consume this mass amount of information – that’s led to the latest changes.” SHOT PRODUCTION FOCUS

Visual effects artists already use Isotropix’s Clarisse iFX as a way of working with complex CG scenes, as well as the newer BUiLDER toolset with its nodal scene assembly and nodal compositing features. Isotropix CEO and Co-founder Sam Assadian has been watching artists take full advantage of the extra power in BUiLDER and says extra features are coming soon. “We are always continuing to simplify collaborative workflows,

64 • VFXVOICE.COM FALL 2020

improving rendering performances and extending our support of industry standards such as USD, Autodesk Standard Material and Material X. “The later point is very important since collaboration and asset sharing are becoming very common in the industry,” continues Assadian. “We will also be publicly releasing Clarisse SDK to boost the third-party ecosystem tools already gravitating around Clarisse. We’ve also recently published a new API designed to simplify the integration of any third-party renderers to Clarisse.” VFX artists have also been generating complex texturing detail with Adobe’s Substance products. One of the latest developments has been on the machine learning side with the material authoring tool Substance Alchemist via an algorithm called Materia. “It takes any photograph of a surface and converts it into a full physically-based material,” details Jérémie Noguer, Adobe’s Principal Product Manager - Entertainment. “To get to the output material right, we trained the algorithm on thousands of scanned materials for which we had the ground truth data – photogrammetry – and various renders of that data with different lighting conditions.” Noguer says the application of Materia to VFX workflows would be to generate accurate materials from photos taken on a physical set. “Materia excels at generating architecture type materials and organic grounds, stone walls and such, so it could be useful in many cases where photogrammetry would be overkill, too pricey, time-consuming or straight-up impossible.” VFX studios themselves are often responsible for the creation of bespoke tools for use in production. One such tool is Framestore’s Fibre dynamics solver, used on projects like Lady and the Tramp to make hair and fur move realistically when interacting with water, wind, clothes and other characters. “At the core of Fibre,” states Framestore Lead R&D Technical Director Alex Rothwell, “we needed a robust solver algorithm, something that could deal with the wide range of element interactions required. We opted for a Physically Based Dynamics approach, facilitated by the intra hair constraints implemented using Cosserat-type rods.” “Our previous workflows had required the creation of low-resolution proxies and other manual setup in order to guide collisions between hair and geometry. Fibre uses an optimized collision engine, effectively eliminating the need for any preprocessing.” TOOLS TO KEEP WORKING

With working remotely becoming the norm this year, a common solution among studios became Amazon Web Services’ (AWS) Studio in the Cloud, a mix of cloud-based virtual workstations, storage and rendering capabilities. “Rather than purchase high-performance hardware, as well as house and maintain physical machines, users can spin up the resources they need, when they need them, and spin them down, paying only for the time used,” offers Will McDonald, Product and Technology Leader at AWS. “In addition to shifting capital expenditure to operational expenditure, creating digital content on the cloud also provides greater resource flexibility.”

McDonald attests, too, that virtual workstations are a way of maintaining compute power and doing it securely, and have “enabled studios to continue to work effectively to the point where several are evaluating how to continue to work in this way to maximize their flexibility to hire the best talent regardless of where they reside.” Remote collaboration is a theme in another Foundry tool, SyncReview, for Nuke Studio, Hiero and HieroPlayer. The idea here is to be able to run Nuke Studio or Hiero sessions in different locations, say where a VFX studio is based in separate countries, and have the full media synchronized in high fidelity between them. Juan Salazar, Senior Creative Product Manager, Timeline Products at Foundry, explains the idea and how it came about. “With SyncReview, you can run a session of Nuke Studio and have everything syncing in both places. It’s something that hasn’t been done before completely like this. “SyncReview came from a bigger project we are doing to improve the review process for the VFX pipeline. One of the main issues in review sessions is the lack of visibility over the whole timeline and also seeing total image and color accuracy.”

TOP TO BOTTOM: Framestore’s Fibre tool was used for the hair/ fur of the characters in Lady and the Tramp. This stage shows the clumping stage. (Image copyright © 2019 Walt Disney Pictures) Simulation QC render automatically generated by Framestore’s pipeline. (Image copyright © 2019 Walt Disney Pictures) Final render. (Image copyright © 2019 Walt Disney Pictures)

FALL 2020 VFXVOICE.COM • 65


TECH & TOOLS

NEW TOOLS AND NEW FEATURES TO TRY OUT

TOP TO BOTTOM: A look at the latest Foundry’s Katana user interface. (Image courtesy of Foundry) Clarisse’s procedural scene layout. (Image courtesy of Isotropix) User interface for Substance Alchemist, where users build material libraries. (Image courtesy of Adobe) Users can mark up and scrub through media using SyncReview to review the content together. (Image courtesy of Foundry)

SideFX Houdini’s latest features for the procedural software center around three areas, and further its focus on USD. First, there’s a number of developments in Houdini’s Solaris, its USD-based context for procedural lookdev, layout and lighting, and the Karma renderer that integrates into Solaris. The second area is interactive physics, where aspects such as pyro sims and Vellum brushes are being solved essentially in real-time, providing direct interaction for the artist in their scenes. The third area involves character creation within Houdini. Here, motion retargeting, procedural rigging, a Topo Transfer tool and other developments are leading the charge into new character workflows. Part of the push with each of the above areas is, says Cristin Barghiel, Vice President of Product Development at SideFX, a response to customers asking the software maker to enable more and more development to occur inside Houdini. “This has been a running string through the history of Houdini – we strive to make it easier for artists to make beautiful work, and make sure our software works well with others to ensure a performant pipeline.” If you’re looking for new ways to digitally scan assets both inside and out, SO REAL Digital Twins AG has launched a service aimed at doing exactly that primarily for AR/VR, but also for VFX and other areas of CG. The service works using CT scans. “CT uses X-rays,” explains SO REAL’s Head of Film & AR Raffael Dickreuter. “The X-ray photons penetrate the entire object. We capture the inside and outside at the same time. We work with the raw data produced by the scanner and then can deliver many formats. “We already knew how precise CT scans can be, down at the level of microns,” adds Dickreuter. “We knew that many physical parameters, CG for example, can be extracted from the volume data. The question was: could that be converted from the volume domain to the polygon domain? So we tried it and it worked.” For Katana users, the software’s latest incarnation offers up a number of enhancements, including streamlined workflows, shading network improvements, new snapping UX built on top of USD, dockable widgets and network material editing. Jordan Thistlewood, Director of Product - Pre-production, Lookdev & Lighting at Foundry, observes that “the scale of productions are not getting smaller, and it’s been getting harder and harder to manage massive amounts of information. So with tools like Katana what we’re trying to do is look at what are the core performance aspects, the core workflow aspects, the core interoperability – what are the other programs being used and how is it part of the pipeline? Also, what does an artist do? How do they sit there and consume this mass amount of information – that’s led to the latest changes.” SHOT PRODUCTION FOCUS

Visual effects artists already use Isotropix’s Clarisse iFX as a way of working with complex CG scenes, as well as the newer BUiLDER toolset with its nodal scene assembly and nodal compositing features. Isotropix CEO and Co-founder Sam Assadian has been watching artists take full advantage of the extra power in BUiLDER and says extra features are coming soon. “We are always continuing to simplify collaborative workflows,

64 • VFXVOICE.COM FALL 2020

improving rendering performances and extending our support of industry standards such as USD, Autodesk Standard Material and Material X. “The later point is very important since collaboration and asset sharing are becoming very common in the industry,” continues Assadian. “We will also be publicly releasing Clarisse SDK to boost the third-party ecosystem tools already gravitating around Clarisse. We’ve also recently published a new API designed to simplify the integration of any third-party renderers to Clarisse.” VFX artists have also been generating complex texturing detail with Adobe’s Substance products. One of the latest developments has been on the machine learning side with the material authoring tool Substance Alchemist via an algorithm called Materia. “It takes any photograph of a surface and converts it into a full physically-based material,” details Jérémie Noguer, Adobe’s Principal Product Manager - Entertainment. “To get to the output material right, we trained the algorithm on thousands of scanned materials for which we had the ground truth data – photogrammetry – and various renders of that data with different lighting conditions.” Noguer says the application of Materia to VFX workflows would be to generate accurate materials from photos taken on a physical set. “Materia excels at generating architecture type materials and organic grounds, stone walls and such, so it could be useful in many cases where photogrammetry would be overkill, too pricey, time-consuming or straight-up impossible.” VFX studios themselves are often responsible for the creation of bespoke tools for use in production. One such tool is Framestore’s Fibre dynamics solver, used on projects like Lady and the Tramp to make hair and fur move realistically when interacting with water, wind, clothes and other characters. “At the core of Fibre,” states Framestore Lead R&D Technical Director Alex Rothwell, “we needed a robust solver algorithm, something that could deal with the wide range of element interactions required. We opted for a Physically Based Dynamics approach, facilitated by the intra hair constraints implemented using Cosserat-type rods.” “Our previous workflows had required the creation of low-resolution proxies and other manual setup in order to guide collisions between hair and geometry. Fibre uses an optimized collision engine, effectively eliminating the need for any preprocessing.” TOOLS TO KEEP WORKING

With working remotely becoming the norm this year, a common solution among studios became Amazon Web Services’ (AWS) Studio in the Cloud, a mix of cloud-based virtual workstations, storage and rendering capabilities. “Rather than purchase high-performance hardware, as well as house and maintain physical machines, users can spin up the resources they need, when they need them, and spin them down, paying only for the time used,” offers Will McDonald, Product and Technology Leader at AWS. “In addition to shifting capital expenditure to operational expenditure, creating digital content on the cloud also provides greater resource flexibility.”

McDonald attests, too, that virtual workstations are a way of maintaining compute power and doing it securely, and have “enabled studios to continue to work effectively to the point where several are evaluating how to continue to work in this way to maximize their flexibility to hire the best talent regardless of where they reside.” Remote collaboration is a theme in another Foundry tool, SyncReview, for Nuke Studio, Hiero and HieroPlayer. The idea here is to be able to run Nuke Studio or Hiero sessions in different locations, say where a VFX studio is based in separate countries, and have the full media synchronized in high fidelity between them. Juan Salazar, Senior Creative Product Manager, Timeline Products at Foundry, explains the idea and how it came about. “With SyncReview, you can run a session of Nuke Studio and have everything syncing in both places. It’s something that hasn’t been done before completely like this. “SyncReview came from a bigger project we are doing to improve the review process for the VFX pipeline. One of the main issues in review sessions is the lack of visibility over the whole timeline and also seeing total image and color accuracy.”

TOP TO BOTTOM: Framestore’s Fibre tool was used for the hair/ fur of the characters in Lady and the Tramp. This stage shows the clumping stage. (Image copyright © 2019 Walt Disney Pictures) Simulation QC render automatically generated by Framestore’s pipeline. (Image copyright © 2019 Walt Disney Pictures) Final render. (Image copyright © 2019 Walt Disney Pictures)

FALL 2020 VFXVOICE.COM • 65


PROFILE

MATT WORKMAN: SHARING HIS VIRTUAL PRODUCTION JOURNEY AS IT’S HAPPENING By IAN FAILES

All images courtesy of Matt Workman unless otherwise noted. TOP: Matt Workman in 2010, using the Red One camera – an early leader in digital cinema cameras that could capture up to 4K resolution – during the filming of a music video in Manhattan for Def Jam Recordings.

66 • VFXVOICE.COM FALL 2020

There’s no doubt that virtual production, and the associated area of real-time, is the hot topic in filmmaking and visual effects right now. Since virtual production is a relatively new area of VFX, information, training and advice on the latest techniques have not always been easy to find for those eager to learn. But one person, cinematographer Matt Workman (http:// www.cinematographydb.com), has become somewhat of a go-to source. He regularly shares – often daily – his own experiments in virtual production with followers on social media, also turning to the burgeoning virtual production community for advice and inspiration. Workman worked as a commercial cinematographer in New York for more than a decade, eventually looking to transition more into larger visual effects projects. This also sparked an interest in doing his own previs. “When I was working as a live-action cinematographer, the bigger projects required weeks of pre-production and planning for a one-to-three-day shoot,” Workman remarks. “Many of these projects had storyboards and previs already done, so I wanted to be able to contribute and share my ideas for camera work and lighting. I started by building tools in Maya, and I actually had a product for a short time called ‘Virtual Cinematography Tools’ that some people still use that was a mix of rigged models and a simple Python/MEL plug-in.” That early tool segued into the development of a real-time cinematography simulator called Cine Tracer, after Workman had started a company called Cinematography Database and begun selling a Cinema 4D previs plug-in known as Cine Designer. The plug-in allowed for very accurate technical previs with rigged 3D models of cameras, lights and other film industry equipment. It caught the eye of Unreal Engine maker Epic Games. “One day Epic Games asked me if I would develop something similar for Unreal,” relates Workman. “So I started to learn Unreal Engine 4 and I made a quick prototype into a video game. I posted the video online and it went viral. Since then I’ve kept adding to what is now called Cine Tracer as I learn Unreal Engine, and it’s turned into its own ecosystem and includes set building and digital humans on top of the cameras and lights from my Maya/Cinema 4D work.” Having worked with Epic since the beginning of Cine Tracer, Workman has maintained that relationship. He was awarded an Unreal Dev Grant and has been part of the Unreal Engine 4 Live Training stream. When Epic Games became involved in helping productions with real-time footage delivery on LED walls (the kind popularized by The Mandalorian), they asked Workman to help demo the tech on a purpose-built stage featuring Lux Machina LED screens. Specifically, he was part of an LED wall shoot as ‘director/DP’ during SIGGRAPH 2019. “I was already going to be presenting twice with Unreal Engine at SIGGRAPH, so I was up for one more project,” says Workman. “I had no idea how massive the LED project would be and I spent a full month on the stage. My primary role was to be the cinematographer and consult on the camera, lens, lighting, grip, equipment, and the general approach to shooting the project. I also had to

“My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out. So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.” —Matt Workman, Cinematographer/Developer consult on crew and other typical cinematographer responsibilities to make a smooth professional shoot. “The team on the project had already worked together on The Mandalorian, so I was the one catching up,” adds Workman. “But I contributed ideas on how I wanted to control the CG lighting environment at run time. I was also responsible for finding shots that best showed off the interaction of the LED reflections of the wall on the live-action motorcycle and actor they had there.” Since the LED wall experience, Workman has continued experimenting in real-time and in showcasing his results. For example, earlier this year he built his own virtual production at home, a fortuitous step for when the COVID-19 crisis arrived. “The main tool I’ve employed here is the HTC Vive Tracker that allows me to track the position of a real-world camera,” explains Workman. “I bought a Blackmagic URSA Mini Pro 4.6K G2, and using a Blackmagic DeckLink 8K Pro I can bring the live footage into Unreal Engine 4.

TOP: Workman during a demo on Epic Games’ LED wall setup. (Image courtesy of Epic Games) BOTTOM: A ‘Cyber Punk Car Scene’ made with Unreal Engine, in which Workman used the tool’s Sequence Recorder to act out the scene and actually drive the car.

FALL 2020 VFXVOICE.COM • 67


PROFILE

MATT WORKMAN: SHARING HIS VIRTUAL PRODUCTION JOURNEY AS IT’S HAPPENING By IAN FAILES

All images courtesy of Matt Workman unless otherwise noted. TOP: Matt Workman in 2010, using the Red One camera – an early leader in digital cinema cameras that could capture up to 4K resolution – during the filming of a music video in Manhattan for Def Jam Recordings.

66 • VFXVOICE.COM FALL 2020

There’s no doubt that virtual production, and the associated area of real-time, is the hot topic in filmmaking and visual effects right now. Since virtual production is a relatively new area of VFX, information, training and advice on the latest techniques have not always been easy to find for those eager to learn. But one person, cinematographer Matt Workman (http:// www.cinematographydb.com), has become somewhat of a go-to source. He regularly shares – often daily – his own experiments in virtual production with followers on social media, also turning to the burgeoning virtual production community for advice and inspiration. Workman worked as a commercial cinematographer in New York for more than a decade, eventually looking to transition more into larger visual effects projects. This also sparked an interest in doing his own previs. “When I was working as a live-action cinematographer, the bigger projects required weeks of pre-production and planning for a one-to-three-day shoot,” Workman remarks. “Many of these projects had storyboards and previs already done, so I wanted to be able to contribute and share my ideas for camera work and lighting. I started by building tools in Maya, and I actually had a product for a short time called ‘Virtual Cinematography Tools’ that some people still use that was a mix of rigged models and a simple Python/MEL plug-in.” That early tool segued into the development of a real-time cinematography simulator called Cine Tracer, after Workman had started a company called Cinematography Database and begun selling a Cinema 4D previs plug-in known as Cine Designer. The plug-in allowed for very accurate technical previs with rigged 3D models of cameras, lights and other film industry equipment. It caught the eye of Unreal Engine maker Epic Games. “One day Epic Games asked me if I would develop something similar for Unreal,” relates Workman. “So I started to learn Unreal Engine 4 and I made a quick prototype into a video game. I posted the video online and it went viral. Since then I’ve kept adding to what is now called Cine Tracer as I learn Unreal Engine, and it’s turned into its own ecosystem and includes set building and digital humans on top of the cameras and lights from my Maya/Cinema 4D work.” Having worked with Epic since the beginning of Cine Tracer, Workman has maintained that relationship. He was awarded an Unreal Dev Grant and has been part of the Unreal Engine 4 Live Training stream. When Epic Games became involved in helping productions with real-time footage delivery on LED walls (the kind popularized by The Mandalorian), they asked Workman to help demo the tech on a purpose-built stage featuring Lux Machina LED screens. Specifically, he was part of an LED wall shoot as ‘director/DP’ during SIGGRAPH 2019. “I was already going to be presenting twice with Unreal Engine at SIGGRAPH, so I was up for one more project,” says Workman. “I had no idea how massive the LED project would be and I spent a full month on the stage. My primary role was to be the cinematographer and consult on the camera, lens, lighting, grip, equipment, and the general approach to shooting the project. I also had to

“My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out. So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.” —Matt Workman, Cinematographer/Developer consult on crew and other typical cinematographer responsibilities to make a smooth professional shoot. “The team on the project had already worked together on The Mandalorian, so I was the one catching up,” adds Workman. “But I contributed ideas on how I wanted to control the CG lighting environment at run time. I was also responsible for finding shots that best showed off the interaction of the LED reflections of the wall on the live-action motorcycle and actor they had there.” Since the LED wall experience, Workman has continued experimenting in real-time and in showcasing his results. For example, earlier this year he built his own virtual production at home, a fortuitous step for when the COVID-19 crisis arrived. “The main tool I’ve employed here is the HTC Vive Tracker that allows me to track the position of a real-world camera,” explains Workman. “I bought a Blackmagic URSA Mini Pro 4.6K G2, and using a Blackmagic DeckLink 8K Pro I can bring the live footage into Unreal Engine 4.

TOP: Workman during a demo on Epic Games’ LED wall setup. (Image courtesy of Epic Games) BOTTOM: A ‘Cyber Punk Car Scene’ made with Unreal Engine, in which Workman used the tool’s Sequence Recorder to act out the scene and actually drive the car.

FALL 2020 VFXVOICE.COM • 67


PROFILE

“Having everyone actually seeing the virtual environment is a breath of fresh air. It brings all the creative decision making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.” —Matt Workman, Cinematographer/Developer

TOP LEFT: Workman, working as a cinematographer, shot commercials for brands such as BMW, Kodak and General Mills in New York for several years. TOP RIGHT: Some of the camera gear and rig Workman has for shooting in his virtual stage. MIDDLE: Workman films, from home, with a virtual camera. BOTTOM: Early crane-operating experience on a commercial shoot.

68 • VFXVOICE.COM FALL 2020

“I use UE4.25 Composure to live composite footage into a 3D Unreal Engine environment and I can do basic camera movement using the Vive Tracker. This same ‘indie’ approach can be used on an LED wall using nDisplay and there are several smaller studios using this workflow today.” Workman says that even before the coronavirus pandemic he had been experimenting with how to produce content remotely with virtual production techniques. To prove it could be done, he teamed up with Eric Jacobus from SuperAlloy Interactive to produce a short film. “Eric is a professional stunt/action actor and has a mocap studio and team. He wrote the script and we did a remote mocap session with an Xsens suit over Zoom. I could see multiple video witness cameras as well as the live Xsens preview of the scene. The mocap from the day was then cleaned up by [mocap artist] Mike Foster and sent to me.” Workman has also been writing a new ‘Virtual Production Tools’ framework for Unreal Engine 4, and was able to streamline the live filming of the mocap data. “We actually streamed four hours of the shoot to YouTube. The amazing part was that it felt very similar to a director and DP working on a live-action set. “I recently saw the rough cut of the film and it’s turning out great,” says Workman. “Our next step is to further refine the mocap based on the edit and do a final art and lighting pass, then film the project again to get a final result. This is an experimental approach, but the results are quite promising.” From Workman’s perspective, virtual production is clearly going to be part of the future of production, especially for the benefits of collaboration. “Having everyone actually seeing the virtual environment is a breath of fresh air,” he observes. “It brings all the creative decision-making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.” There are, however, some cautions that remain. “LED walls have their limits,” Workman states. “They won’t work for every shoot,

but I think directors and writers are interested in using them, and they will find ways of writing and shooting that make great use of them in production.” Workman hopes there will also be continued developments in matching real-time tools to real-world tools. “Right now, matching real-world cameras and lenses in Unreal Engine is quite difficult. It’s been solved by companies like MoSys and Ncam, but it requires very expensive hardware and software. The hope is that camera and lens manufacturers can make lens distortion and other important variables internally mapped and then made accessible at runtime gen-locked to the camera’s shutter. Once that process is made simpler and cheaper, doing mixed reality on greenscreen and LED walls will be much more accessible.” The cinematographer observes, too, that there is a strong crossover between virtual production and what VFX practitioners already do on live-action sets. “Completely CG/engine virtual production is very similar to a traditional VFX or 3D animation pipeline and is where most VFX artists will probably enter realtime production. Once they are comfortable in a real-time engine, then they can move onto being an on-set real-time tech artist or artist and help run the virtual environment that is on the LED walls or live composited on a greenscreen shoot.” To many newcomers to real-time and virtual production – and to many already in the industry – Workman has been a significant source of energy. And while he certainly acknowledges being in the business of selling products, via his company Cinematography Database and its main product Cine Tracer, Workman says the community aspect of virtual production right now is very important. “As a developer and artist, I thrive on sharing my work daily. This provides me feedback that I need to continue to learn and grow. I livestreamed the first year of Cine Tracer development every day on Twitch. I didn’t make much money from the stream, but I met a community of Unreal Engine developers that literally taught me Unreal Engine as I was streaming it. “My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out,” Workman adds. “So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.”

TOP: Workman has been acquiring a wealth of virtual production gear in the past few years. MIDDLE: A game-engine-rendered scene is captured. BOTTOM: The resulting frame.

FALL 2020 VFXVOICE.COM • 69


PROFILE

“Having everyone actually seeing the virtual environment is a breath of fresh air. It brings all the creative decision making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.” —Matt Workman, Cinematographer/Developer

TOP LEFT: Workman, working as a cinematographer, shot commercials for brands such as BMW, Kodak and General Mills in New York for several years. TOP RIGHT: Some of the camera gear and rig Workman has for shooting in his virtual stage. MIDDLE: Workman films, from home, with a virtual camera. BOTTOM: Early crane-operating experience on a commercial shoot.

68 • VFXVOICE.COM FALL 2020

“I use UE4.25 Composure to live composite footage into a 3D Unreal Engine environment and I can do basic camera movement using the Vive Tracker. This same ‘indie’ approach can be used on an LED wall using nDisplay and there are several smaller studios using this workflow today.” Workman says that even before the coronavirus pandemic he had been experimenting with how to produce content remotely with virtual production techniques. To prove it could be done, he teamed up with Eric Jacobus from SuperAlloy Interactive to produce a short film. “Eric is a professional stunt/action actor and has a mocap studio and team. He wrote the script and we did a remote mocap session with an Xsens suit over Zoom. I could see multiple video witness cameras as well as the live Xsens preview of the scene. The mocap from the day was then cleaned up by [mocap artist] Mike Foster and sent to me.” Workman has also been writing a new ‘Virtual Production Tools’ framework for Unreal Engine 4, and was able to streamline the live filming of the mocap data. “We actually streamed four hours of the shoot to YouTube. The amazing part was that it felt very similar to a director and DP working on a live-action set. “I recently saw the rough cut of the film and it’s turning out great,” says Workman. “Our next step is to further refine the mocap based on the edit and do a final art and lighting pass, then film the project again to get a final result. This is an experimental approach, but the results are quite promising.” From Workman’s perspective, virtual production is clearly going to be part of the future of production, especially for the benefits of collaboration. “Having everyone actually seeing the virtual environment is a breath of fresh air,” he observes. “It brings all the creative decision-making back into one space and one moment, whereas production is typically distributed around the world and over weeks or months in a typical greenscreen virtual environment pipeline.” There are, however, some cautions that remain. “LED walls have their limits,” Workman states. “They won’t work for every shoot,

but I think directors and writers are interested in using them, and they will find ways of writing and shooting that make great use of them in production.” Workman hopes there will also be continued developments in matching real-time tools to real-world tools. “Right now, matching real-world cameras and lenses in Unreal Engine is quite difficult. It’s been solved by companies like MoSys and Ncam, but it requires very expensive hardware and software. The hope is that camera and lens manufacturers can make lens distortion and other important variables internally mapped and then made accessible at runtime gen-locked to the camera’s shutter. Once that process is made simpler and cheaper, doing mixed reality on greenscreen and LED walls will be much more accessible.” The cinematographer observes, too, that there is a strong crossover between virtual production and what VFX practitioners already do on live-action sets. “Completely CG/engine virtual production is very similar to a traditional VFX or 3D animation pipeline and is where most VFX artists will probably enter realtime production. Once they are comfortable in a real-time engine, then they can move onto being an on-set real-time tech artist or artist and help run the virtual environment that is on the LED walls or live composited on a greenscreen shoot.” To many newcomers to real-time and virtual production – and to many already in the industry – Workman has been a significant source of energy. And while he certainly acknowledges being in the business of selling products, via his company Cinematography Database and its main product Cine Tracer, Workman says the community aspect of virtual production right now is very important. “As a developer and artist, I thrive on sharing my work daily. This provides me feedback that I need to continue to learn and grow. I livestreamed the first year of Cine Tracer development every day on Twitch. I didn’t make much money from the stream, but I met a community of Unreal Engine developers that literally taught me Unreal Engine as I was streaming it. “My current move into Unreal Engine-based virtual production is in a similar state to Cine Tracer when I started out,” Workman adds. “So I broadcast my whole process, failures and successes. I’ve built a community around this growing field and it accelerates my learning, and it also allows me to work with companies who are interested in virtual production to integrate their existing products or work together to create new ones.”

TOP: Workman has been acquiring a wealth of virtual production gear in the past few years. MIDDLE: A game-engine-rendered scene is captured. BOTTOM: The resulting frame.

FALL 2020 VFXVOICE.COM • 69


TV/STREAMING

WATCHMEN’S LEGACY: NEXT-LEVEL CINEMATIC TV VFX By TREVOR HOGG

TOP: YouTube footage of pig races was an important reference for Mackevision when putting together the court scene. (Image courtesy of Mackevision and HBO) OPPOSITE TOP: It was important for Framestore to make the experience of the energy being sucked out of Doctor Manhattan as visceral and painful as possible. (Image courtesy of Framestore and HBO) OPPOSITE BOTTOM: Creator Damon Lindelof on set with Regina King while shooting the HBO limited series adaptation of Watchmen. (Image courtesy of HBO)

70 • VFXVOICE.COM FALL 2020

As with the original 1986-1987 comic book maxi-series conceived by Alan Moore and Dave Gibbons, the HBO production of Watchmen was treated as a limited series by creator Damon Lindelof. The alternative history drama begins with the Tulsa Race Massacre of 1921 and then focuses on a white supremacist group known as the Seventh Kavalry violently opposing the Tulsa Police Department in the present. The story ambitions are epic and at times surreal, whether it be frozen squids raining down from the sky, clones being catapulted, a mirror-reflecting mask or a super-powerful blue being. The workload was shared among 30 vendors under the direction of Visual Effects Supervisor Erik Henry and Visual Effects Producer Matt Robken, who previously collaborated on Tom Clancy’s Jack Ryan. Among those recruited by Henry and Robken were Hybride, Rodeo FX, Raynault VFX, Mackevision, BUF, The Mill, Important Looking Pirates, MARZ, One Of Us and Framestore, with artists situated in Paris, Stuttgart, London, Montreal, Toronto and Stockholm. A tight production schedule led to the decision to have a roster of visual effects vendors that would rival a Marvel Studios movie. “It was going to be tough to deliver the last three episodes, so we made sure that no one had too many shots at one time to avoid having a logjam at the end,” states Henry. “The scripts came close to the time we had to shoot, which meant not a lot of prep time. Way before the scripts came to the crew, we would be given a verbal description of what the big scenes were and in that way were able to do previs in some instances and certainly build models in advance. All that prep work helped us get to the finish line.” An important element of the visual effects process was having a quick editorial turnover. “We talked to Damon, the directors and editors, and requested that they lock the visual effects sequences early,”

“It was going to be tough to deliver the last three episodes, so we made sure that no one had too many shots at one time to avoid having a logjam at the end. The scripts came close to the time we had to shoot, which meant not a lot of prep time. Way before the scripts came to the crew, we would be given a verbal description of what the big scenes were and in that way were able to do previs in some instances and certainly build models in advance. All that prep work helped us get to the finish line.” —Erik Henry, Visual Effects Supervisor remarks Robken. “You can throw as much acceleration money at it as you can but there is only so much time in a day. Everyone was great about it.” Critical to the project were the design of Doctor Manhattan (Yahya Abdul-Mateen II), the Millennium Clock and the reflective mask worn by Detective Wade Tillman/Looking Glass (Tim Blake Nelson). “I discussed with Damon whether Doctor Manhattan should glow all of the time and decided it should only happen when he was using his powers,” states Henry. “We also wanted to make sure that the performance of the actor came through. The Millennium Clock took a lot of time and had to be like a Trojan horse. We had to have it look like it could tell time in some strange atomic clock way but [in reality] be a machine that could trap Doctor Manhattan. As for the silver shiny mask that Looking Glass wore, I came up with the idea that cameras would be needed on the head of the actor because there was no real way to get production to shoot everything twice.” Approximately 2,600 visual effects were produced with the range for each of the nine episodes going from 500 to 150. “HBO came in with a great attitude and knew what it was going to take to put these scripts onscreen,” states Robken. “Our budget was always hefty and appropriate. There was some contingency money that

FALL 2020 VFXVOICE.COM • 71


TV/STREAMING

WATCHMEN’S LEGACY: NEXT-LEVEL CINEMATIC TV VFX By TREVOR HOGG

TOP: YouTube footage of pig races was an important reference for Mackevision when putting together the court scene. (Image courtesy of Mackevision and HBO) OPPOSITE TOP: It was important for Framestore to make the experience of the energy being sucked out of Doctor Manhattan as visceral and painful as possible. (Image courtesy of Framestore and HBO) OPPOSITE BOTTOM: Creator Damon Lindelof on set with Regina King while shooting the HBO limited series adaptation of Watchmen. (Image courtesy of HBO)

70 • VFXVOICE.COM FALL 2020

As with the original 1986-1987 comic book maxi-series conceived by Alan Moore and Dave Gibbons, the HBO production of Watchmen was treated as a limited series by creator Damon Lindelof. The alternative history drama begins with the Tulsa Race Massacre of 1921 and then focuses on a white supremacist group known as the Seventh Kavalry violently opposing the Tulsa Police Department in the present. The story ambitions are epic and at times surreal, whether it be frozen squids raining down from the sky, clones being catapulted, a mirror-reflecting mask or a super-powerful blue being. The workload was shared among 30 vendors under the direction of Visual Effects Supervisor Erik Henry and Visual Effects Producer Matt Robken, who previously collaborated on Tom Clancy’s Jack Ryan. Among those recruited by Henry and Robken were Hybride, Rodeo FX, Raynault VFX, Mackevision, BUF, The Mill, Important Looking Pirates, MARZ, One Of Us and Framestore, with artists situated in Paris, Stuttgart, London, Montreal, Toronto and Stockholm. A tight production schedule led to the decision to have a roster of visual effects vendors that would rival a Marvel Studios movie. “It was going to be tough to deliver the last three episodes, so we made sure that no one had too many shots at one time to avoid having a logjam at the end,” states Henry. “The scripts came close to the time we had to shoot, which meant not a lot of prep time. Way before the scripts came to the crew, we would be given a verbal description of what the big scenes were and in that way were able to do previs in some instances and certainly build models in advance. All that prep work helped us get to the finish line.” An important element of the visual effects process was having a quick editorial turnover. “We talked to Damon, the directors and editors, and requested that they lock the visual effects sequences early,”

“It was going to be tough to deliver the last three episodes, so we made sure that no one had too many shots at one time to avoid having a logjam at the end. The scripts came close to the time we had to shoot, which meant not a lot of prep time. Way before the scripts came to the crew, we would be given a verbal description of what the big scenes were and in that way were able to do previs in some instances and certainly build models in advance. All that prep work helped us get to the finish line.” —Erik Henry, Visual Effects Supervisor remarks Robken. “You can throw as much acceleration money at it as you can but there is only so much time in a day. Everyone was great about it.” Critical to the project were the design of Doctor Manhattan (Yahya Abdul-Mateen II), the Millennium Clock and the reflective mask worn by Detective Wade Tillman/Looking Glass (Tim Blake Nelson). “I discussed with Damon whether Doctor Manhattan should glow all of the time and decided it should only happen when he was using his powers,” states Henry. “We also wanted to make sure that the performance of the actor came through. The Millennium Clock took a lot of time and had to be like a Trojan horse. We had to have it look like it could tell time in some strange atomic clock way but [in reality] be a machine that could trap Doctor Manhattan. As for the silver shiny mask that Looking Glass wore, I came up with the idea that cameras would be needed on the head of the actor because there was no real way to get production to shoot everything twice.” Approximately 2,600 visual effects were produced with the range for each of the nine episodes going from 500 to 150. “HBO came in with a great attitude and knew what it was going to take to put these scripts onscreen,” states Robken. “Our budget was always hefty and appropriate. There was some contingency money that

FALL 2020 VFXVOICE.COM • 71


TV/STREAMING

we used. We had a big visual effects team [with a staff membership of 16] on Watchmen, which helped us get through so many vendors, and developed a good pipeline and workflow. We had an on-set supervisor, data wrangler and a visual effects editorial team for odds and evens [blocks of episodes] until later in the season when we ramped down to one.” Assets and shots needed to be shared. “There were legacy effects like the squid where it was a one-off scene that we gave to Hybride, and later in the season they showed up again but were mixed into either set or situations that other vendors had already done. There were quite a number of shared shots and that was an added challenge later in season.” The effects work varied from episode to episode, which meant that Hybride did not have reoccurring assets. “The most technically advanced sequence that we worked on was the squid rain,” states Mathieu Boucher, Executive Producer and Director of Production. “Small squids are falling from the sky and start to melt upon getting into the atmosphere. To have a creature that is super soft and rubbery hit a surface and after a few seconds

72 • VFXVOICE.COM FALL 2020

start to melt is a worse scenario in CG because you have soft body deformations and fluid dynamics. There are millions of squids interacting with an environment. We created libraries of splashes and melting squids, pre-cached those and reassembled them in a final simulation. One of the most useful references was a hailstorm that we found for how millions of falling objects bounce and create multiple layers. You see a wavy pattern in the background which helps with scale. The fluid created by frogs reproducing was a point of reference for close-up shots. You have to be creative and find stuff that connects with what you’re trying to achieve and grounds it into a realistic setting.” Episode 6, which was done in black and white, features a major contribution from Rodeo FX. “The biggest shot we had was when Hooded Justice (Jovan Adepo) is fighting inside a grocery store. There’s lettuce and cereal exploding, he jumps through a window, time completely freezes, and we do a full 180 degrees around him,” remarks Thomas Montminy Brodeur, Visual Effects Supervisor. “It was one continuous shot that had four different cameras, three

“The biggest difficulty with visual effects that aren’t ordinary is to make them photoreal, because sometimes the mind doesn’t even know what that looks like.” —Matt Robken, Visual Effects Producer witness cameras, a digital double, and the whole environment outside had to be recreated so it matched with how the camera turns around. That took six months to create. We decided to film without a cape, cereal and lettuce to control the timing. Parallax was needed for the lettuce, cereal and glass while the camera is traveling to avoid them looking like 2D elements. A LiDAR scan was taken of the grocery store.” The handheld camera complicated the stitching of the plates. “One plate will go fast and the other slow, so you need to create a stitch that will be an average between those two to maintain the rhythm of the episode. The best thing for us was when a transition could occur when the actor was out of frame.” Raynault produced a 1,498 frame shot that goes from a massacre

OPPOSITE TOP AND BOTTOM LEFT: Markers guide the eyeline for Sister Night (Regina King) for the computer graphics later produced by BUF. (Images courtesy of BUF and HBO) OPPOSITE TOP AND BOTTOM RIGHT: BUF designed the holograms to track museum patrons as they walk by them. (Images courtesy of BUF and HBO) TOP AND BOTTOM LEFT: BUF transforms a greenscreen stage into a nighttime environment. (Images courtesy of BUF and HBO) TOP AND BOTTOM RIGHT: The holographic tree created by BUF combined realistic and graphic elements. (Images courtesy of BUF and HBO)

FALL 2020 VFXVOICE.COM • 73


TV/STREAMING

we used. We had a big visual effects team [with a staff membership of 16] on Watchmen, which helped us get through so many vendors, and developed a good pipeline and workflow. We had an on-set supervisor, data wrangler and a visual effects editorial team for odds and evens [blocks of episodes] until later in the season when we ramped down to one.” Assets and shots needed to be shared. “There were legacy effects like the squid where it was a one-off scene that we gave to Hybride, and later in the season they showed up again but were mixed into either set or situations that other vendors had already done. There were quite a number of shared shots and that was an added challenge later in season.” The effects work varied from episode to episode, which meant that Hybride did not have reoccurring assets. “The most technically advanced sequence that we worked on was the squid rain,” states Mathieu Boucher, Executive Producer and Director of Production. “Small squids are falling from the sky and start to melt upon getting into the atmosphere. To have a creature that is super soft and rubbery hit a surface and after a few seconds

72 • VFXVOICE.COM FALL 2020

start to melt is a worse scenario in CG because you have soft body deformations and fluid dynamics. There are millions of squids interacting with an environment. We created libraries of splashes and melting squids, pre-cached those and reassembled them in a final simulation. One of the most useful references was a hailstorm that we found for how millions of falling objects bounce and create multiple layers. You see a wavy pattern in the background which helps with scale. The fluid created by frogs reproducing was a point of reference for close-up shots. You have to be creative and find stuff that connects with what you’re trying to achieve and grounds it into a realistic setting.” Episode 6, which was done in black and white, features a major contribution from Rodeo FX. “The biggest shot we had was when Hooded Justice (Jovan Adepo) is fighting inside a grocery store. There’s lettuce and cereal exploding, he jumps through a window, time completely freezes, and we do a full 180 degrees around him,” remarks Thomas Montminy Brodeur, Visual Effects Supervisor. “It was one continuous shot that had four different cameras, three

“The biggest difficulty with visual effects that aren’t ordinary is to make them photoreal, because sometimes the mind doesn’t even know what that looks like.” —Matt Robken, Visual Effects Producer witness cameras, a digital double, and the whole environment outside had to be recreated so it matched with how the camera turns around. That took six months to create. We decided to film without a cape, cereal and lettuce to control the timing. Parallax was needed for the lettuce, cereal and glass while the camera is traveling to avoid them looking like 2D elements. A LiDAR scan was taken of the grocery store.” The handheld camera complicated the stitching of the plates. “One plate will go fast and the other slow, so you need to create a stitch that will be an average between those two to maintain the rhythm of the episode. The best thing for us was when a transition could occur when the actor was out of frame.” Raynault produced a 1,498 frame shot that goes from a massacre

OPPOSITE TOP AND BOTTOM LEFT: Markers guide the eyeline for Sister Night (Regina King) for the computer graphics later produced by BUF. (Images courtesy of BUF and HBO) OPPOSITE TOP AND BOTTOM RIGHT: BUF designed the holograms to track museum patrons as they walk by them. (Images courtesy of BUF and HBO) TOP AND BOTTOM LEFT: BUF transforms a greenscreen stage into a nighttime environment. (Images courtesy of BUF and HBO) TOP AND BOTTOM RIGHT: The holographic tree created by BUF combined realistic and graphic elements. (Images courtesy of BUF and HBO)

FALL 2020 VFXVOICE.COM • 73


TV/STREAMING

at a carnival in Hoboken, New Jersey to a giant squid being teleported in Manhattan by Adrian Veidt. “I went to New York City with a drone pilot to gather reference of the Hudson River,” states Mathieu Raynault, Founder. “In the end the only plate we used was the one in the beginning. A small town in Georgia was used for Hoboken. We took some liberty to change the sky and bring in some light so it wasn’t pitch black. The river was the natural dividing line. Hoboken required set extensions and creating corpses on the ground and cars. There are still burgers cooking on the barbeque. Before you get to the Ferris wheel the shot is entirely CG. Then you see the piers and start crossing the river. The audience doesn’t know where they are at that point.” Several blocks in New York City were photographed to provide reference material for the modelers. “The squid was a sculpture because it’s not moving. We had to find that right shape so it feels dead on the building with those tentacles. A lot of liberties were taken with one tentacle being a kilometer long.” Watchmen was the first superhero project for Mackevision,

74 • VFXVOICE.COM FALL 2020

which joined after principal photography was completed. “The pigs were our toughest sequence because bringing full CG creatures to life is already a challenge in itself,” states Emanuel Fuchs, Visual Effects Supervisor. “We found a perfect reference on YouTube where a German farmer has pig races and surprisingly had the same sizes of the pigs we were looking for – he opened the wood board and the pigs started to run. That was all that we needed for our shot when the door of the courtroom was opened and the pigs started to run. We stabilized the footage because it was handheld and all over the place, tracked it into our shots, gave that as a reference to the animators and showed it to Erik. We talked about, do they start immediately with the run cycle or canter first, go faster and become slower again? We had a project with pigs before, so we had our muscles system setup. We did some tweaks because before we only had one character and with this we were up to 30 characters.” BUF was responsible for designing, creating and executing the holograms in the museum situated in the Greenwood Center for

“There was a storyboard animatic of the kinds of things that they wanted in the sequence [of Doctor Manhattan terraforming Europa], but it was left to us to figure out how to navigate telling the story. There was a clear brief from Damon and Erik that they didn’t want us to animate or have anything that moved and looked like time-lapse photography. These events needed to appear sped up, not captured over a long duration.” OPPOSITE TOP AND BOTTOM LEFT: The hologram shoot —Tom Luff, Visual Effects Supervisor conducted by BUF occurred after principal photography Cultural Heritage which is visited by Angela Abar/Sister Night (Regina King). “We had major constraints because the plates were shot without any actors on the podiums,” states Olivier Cauwet, Visual Effects Supervisor. “They requested us to add some holograms many weeks after that, so there was no possibility to reshoot some actors at the same angles. One day was devoted to shooting holograms with 12 cameras on a greenscreen set. For some shots we didn’t know the perspective, focus and lens. It’s a 3D model with projection from 12 cameras. The hologram is made

had been completed. (Images courtesy of BUF and HBO) OPPOSITE TOP AND BOTTOM RIGHT: BUF created a flashlight beam that flickers like a projector for a black and white sequence. (Images courtesy of BUF and HBO)

TOP AND BOTTOM LEFT: The hardest effect for Rodeo FX was constructing a seamless, continuous shot that concludes with Hooded Justice (Jovan Adepo) leaping through a storefront window. (Images courtesy of Rodeo FX and HBO) TOP AND BOTTOM RIGHT: BUF treated the imagery as if it was shot on old and degraded black and white film. (Images courtesy of BUF and HBO)

FALL 2020 VFXVOICE.COM • 75


TV/STREAMING

at a carnival in Hoboken, New Jersey to a giant squid being teleported in Manhattan by Adrian Veidt. “I went to New York City with a drone pilot to gather reference of the Hudson River,” states Mathieu Raynault, Founder. “In the end the only plate we used was the one in the beginning. A small town in Georgia was used for Hoboken. We took some liberty to change the sky and bring in some light so it wasn’t pitch black. The river was the natural dividing line. Hoboken required set extensions and creating corpses on the ground and cars. There are still burgers cooking on the barbeque. Before you get to the Ferris wheel the shot is entirely CG. Then you see the piers and start crossing the river. The audience doesn’t know where they are at that point.” Several blocks in New York City were photographed to provide reference material for the modelers. “The squid was a sculpture because it’s not moving. We had to find that right shape so it feels dead on the building with those tentacles. A lot of liberties were taken with one tentacle being a kilometer long.” Watchmen was the first superhero project for Mackevision,

74 • VFXVOICE.COM FALL 2020

which joined after principal photography was completed. “The pigs were our toughest sequence because bringing full CG creatures to life is already a challenge in itself,” states Emanuel Fuchs, Visual Effects Supervisor. “We found a perfect reference on YouTube where a German farmer has pig races and surprisingly had the same sizes of the pigs we were looking for – he opened the wood board and the pigs started to run. That was all that we needed for our shot when the door of the courtroom was opened and the pigs started to run. We stabilized the footage because it was handheld and all over the place, tracked it into our shots, gave that as a reference to the animators and showed it to Erik. We talked about, do they start immediately with the run cycle or canter first, go faster and become slower again? We had a project with pigs before, so we had our muscles system setup. We did some tweaks because before we only had one character and with this we were up to 30 characters.” BUF was responsible for designing, creating and executing the holograms in the museum situated in the Greenwood Center for

“There was a storyboard animatic of the kinds of things that they wanted in the sequence [of Doctor Manhattan terraforming Europa], but it was left to us to figure out how to navigate telling the story. There was a clear brief from Damon and Erik that they didn’t want us to animate or have anything that moved and looked like time-lapse photography. These events needed to appear sped up, not captured over a long duration.” OPPOSITE TOP AND BOTTOM LEFT: The hologram shoot —Tom Luff, Visual Effects Supervisor conducted by BUF occurred after principal photography Cultural Heritage which is visited by Angela Abar/Sister Night (Regina King). “We had major constraints because the plates were shot without any actors on the podiums,” states Olivier Cauwet, Visual Effects Supervisor. “They requested us to add some holograms many weeks after that, so there was no possibility to reshoot some actors at the same angles. One day was devoted to shooting holograms with 12 cameras on a greenscreen set. For some shots we didn’t know the perspective, focus and lens. It’s a 3D model with projection from 12 cameras. The hologram is made

had been completed. (Images courtesy of BUF and HBO) OPPOSITE TOP AND BOTTOM RIGHT: BUF created a flashlight beam that flickers like a projector for a black and white sequence. (Images courtesy of BUF and HBO)

TOP AND BOTTOM LEFT: The hardest effect for Rodeo FX was constructing a seamless, continuous shot that concludes with Hooded Justice (Jovan Adepo) leaping through a storefront window. (Images courtesy of Rodeo FX and HBO) TOP AND BOTTOM RIGHT: BUF treated the imagery as if it was shot on old and degraded black and white film. (Images courtesy of BUF and HBO)

FALL 2020 VFXVOICE.COM • 75


TV/STREAMING

TOP TO BOTTOM: Important Looking Pirates tracked the faces in the plate elements, lined them up with the CG heads of Mr. Phillips (Tom Mison) and Ms. Crookshanks (Sara Vickers), and re-lighted the faces to make them a closer match to the main plate. (Images courtesy of ILP and HBO)

76 • VFXVOICE.COM FALL 2020

of particles, like pixels in 3D, so you can match to the lens and camera movement.” The holograms have the ability to interact with patrons. “When a person enters the museum and comes close to a hologram, the idea was that it would light on,” remarks Justine Paynat-Sautivet, Visual Effects Producer. “If a person is turning around at a podium, the hologram would turn and talk to them.” The Mill looked after a variety of different shots such as Doctor Manhattan terraforming Europa. “It was an ambitious sequence as we had to tell the story of Doctor Manhattan building this biodome on the surface of Europa in about 60 seconds,” notes Tom Luff, Visual Effects Supervisor. “There was a lot of detail that we wanted to get in there to show the scale of what he was doing from the large to small. We wanted to show these iceberg forms cracking and sheering off and gas being released from inside of the planet, as well cutting to shots of seeds unfurling and plants starting to come out of the ground. There was a storyboard animatic of the kinds of things that they wanted in the sequence, but it was left to us to figure out how to navigate telling the story. There was a clear brief from Damon and Erik that they didn’t want us to animate or have anything that moved and looked like time-lapse photography. These events needed to appear sped up, not captured over a long duration. We had to remove the jittery animation where the camera has moved slightly into position each time a photo has been taken and keep a constant lighting condition rather than having a sun moving over the horizon.” For Important Looking Pirates, the face replacements for the cloned servants of Adrian Veidt (Jeremy Irons) on Europa were the trickiest assets to execute. “Using plate elements or combining them with our CG heads was the most effective,” states Bobo Skipper, Visual Effects Supervisor. “We used a variety of techniques on a shot-by-shot basis in order to integrate the elements with the plate. We of course did a lot of projecting, intricate grading and masking in comp. In some cases, we tracked the faces in the plate elements [that were shot on greenscreen] and lined them up with our CG head so we could combine the plate with the CG and essentially re-light the faces to make them closer match the main plate. For wider shots our CG assets worked quite well, for close ups we relied more on plates, and in some cases a mix of the two was the way to go.” ILP shared 57 out of 138 shots with other vendors. “Given our successful work with the face replacements, we agreed with the client that we would focus our efforts on that job for several shots,” remarks Kajsa Kurtén, Visual Effects Production Manager, “while the rest of the shot work and final comp was handled by another vendor.” A complicated effect handled by MARZ was the reflective face mask worn by Detective Wade Tillman/Looking Glass that appears in over 300 shots. “Erik’s main concern was how we went about capturing all of the reflection data,” remarks Nathaniel Larouche, Visual Effects Supervisor. “There was a 360 Rylo camera on the front and back of Tim’s head and both captured 6K spherical video. The camera had an internal

accelerometer gyroscope and recorded the rotation information into the MP4 file. We did a scan of his face clean, with a few different FACS, and wearing the mask, as it’s interesting how it presses his nose down. We came up with a process of being able to bring back and mimic some of the folds in the footage.” Doctor Manhattan gets destroyed by Framestore in the final episode. “Erik described it as, ‘There are these evil people who switch on a beam that hits Doctor Manhattan like a hammer and sucks out his energy. This is visceral and really painful.’” states Ahmed Gharraph, Visual Effects Supervisor. “He didn’t want his physical body to disintegrate. It was just pure energy. We spent a week or two doing a bunch of concept art.” A still frame was taken from the plate photography and painted over. “There was a cage built and a spotlight shining down on the actor. We had to give the beam of light some volume but didn’t want to overwhelm the image, as it was important to see Doctor Manhattan acting alongside his co-star. It’s a heartfelt moment between those two characters. There was only one shot where we used a digital double. We tried different types of simulations and ended up with a nice mixture of particles, liquids and volumetrics. It was a Houdini workflow. Once it goes into Nuke you have infinite control over the passes and compositing the layers on top of each other. There was a lot of back and forth in terms of adjusting viscosity to make sure it looks like the stuff is actually peeling off of his skin.” Of the over 100 shots created by One Of Us, the scene that caused them the most sleepless nights was the de-materialization of Doctor Manhattan by the portal gun. “There were conceptual constraints,” notes Dominic Parker, Co-director at One Of Us. “The effect had to be a variant on the teleportation we had seen earlier in the series, but where the previous one had been a planar surface, this had to be shot out of a gun. And it had to fit a specific performance. We took this shot to the wire. If the wire had been a couple of weeks later, then we’d have probably taken it to the wire.” A 3D scan of Doctor Manhattan was provided to One Of Us. “From a design perspective, the unusual thing about Watchmen was that it was not always looking for a cool, slick aesthetic. The whole look of the production was a little off-kilter and difficult to define.” The effort of the entire production team garnered critical and public acclaim, with Watchmen being lauded with a Peabody Award. “The biggest difficulty with visual effects that aren’t ordinary is to make them photoreal, because sometimes the mind doesn’t even know what that looks like,” observes Robken. “It’s daunting in that way, but at the same time the variety of challenges thrown at us kept it fun, interesting, and let us explore.” Henry laughs, “I will confess that when I first read that squids would be falling from the sky, I had a WTF moment! The reflective mask and the whole finale with the powers of Doctor Manhattan being taken away were gratifying. You see Doctor Manhattan force away what is happening to him. Sister Night asks, ‘Where are you?’ He says, ‘I’m in every moment.’ The way he says it and the effects around him are moving in slow motion, that’s the power of the show. It can be spectacle on an amazing level, and yet every time I think about that scene, I get teary-eyed. I’m glad to be part of those moments.”

TOP THREE: Plate, grey scale and final composite of the Nite Owl crashing by Important Looking Pirates. (Images courtesy of ILP and HBO) BOTTOM TWO: The most technically advanced sequence that Hybride worked on was the squid rain. (Images courtesy of Hybride and HBO)

FALL 2020 VFXVOICE.COM • 77


TV/STREAMING

TOP TO BOTTOM: Important Looking Pirates tracked the faces in the plate elements, lined them up with the CG heads of Mr. Phillips (Tom Mison) and Ms. Crookshanks (Sara Vickers), and re-lighted the faces to make them a closer match to the main plate. (Images courtesy of ILP and HBO)

76 • VFXVOICE.COM FALL 2020

of particles, like pixels in 3D, so you can match to the lens and camera movement.” The holograms have the ability to interact with patrons. “When a person enters the museum and comes close to a hologram, the idea was that it would light on,” remarks Justine Paynat-Sautivet, Visual Effects Producer. “If a person is turning around at a podium, the hologram would turn and talk to them.” The Mill looked after a variety of different shots such as Doctor Manhattan terraforming Europa. “It was an ambitious sequence as we had to tell the story of Doctor Manhattan building this biodome on the surface of Europa in about 60 seconds,” notes Tom Luff, Visual Effects Supervisor. “There was a lot of detail that we wanted to get in there to show the scale of what he was doing from the large to small. We wanted to show these iceberg forms cracking and sheering off and gas being released from inside of the planet, as well cutting to shots of seeds unfurling and plants starting to come out of the ground. There was a storyboard animatic of the kinds of things that they wanted in the sequence, but it was left to us to figure out how to navigate telling the story. There was a clear brief from Damon and Erik that they didn’t want us to animate or have anything that moved and looked like time-lapse photography. These events needed to appear sped up, not captured over a long duration. We had to remove the jittery animation where the camera has moved slightly into position each time a photo has been taken and keep a constant lighting condition rather than having a sun moving over the horizon.” For Important Looking Pirates, the face replacements for the cloned servants of Adrian Veidt (Jeremy Irons) on Europa were the trickiest assets to execute. “Using plate elements or combining them with our CG heads was the most effective,” states Bobo Skipper, Visual Effects Supervisor. “We used a variety of techniques on a shot-by-shot basis in order to integrate the elements with the plate. We of course did a lot of projecting, intricate grading and masking in comp. In some cases, we tracked the faces in the plate elements [that were shot on greenscreen] and lined them up with our CG head so we could combine the plate with the CG and essentially re-light the faces to make them closer match the main plate. For wider shots our CG assets worked quite well, for close ups we relied more on plates, and in some cases a mix of the two was the way to go.” ILP shared 57 out of 138 shots with other vendors. “Given our successful work with the face replacements, we agreed with the client that we would focus our efforts on that job for several shots,” remarks Kajsa Kurtén, Visual Effects Production Manager, “while the rest of the shot work and final comp was handled by another vendor.” A complicated effect handled by MARZ was the reflective face mask worn by Detective Wade Tillman/Looking Glass that appears in over 300 shots. “Erik’s main concern was how we went about capturing all of the reflection data,” remarks Nathaniel Larouche, Visual Effects Supervisor. “There was a 360 Rylo camera on the front and back of Tim’s head and both captured 6K spherical video. The camera had an internal

accelerometer gyroscope and recorded the rotation information into the MP4 file. We did a scan of his face clean, with a few different FACS, and wearing the mask, as it’s interesting how it presses his nose down. We came up with a process of being able to bring back and mimic some of the folds in the footage.” Doctor Manhattan gets destroyed by Framestore in the final episode. “Erik described it as, ‘There are these evil people who switch on a beam that hits Doctor Manhattan like a hammer and sucks out his energy. This is visceral and really painful.’” states Ahmed Gharraph, Visual Effects Supervisor. “He didn’t want his physical body to disintegrate. It was just pure energy. We spent a week or two doing a bunch of concept art.” A still frame was taken from the plate photography and painted over. “There was a cage built and a spotlight shining down on the actor. We had to give the beam of light some volume but didn’t want to overwhelm the image, as it was important to see Doctor Manhattan acting alongside his co-star. It’s a heartfelt moment between those two characters. There was only one shot where we used a digital double. We tried different types of simulations and ended up with a nice mixture of particles, liquids and volumetrics. It was a Houdini workflow. Once it goes into Nuke you have infinite control over the passes and compositing the layers on top of each other. There was a lot of back and forth in terms of adjusting viscosity to make sure it looks like the stuff is actually peeling off of his skin.” Of the over 100 shots created by One Of Us, the scene that caused them the most sleepless nights was the de-materialization of Doctor Manhattan by the portal gun. “There were conceptual constraints,” notes Dominic Parker, Co-director at One Of Us. “The effect had to be a variant on the teleportation we had seen earlier in the series, but where the previous one had been a planar surface, this had to be shot out of a gun. And it had to fit a specific performance. We took this shot to the wire. If the wire had been a couple of weeks later, then we’d have probably taken it to the wire.” A 3D scan of Doctor Manhattan was provided to One Of Us. “From a design perspective, the unusual thing about Watchmen was that it was not always looking for a cool, slick aesthetic. The whole look of the production was a little off-kilter and difficult to define.” The effort of the entire production team garnered critical and public acclaim, with Watchmen being lauded with a Peabody Award. “The biggest difficulty with visual effects that aren’t ordinary is to make them photoreal, because sometimes the mind doesn’t even know what that looks like,” observes Robken. “It’s daunting in that way, but at the same time the variety of challenges thrown at us kept it fun, interesting, and let us explore.” Henry laughs, “I will confess that when I first read that squids would be falling from the sky, I had a WTF moment! The reflective mask and the whole finale with the powers of Doctor Manhattan being taken away were gratifying. You see Doctor Manhattan force away what is happening to him. Sister Night asks, ‘Where are you?’ He says, ‘I’m in every moment.’ The way he says it and the effects around him are moving in slow motion, that’s the power of the show. It can be spectacle on an amazing level, and yet every time I think about that scene, I get teary-eyed. I’m glad to be part of those moments.”

TOP THREE: Plate, grey scale and final composite of the Nite Owl crashing by Important Looking Pirates. (Images courtesy of ILP and HBO) BOTTOM TWO: The most technically advanced sequence that Hybride worked on was the squid rain. (Images courtesy of Hybride and HBO)

FALL 2020 VFXVOICE.COM • 77


VR/AR/MR TRENDS

EXPLORING THE VIRTUAL COMMUNITIES OF SOCIAL VR By CHRIS McGOWAN

TOP: Hanging out in VR in VRChat. (Image courtesy of VRChat) BOTTOM LEFT: Darshan Shankar, Founder and CEO, Bigscreen VR. BOTTOM RIGHT: Bob Buchi, President of Worldwide Home Entertainment, Paramount Pictures.

78 • VFXVOICE.COM FALL 2020

When many people think of virtual reality, they tend to imagine immersive narratives, interactive experiences and games designed for that medium. Others ponder VR’s potential for education, training and therapy. Another of VR’s most compelling promises is that of Social VR, in which participants hang out, chat, interact and attend events together in virtual reality. They can even watch movies and concerts. Many are already immersing in 360-degree virtual spaces with friends and acquaintances. And many more will do it in the near future as VR quality improves and headsets become less cumbersome. Bigscreen VR and AltspaceVR are two platforms available now that offer a variety of social VR possibilities for entertainment and productivity. In Bigscreen VR, one can hang out, play and work, or go to the movies. Bigscreen’s main focus at the moment is to provide a different type of entertainment experience: the virtual cinema. Once logged into Bigscreen, users can sit in a dynamically-lit movie theater environment in virtual reality and watch movies together. Bigscreen VR Founder and CEO Darshan Shankar explains, “Bigscreen provides the immersion of a movie theater, since we also render a virtual world around the movie. This lets people feel like they are transported into a movie theater to watch a film on a massive projection screen, when in reality they’re sitting on their couch wearing a headset. Additionally, Bigscreen provides a social layer, enabling people to watch films together with friends and other movie fans around the world. Voice chat and avatars enable fun conversations, just like the social experience of watching a movie in a theater or living room.” Bigscreen offers studio films that cost $4 to $5 per admission. It has studio partnerships with Paramount Pictures and Funimation (Sony Pictures’ anime division) and has hundreds of titles. It has licensed films from Paramount such as Star Trek, Terminator and Transformers films, Interstellar, Top Gun, Ghost in the Shell, World War Z and Ferris Bueller’s Day Off, with the pact covering multiple countries. “We are always looking for innovative ways to engage and entertain fans, and Bigscreen’s virtual reality platform offers viewers a new way to experience films in their homes,” comments Bob Buchi, President of Worldwide Home Entertainment for Paramount Pictures. He says the studio planned to make another 100+ movies available over the course of this year, on top of an initial 35+ titles. In addition to regular 2D screenings, Bigscreen presents 3D editions of select movies, which benefit from being shown in virtual reality. “When a person wears a VR headset and watches a 3D movie in Bigscreen, our software renders a crisp, high quality picture in each eye.” Those pictures are “perfectly aligned together, creating a 3D experience without common issues of 3D glasses such as cross-talk and ghosting. It’s subtle, doesn’t cause headaches and adds to the immersion of a film,” says Shankar. Bigscreen may also become the best home option for viewing movies in 3D because Netflix, Amazon Prime and most major streamers only offer the 2D version of feature films.

“User feedback to the 3D screenings has been particularly strong and we feel that Bigscreen offers an exceptional at-home 3D movie-watching experience,” Buchi comments. Bigscreen is now working on additional “VR Cinema” content, including a drive-in movie theater environment, and has had themed movie events like horror-movie weeks. There are also special events, such as in June when Paramount Pictures, director Ava DuVernay and Bigscreen VR partnered to bring Selma to viewers on the Bigscreen VR platform. There was a live movie screening in June and the movie was available for free that month. Bigscreen also has more than 50 free, ad-supported TV channels to watch inside the app, including CNN, Bloomberg, NBC and Comedy Central. It plans to add episodic TV,

TOP: Play Dungeons and Dragons in virtual reality in AltspaceVR. (Image courtesy of AltspaceVR) BOTTOM: Interstellar is shown in the virtual cinema by Bigscreen VR. (Image courtesy of Paramount Pictures and Bigscreen VR)

FALL 2020 VFXVOICE.COM • 79


VR/AR/MR TRENDS

EXPLORING THE VIRTUAL COMMUNITIES OF SOCIAL VR By CHRIS McGOWAN

TOP: Hanging out in VR in VRChat. (Image courtesy of VRChat) BOTTOM LEFT: Darshan Shankar, Founder and CEO, Bigscreen VR. BOTTOM RIGHT: Bob Buchi, President of Worldwide Home Entertainment, Paramount Pictures.

78 • VFXVOICE.COM FALL 2020

When many people think of virtual reality, they tend to imagine immersive narratives, interactive experiences and games designed for that medium. Others ponder VR’s potential for education, training and therapy. Another of VR’s most compelling promises is that of Social VR, in which participants hang out, chat, interact and attend events together in virtual reality. They can even watch movies and concerts. Many are already immersing in 360-degree virtual spaces with friends and acquaintances. And many more will do it in the near future as VR quality improves and headsets become less cumbersome. Bigscreen VR and AltspaceVR are two platforms available now that offer a variety of social VR possibilities for entertainment and productivity. In Bigscreen VR, one can hang out, play and work, or go to the movies. Bigscreen’s main focus at the moment is to provide a different type of entertainment experience: the virtual cinema. Once logged into Bigscreen, users can sit in a dynamically-lit movie theater environment in virtual reality and watch movies together. Bigscreen VR Founder and CEO Darshan Shankar explains, “Bigscreen provides the immersion of a movie theater, since we also render a virtual world around the movie. This lets people feel like they are transported into a movie theater to watch a film on a massive projection screen, when in reality they’re sitting on their couch wearing a headset. Additionally, Bigscreen provides a social layer, enabling people to watch films together with friends and other movie fans around the world. Voice chat and avatars enable fun conversations, just like the social experience of watching a movie in a theater or living room.” Bigscreen offers studio films that cost $4 to $5 per admission. It has studio partnerships with Paramount Pictures and Funimation (Sony Pictures’ anime division) and has hundreds of titles. It has licensed films from Paramount such as Star Trek, Terminator and Transformers films, Interstellar, Top Gun, Ghost in the Shell, World War Z and Ferris Bueller’s Day Off, with the pact covering multiple countries. “We are always looking for innovative ways to engage and entertain fans, and Bigscreen’s virtual reality platform offers viewers a new way to experience films in their homes,” comments Bob Buchi, President of Worldwide Home Entertainment for Paramount Pictures. He says the studio planned to make another 100+ movies available over the course of this year, on top of an initial 35+ titles. In addition to regular 2D screenings, Bigscreen presents 3D editions of select movies, which benefit from being shown in virtual reality. “When a person wears a VR headset and watches a 3D movie in Bigscreen, our software renders a crisp, high quality picture in each eye.” Those pictures are “perfectly aligned together, creating a 3D experience without common issues of 3D glasses such as cross-talk and ghosting. It’s subtle, doesn’t cause headaches and adds to the immersion of a film,” says Shankar. Bigscreen may also become the best home option for viewing movies in 3D because Netflix, Amazon Prime and most major streamers only offer the 2D version of feature films.

“User feedback to the 3D screenings has been particularly strong and we feel that Bigscreen offers an exceptional at-home 3D movie-watching experience,” Buchi comments. Bigscreen is now working on additional “VR Cinema” content, including a drive-in movie theater environment, and has had themed movie events like horror-movie weeks. There are also special events, such as in June when Paramount Pictures, director Ava DuVernay and Bigscreen VR partnered to bring Selma to viewers on the Bigscreen VR platform. There was a live movie screening in June and the movie was available for free that month. Bigscreen also has more than 50 free, ad-supported TV channels to watch inside the app, including CNN, Bloomberg, NBC and Comedy Central. It plans to add episodic TV,

TOP: Play Dungeons and Dragons in virtual reality in AltspaceVR. (Image courtesy of AltspaceVR) BOTTOM: Interstellar is shown in the virtual cinema by Bigscreen VR. (Image courtesy of Paramount Pictures and Bigscreen VR)

FALL 2020 VFXVOICE.COM • 79


VR/AR/MR TRENDS

TOP: The Balloon Island world in AltspaceVR. (Image courtesy of AltspaceVR) MIDDLE: Explore the Mythical Library world in AltSpace VR. (Image courtesy of AltspaceVR) BOTTOM: Pokémon Detective Pikachu makes an appearance in Bigscreen TV’s virtual cinema. (Image courtesy of Paramount Pictures and Bigscreen VR)

80 • VFXVOICE.COM FALL 2020

live sports and more studio deals to the platform. Bigscreen was founded in San Francisco by Darshan Shankar, and the beta launch occurred in March 2016. “I started Bigscreen to explore the possibilities of spatial computing and how VR/AR headsets could change the way we work, play and hangout, as I believe VR headsets could one day replace the physical screens in our lives and change how we interact with computers in our daily lives.” The pandemic accelerated interest in the Bigscreen platform, with “an influx of many filmmakers, directors, producers and movie studios reaching out to work with the company.” Shankar says Bigscreen broke its revenue and usage records “multiple times over as people flocked to buy VR headsets and jump into Bigscreen as a fun way to hang out.” Bigscreen VR is compatible with major VR headsets, with PlayStation VR support coming in early 2021. Bigscreen had over 1.5 million users at the start of the year, according to Shankar. AltspaceVR is another major social VR presence and presents itself as a “new communications platform.” It hosts diverse activities, including live comedy, music events, tabletop games and both public and private events. AltspaceVR is based in Redwood City, California and launched in May 2015. The company uses a “freemium” business model, in which the basic software is given away for free, and then the business charges consumers for more advanced features. It had financial problems and briefly closed in 2017, but was then purchased by Microsoft and given a new life. “AltspaceVR is the premiere place to attend meet-ups, community gatherings, and live events like talk shows, open mic nights, interest-based clubs, and more. We have a busy events calendar that features a wide variety of content from people all over the world,” comments Karolina Manko, AltspaceVR Program Manager and Marketing Strategist. Users can create their own public virtual events or opt to play Dungeons and Dragons (the first major tabletop game in VR, according to the firm) and other games, surf the web or watch videos. Continues Manko, “AltspaceVR gives people a place to connect synchronously so they can enjoy socializing in a way that’s more

personal and familiar than traditional social media. In AltspaceVR you can have real-time conversations, high five or hug friends, stand around in a circle just chatting, and more. The true magic of social VR is that it gives you a sense of presence and makes you feel like you really are standing next to someone who might be on the other side of the globe.” Entertainment is a big part of the mix and “that’s evidenced by the frequency of parties, screenings, performances and talk shows that regularly populate our events calendar.” “Our ‘FrontRow’ technology uses room mirroring to enable scaling events quickly and easily,” she explains. “This feature allows an event host to go on-air across several different instances, or rooms. This allows audience sizes to be optimized for performance on both mobile VR headsets and PC-powered VR headsets alike, ensuring accessibility and a uniform experience regardless of hardware. With FrontRow, events can easily scale into the hundreds.” HP held an international product launch for their new Reverb G2 VR headset in AltspaceVR as part of the Augmented World Expo (AWE) in May. On the entertainment side, comedian and vocal artist Reggie Watts (who also leads the house band for The Late Late Show with James Corden) entertained AltSpacers with his eight-week show series Reinterpreted Reality, which ran in June. Watts had appeared in AltspaceVR as far back as 2015 when he teamed with Justin Roiland (Rick and Morty, Solar Opposites) for a VR comedy set while at Jashfest, the Palm Springs comedy festival. The platform has also hosted other celebrities such as science educator Bill Nye and actor/comedian Drew Carey. “AltspaceVR also offers a place to do business, build community, network, learn new skills, and more,” says Manko. She adds that during the first few months of the pandemic, AltspaceVR saw “a large increase in outreach, as people and businesses [continued] to explore VR as an alternative to conferences, classes, happy hours, and more.” AltspaceVR is compatible with major VR headsets. In 2014, before the advent of Bigscreen or AltspaceVR, Graham Gaylor and Jesse Joudrey launched VRChat. Some think of VRChat as similar to a VR version of Second Life (Linden Lab’s online virtual world that launched in 2003 and mostly features user-generated content). VRChat enables the user to create worlds and custom avatars. Inside the VRChat world, one can engage in many activities – art, games, events, classes – and even perform for large crowds. According to the company, “Since its founding … the platform has grown to millions of users and features a passionate community of creators. Using Unity and our SDK, creators can build avatars or worlds and upload them to share with others. This open-ended creativity and the dynamics of sharing in an active social environment has led to a never-ending fountain of entertaining, surprising and wonderful experiences.” VRChat claims that its community has created over 50,000 unique worlds to date. Other leading social VR platforms include Rec Room, Sansar, Vive Sync, Wave (a social VR music platform), Neos VR, OrbusVR, Anyland and vTime (a “cross-reality” social network for VR and AR).

TOP: Terminator Genisys in virtual reality is available at Bigscreen VR. (Image courtesy of Paramount Pictures and Bigscreen VR) MIDDLE AND BOTTOM: Hanging out with other avatars in VR. (Image courtesy of AltspaceVR)

FALL 2020 VFXVOICE.COM • 81


VR/AR/MR TRENDS

TOP: The Balloon Island world in AltspaceVR. (Image courtesy of AltspaceVR) MIDDLE: Explore the Mythical Library world in AltSpace VR. (Image courtesy of AltspaceVR) BOTTOM: Pokémon Detective Pikachu makes an appearance in Bigscreen TV’s virtual cinema. (Image courtesy of Paramount Pictures and Bigscreen VR)

80 • VFXVOICE.COM FALL 2020

live sports and more studio deals to the platform. Bigscreen was founded in San Francisco by Darshan Shankar, and the beta launch occurred in March 2016. “I started Bigscreen to explore the possibilities of spatial computing and how VR/AR headsets could change the way we work, play and hangout, as I believe VR headsets could one day replace the physical screens in our lives and change how we interact with computers in our daily lives.” The pandemic accelerated interest in the Bigscreen platform, with “an influx of many filmmakers, directors, producers and movie studios reaching out to work with the company.” Shankar says Bigscreen broke its revenue and usage records “multiple times over as people flocked to buy VR headsets and jump into Bigscreen as a fun way to hang out.” Bigscreen VR is compatible with major VR headsets, with PlayStation VR support coming in early 2021. Bigscreen had over 1.5 million users at the start of the year, according to Shankar. AltspaceVR is another major social VR presence and presents itself as a “new communications platform.” It hosts diverse activities, including live comedy, music events, tabletop games and both public and private events. AltspaceVR is based in Redwood City, California and launched in May 2015. The company uses a “freemium” business model, in which the basic software is given away for free, and then the business charges consumers for more advanced features. It had financial problems and briefly closed in 2017, but was then purchased by Microsoft and given a new life. “AltspaceVR is the premiere place to attend meet-ups, community gatherings, and live events like talk shows, open mic nights, interest-based clubs, and more. We have a busy events calendar that features a wide variety of content from people all over the world,” comments Karolina Manko, AltspaceVR Program Manager and Marketing Strategist. Users can create their own public virtual events or opt to play Dungeons and Dragons (the first major tabletop game in VR, according to the firm) and other games, surf the web or watch videos. Continues Manko, “AltspaceVR gives people a place to connect synchronously so they can enjoy socializing in a way that’s more

personal and familiar than traditional social media. In AltspaceVR you can have real-time conversations, high five or hug friends, stand around in a circle just chatting, and more. The true magic of social VR is that it gives you a sense of presence and makes you feel like you really are standing next to someone who might be on the other side of the globe.” Entertainment is a big part of the mix and “that’s evidenced by the frequency of parties, screenings, performances and talk shows that regularly populate our events calendar.” “Our ‘FrontRow’ technology uses room mirroring to enable scaling events quickly and easily,” she explains. “This feature allows an event host to go on-air across several different instances, or rooms. This allows audience sizes to be optimized for performance on both mobile VR headsets and PC-powered VR headsets alike, ensuring accessibility and a uniform experience regardless of hardware. With FrontRow, events can easily scale into the hundreds.” HP held an international product launch for their new Reverb G2 VR headset in AltspaceVR as part of the Augmented World Expo (AWE) in May. On the entertainment side, comedian and vocal artist Reggie Watts (who also leads the house band for The Late Late Show with James Corden) entertained AltSpacers with his eight-week show series Reinterpreted Reality, which ran in June. Watts had appeared in AltspaceVR as far back as 2015 when he teamed with Justin Roiland (Rick and Morty, Solar Opposites) for a VR comedy set while at Jashfest, the Palm Springs comedy festival. The platform has also hosted other celebrities such as science educator Bill Nye and actor/comedian Drew Carey. “AltspaceVR also offers a place to do business, build community, network, learn new skills, and more,” says Manko. She adds that during the first few months of the pandemic, AltspaceVR saw “a large increase in outreach, as people and businesses [continued] to explore VR as an alternative to conferences, classes, happy hours, and more.” AltspaceVR is compatible with major VR headsets. In 2014, before the advent of Bigscreen or AltspaceVR, Graham Gaylor and Jesse Joudrey launched VRChat. Some think of VRChat as similar to a VR version of Second Life (Linden Lab’s online virtual world that launched in 2003 and mostly features user-generated content). VRChat enables the user to create worlds and custom avatars. Inside the VRChat world, one can engage in many activities – art, games, events, classes – and even perform for large crowds. According to the company, “Since its founding … the platform has grown to millions of users and features a passionate community of creators. Using Unity and our SDK, creators can build avatars or worlds and upload them to share with others. This open-ended creativity and the dynamics of sharing in an active social environment has led to a never-ending fountain of entertaining, surprising and wonderful experiences.” VRChat claims that its community has created over 50,000 unique worlds to date. Other leading social VR platforms include Rec Room, Sansar, Vive Sync, Wave (a social VR music platform), Neos VR, OrbusVR, Anyland and vTime (a “cross-reality” social network for VR and AR).

TOP: Terminator Genisys in virtual reality is available at Bigscreen VR. (Image courtesy of Paramount Pictures and Bigscreen VR) MIDDLE AND BOTTOM: Hanging out with other avatars in VR. (Image courtesy of AltspaceVR)

FALL 2020 VFXVOICE.COM • 81


TV/STREAMING

SOLAR OPPOSITES: ONE SCI-FI ANIMATED SITCOM AND TWO WORLDS By CHRIS McGOWAN

All images courtesy of Hulu. TOP: The pupa is a cross between a child and a pet with the power to evolve into a super-computer that can rebuild Earth into the image of doomed planet Schlorp. BOTTOM LEFT: David Marshall, Technical Director. BOTTOM RIGHT: Yaoyao Ma Van As, Art Director.

82 • VFXVOICE.COM FALL 2020

PG 82-85 SOLAR OPPOSITES.indd 82-83

The artists and animators of Solar Opposites – Hulu’s hit animated comedy series about a warped alien family that has crash-landed in suburban America – had the unusual challenge of having to create two distinct realities for each episode. One is the world of a sci-fi sitcom that balances lighthearted zaniness with boundary-pushing crudity and violence. The other is “the wall,” the dog-eat-dog setting of a dystopian story within the main story. Solar Opposites, which bowed in May, was created by Justin Roiland (animator/actor/co-creator of Rick and Morty) and writer Mike McMahan (creator of Star Trek: Lower Decks for CBS All Access). It concerns five aliens who have fled their destroyed home world of Schlorp and find themselves stuck on Earth. Korvo (voiced by Roiland) and Terry (Thomas Middleditch of Silicon Valley fame) are the adults, acting sometimes like brothers and other times like a same-sex couple. Korvo hates Earth (i.e., American) culture and Terry likes it. They have two replicant offspring: innocent Jesse (Mary Mack) who loves humans, junk food and pop, and irascible Yumyulack (Sean Giambrone), who uses a shrink-ray gun to zap humans who annoy him and keeps them in a multi-level vivarium set into his bedroom wall. The latter, stuffed with a host of desperate, miniaturized people, has become a post-apocalyptic microcosm of society, a brutal hierarchy lorded over by the Duke (Alfred Molina). Meanwhile, the fifth member of the team – the pupa – is a cute child/pet that will one day end life as we know it by remaking the Earth in the image of Schlorp. The aliens seek to fit into their new home – especially Jesse and Yumyulack in high school – yet lack all empathy for the earthlings around them. Inevitably, someone in the family unleashes something from alien biology or sci-fi tech that kills lots of people in a gruesome fashion. Yet, the show maintains a “fun above all” spirit. Indeed, Solar Opposites – which has a similar animation style to that of Rick and Morty – is lighthearted despite the carnage, and the bright and colorful look contrasts with the often-harrowing action. “We like to play with the juxtaposition of those two things,” comments Technical Director David Marshall. “The show can be gory at times, but it is still extremely silly. I think blending the two gives you a broader range of styles than you’d have otherwise.” “Sometimes we use bright colors to enhance the drama,” explains Art Director Yaoyao Ma Van As. “Other times they were there to show contrast. For example, I wanted the scenes in Episode 7, where the rebels are being drowned by the Duke, to have a kind of serene beauty to really contrast with all of the death and horror.” Prior to working on Solar Opposites, Marshall worked as a technical director and animation supervisor for Rick and Morty. Ma Van As was also involved with that show as a background painter for Season 2. “Once the season wrapped, I had the opportunity to help Justin with his pitch for a new show. I painted some concepts and colored some characters for what would come to be Solar Opposites,” she recalls. Solar Opposites was initially put into development by 20th Century Fox Television, which then shelved the project. In 2018,

Hulu took an interest and gave Roiland and McMahan an initial order of two seasons with eight episodes each. At that point, Ma Van As came aboard as Art Director. In June of this year, the show was renewed for a third season with 12 episodes. The show’s main designs were generated in-house. “Everything is animated and comped using Harmony by Toon Boom,” explains Marshall. “We also used Adobe After Effects for some scenes. Animation was done by our amazing Canadian partners, Bardel Entertainment, with technical or creative retakes and revisions all being done in-house [Bardel also handles Rick and Morty]. I’m always impressed by the quality of work they produce on some extremely daunting projects.”

TOP: Solar Opposites’ Earthbound alien family: the pupa (in bed), Yumyulack, Korvo, Jesse and Terry. BOTTOM: A cinematic style of storytelling was used to set the wall storyline apart from the “real world” of Solar Opposites.

FALL 2020 VFXVOICE.COM • 83

8/27/20 3:30 PM


TV/STREAMING

SOLAR OPPOSITES: ONE SCI-FI ANIMATED SITCOM AND TWO WORLDS By CHRIS McGOWAN

All images courtesy of Hulu. TOP: The pupa is a cross between a child and a pet with the power to evolve into a super-computer that can rebuild Earth into the image of doomed planet Schlorp. BOTTOM LEFT: David Marshall, Technical Director. BOTTOM RIGHT: Yaoyao Ma Van As, Art Director.

82 • VFXVOICE.COM FALL 2020

PG 82-85 SOLAR OPPOSITES.indd 82-83

The artists and animators of Solar Opposites – Hulu’s hit animated comedy series about a warped alien family that has crash-landed in suburban America – had the unusual challenge of having to create two distinct realities for each episode. One is the world of a sci-fi sitcom that balances lighthearted zaniness with boundary-pushing crudity and violence. The other is “the wall,” the dog-eat-dog setting of a dystopian story within the main story. Solar Opposites, which bowed in May, was created by Justin Roiland (animator/actor/co-creator of Rick and Morty) and writer Mike McMahan (creator of Star Trek: Lower Decks for CBS All Access). It concerns five aliens who have fled their destroyed home world of Schlorp and find themselves stuck on Earth. Korvo (voiced by Roiland) and Terry (Thomas Middleditch of Silicon Valley fame) are the adults, acting sometimes like brothers and other times like a same-sex couple. Korvo hates Earth (i.e., American) culture and Terry likes it. They have two replicant offspring: innocent Jesse (Mary Mack) who loves humans, junk food and pop, and irascible Yumyulack (Sean Giambrone), who uses a shrink-ray gun to zap humans who annoy him and keeps them in a multi-level vivarium set into his bedroom wall. The latter, stuffed with a host of desperate, miniaturized people, has become a post-apocalyptic microcosm of society, a brutal hierarchy lorded over by the Duke (Alfred Molina). Meanwhile, the fifth member of the team – the pupa – is a cute child/pet that will one day end life as we know it by remaking the Earth in the image of Schlorp. The aliens seek to fit into their new home – especially Jesse and Yumyulack in high school – yet lack all empathy for the earthlings around them. Inevitably, someone in the family unleashes something from alien biology or sci-fi tech that kills lots of people in a gruesome fashion. Yet, the show maintains a “fun above all” spirit. Indeed, Solar Opposites – which has a similar animation style to that of Rick and Morty – is lighthearted despite the carnage, and the bright and colorful look contrasts with the often-harrowing action. “We like to play with the juxtaposition of those two things,” comments Technical Director David Marshall. “The show can be gory at times, but it is still extremely silly. I think blending the two gives you a broader range of styles than you’d have otherwise.” “Sometimes we use bright colors to enhance the drama,” explains Art Director Yaoyao Ma Van As. “Other times they were there to show contrast. For example, I wanted the scenes in Episode 7, where the rebels are being drowned by the Duke, to have a kind of serene beauty to really contrast with all of the death and horror.” Prior to working on Solar Opposites, Marshall worked as a technical director and animation supervisor for Rick and Morty. Ma Van As was also involved with that show as a background painter for Season 2. “Once the season wrapped, I had the opportunity to help Justin with his pitch for a new show. I painted some concepts and colored some characters for what would come to be Solar Opposites,” she recalls. Solar Opposites was initially put into development by 20th Century Fox Television, which then shelved the project. In 2018,

Hulu took an interest and gave Roiland and McMahan an initial order of two seasons with eight episodes each. At that point, Ma Van As came aboard as Art Director. In June of this year, the show was renewed for a third season with 12 episodes. The show’s main designs were generated in-house. “Everything is animated and comped using Harmony by Toon Boom,” explains Marshall. “We also used Adobe After Effects for some scenes. Animation was done by our amazing Canadian partners, Bardel Entertainment, with technical or creative retakes and revisions all being done in-house [Bardel also handles Rick and Morty]. I’m always impressed by the quality of work they produce on some extremely daunting projects.”

TOP: Solar Opposites’ Earthbound alien family: the pupa (in bed), Yumyulack, Korvo, Jesse and Terry. BOTTOM: A cinematic style of storytelling was used to set the wall storyline apart from the “real world” of Solar Opposites.

FALL 2020 VFXVOICE.COM • 83

8/27/20 3:30 PM


TV/STREAMING

TOP: Bright, bold colors are used to heighten the drama and contrast beauty with horror. BOTTOM: Terry has inadvertently touched the self-destruct button.

84 • VFXVOICE.COM FALL 2020

PG 82-85 SOLAR OPPOSITES.indd 84-85

“The post-production began in earnest in the winter of 2019 and continued until April 2020,” recalls Marshall. “We were in the middle of production of Solar Opposites and Rick and Morty at the time of the lockdown. The studio had to create an entire work-from-home pipeline in a weekend and refine it as we went along. The lockdown definitely presented us with a host of new challenges.” Even without the pandemic, “the first season of any show is always challenging,” says Marshall. “There’s always something you didn’t think of in design or storyboard that all of a sudden has to be figured out ASAP. What might look great as a still image when designing could come back looking odd when you see it in motion. In the end, you come up with a bunch of different solutions and then try them all. The one that looks best and won’t slow down production ends up being what you go with.” In creating the show’s look, Ma Van As remarks, “We tried to stick to Justin’s character style, but he gave us a lot of creative freedom for the artwork across the board. There were times where Justin and McMahan had specific ideas in mind for certain characters. Justin gave us quick, funny doodles to illustrate what he was thinking with the Gooblers, for example. Justin and McMahan had specific ideas of what the wall setting should be, how big the people within the wall should be, so that was very helpful to help us nail down the look and feel of the environment. For the most part, they

trusted the artists to help build this world.” Marshall adds, “On the animation side of things, there’s always a huge amount of on-the-fly decision-making and last-minute changes as far as tone or style, and even more so on the first season of a show. I worked closely with Mike [McMahan] when designing effects and lighting, and he was always excited to hear our ideas. He always made time to look at something you were working on or toss ideas back and forth on the way something ought to look.” Ma Van As notes that on the first season, the team worked on a very compressed schedule. “We had to be very efficient and communicate constantly to eliminate as many mistakes as early as possible. We assumed that since the show was set on Earth, it would be relatively smooth sailing. Once we got into the wall, it became instantly more challenging, because we wanted to really create this world that the show creators had in mind without cutting corners.” The wall is an element that lends a certain gravitas to the series. “As far as the art style goes,” says Ma Van As, “we tried to make different areas of the wall feel different. There are the lower levels, which feel grungy and desolate. As you walk higher and into the richer areas, it gets more colorful and cleaner. When inside the wall, we really wanted to feel an expansive, yet trapped quality [like an ant farm] to contrast with the outside world, so we were very conscious of making sure the art reflected that in any way we could. “For the design and color portion of the show,” she continues, “we wanted to really push the scale and difference between the real world and the wall storylines. We wanted to utilize mood and lighting to help us push the drama. We wanted to make sure the designs within the wall guided the story of the materials the characters had access to, the scale and cleanliness of the environment.” Marshall comments, “For sure, it was a conscious decision to treat what’s happening in the wall a little differently. Everything is a little more cinematic, a little more serious. You don’t want it to look so far off that you come back from an act break and think you’re watching a completely different show, but using that cinematic style of storytelling for those sequences lends both drama to what’s happening and an extra comedic aspect to how seriously we’re treating these tiny people stuck in a wall.” Serialization also affected artistic choices for the show, as the team knew they had 16 episodes to work with. “The biggest thing is to keep things consistent,” Marshall advises. “Sometimes you make a quick decision on how something should look or you really go overboard on the FX side of things for a special one-off shot, and then you realize, ‘Oh, now we always have to do that.’ So you have to be careful when making those kinds of decisions that could come back to bite you three seasons from now.” On working with Roiland, Ma Van As notes, “Justin gives the artists a lot of trust and leeway, while at the same time being very precise about the things he really needs to see for his vision.” Adds Marshall, “Justin has always been great to work with. It’s always refreshing to work with someone who has a clear vision of what they want and he has a grasp of the animation process that not every creator shares.”

TOP: Korvo, Yumyulack, Jesse, the pupa and Terry. MIDDLE: A gathering of adults from Schlorp. BOTTOM: A revolution is breaking out inside the wall, which gets a full treatment in Episode 7.

FALL 2020 VFXVOICE.COM • 85

8/27/20 3:31 PM


TV/STREAMING

TOP: Bright, bold colors are used to heighten the drama and contrast beauty with horror. BOTTOM: Terry has inadvertently touched the self-destruct button.

84 • VFXVOICE.COM FALL 2020

PG 82-85 SOLAR OPPOSITES.indd 84-85

“The post-production began in earnest in the winter of 2019 and continued until April 2020,” recalls Marshall. “We were in the middle of production of Solar Opposites and Rick and Morty at the time of the lockdown. The studio had to create an entire work-from-home pipeline in a weekend and refine it as we went along. The lockdown definitely presented us with a host of new challenges.” Even without the pandemic, “the first season of any show is always challenging,” says Marshall. “There’s always something you didn’t think of in design or storyboard that all of a sudden has to be figured out ASAP. What might look great as a still image when designing could come back looking odd when you see it in motion. In the end, you come up with a bunch of different solutions and then try them all. The one that looks best and won’t slow down production ends up being what you go with.” In creating the show’s look, Ma Van As remarks, “We tried to stick to Justin’s character style, but he gave us a lot of creative freedom for the artwork across the board. There were times where Justin and McMahan had specific ideas in mind for certain characters. Justin gave us quick, funny doodles to illustrate what he was thinking with the Gooblers, for example. Justin and McMahan had specific ideas of what the wall setting should be, how big the people within the wall should be, so that was very helpful to help us nail down the look and feel of the environment. For the most part, they

trusted the artists to help build this world.” Marshall adds, “On the animation side of things, there’s always a huge amount of on-the-fly decision-making and last-minute changes as far as tone or style, and even more so on the first season of a show. I worked closely with Mike [McMahan] when designing effects and lighting, and he was always excited to hear our ideas. He always made time to look at something you were working on or toss ideas back and forth on the way something ought to look.” Ma Van As notes that on the first season, the team worked on a very compressed schedule. “We had to be very efficient and communicate constantly to eliminate as many mistakes as early as possible. We assumed that since the show was set on Earth, it would be relatively smooth sailing. Once we got into the wall, it became instantly more challenging, because we wanted to really create this world that the show creators had in mind without cutting corners.” The wall is an element that lends a certain gravitas to the series. “As far as the art style goes,” says Ma Van As, “we tried to make different areas of the wall feel different. There are the lower levels, which feel grungy and desolate. As you walk higher and into the richer areas, it gets more colorful and cleaner. When inside the wall, we really wanted to feel an expansive, yet trapped quality [like an ant farm] to contrast with the outside world, so we were very conscious of making sure the art reflected that in any way we could. “For the design and color portion of the show,” she continues, “we wanted to really push the scale and difference between the real world and the wall storylines. We wanted to utilize mood and lighting to help us push the drama. We wanted to make sure the designs within the wall guided the story of the materials the characters had access to, the scale and cleanliness of the environment.” Marshall comments, “For sure, it was a conscious decision to treat what’s happening in the wall a little differently. Everything is a little more cinematic, a little more serious. You don’t want it to look so far off that you come back from an act break and think you’re watching a completely different show, but using that cinematic style of storytelling for those sequences lends both drama to what’s happening and an extra comedic aspect to how seriously we’re treating these tiny people stuck in a wall.” Serialization also affected artistic choices for the show, as the team knew they had 16 episodes to work with. “The biggest thing is to keep things consistent,” Marshall advises. “Sometimes you make a quick decision on how something should look or you really go overboard on the FX side of things for a special one-off shot, and then you realize, ‘Oh, now we always have to do that.’ So you have to be careful when making those kinds of decisions that could come back to bite you three seasons from now.” On working with Roiland, Ma Van As notes, “Justin gives the artists a lot of trust and leeway, while at the same time being very precise about the things he really needs to see for his vision.” Adds Marshall, “Justin has always been great to work with. It’s always refreshing to work with someone who has a clear vision of what they want and he has a grasp of the animation process that not every creator shares.”

TOP: Korvo, Yumyulack, Jesse, the pupa and Terry. MIDDLE: A gathering of adults from Schlorp. BOTTOM: A revolution is breaking out inside the wall, which gets a full treatment in Episode 7.

FALL 2020 VFXVOICE.COM • 85

8/27/20 3:31 PM


GAMES

VIDEO GAMES SURGE IN LOCKDOWN BOOM By CHRIS McGOWAN

TOP AND BOTTOM: Nintendo’s Animal Crossing: New Horizons, playable on the Switch, is full of lovable characters and sold 13.4 million units in the six weeks following its March release. (Image courtesy of Nintendo)

86 • VFXVOICE.COM FALL 2020

PG 86-89 GAMES.indd 86-87

During the COVID lockdown, many people have passed a good portion of their free time playing video games, resulting in a spike for one of the rare industries that thrived when the pandemic took hold in the spring. Many already growing segments – such as mobile games, multiplayer games, the free-to-play model and the “battle royale” category – received an even bigger boost. The handheld Nintendo Switch enjoyed record sales. And employees across the video game industry worked remotely from home to keep development on track. “During a lockdown,” says Michael Pachter, Managing Director of Equity Research for Wedbush Securities, “people are home all the time, so they find themselves with less commute time, less preparation (grooming) and less time spent going to movies, the gym, the beach, restaurants, etc. That frees up at least two hours per day per person, and yes, they are entertaining themselves. The pickup in each entertainment medium is probably correlated to time spent there already, so TV probably [picked up] the biggest portion and games the second biggest.” April’s video game revenues illustrated that Americans were indeed busy at home with games on their PCs, cell phones and consoles. In terms of the latter, U.S. consumers spent $420 million on hardware like Sony’s PlayStation 4, Microsoft’s Xbox One and the Nintendo Switch, a staggering increase of 163% over the same month a year earlier, according to The NPD Group, a market research company. The Switch was the leader and its year-to-date dollar sales for the first four months of 2020 were the highest of any platform in U.S. history, according to NPD, breaking a record set by the Nintendo Wii in April 2009. “I’ve been working with video game sales data for over 15 years now and April is by far the wildest month I think I’ve ever seen,” NPD analyst Mat Piscatella commented on Twitter on May 22. The research firm also reported that total video game sales in the U.S. (hardware, software, accessories and game cards) reached nearly $1.5 billion in April, up 73% as compared to the same month the previous year. The gains for May, which had shorter lockdowns for many states, were not as high but still impressive: total U.S. sales reached $977 million, which was 52% higher than the same month in 2019. Square Enix’s Final Fantasy VII Remake (a PS4 exclusive until March 2021), Activision’s Call of Duty: Modern Warfare and Nintendo’s Animal Crossing: New Horizons (available only for Switch), were the top three titles in April, according to The NPD Group. 2K Sports’ NBA 2K20, Rockstar Games’ Grand Theft Auto V, Capcom’s Resident Evil 3 and Sony Interactive Entertainment’s MLB: The Show 20 were among other games at the top of the sales charts. “Games that allow players to interact with other players and help replace missing social interactions are absolutely booming. Live service games are all pushing for more content to keep up with the increased demand of their players,” comments Keith Guerrette, Founder and Studio Director at Beyond-FX. Guerrette is a visual effects artist who worked with developer Santa Monica Studio on Sony Interactive Entertainment’s God of War franchise and with developer Naughty Dog on Sony’s Uncharted series and The Last of Us franchise.

“We’re definitely seeing a surge in incoming work. We just had to turn down a project today because every one of our artists is booked. We’re looking at hiring to set ourselves up for more capacity to meet the demand. COVID turned on the floodgates to remote work because nearly every games company just went full remote. There are a lot of great tools to make this viable. Companies like Undertone FX were already set up to thrive in remote, distributed teams, and now COVID is normalizing this.” —David Johnson, Founder, CEO and Creative Director, Undertone FX For children, multiplayer video games are a great way to stay in touch with their friends when time, distance or a quarantine keep them apart. Kids’ consumer power could be felt in the big sales for Animal Crossing: New Horizons, which bowed on March 20, in the early days of the pandemic, and sold 13.4 million units in its first six weeks of release, according to Nintendo. Kids also powered the popularity of Roblox Corporation’s Roblox, a free-to-play game creation system and MMO (Massively Multiplayer Online game) that has games in many genres and claims to have had 150 million monthly active users as of July. Roblox has grossed over $1 billion in revenue since its 2006 launch, according to research firm Sensor Tower. Roblox is partly a building game, like the extremely popular Minecraft. The “sandbox game,” Minecraft from Microsoft/Mojang Studios, has sold over 200 million copies across all platforms, according to Mojang, which may take it past Tetris as the best-selling video game of all time, although sales estimates vary wildly for the latter. Minecraft had 126 million monthly active users as of May 2020, according to Microsoft, and had a spike of 40% in multiplayer sessions in April. Electronic Arts’ Sims franchise, another longstanding giant, is a series of “life simulation” video games and has sold nearly 200 million total copies of its different versions, according to EA. The “free-to-play” category includes popular multiplayer “battle royale” titles like Epic Games’s Fortnite, Electronic Arts’

TOP AND BOTTOM: Activision’s Call of Duty: Modern Warfare featured superrealistic battle scenes and is one of the best-selling video games in 2020. (Image courtesy of Activision Blizzard)

FALL 2020 VFXVOICE.COM • 87

9/9/20 12:32 PM


GAMES

VIDEO GAMES SURGE IN LOCKDOWN BOOM By CHRIS McGOWAN

TOP AND BOTTOM: Nintendo’s Animal Crossing: New Horizons, playable on the Switch, is full of lovable characters and sold 13.4 million units in the six weeks following its March release. (Image courtesy of Nintendo)

86 • VFXVOICE.COM FALL 2020

PG 86-89 GAMES.indd 86-87

During the COVID lockdown, many people have passed a good portion of their free time playing video games, resulting in a spike for one of the rare industries that thrived when the pandemic took hold in the spring. Many already growing segments – such as mobile games, multiplayer games, the free-to-play model and the “battle royale” category – received an even bigger boost. The handheld Nintendo Switch enjoyed record sales. And employees across the video game industry worked remotely from home to keep development on track. “During a lockdown,” says Michael Pachter, Managing Director of Equity Research for Wedbush Securities, “people are home all the time, so they find themselves with less commute time, less preparation (grooming) and less time spent going to movies, the gym, the beach, restaurants, etc. That frees up at least two hours per day per person, and yes, they are entertaining themselves. The pickup in each entertainment medium is probably correlated to time spent there already, so TV probably [picked up] the biggest portion and games the second biggest.” April’s video game revenues illustrated that Americans were indeed busy at home with games on their PCs, cell phones and consoles. In terms of the latter, U.S. consumers spent $420 million on hardware like Sony’s PlayStation 4, Microsoft’s Xbox One and the Nintendo Switch, a staggering increase of 163% over the same month a year earlier, according to The NPD Group, a market research company. The Switch was the leader and its year-to-date dollar sales for the first four months of 2020 were the highest of any platform in U.S. history, according to NPD, breaking a record set by the Nintendo Wii in April 2009. “I’ve been working with video game sales data for over 15 years now and April is by far the wildest month I think I’ve ever seen,” NPD analyst Mat Piscatella commented on Twitter on May 22. The research firm also reported that total video game sales in the U.S. (hardware, software, accessories and game cards) reached nearly $1.5 billion in April, up 73% as compared to the same month the previous year. The gains for May, which had shorter lockdowns for many states, were not as high but still impressive: total U.S. sales reached $977 million, which was 52% higher than the same month in 2019. Square Enix’s Final Fantasy VII Remake (a PS4 exclusive until March 2021), Activision’s Call of Duty: Modern Warfare and Nintendo’s Animal Crossing: New Horizons (available only for Switch), were the top three titles in April, according to The NPD Group. 2K Sports’ NBA 2K20, Rockstar Games’ Grand Theft Auto V, Capcom’s Resident Evil 3 and Sony Interactive Entertainment’s MLB: The Show 20 were among other games at the top of the sales charts. “Games that allow players to interact with other players and help replace missing social interactions are absolutely booming. Live service games are all pushing for more content to keep up with the increased demand of their players,” comments Keith Guerrette, Founder and Studio Director at Beyond-FX. Guerrette is a visual effects artist who worked with developer Santa Monica Studio on Sony Interactive Entertainment’s God of War franchise and with developer Naughty Dog on Sony’s Uncharted series and The Last of Us franchise.

“We’re definitely seeing a surge in incoming work. We just had to turn down a project today because every one of our artists is booked. We’re looking at hiring to set ourselves up for more capacity to meet the demand. COVID turned on the floodgates to remote work because nearly every games company just went full remote. There are a lot of great tools to make this viable. Companies like Undertone FX were already set up to thrive in remote, distributed teams, and now COVID is normalizing this.” —David Johnson, Founder, CEO and Creative Director, Undertone FX For children, multiplayer video games are a great way to stay in touch with their friends when time, distance or a quarantine keep them apart. Kids’ consumer power could be felt in the big sales for Animal Crossing: New Horizons, which bowed on March 20, in the early days of the pandemic, and sold 13.4 million units in its first six weeks of release, according to Nintendo. Kids also powered the popularity of Roblox Corporation’s Roblox, a free-to-play game creation system and MMO (Massively Multiplayer Online game) that has games in many genres and claims to have had 150 million monthly active users as of July. Roblox has grossed over $1 billion in revenue since its 2006 launch, according to research firm Sensor Tower. Roblox is partly a building game, like the extremely popular Minecraft. The “sandbox game,” Minecraft from Microsoft/Mojang Studios, has sold over 200 million copies across all platforms, according to Mojang, which may take it past Tetris as the best-selling video game of all time, although sales estimates vary wildly for the latter. Minecraft had 126 million monthly active users as of May 2020, according to Microsoft, and had a spike of 40% in multiplayer sessions in April. Electronic Arts’ Sims franchise, another longstanding giant, is a series of “life simulation” video games and has sold nearly 200 million total copies of its different versions, according to EA. The “free-to-play” category includes popular multiplayer “battle royale” titles like Epic Games’s Fortnite, Electronic Arts’

TOP AND BOTTOM: Activision’s Call of Duty: Modern Warfare featured superrealistic battle scenes and is one of the best-selling video games in 2020. (Image courtesy of Activision Blizzard)

FALL 2020 VFXVOICE.COM • 87

9/9/20 12:32 PM


GAMES

TOP: There’s always plenty of action in the free-to-play battle royale game Fortnite. (Image courtesy of Epic Games) MIDDLE: Travis Scott attracted millions of players to his April concert appearance in Fortnite. (Image courtesy of Epic Games) BOTTOM: Create your own world in the immensely popular Minecraft. (Image courtesy of Mojang Studios/Microsoft)

88 • VFXVOICE.COM FALL 2020

PG 86-89 GAMES.indd 88-89

Apex Legends and Activision Blizzard’s Call of Duty: Warzone. The latter is a free version of the popular Call of Duty series. It debuted on March 11 and had more than 60 million downloads in less than two months, according to Activision Blizzard. “Free-to-play models continue to permeate the industry. Battle Royale genre games are the hot segment that everyone seems to be chasing,” notes David Johnson, who is Founder, CEO and Creative Director of Undertone FX, a studio specializing in real-time visual effects for video games and VR/AR. Prior to that, Johnson was the Lead Visual Effects Artist at Activision Blizzard’s Infinity Ward Studio and worked on the Call of Duty franchise, among other projects. Other popular top free-to-play games include Nexon’s Dungeon Fighter Online, Tencent’s Honor of Kings, Niantic’s Pokemon GO, Smilegate’s Crossfire and Riot Games’ League of Legends. Free-to-play titles generate considerable income through sales of extras. Fortnite, which is a hybrid of a building game and shooter game, earned $1.8 billion in global digital sales in 2019, according to SuperData, a Nielsen-owned digital gaming research firm. It comes in three game mode versions: Fortnite Battle Royale, Fortnite: Save the World and Fortnite Creative. “Free-to-play games grow as their engagement grows. This means more players, more time spent, or more things to buy. All of the above is happening for free-to-play games, so, yes, they will keep growing,” says Pachter. “Fortnite is popular because it’s a good game and it’s approachable. It tells us that consumers like free-to-play and are willing to spend money on cosmetic items. Everyone else is emulating Fortnite’s success.” Fortnite players use real-world money to purchase the in-game currency of V-Bucks, which allows them to buy items like gliders, pickaxes, outfits and “emotes” (dances). In-game events are part of the mix. An in-game Travis Scott virtual concert in April had 12.3 million concurrent views from players, breaking Marshmello’s record last year of 10.7 million virtual attendees. A Star Wars event took place in 2019; the Millennium Falcon soared across the sky and director J.J. Abrams presented an exclusive clip from Star Wars: The Rise of Skywalker. “In-game events drive engagement higher. More engagement equals more spending,” comments Pachter. As of May 4, Fortnite had 350 million registered players worldwide and, in April alone, its players racked up a combined 3.2 billion hours in-game, according to Epic Games. Many gamers use subscription services. In late April, Microsoft revealed that Xbox Game Pass had surpassed 10 million members from 41 countries around the world, and that Xbox Live had almost 90 million active users. Game Pass members are interacting more now within the gaming community. In another example of the lockdown effect, Microsoft reported on May 30 that Game Pass subscribers had added over 23 million friends since March, a 70% increase in the “friendship rate.” Apple Arcade, PlayStation Now, EA Access, Google Stadia, Nintendo Switch Online and Ubisoft’s Uplay+ are other examples of video game subscription services, which seem to be multiplying at the rate of premium TV services.

“Free-to-play games grow as their engagement grows. This means more players, more time spent, or more things to buy. All of the above is happening for free-to-play games, so, yes, they will keep growing. Fortnite is popular because it’s a good game and it’s approachable. It tells us that consumers like free-to-play and are willing to spend money on cosmetic items. Everyone else is emulating Fortnite’s success.” —Michael Pachter, Managing Director of Equity Research, Wedbush Securities According to SuperData, the video game industry is building on the already impressive results of 2019, in which video games had revenue of $120.1 billion worldwide, with a breakdown of $64.4 billion for mobile games, $29.6 billion for PC games, and $15.4 billion for consoles. In terms of the latter, the industry eagerly awaits the debuts of the next-generation PlayStation 5 and Xbox Series X consoles, both expected soon, even as the PS4 continues to do battle with the Xbox One and Switch (the three platforms’ estimated lifetime sales as of May 6 were 109.7 million units, 47.2 million, and 55.7 million, respectively, according to U.K. research firm VGChartz). Extended reality (XR), which encompasses VR, AR and MR, is also growing steadily. SuperData reported $6.3 billion in XR sales in 2019, a 26% increase in revenue, thanks in part to new headsets like the standalone Oculus Quest. About VR, Guerrette says, “There are absolutely breathtaking productions that are seeing profits and rewards from consumers, which makes me believe that the market is sticking, and will continue to grow quickly as technology only allows us to create better and better experiences.” The pandemic has delayed some software titles, but Pachter observes “no meaningful impact on games development or manufacture.” VFX and animation studios have been particularly well-suited to operating during a lockdown. “We’re definitely seeing a surge in incoming work,” says Johnson. “We just had to turn down a project today because every one of our artists is booked. We’re looking at hiring to set ourselves up for more capacity to meet the demand.” He adds, “COVID turned on the floodgates to remote work because nearly every games company just went full remote. There are a lot of great tools to make this viable. Companies like Undertone FX were already set up to thrive in remote, distributed teams, and now COVID is normalizing this.” “In many ways,” Guerrette comments, “this experience has pushed the video game industry [onto] a path that it was already moving towards – distributed development to allow talented team members to function as a part of the team from wherever they call home.”

TOP: Cloud Strife and Sephiroth are characters in Square Enix’s Final Fantasy VII: Remake. (Image courtesy of Square Enix) MIDDLE: More wild driving and wild action in Grand Theft Auto V. (Image courtesy of Rockstar Games) BOTTOM: The Sims 4 has carried on the best-selling life-simulation game franchise. (Image courtesy of Electronic Arts)

FALL 2020 VFXVOICE.COM • 89

9/9/20 12:32 PM


GAMES

TOP: There’s always plenty of action in the free-to-play battle royale game Fortnite. (Image courtesy of Epic Games) MIDDLE: Travis Scott attracted millions of players to his April concert appearance in Fortnite. (Image courtesy of Epic Games) BOTTOM: Create your own world in the immensely popular Minecraft. (Image courtesy of Mojang Studios/Microsoft)

88 • VFXVOICE.COM FALL 2020

PG 86-89 GAMES.indd 88-89

Apex Legends and Activision Blizzard’s Call of Duty: Warzone. The latter is a free version of the popular Call of Duty series. It debuted on March 11 and had more than 60 million downloads in less than two months, according to Activision Blizzard. “Free-to-play models continue to permeate the industry. Battle Royale genre games are the hot segment that everyone seems to be chasing,” notes David Johnson, who is Founder, CEO and Creative Director of Undertone FX, a studio specializing in real-time visual effects for video games and VR/AR. Prior to that, Johnson was the Lead Visual Effects Artist at Activision Blizzard’s Infinity Ward Studio and worked on the Call of Duty franchise, among other projects. Other popular top free-to-play games include Nexon’s Dungeon Fighter Online, Tencent’s Honor of Kings, Niantic’s Pokemon GO, Smilegate’s Crossfire and Riot Games’ League of Legends. Free-to-play titles generate considerable income through sales of extras. Fortnite, which is a hybrid of a building game and shooter game, earned $1.8 billion in global digital sales in 2019, according to SuperData, a Nielsen-owned digital gaming research firm. It comes in three game mode versions: Fortnite Battle Royale, Fortnite: Save the World and Fortnite Creative. “Free-to-play games grow as their engagement grows. This means more players, more time spent, or more things to buy. All of the above is happening for free-to-play games, so, yes, they will keep growing,” says Pachter. “Fortnite is popular because it’s a good game and it’s approachable. It tells us that consumers like free-to-play and are willing to spend money on cosmetic items. Everyone else is emulating Fortnite’s success.” Fortnite players use real-world money to purchase the in-game currency of V-Bucks, which allows them to buy items like gliders, pickaxes, outfits and “emotes” (dances). In-game events are part of the mix. An in-game Travis Scott virtual concert in April had 12.3 million concurrent views from players, breaking Marshmello’s record last year of 10.7 million virtual attendees. A Star Wars event took place in 2019; the Millennium Falcon soared across the sky and director J.J. Abrams presented an exclusive clip from Star Wars: The Rise of Skywalker. “In-game events drive engagement higher. More engagement equals more spending,” comments Pachter. As of May 4, Fortnite had 350 million registered players worldwide and, in April alone, its players racked up a combined 3.2 billion hours in-game, according to Epic Games. Many gamers use subscription services. In late April, Microsoft revealed that Xbox Game Pass had surpassed 10 million members from 41 countries around the world, and that Xbox Live had almost 90 million active users. Game Pass members are interacting more now within the gaming community. In another example of the lockdown effect, Microsoft reported on May 30 that Game Pass subscribers had added over 23 million friends since March, a 70% increase in the “friendship rate.” Apple Arcade, PlayStation Now, EA Access, Google Stadia, Nintendo Switch Online and Ubisoft’s Uplay+ are other examples of video game subscription services, which seem to be multiplying at the rate of premium TV services.

“Free-to-play games grow as their engagement grows. This means more players, more time spent, or more things to buy. All of the above is happening for free-to-play games, so, yes, they will keep growing. Fortnite is popular because it’s a good game and it’s approachable. It tells us that consumers like free-to-play and are willing to spend money on cosmetic items. Everyone else is emulating Fortnite’s success.” —Michael Pachter, Managing Director of Equity Research, Wedbush Securities According to SuperData, the video game industry is building on the already impressive results of 2019, in which video games had revenue of $120.1 billion worldwide, with a breakdown of $64.4 billion for mobile games, $29.6 billion for PC games, and $15.4 billion for consoles. In terms of the latter, the industry eagerly awaits the debuts of the next-generation PlayStation 5 and Xbox Series X consoles, both expected soon, even as the PS4 continues to do battle with the Xbox One and Switch (the three platforms’ estimated lifetime sales as of May 6 were 109.7 million units, 47.2 million, and 55.7 million, respectively, according to U.K. research firm VGChartz). Extended reality (XR), which encompasses VR, AR and MR, is also growing steadily. SuperData reported $6.3 billion in XR sales in 2019, a 26% increase in revenue, thanks in part to new headsets like the standalone Oculus Quest. About VR, Guerrette says, “There are absolutely breathtaking productions that are seeing profits and rewards from consumers, which makes me believe that the market is sticking, and will continue to grow quickly as technology only allows us to create better and better experiences.” The pandemic has delayed some software titles, but Pachter observes “no meaningful impact on games development or manufacture.” VFX and animation studios have been particularly well-suited to operating during a lockdown. “We’re definitely seeing a surge in incoming work,” says Johnson. “We just had to turn down a project today because every one of our artists is booked. We’re looking at hiring to set ourselves up for more capacity to meet the demand.” He adds, “COVID turned on the floodgates to remote work because nearly every games company just went full remote. There are a lot of great tools to make this viable. Companies like Undertone FX were already set up to thrive in remote, distributed teams, and now COVID is normalizing this.” “In many ways,” Guerrette comments, “this experience has pushed the video game industry [onto] a path that it was already moving towards – distributed development to allow talented team members to function as a part of the team from wherever they call home.”

TOP: Cloud Strife and Sephiroth are characters in Square Enix’s Final Fantasy VII: Remake. (Image courtesy of Square Enix) MIDDLE: More wild driving and wild action in Grand Theft Auto V. (Image courtesy of Rockstar Games) BOTTOM: The Sims 4 has carried on the best-selling life-simulation game franchise. (Image courtesy of Electronic Arts)

FALL 2020 VFXVOICE.COM • 89

9/9/20 12:32 PM


[ VES SECTION SPOTLIGHT: INDIA ]

VFX in the Heart of Bollywood By NAOMI GOLDMAN

TOP AND MIDDLE: VES India celebrates its launch in December 2017 at Red Chillies’ VFX Studio in Mumbai. BOTTOM: VES India hosted a highly successful “The Making Of” behind the scenes event with AMD. This photo is of Tim McGovern, VFX Supervisor at DNEG, and Chair of the Outreach to Developing Regions Committee.

90 • VFXVOICE.COM FALL 2020

The VES India Section is 50+ members strong and growing in one of the most dynamic and populous VFX markets in the world. VES India was formally established after years of hard work by passionate VFX artists and innovators in the industry to rally the community, working in concert with the VES Outreach to Developing Regions Committee to successfully launch in 2017. The film and TV industry is spread across the country. And while all regions and states have boutique studios catering to animation and visual effects, the majority of the work for local and international projects originates from four main hubs: Mumbai, Bangalore, Hyderabad and Chennai. Due to India’s diversity and regional boundaries, the VES Section is uniquely positioned to bring studios and practitioners together to network, socialize and build community – critical to meeting client deadlines and fostering ongoing professional development. “I see VES India scaling new heights with the best of the domain experts leading the way and fostering a tight-knit community,” says Abhishek Krishnan, Chair of the India Section Board of Managers and Marketing Manager at AMD. “We are confident in our plans to spread awareness about our huge talent pool, not just in India but internationally as well.” The young Section’s well-respected members come from various backgrounds, and many are considered respected leaders in their disciplines. Most of the members have strong VFX and animation expertise spanning 15+ years in the industry, with a mix of experience in VFX for films, OTT content and animation. Members also work on gaming and technology and represent the creative, production and management fields. Professionals come from leading studios including Red Chillies Entertainment, NY VFXWAALA, DNEG, MPC, The Mill, 88 Pictures, Prana Studios and Phantomfx, among others. “I have been involved with the VES since 2012, when I became a member in Toronto,” says Pankaj Verma, member of the VES India Section Board of Managers. “Upon my return home to India, I realized that VES could be the medium to get connected with like-minded professionals and meet the big need to develop a sense of community. It was a very prideful moment for me to be a part of building another VES Section.” The Indian film industry is serving all genres of movie making, including feature, episodic, advertising and streaming medium in both live action and animation. The Indian VFX market is divided in two streams, one that caters to the local/regional OTT content demand and the other to serve the outsourced work from all over the world.” To serve the demand, the Indian VFX industry has a strong creative and technical workforce – a big pool of artists who have been trained at various institutions abroad and in India. “This, with adaptation of English as a business language,” states Verma, “makes India the best choice for the international market. This is very evident as major production houses/VFX studios, including DNEG, MPC, Mill Films and Framestore, are opening their arms to us. And this is creating a unique blend of work culture by mixing foreign and Indian culture.” VES India’s formation was announced in December 2017 at

Red Chillies’ VFX studio in Mumbai, where an inaugural party was hosted to celebrate the Section and welcome the new board and its members. Since then, the Section has embarked on a robust program of events to provide valuable benefits to members and cultivate new prospects. Before the advent of COVID-19, film screenings organized by the VES headquarters were held almost every week for VES members and families in India. Mumbai does a lot of screenings, and once it is safe to resume in-person events, the team will explore bringing screenings to other cities and expanding activities across the whole region. Additionally, the Section had been hosting a steady stream of regional meet and greets led by members locally in every major city, and has hosted successful online and offline meets in Chennai, Bangalore and Mumbai. Going forward the Section will continue to host virtual events, with an eye towards increasing the schedule to ease connectivity between members and prospects. “It’s a great experience hosting these meetups where existing and potential members have a chance to socialize,” says Balakrishnan Rajarathinam, member of the VES India Section Board of Managers and VFX Head of Production, Basilic Fly Studio. “New members really seem to enjoy the camaraderie and the intensive knowledge sharing in a chilled environment. And it’s gratifying that most of the participants who attend our events feel a strong sense of community.” VES India also co-sponsored events with other organizations to help increase awareness and learning about VFX/animation films. This was typified by the behind-the-scenes event “The Making Of” held with AMD, which was largely done to share information of both regional and international VFX/animation work being done in India by regional studios. The Section leadership has a forward-looking vision and clear goals for enriching its programmatic offerings. “We are planning to conduct master classes in our Section as platforms for pioneers in our industry to interact with VFX students and aspirants,” says Rajarathinam. “Not only students, but young professionals and budding artists, as well, to get the benefit of interacting with VFX legends. Our goal is to engage the VFX community through frequent meetups and events. We aim to do these events across multiple Indian cities by engaging VFX studios in respective regions.“ “Moving forward, we are planning to host informative/educational events to help people learn, share and collaborate via webinars and seminars,” says Verma. “We are finding new ways to reach candidates to recruit, and targeting small boutique studios to explore the possibilities of engaging eligible professionals to apply for VES membership.” In reflecting on the trajectory of the Section, the leadership team shared their thoughts on the significance of belonging to the Society and its impact on the visual effects community. “Being a part of the VES is so rewarding,” says Krishnan. “The Society is the epitome of talented and hardworking individuals. It’s great to be recognized for these efforts and be a part of this global platform, which also allows you to contribute and make it better

every day. We will be working together to bring the community closer and make it stronger in the coming years.” “We have the chance to be VFX ambassadors,” says Verma. “People around the world can look to the VES India Section not just for knowledge and idea sharing, but to collaborate and learn about India and the Indian VFX industry.” “As members of the VES, we’re able to take a lot of initiative for the industry’s well-being and evolution,” says Rajarathinam. “It also gives us satisfaction to represent our industry’s aspirations within our community and to be leaders across India. Being a part of the Society is a privilege and a great experience. And as our plans for more intensive activities are executed, we foresee the Section making a significant impact on the regional VFX community and the industry as a whole.”

TOP: VES India hosted a highly successful “The Making Of” behind the scenes event with AMD. MIDDLE: VES India members and prospects gather for festive meetups throughout the year. This one is in Bangalore. BOTTOM: VES India meetup in Mumbai.

FALL 2020

VFXVOICE.COM • 91


[ VES SECTION SPOTLIGHT: INDIA ]

VFX in the Heart of Bollywood By NAOMI GOLDMAN

TOP AND MIDDLE: VES India celebrates its launch in December 2017 at Red Chillies’ VFX Studio in Mumbai. BOTTOM: VES India hosted a highly successful “The Making Of” behind the scenes event with AMD. This photo is of Tim McGovern, VFX Supervisor at DNEG, and Chair of the Outreach to Developing Regions Committee.

90 • VFXVOICE.COM FALL 2020

The VES India Section is 50+ members strong and growing in one of the most dynamic and populous VFX markets in the world. VES India was formally established after years of hard work by passionate VFX artists and innovators in the industry to rally the community, working in concert with the VES Outreach to Developing Regions Committee to successfully launch in 2017. The film and TV industry is spread across the country. And while all regions and states have boutique studios catering to animation and visual effects, the majority of the work for local and international projects originates from four main hubs: Mumbai, Bangalore, Hyderabad and Chennai. Due to India’s diversity and regional boundaries, the VES Section is uniquely positioned to bring studios and practitioners together to network, socialize and build community – critical to meeting client deadlines and fostering ongoing professional development. “I see VES India scaling new heights with the best of the domain experts leading the way and fostering a tight-knit community,” says Abhishek Krishnan, Chair of the India Section Board of Managers and Marketing Manager at AMD. “We are confident in our plans to spread awareness about our huge talent pool, not just in India but internationally as well.” The young Section’s well-respected members come from various backgrounds, and many are considered respected leaders in their disciplines. Most of the members have strong VFX and animation expertise spanning 15+ years in the industry, with a mix of experience in VFX for films, OTT content and animation. Members also work on gaming and technology and represent the creative, production and management fields. Professionals come from leading studios including Red Chillies Entertainment, NY VFXWAALA, DNEG, MPC, The Mill, 88 Pictures, Prana Studios and Phantomfx, among others. “I have been involved with the VES since 2012, when I became a member in Toronto,” says Pankaj Verma, member of the VES India Section Board of Managers. “Upon my return home to India, I realized that VES could be the medium to get connected with like-minded professionals and meet the big need to develop a sense of community. It was a very prideful moment for me to be a part of building another VES Section.” The Indian film industry is serving all genres of movie making, including feature, episodic, advertising and streaming medium in both live action and animation. The Indian VFX market is divided in two streams, one that caters to the local/regional OTT content demand and the other to serve the outsourced work from all over the world.” To serve the demand, the Indian VFX industry has a strong creative and technical workforce – a big pool of artists who have been trained at various institutions abroad and in India. “This, with adaptation of English as a business language,” states Verma, “makes India the best choice for the international market. This is very evident as major production houses/VFX studios, including DNEG, MPC, Mill Films and Framestore, are opening their arms to us. And this is creating a unique blend of work culture by mixing foreign and Indian culture.” VES India’s formation was announced in December 2017 at

Red Chillies’ VFX studio in Mumbai, where an inaugural party was hosted to celebrate the Section and welcome the new board and its members. Since then, the Section has embarked on a robust program of events to provide valuable benefits to members and cultivate new prospects. Before the advent of COVID-19, film screenings organized by the VES headquarters were held almost every week for VES members and families in India. Mumbai does a lot of screenings, and once it is safe to resume in-person events, the team will explore bringing screenings to other cities and expanding activities across the whole region. Additionally, the Section had been hosting a steady stream of regional meet and greets led by members locally in every major city, and has hosted successful online and offline meets in Chennai, Bangalore and Mumbai. Going forward the Section will continue to host virtual events, with an eye towards increasing the schedule to ease connectivity between members and prospects. “It’s a great experience hosting these meetups where existing and potential members have a chance to socialize,” says Balakrishnan Rajarathinam, member of the VES India Section Board of Managers and VFX Head of Production, Basilic Fly Studio. “New members really seem to enjoy the camaraderie and the intensive knowledge sharing in a chilled environment. And it’s gratifying that most of the participants who attend our events feel a strong sense of community.” VES India also co-sponsored events with other organizations to help increase awareness and learning about VFX/animation films. This was typified by the behind-the-scenes event “The Making Of” held with AMD, which was largely done to share information of both regional and international VFX/animation work being done in India by regional studios. The Section leadership has a forward-looking vision and clear goals for enriching its programmatic offerings. “We are planning to conduct master classes in our Section as platforms for pioneers in our industry to interact with VFX students and aspirants,” says Rajarathinam. “Not only students, but young professionals and budding artists, as well, to get the benefit of interacting with VFX legends. Our goal is to engage the VFX community through frequent meetups and events. We aim to do these events across multiple Indian cities by engaging VFX studios in respective regions.“ “Moving forward, we are planning to host informative/educational events to help people learn, share and collaborate via webinars and seminars,” says Verma. “We are finding new ways to reach candidates to recruit, and targeting small boutique studios to explore the possibilities of engaging eligible professionals to apply for VES membership.” In reflecting on the trajectory of the Section, the leadership team shared their thoughts on the significance of belonging to the Society and its impact on the visual effects community. “Being a part of the VES is so rewarding,” says Krishnan. “The Society is the epitome of talented and hardworking individuals. It’s great to be recognized for these efforts and be a part of this global platform, which also allows you to contribute and make it better

every day. We will be working together to bring the community closer and make it stronger in the coming years.” “We have the chance to be VFX ambassadors,” says Verma. “People around the world can look to the VES India Section not just for knowledge and idea sharing, but to collaborate and learn about India and the Indian VFX industry.” “As members of the VES, we’re able to take a lot of initiative for the industry’s well-being and evolution,” says Rajarathinam. “It also gives us satisfaction to represent our industry’s aspirations within our community and to be leaders across India. Being a part of the Society is a privilege and a great experience. And as our plans for more intensive activities are executed, we foresee the Section making a significant impact on the regional VFX community and the industry as a whole.”

TOP: VES India hosted a highly successful “The Making Of” behind the scenes event with AMD. MIDDLE: VES India members and prospects gather for festive meetups throughout the year. This one is in Bangalore. BOTTOM: VES India meetup in Mumbai.

FALL 2020

VFXVOICE.COM • 91


[ THE VES HANDBOOK ]

Virtual Production Abstracted from The VES Handbook of Visual Effects – 3rd Edition (Chapter 2, page 57) Written by Addison Bath. Edited for this publication by Jeffrey A. Okun, VES

Much of the The Mandalorian was filmed indoors using a new technology called Stagecraft, which utilizes massive, rearprojected LED screens to create realistic background environments for scenes. (Image copyright © 2019 Disney+ and Industrial Light & Magic)

Live Action with CG Elements Merging CG into a live-action shoot allows the director and crew to evaluate composition, lighting, timing, and more, before the final plate is captured. Bringing CG elements into the real world requires a great deal of planning. Avatar (2009) was the first film to use a system that allowed the crew to see a live composite of the CG elements through the eyepiece of the camera and on set monitors. To do this the picture camera must be accurately tracked and recreated in the virtual world. The most common camera tracking techniques are optical markers, encoders, IMUs, computer vision, or some combination of these. The other key to camera recreation is using encoders to read lens information for the virtual camera. The real-time render is then composited with the live plate. Aligning the virtual world to the real world must be accurate for the technique to be effective. Survey, lidar, and other tracked physical objects are used to achieve this lineup. Gravity (2013) advanced virtual production by creating the light box – walls of LED lights surrounding a 10ft x 10ft acting space. The LED lights provided correct lighting for the live-action subject and allowed them to see the CG images to be added later. These were driven in real-time by images pre-rendered from the actor’s perspective allowing them a better performance because they were in the CG world in real time and able to act/react to it. The Mandalorian (2019) pushed the LED wall technique farther being used extensively. Live Action with CG Characters Merging CG characters into live action in real time offers many advantages: It

92 • VFXVOICE.COM FALL 2020

allows camera moves to be accurately motivated by CG characters; proper eyelines can be verified; and actors can perform against their CG counterparts. Performances from the CG characters can be prerecorded or achieved live, but they must be rendered in real time. On Real Steel (2011), performance capture was used to record the choreography for all the fights between the boxing robots in pre-production. Once shooting began, motion capture volumes were built at all fight locations to track the picture cameras; this included both indoor and outdoor sets enabled by active LED markers used for tracking. The live composite was recorded along with the tracking information and provided to the VFX facilities as key information needed to produce the final shots. In Game of Thrones (2011–2019), multiple techniques were used to bring the CG dragons to life. Motioncontrolled flamethrowers were driven with animation from the dragon’s head. Thor: Ragnarok (2017) also applied this technique by using real-time performance from an actor to drive the Hulk, using Simulcam to view the Hulk in context of the real world. This was key to the production as the CG character was much larger than the actor driving the performance. The actor and camera were fitted with markers, and movable towers of motion capture cameras were constructed to allow maximum flexibility during the shoot. Using this process allowed both performance and composition of physical and digital characters to be explored in real time. The power of handing back creative freedom and control to the key decisionmakers is what virtual production is all about. It is a growing area both for use and advances in the technology.



[ VES NEWS ]

VES Inducts 2020 Honorees By NAOMI GOLDMAN The Visual Effects Society Board of Directors is pleased to announce the 2020 inductees into the VES Hall of Fame and the newest Lifetime and Honorary members as well as this year’s recipient of the VES Founders Award. “Our VES honorees represent a group of exceptional artists, innovators and professionals who have had a profound impact on the field of visual effects,” said Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” Founders Award Recipient: Ray Scalice. For more than 40 years, Scalice has served in executive management positions with Lucasfilm Ltd, Industrial Light & Magic, The Walt Disney Company and Pacific Title Digital. With more than 80 film credits, Scalice holds the position of General Manager/Executive Producer for Pixel Magic. A founding member of the VES, he has served many years on the Awards Committee and as Co-chair of the Business, Labor & Law Committee, and as a member of the Board of Directors. Lifetime Member: Debbie Denise. Executive Producer Denise has been involved with the visual effects and animation industry for almost 30 years. She produced the visual effects for Death Becomes Her and Forrest Gump at Industrial Light & Magic, and shepherded groundbreaking films as Executive Vice President of Production at Sony Pictures Imageworks, including The Amazing Spider-Man, Alice In Wonderland, Watchmen (2009), Academy Award-nominated Stuart Little, the Harry Potter and Men In Black franchises and Hotel Transylvania. Lifetime Member: Thomas Haegele. Professor Haegele established Polygon, one of the first German production houses for professional computer animation. He is the Co-founder of Filmakademie Baden-Wuerttemberg, and was a professor for Animation and Digital Imaging for more than 25 years. Haegele served as the Director of the Institute of Animation, Visual Effects and Digital Postproduction, as well as the Deputy Managing Director of the Filmakademie, and is also the founder and Conference Chair of FMX. LEFT TO RIGHT: Ray Scalice, Debbie Denise, Thomas Haegele, Richard Hollander, VES, Eugene “Gene” P. Rizzardi, Jr., Ron Cobb, Don Iwerks, Greg Jein

94 • VFXVOICE.COM FALL 2020

PG 94-95 VES NEWS.indd All Pages

Lifetime Member: Richard Hollander, VES. As President of the Film Division and Senior Visual Effects Supervisor at Rhythm & Hues Studios, Hollander oversaw VFX production on Superman Returns, Garfield, 300: Rise of an Empire, The Cat in the Hat, and Chronicles of Riddick. He serves on the Academy of Motion Picture Arts and Sciences’ Visual Effects branch’s Sci-Tech Council and Executive Committee, is a founding board member of the VES and a VES Fellow, and is the recipient of a Scientific and Technical Achievement Award from the Academy of Motion Picture Arts and Sciences.

Lifetime Member: Eugene “Gene” P. Rizzardi, Jr. Now retired, Rizzardi is an acclaimed model maker, model shop supervisor and special effects artist. He garnered an Emmy Award for Outstanding Special Visual Effects for The Hugga Bunch. Further credits include Apollo 13, Titanic, Alien: Resurrection, Godzilla (1998) and Dinner for Schmucks. Rizzardi served for many years on the VES Board of Directors and is a member of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences and the Association of Professional Model Makers. Honorary Member: Ron Cobb. Cobb is an acclaimed cartoonist, artist, writer, film designer and film director. He was the Production Designer on Conan the Barbarian, The Last Starfighter and Leviathan, and contributed conceptual designs to Star Wars, Alien, Close Encounters of the Third Kind, The Abyss, Total Recall (1990) and Back to the Future. His illustrations have been published in the books RCD-25, Mah Fellow Americans, The Cobb Book, Cobb Again and Colorvision. Honorary Member: Don Iwerks. Iwerks is a former Disney executive, Disney Legend and Co-founder of Iwerks Entertainment, and a renowned developer of special venues throughout the world. Don is the recipient of the Gordon E. Sawyer Award, an Oscar © from the Academy of Motion Picture Arts and Sciences, that honors “an individual in the motion picture industry whose technological contributions have brought credit to the industry.”

rods that are first manually pushed into position to create lit and shaded areas, then filmed frame by frame. It was one of the first devices ever to produce animation by reconfiguring a set of individual picture elements, later called pixels. Gene Warren, Jr. (1941–2019). Warren, Jr. was a lauded special effects designer at Fantasy II Film Effects. He received an Academy Award and BAFTA for his work on Terminator 2: Judgment Day and an Emmy for The Winds of War. He was also known for his work providing spectacular in-camera illusions for Francis Ford Coppola’s Bram Stoker’s Dracula, and shared a VES Award nomination with his son, Gene Warren III, for miniature work on the action thriller The Expendables. Gene Warren, Sr. (1916 –1997). Warren, Sr. was an award-winning special effects director who started his career as an animator and puppeteer. His work was seen in dozens of films from the 1950s through 70s, including Tom Thumb, The Seven Faces of Dr. Lao, Spartacus, The Andromeda Strain and The Time Machine, which won him the Academy Award for Special Effects. The honorees and inductees will be recognized at an event this fall. Names of this year’s VES Fellows were not announced in time for publication in this issue.

Honorary Member: Greg Jein. Jein is a model designer and artist whose work includes studio models, props and other artwork, including landscape miniatures, that appeared throughout the Star Trek franchise. Jein was twice nominated for an Academy Award for Visual Effects for his work on Close Encounters of the Third Kind and 1941, and is also known for his work on Avatar, Oblivion and Interstellar.

VES Hall of Fame Inductees Irwin Allen (1916–1991). Allen was an American film and television producer and director known for his work in science fiction, then later as the “Master of Disaster” for his work in the disaster film genre. His most successful productions were The Poseidon Adventure and The Towering Inferno. He won an Academy Award for his documentary The Sea Around Us and was creator of Lost in Space, Voyage to the Bottom of the Sea and Time Tunnel. Mary Blair (1911–1978). Blair was an American artist, animator and designer and Disney Legend, prominent in producing art and animation for The Walt Disney Company and drawing concept art for Alice in Wonderland, Peter Pan and Cinderella and character designs for attractions including Disneyland’s It’s a Small World.

LEFT TO RIGHT: Irwin Allen, Mary Blair, Claire Parker, Gene Warren, Jr., Gene Warren, Sr.

Claire Parker (1906–1981). Parker was an American engineer and animator. Her best-known contribution to the history of cinema is the Pinscreen, a vertically-mounted grid of 240,000 sliding metal

FALL 2020 VFXVOICE.COM • 95

9/14/20 4:21 PM


[ VES NEWS ]

VES Inducts 2020 Honorees By NAOMI GOLDMAN The Visual Effects Society Board of Directors is pleased to announce the 2020 inductees into the VES Hall of Fame and the newest Lifetime and Honorary members as well as this year’s recipient of the VES Founders Award. “Our VES honorees represent a group of exceptional artists, innovators and professionals who have had a profound impact on the field of visual effects,” said Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” Founders Award Recipient: Ray Scalice. For more than 40 years, Scalice has served in executive management positions with Lucasfilm Ltd, Industrial Light & Magic, The Walt Disney Company and Pacific Title Digital. With more than 80 film credits, Scalice holds the position of General Manager/Executive Producer for Pixel Magic. A founding member of the VES, he has served many years on the Awards Committee and as Co-chair of the Business, Labor & Law Committee, and as a member of the Board of Directors. Lifetime Member: Debbie Denise. Executive Producer Denise has been involved with the visual effects and animation industry for almost 30 years. She produced the visual effects for Death Becomes Her and Forrest Gump at Industrial Light & Magic, and shepherded groundbreaking films as Executive Vice President of Production at Sony Pictures Imageworks, including The Amazing Spider-Man, Alice In Wonderland, Watchmen (2009), Academy Award-nominated Stuart Little, the Harry Potter and Men In Black franchises and Hotel Transylvania. Lifetime Member: Thomas Haegele. Professor Haegele established Polygon, one of the first German production houses for professional computer animation. He is the Co-founder of Filmakademie Baden-Wuerttemberg, and was a professor for Animation and Digital Imaging for more than 25 years. Haegele served as the Director of the Institute of Animation, Visual Effects and Digital Postproduction, as well as the Deputy Managing Director of the Filmakademie, and is also the founder and Conference Chair of FMX. LEFT TO RIGHT: Ray Scalice, Debbie Denise, Thomas Haegele, Richard Hollander, VES, Eugene “Gene” P. Rizzardi, Jr., Ron Cobb, Don Iwerks, Greg Jein

94 • VFXVOICE.COM FALL 2020

PG 94-95 VES NEWS.indd All Pages

Lifetime Member: Richard Hollander, VES. As President of the Film Division and Senior Visual Effects Supervisor at Rhythm & Hues Studios, Hollander oversaw VFX production on Superman Returns, Garfield, 300: Rise of an Empire, The Cat in the Hat, and Chronicles of Riddick. He serves on the Academy of Motion Picture Arts and Sciences’ Visual Effects branch’s Sci-Tech Council and Executive Committee, is a founding board member of the VES and a VES Fellow, and is the recipient of a Scientific and Technical Achievement Award from the Academy of Motion Picture Arts and Sciences.

Lifetime Member: Eugene “Gene” P. Rizzardi, Jr. Now retired, Rizzardi is an acclaimed model maker, model shop supervisor and special effects artist. He garnered an Emmy Award for Outstanding Special Visual Effects for The Hugga Bunch. Further credits include Apollo 13, Titanic, Alien: Resurrection, Godzilla (1998) and Dinner for Schmucks. Rizzardi served for many years on the VES Board of Directors and is a member of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences and the Association of Professional Model Makers. Honorary Member: Ron Cobb. Cobb is an acclaimed cartoonist, artist, writer, film designer and film director. He was the Production Designer on Conan the Barbarian, The Last Starfighter and Leviathan, and contributed conceptual designs to Star Wars, Alien, Close Encounters of the Third Kind, The Abyss, Total Recall (1990) and Back to the Future. His illustrations have been published in the books RCD-25, Mah Fellow Americans, The Cobb Book, Cobb Again and Colorvision. Honorary Member: Don Iwerks. Iwerks is a former Disney executive, Disney Legend and Co-founder of Iwerks Entertainment, and a renowned developer of special venues throughout the world. Don is the recipient of the Gordon E. Sawyer Award, an Oscar © from the Academy of Motion Picture Arts and Sciences, that honors “an individual in the motion picture industry whose technological contributions have brought credit to the industry.”

rods that are first manually pushed into position to create lit and shaded areas, then filmed frame by frame. It was one of the first devices ever to produce animation by reconfiguring a set of individual picture elements, later called pixels. Gene Warren, Jr. (1941–2019). Warren, Jr. was a lauded special effects designer at Fantasy II Film Effects. He received an Academy Award and BAFTA for his work on Terminator 2: Judgment Day and an Emmy for The Winds of War. He was also known for his work providing spectacular in-camera illusions for Francis Ford Coppola’s Bram Stoker’s Dracula, and shared a VES Award nomination with his son, Gene Warren III, for miniature work on the action thriller The Expendables. Gene Warren, Sr. (1916 –1997). Warren, Sr. was an award-winning special effects director who started his career as an animator and puppeteer. His work was seen in dozens of films from the 1950s through 70s, including Tom Thumb, The Seven Faces of Dr. Lao, Spartacus, The Andromeda Strain and The Time Machine, which won him the Academy Award for Special Effects. The honorees and inductees will be recognized at an event this fall. Names of this year’s VES Fellows were not announced in time for publication in this issue.

Honorary Member: Greg Jein. Jein is a model designer and artist whose work includes studio models, props and other artwork, including landscape miniatures, that appeared throughout the Star Trek franchise. Jein was twice nominated for an Academy Award for Visual Effects for his work on Close Encounters of the Third Kind and 1941, and is also known for his work on Avatar, Oblivion and Interstellar.

VES Hall of Fame Inductees Irwin Allen (1916–1991). Allen was an American film and television producer and director known for his work in science fiction, then later as the “Master of Disaster” for his work in the disaster film genre. His most successful productions were The Poseidon Adventure and The Towering Inferno. He won an Academy Award for his documentary The Sea Around Us and was creator of Lost in Space, Voyage to the Bottom of the Sea and Time Tunnel. Mary Blair (1911–1978). Blair was an American artist, animator and designer and Disney Legend, prominent in producing art and animation for The Walt Disney Company and drawing concept art for Alice in Wonderland, Peter Pan and Cinderella and character designs for attractions including Disneyland’s It’s a Small World.

LEFT TO RIGHT: Irwin Allen, Mary Blair, Claire Parker, Gene Warren, Jr., Gene Warren, Sr.

Claire Parker (1906–1981). Parker was an American engineer and animator. Her best-known contribution to the history of cinema is the Pinscreen, a vertically-mounted grid of 240,000 sliding metal

FALL 2020 VFXVOICE.COM • 95

9/14/20 4:21 PM


[ FINAL FRAME ]

Bitzer and Griffith – An Early Dynamic Duo

The latest advancements in film production are creating even newer alliances in what is already a highly collaborative art form – such as the one between the visual effects artist and the cinematographer. A recent example of that is Visual Effects Supervisor Rob Legato, ASC and Director of Photography Caleb Deschanel, ASC, who worked hand in hand on virtually every shot of The Lion King. (See story in this issue.) Such close collaboration is the result of a trend towards virtual production techniques and tools. One of the earliest film production ‘dynamic duos’ was director D.W. Griffith and cinematographer Billy Bitzer. This pair began collaborating in 1908 and worked together until 1929. Bitzer was able to work so closely with Griffith that he was able to channel Griffith’s vision with pioneering camera innovations. Bitzer’s achievements include being the first to film completely under artificial light in contrast to outside, and he was also the first to

96 • VFXVOICE.COM FALL 2020

use split-screen photography and backlight. He was arguably the inventor of the close-up as well as long shots, and an innovator in advancing matte photography. Bitzer’s leading-edge techniques included the fade out and iris shot to end a scene and soft-focus photography with the aid of a diffuser. Some of the duo’s bestknown works are Broken Blossoms, Way Down East, The Birth of a Nation and Intolerance. Griffith, of course, is remembered for developing the language of film with certain kinds of camera shots and lighting to give a narrative more mood and tension, which is what he and Bitzer worked together to develop.




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.