VFX Voice - Summer 2018 Issue

Page 1

VFXVOICE.COM

SUMMER 2018

THE MAKING OF

INCREDIBLES 2 NEW CHALLENGES, NEW ACHIEVEMENTS

AVATAR – FLIGHT OF PASSAGE: THEME PARK THRILLER • TV VFX EXPLOSION: WESTWORLD, STRANGER THINGS, LOST IN SPACE • PROFILES: DOUGLAS TRUMBULL, SYD MEAD & RACHEL DAY • “WHAT’S IN YOUR VFX KIT?”




[ EXECUTIVE NOTE ]

Welcome to the Summer issue of VFX Voice! In this issue we shine a light on the dynamic explosion of VFX in television driving Westworld 2, Stranger Things 2 and Lost in Space, and bring you an exclusive industry roundtable on the demands and rewards of VFX for the small screen. We tell the visual effects story behind Disney’s thrilling “Avatar: Flight of Passage” theme park ride and go inside Black Panther, Avengers: Infinity War and the much-anticipated Incredibles 2. And we profile industry legends Syd Mead and Doug Trumbull and acclaimed videogame artist Rachel Day. Check out “What’s in Your VFX Kit?,” a breakdown of gear VFX pros take on the job, spotlights on Method Studios, Important Looking Pirates and the VES Montreal Section, as well as a new guest column, VFX Careers, starting with insights on VFX recruitment from the human resources perspective. There is a steady stream of VFX news – so VFX Voice is on the beat. We continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Your enthusiasm and support have made VFX Voice a must-read publication around the globe, and we’re proud to be the definitive authority on all things VFX.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2018



[ CONTENTS ] FEATURES 8 THEME PARKS: AVATAR: FLIGHT OF PASSAGE How the sci-fi fantasy gem became a 3D thrill ride.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 GUEST COLUMN: VFX CAREERS

14 TV: STRANGER THINGS 2 Netflix series deploys an array of effects in service of story. 20 TV: LOST IN SPACE The future finally catches up to the view from the ‘60s. 26 PROFILE: DOUGLAS TRUMBULL, VES On the frontiers of film’s future with an industry giant. 32 TV: WESTWORLD 2 HBO mystery-thriller’s effects are everywhere; many unseen.

92 VES SECTION SPOTLIGHT: MONTREAL 94 VES HANDBOOK 96 FINAL FRAME: LOST IN SPACE

ON THE COVER: Helen Parr aka Elastigirl (voiced by Holly Hunter) in Incredibles 2. (Image copyright © 2018 Disney/Pixar. All Rights Reserved.)

38 COVER: INCREDIBLES 2 Producers, animators synchronize to create a Pixar milestone. 44 PROFILE: SYD MEAD Visionary designer of the future we’ve come to believe is real. 50 INDUSTRY ROUNDTABLE Industry experts weigh in on the TV VFX explosion. 56 COMPANY SPOTLIGHT: METHOD STUDIOS Seeking a global presence while keeping a boutique feel. 62 FILM: AVENGERS: INFINITY WAR Marvel all-stars join forces to turn up the VFX volume. 68 FILM: AVENGERS: INFINITY WAR Weta Digital pumped up the superheroes for the finale. 70 FILM: BLACK PANTHER How the battle of Wakanda was brought to the big screen. 74 PROFILE: RACHEL DAY VFX artist at the helm of videogame phenomenon Overwatch. 78 COMPANY SPOTLIGHT: ILP Swedish VFX buccaneers are riding an international wave. 84 VFX TRENDS: VFX KITS VFX supervisors reveal the essential gear they rely on.

4 • VFXVOICE.COM SUMMER 2018



SUMMER 2018 • VOL. 2, NO. 3

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS T. Willie Clark Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan Paula Parisi ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Dan Schrecker, Treasurer Rita Cahill, Secretary DIRECTORS Jeff Barnes, Andrew Bly, Brooke Breton Kathryn Brillhart, Emma Clifton Perry Bob Coleman, Dayne Cowan Kim Davidson, Debbie Denise Richard Edlund, VES, Pam Hogarth Joel Hynek, Jeff Kleiser, Neil Lim-Sang Brooke Lyndon-Stanford, Tim McGovern Kevin Rafferty, Scott Ross, Barry Sandrew Tim Sassoon, David Tanaka, Bill Taylor, VES Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Fon Davis, Charlie Iturriaga Christian Kubsch, Andres Martinez Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2018 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2018



THEME PARKS

AVATAR: FLIGHT OF PASSAGE: A CINEMATIC, MULTI-SENSORY 3D EXPERIENCE THAT SOARS By CHRIS McGOWAN

All images copyright © 2017 Walt Disney Productions. All Rights Reserved. TOP: “Avatar: Flight of Passage” offers guests the chance to connect with an avatar and soar on a banshee over Pandora.

8 • VFXVOICE.COM SUMMER 2018

When does a film become a ride, and when does a ride become a film? “Avatar: Flight of Passage” blurs the line between the two in the most immersive such experience to date. In the attraction, which is part of Pandora: The World of Avatar in Disney’s Animal Kingdom in Lake Buena Vista, Florida, participants take a simulated flight on a banshee’s back through the lush landscapes of Pandora, inspired by the 2009 James Cameron movie Avatar. The ride mixes motion-simulator technology with 3D video, haptic vibration and multi-sensory effects. It won the 2018 VES Award for Outstanding Visual Effects in a Special Venue Project and has been exceedingly popular since its May 2017 debut – so much so that initial wait times for the ride sometimes extended to six hours. This marriage of a ride simulator and 3D VFX incorporates a narrative – your experience takes place 100 years after the events in the Avatar film. Humans and the native Na’vi are attempting to restore the ecological balance on the planet following the devastation wrought by the RDA mining company in the past, and to bring banshee (or “Ikran”) populations back to natural levels through the Pandora Conservation Initiative (PCI). Guests can link with avatars and fly the pterodactyl-like banshees across the stunning Valley of Mo’ara. However, before the experience begins, you follow a waterfall-lined hiking trail to the attraction’s entrance and enter a cave full of ancient Na’vi paintings. Eventually, you end up at a PCI research lab, where an avatar floats in an amnio tank, not yet activated. Video instructions inform the audience about how they will link with their own avatars, and then the adventure begins.

Once seated on a link chair and outfitted with 3D glasses, you swoop through the rain forest, fly by floating mountains, glide past waterfalls, pass above an animal stampede, and face off with a Great Leonopteryx. Along with the thrills and 3D visuals, the ride has sensory stimuli that shift with each setting. You smell the forest canopy, and feel wind and mist at different points. But it’s not all soaring and careening – there are also calm moments, such as when you come to a rest in a dark cave full of bioluminescence and you hear and feel your banshee’s labored breathing (felt from the seat between your knees). Such multi-sensory entertainment has long been imagined, but never achieved at this level. Understandably, “Avatar: Flight of Passage” was a long time in the making. Work began in 2012, according to Joe Rohde, Portfolio Creative Executive for Walt Disney Imagineering. He comments that WDI “has over 100 different disciplines and just about every single one was used to bring the experience to fruition.” WDI and LEI (Lightstorm Entertainment Inc., co-founded by James Cameron) teamed and brought in Weta Digital to help with the visuals, an enormous undertaking involving the main attraction footage and animation for the queue to bring the preshow to life. “Everything you see in the attraction’s main show was original to this production,” comments WDI VFX Executive Producer Amy Jupiter. “Hundreds of artists” contributed to the VFX production, with constant interaction between the Imagineers and the VFX people, adds Rohde. “It was great. We asked a lot of them, much of it quite distinct from supporting a feature film.” Under Rohde’s direction, Jupiter and LEI VFX Supervisor/ Animation Director Richard Baneham led a team to design and

TOP: A scene from “Avatar: Flight of Passage.” BOTTOM: A scene from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 9


THEME PARKS

AVATAR: FLIGHT OF PASSAGE: A CINEMATIC, MULTI-SENSORY 3D EXPERIENCE THAT SOARS By CHRIS McGOWAN

All images copyright © 2017 Walt Disney Productions. All Rights Reserved. TOP: “Avatar: Flight of Passage” offers guests the chance to connect with an avatar and soar on a banshee over Pandora.

8 • VFXVOICE.COM SUMMER 2018

When does a film become a ride, and when does a ride become a film? “Avatar: Flight of Passage” blurs the line between the two in the most immersive such experience to date. In the attraction, which is part of Pandora: The World of Avatar in Disney’s Animal Kingdom in Lake Buena Vista, Florida, participants take a simulated flight on a banshee’s back through the lush landscapes of Pandora, inspired by the 2009 James Cameron movie Avatar. The ride mixes motion-simulator technology with 3D video, haptic vibration and multi-sensory effects. It won the 2018 VES Award for Outstanding Visual Effects in a Special Venue Project and has been exceedingly popular since its May 2017 debut – so much so that initial wait times for the ride sometimes extended to six hours. This marriage of a ride simulator and 3D VFX incorporates a narrative – your experience takes place 100 years after the events in the Avatar film. Humans and the native Na’vi are attempting to restore the ecological balance on the planet following the devastation wrought by the RDA mining company in the past, and to bring banshee (or “Ikran”) populations back to natural levels through the Pandora Conservation Initiative (PCI). Guests can link with avatars and fly the pterodactyl-like banshees across the stunning Valley of Mo’ara. However, before the experience begins, you follow a waterfall-lined hiking trail to the attraction’s entrance and enter a cave full of ancient Na’vi paintings. Eventually, you end up at a PCI research lab, where an avatar floats in an amnio tank, not yet activated. Video instructions inform the audience about how they will link with their own avatars, and then the adventure begins.

Once seated on a link chair and outfitted with 3D glasses, you swoop through the rain forest, fly by floating mountains, glide past waterfalls, pass above an animal stampede, and face off with a Great Leonopteryx. Along with the thrills and 3D visuals, the ride has sensory stimuli that shift with each setting. You smell the forest canopy, and feel wind and mist at different points. But it’s not all soaring and careening – there are also calm moments, such as when you come to a rest in a dark cave full of bioluminescence and you hear and feel your banshee’s labored breathing (felt from the seat between your knees). Such multi-sensory entertainment has long been imagined, but never achieved at this level. Understandably, “Avatar: Flight of Passage” was a long time in the making. Work began in 2012, according to Joe Rohde, Portfolio Creative Executive for Walt Disney Imagineering. He comments that WDI “has over 100 different disciplines and just about every single one was used to bring the experience to fruition.” WDI and LEI (Lightstorm Entertainment Inc., co-founded by James Cameron) teamed and brought in Weta Digital to help with the visuals, an enormous undertaking involving the main attraction footage and animation for the queue to bring the preshow to life. “Everything you see in the attraction’s main show was original to this production,” comments WDI VFX Executive Producer Amy Jupiter. “Hundreds of artists” contributed to the VFX production, with constant interaction between the Imagineers and the VFX people, adds Rohde. “It was great. We asked a lot of them, much of it quite distinct from supporting a feature film.” Under Rohde’s direction, Jupiter and LEI VFX Supervisor/ Animation Director Richard Baneham led a team to design and

TOP: A scene from “Avatar: Flight of Passage.” BOTTOM: A scene from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 9


THEME PARKS

TOP: A scene from “Avatar: Flight of Passage.”

“Most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another example [is that] the ride takes us through many environments in a short time. So, just like with a real physical ride, those environments had to be miniaturized by comparison to the film.” —Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering

10 • VFXVOICE.COM SUMMER 2018

produce the media templates for both the main show and queue experiences. “As these templates were being created, Weta was brought on to execute and contribute to the design of the main show and queue films,” says Jupiter. “We brought in Weta fairly early into our process given their ongoing and intimate relationship with LEI. Their early involvement and collaboration was key in the development process as we needed many tools for the large frame and high temporal resolution of the film. Not only did it allow them a chance to be more creatively involved, but also a chance to develop new software and a new omni-directional CG lens.” In addition to Baneham, other key members of the “Avatar: Flight of Passage” production included WDI Show Programmer David Lester, Weta Digital VFX Supervisor Thrain Shadbolt, Weta Digital Compositing Supervisor Sam Cole, and Weta Digital CG Supervisors Pavani Rao Boddapati and Daniele Tosti. Regarding the CGI footage, Jupiter adds, “Everything you see in the attraction’s main show was original to this production. We were fortunate to be able to use the model and texture assets from the first film during our templating/visualization process. We were also able to use animation cycles for our BG characters.” She adds, “The Disney artists worked in Motion Builder to build our visualization templates and program and animate the main show ride. Weta used their own proprietary rendering software for the templating process and final renders. We used Nuke Studio 3D for compositing and a complex set of tools to be able to color correct in real time in the theatre.” Weta had previously worked on the “King Kong 360 3D” ride at Universal Studios, which won a VES Award for Outstanding Visual Effects in a Special Venue Project in 2010. In that project, “we

learned a lot about large scale projection, projector cross-talk and managing stereo imagery from multiple rider perspectives,” notes Shadbolt. Weta started its Avatar work in early 2016. “However, discussions and planning had been ongoing for some time prior. It was a long creative process – not only revisiting the vast world of Pandora but also solving a myriad of technical challenges along the way.” He adds, “We worked very closely with WDI and Lightstorm throughout the project. Lightstorm would conceptualize and test a pre-viz-style ‘template’ version of the ride which we would take as a starting point and detail it out into final rendered form. WDI would test the resulting images with the motion base on the ride itself, and based on this feedback then we would go through the creative loop again, making changes.” Weta was responsible for creating all of the final rendered imagery that audiences view on the ride. “Although the ride was made up of distinctly different environments, the entire ride is essentially one continuous shot that runs for 4.5 minutes and spans a distance of 11.2km. The shots were made more challenging due to the fact that the required resolution was 10K at 60fps, and they had to be rendered and delivered in omni-directional stereo,” says Shadbolt. “Our main tool for putting the ride together was Maya,” he notes, “but we rendered it in our proprietary software, Manuka. Due to the sheer scale of the imagery, we had to implement changes to our pipeline, including a custom multi-pass rendering scheme and new workflows for our pre-lighting tool, Gazebo, so animation could get a good predictive model for how it would all play out without having to go to final render.” The ride was an entirely new experience due to the fact that it needed to be rendered at a very high resolution and increased frame rate from the original film, according to Shadbolt. “The majority [of digital assets] had to be rebuilt to bring them up to date for our current pipeline so they could be rendered in Manuka. Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons, and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!” Weta’s biggest challenge was delivering images at 10k, 60 fps stereo. “We had to optimize all parts of our pipeline and make enhancements to Manuka to make this possible. We also had to do a lot of work on our compositing tool set to support this, including expansion of our custom deep compositing workflow and a new node within Nuke called ‘Imagine Render’ that would let us match the fish-eye lens that we were rendering with in Manuka.” Creating the VFX for a ride like Avatar presented singular challenges. Rohde comments, “For one thing, most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another

“VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee.” —Amy Jupiter, Executive Producer, Walt Disney Imagineering VFX

TOP AND BOTTOM: Scenes from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 11


THEME PARKS

TOP: A scene from “Avatar: Flight of Passage.”

“Most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another example [is that] the ride takes us through many environments in a short time. So, just like with a real physical ride, those environments had to be miniaturized by comparison to the film.” —Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering

10 • VFXVOICE.COM SUMMER 2018

produce the media templates for both the main show and queue experiences. “As these templates were being created, Weta was brought on to execute and contribute to the design of the main show and queue films,” says Jupiter. “We brought in Weta fairly early into our process given their ongoing and intimate relationship with LEI. Their early involvement and collaboration was key in the development process as we needed many tools for the large frame and high temporal resolution of the film. Not only did it allow them a chance to be more creatively involved, but also a chance to develop new software and a new omni-directional CG lens.” In addition to Baneham, other key members of the “Avatar: Flight of Passage” production included WDI Show Programmer David Lester, Weta Digital VFX Supervisor Thrain Shadbolt, Weta Digital Compositing Supervisor Sam Cole, and Weta Digital CG Supervisors Pavani Rao Boddapati and Daniele Tosti. Regarding the CGI footage, Jupiter adds, “Everything you see in the attraction’s main show was original to this production. We were fortunate to be able to use the model and texture assets from the first film during our templating/visualization process. We were also able to use animation cycles for our BG characters.” She adds, “The Disney artists worked in Motion Builder to build our visualization templates and program and animate the main show ride. Weta used their own proprietary rendering software for the templating process and final renders. We used Nuke Studio 3D for compositing and a complex set of tools to be able to color correct in real time in the theatre.” Weta had previously worked on the “King Kong 360 3D” ride at Universal Studios, which won a VES Award for Outstanding Visual Effects in a Special Venue Project in 2010. In that project, “we

learned a lot about large scale projection, projector cross-talk and managing stereo imagery from multiple rider perspectives,” notes Shadbolt. Weta started its Avatar work in early 2016. “However, discussions and planning had been ongoing for some time prior. It was a long creative process – not only revisiting the vast world of Pandora but also solving a myriad of technical challenges along the way.” He adds, “We worked very closely with WDI and Lightstorm throughout the project. Lightstorm would conceptualize and test a pre-viz-style ‘template’ version of the ride which we would take as a starting point and detail it out into final rendered form. WDI would test the resulting images with the motion base on the ride itself, and based on this feedback then we would go through the creative loop again, making changes.” Weta was responsible for creating all of the final rendered imagery that audiences view on the ride. “Although the ride was made up of distinctly different environments, the entire ride is essentially one continuous shot that runs for 4.5 minutes and spans a distance of 11.2km. The shots were made more challenging due to the fact that the required resolution was 10K at 60fps, and they had to be rendered and delivered in omni-directional stereo,” says Shadbolt. “Our main tool for putting the ride together was Maya,” he notes, “but we rendered it in our proprietary software, Manuka. Due to the sheer scale of the imagery, we had to implement changes to our pipeline, including a custom multi-pass rendering scheme and new workflows for our pre-lighting tool, Gazebo, so animation could get a good predictive model for how it would all play out without having to go to final render.” The ride was an entirely new experience due to the fact that it needed to be rendered at a very high resolution and increased frame rate from the original film, according to Shadbolt. “The majority [of digital assets] had to be rebuilt to bring them up to date for our current pipeline so they could be rendered in Manuka. Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons, and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!” Weta’s biggest challenge was delivering images at 10k, 60 fps stereo. “We had to optimize all parts of our pipeline and make enhancements to Manuka to make this possible. We also had to do a lot of work on our compositing tool set to support this, including expansion of our custom deep compositing workflow and a new node within Nuke called ‘Imagine Render’ that would let us match the fish-eye lens that we were rendering with in Manuka.” Creating the VFX for a ride like Avatar presented singular challenges. Rohde comments, “For one thing, most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another

“VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee.” —Amy Jupiter, Executive Producer, Walt Disney Imagineering VFX

TOP AND BOTTOM: Scenes from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 11


THEME PARKS

“Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!” —Thrain Shadbolt, VFX Supervisor, Weta Digital

TOP LEFT AND RIGHT: “Avatar: Flight of Passage” begins in the queue as guests get a peek inside a high-tech research lab to view an avatar still in its growth state inside an amnio tank. The room features charts and screens that show how humans will “connect” with a fully developed avatar for their upcoming flight on a banshee. The Na’vi avatar floating in the tank is digital animation. (Photo: Kent Phillips)

12 • VFXVOICE.COM SUMMER 2018

example [is that] the ride takes us through many environments in a short time. So just like with a real physical ride, those environments had to be miniaturized by comparison to the film.” Jupiter adds, “VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee. Every element, whether it be the ride animation, the breathing effects, the wind, the water or the film’s character animation, needs to be carefully composed to support what the rider believes they are experiencing. If one element is out of sync, or even out of balance, the illusion is broken and the rider would feel that there is something off.” 3D glasses were an essential part of creating the Avatar reality. “We needed to develop glasses that functioned with a 160-degree field-of-view screen. Typically, theatrical or cinema glasses only support 90-degree field of view. We also wanted as little visual intrusion of the glasses [as possible] as we wanted folks to have as natural a stereo experience as possible. We opted for as translucent a frame as we could get, along with as much clear filter as we could afford. Essentially, we wanted the glasses to disappear,” comments Jupiter. What was seen through the glasses had to sync with seat movement. The movement of the attraction vehicle and the apparent movement on the screen have to be perfectly coordinated so there is no disconnect, according to Rohde. Jupiter recalls, “We built a full-scale mock-up of the center section of our ride and screen to understand the characteristics of the coordination between the ride and the film. Both the template artists and the ride animators often worked together in real-time with the same software, sharing the same files, thus making the coordination seamless.” Programming of the motion base was handled on site by WDI. The initial motion was tuned to the template from LEI, which could be updated on an almost daily basis. “It becomes easy to forget that it is not real,” says Rohde. “We do not attempt any gag-like or gimmicky 3D effects that would

call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.” Weta contributed to the immersion factor with its stereo rendering. Comments Shadbolt, “Taking the initial stereo settings defined by LEI using a traditional stereo camera pair, we would render in Manuka into ‘omni-directional’ stereo using a special ray-traced lens. The advantage of this was good stereo depth all the way across the dome without fall-off at the edges, which increased the depth and realism.” The main goal was maximum immersion. “This required a combination of physical and optical effects. It is a gestalt experience,” says Rohde. “The fact that your body is embraced by the ride mechanism adds a huge kinesthetic/proprioceptor boost. In addition, we withheld any sense of a musical score until the middle of the second act, sneaking it in slowly so that the sense of realism was not compromised.” Rohde believes that “Avatar: Flight of Passage” is a success because, “number one, the ride is sincere and precise about the emotions we want to trigger, and we do not shy away from some complex and lofty emotions. For example, the attack of the Leonopteryx could just be scary, but we made it also majestic and beautiful, so you are both fascinated and surprised. I believe this attraction benefits from extraordinary coordination of parts at a very fine level of granularity, always with an eye to the holistic ‘feeling’ that would be created by these ensembles.”

“It becomes easy to forget that it is not real. We do not attempt any gag-like or gimmicky 3D effects that would call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the Banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.” —Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering

TOP: A scene from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 13


THEME PARKS

“Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!” —Thrain Shadbolt, VFX Supervisor, Weta Digital

TOP LEFT AND RIGHT: “Avatar: Flight of Passage” begins in the queue as guests get a peek inside a high-tech research lab to view an avatar still in its growth state inside an amnio tank. The room features charts and screens that show how humans will “connect” with a fully developed avatar for their upcoming flight on a banshee. The Na’vi avatar floating in the tank is digital animation. (Photo: Kent Phillips)

12 • VFXVOICE.COM SUMMER 2018

example [is that] the ride takes us through many environments in a short time. So just like with a real physical ride, those environments had to be miniaturized by comparison to the film.” Jupiter adds, “VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee. Every element, whether it be the ride animation, the breathing effects, the wind, the water or the film’s character animation, needs to be carefully composed to support what the rider believes they are experiencing. If one element is out of sync, or even out of balance, the illusion is broken and the rider would feel that there is something off.” 3D glasses were an essential part of creating the Avatar reality. “We needed to develop glasses that functioned with a 160-degree field-of-view screen. Typically, theatrical or cinema glasses only support 90-degree field of view. We also wanted as little visual intrusion of the glasses [as possible] as we wanted folks to have as natural a stereo experience as possible. We opted for as translucent a frame as we could get, along with as much clear filter as we could afford. Essentially, we wanted the glasses to disappear,” comments Jupiter. What was seen through the glasses had to sync with seat movement. The movement of the attraction vehicle and the apparent movement on the screen have to be perfectly coordinated so there is no disconnect, according to Rohde. Jupiter recalls, “We built a full-scale mock-up of the center section of our ride and screen to understand the characteristics of the coordination between the ride and the film. Both the template artists and the ride animators often worked together in real-time with the same software, sharing the same files, thus making the coordination seamless.” Programming of the motion base was handled on site by WDI. The initial motion was tuned to the template from LEI, which could be updated on an almost daily basis. “It becomes easy to forget that it is not real,” says Rohde. “We do not attempt any gag-like or gimmicky 3D effects that would

call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.” Weta contributed to the immersion factor with its stereo rendering. Comments Shadbolt, “Taking the initial stereo settings defined by LEI using a traditional stereo camera pair, we would render in Manuka into ‘omni-directional’ stereo using a special ray-traced lens. The advantage of this was good stereo depth all the way across the dome without fall-off at the edges, which increased the depth and realism.” The main goal was maximum immersion. “This required a combination of physical and optical effects. It is a gestalt experience,” says Rohde. “The fact that your body is embraced by the ride mechanism adds a huge kinesthetic/proprioceptor boost. In addition, we withheld any sense of a musical score until the middle of the second act, sneaking it in slowly so that the sense of realism was not compromised.” Rohde believes that “Avatar: Flight of Passage” is a success because, “number one, the ride is sincere and precise about the emotions we want to trigger, and we do not shy away from some complex and lofty emotions. For example, the attack of the Leonopteryx could just be scary, but we made it also majestic and beautiful, so you are both fascinated and surprised. I believe this attraction benefits from extraordinary coordination of parts at a very fine level of granularity, always with an eye to the holistic ‘feeling’ that would be created by these ensembles.”

“It becomes easy to forget that it is not real. We do not attempt any gag-like or gimmicky 3D effects that would call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the Banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.” —Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering

TOP: A scene from “Avatar: Flight of Passage.”

SUMMER 2018 VFXVOICE.COM • 13


TV

A television series about supernatural events in a small American town featuring otherworldly creatures and depictions of an alternate dimension might seem like a show that requires a heavy dose of visual effects. But Stranger Things, the hit Netflix series created by the Duffer Brothers – while certainly having its fair share of effects shots – is a show in which visual effects are very much used in service of the story. For the show’s second season, Stranger Things 2, the role of shepherding scenes featuring Demogorgons, encounters with the underworld and a raft of other supernatural scenes fell to the husband-wife team of Visual Effects Supervisor Paul Graff and Visual Effects Producer Christina Graff. They share their ‘stranger’ experiences with VFX Voice. ‘WE GOT REALLY LUCKY’

THE SUPERNATURAL VFX OF STRANGER THINGS 2 By IAN FAILES

All images © 2017 Netflix TOP: The Duffer Brothers review a scene on set of Stranger Things 2. (Photo: Jackson Davis) BOTTOM: Filming takes place in the underground tunnels set. (Photo: Jackson Davis)

Prior to working on Stranger Things 2, Paul and Christina Graff had supervised or contributed to numerous films and television shows, mainly via their studio, Crazy Horse Effects. Although this new task was initially daunting, partly from the added pressure of an already enormous Stranger Things fan base out of the first series, the Graffs immediately gravitated to the sensibilities of the Duffer brothers. “We got really lucky,” says Paul. “We got sucked into the most creative powerhouse without even knowing what happened to us. We’re extremely fond of the Duffers, and I think they’re some of the most talented young directors that I can remember.” A large-scale design effort for Stranger Things 2 was led by Production Designer Chris Trujillo and his art department, with the Graffs and their visual effects team designing and creating key elements of environments and other ideas. “It was a fantastic collaborative experience,” notes Christina. “I think they gave us a lot of leeway to furnish some fun ideas and contribute both to the look and the story. It was a bit of a synergy between our two departments, the art department and visual effects. We held hands throughout the entire season.” Things moved quickly on Stranger Things 2’s nine episodes. The Graffs were based in Los Angeles for pre-production, then moved to Atlanta for shooting, with post production back in LA. Ultimately, though, pre-production, production and post always overlapped in some way, as Christina explains: “We were in pre-production, production, and post production simultaneously while shooting chapters three through nine. So for us, post production started two months after the show began shooting. We had four months of post after we wrapped production.”

one, first designed a number of the stages of Dart, with Hydraulx then taking on the CG character. “Hydraulx rigged and textured Dart, put in a muscle structure and the internal organs,” says Paul. “The first two stages are translucent, so depending on the light angle, you can see internal organs shine through. People were ecstatic about it. They really loved Dart.” Later in the series, Dart – now the size of a large dog – joins a group of fellow Demogorgons. One unfortunately kills Bob (Sean Astin), and that required the actor to mime interaction with what would later be inserted as a digital creature. Having previously worked on The Lord of the Rings films, Astin shared with Paul Graff that he thought having someone to interact with was the most important thing in pulling off a convincing shot. “Kate Trefry, one of the writers, was there when we had this conversation,” recalls Paul. “And she was like, ‘Oh, I want to do it!’ So she volunteered and then stood in as the Demogorgon when Sean gets eaten. She had bloody hands and we got the craziest shots of her in slow motion as they struggled. Later in post we had to erase her and put a Demogorgon in her place.” MEETING THE SHADOW MONSTER

A recurring visual effect throughout the series is Will Byer’s (Noah Schnapp) connection with what is known as the Shadow Monster or Shadow Man, the result of his time in the Upside Down in series one. Paul and Christina worked with Production Designer

“It was a fantastic collaborative experience... they gave us a lot of leeway to furnish some fun ideas and contribute both to the look and the story. It was a bit of a synergy between our two departments, the art department and visual effects. We held hands throughout the entire season.” —Christina Graff, Visual Effects Producer

TOP: A previs still from Will Byers’ (Noah Schnapp) encounter with the Shadow Monster, also known as the Shadow Man, and the final visual effects shot as he jumps between dimensions. BOTTOM: The original plate and final shot featuring a CG Dart, which was built by Hydraulx.

DESIGNING DEMOGORGONS

Among the CG creature assignments for Stranger Things 2 was the strange amphibian-like lizard that Dustin (Gaten Matarazzo) finds in a trash can and calls d’Artagnan, or Dart. It eventually transpires that Dart is a young Demogorgon from the Upside Down – an alternate dimension – and grows up quickly as the events of the show progress. Aaron Sims Creative, which had worked on season

14 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 15


TV

A television series about supernatural events in a small American town featuring otherworldly creatures and depictions of an alternate dimension might seem like a show that requires a heavy dose of visual effects. But Stranger Things, the hit Netflix series created by the Duffer Brothers – while certainly having its fair share of effects shots – is a show in which visual effects are very much used in service of the story. For the show’s second season, Stranger Things 2, the role of shepherding scenes featuring Demogorgons, encounters with the underworld and a raft of other supernatural scenes fell to the husband-wife team of Visual Effects Supervisor Paul Graff and Visual Effects Producer Christina Graff. They share their ‘stranger’ experiences with VFX Voice. ‘WE GOT REALLY LUCKY’

THE SUPERNATURAL VFX OF STRANGER THINGS 2 By IAN FAILES

All images © 2017 Netflix TOP: The Duffer Brothers review a scene on set of Stranger Things 2. (Photo: Jackson Davis) BOTTOM: Filming takes place in the underground tunnels set. (Photo: Jackson Davis)

Prior to working on Stranger Things 2, Paul and Christina Graff had supervised or contributed to numerous films and television shows, mainly via their studio, Crazy Horse Effects. Although this new task was initially daunting, partly from the added pressure of an already enormous Stranger Things fan base out of the first series, the Graffs immediately gravitated to the sensibilities of the Duffer brothers. “We got really lucky,” says Paul. “We got sucked into the most creative powerhouse without even knowing what happened to us. We’re extremely fond of the Duffers, and I think they’re some of the most talented young directors that I can remember.” A large-scale design effort for Stranger Things 2 was led by Production Designer Chris Trujillo and his art department, with the Graffs and their visual effects team designing and creating key elements of environments and other ideas. “It was a fantastic collaborative experience,” notes Christina. “I think they gave us a lot of leeway to furnish some fun ideas and contribute both to the look and the story. It was a bit of a synergy between our two departments, the art department and visual effects. We held hands throughout the entire season.” Things moved quickly on Stranger Things 2’s nine episodes. The Graffs were based in Los Angeles for pre-production, then moved to Atlanta for shooting, with post production back in LA. Ultimately, though, pre-production, production and post always overlapped in some way, as Christina explains: “We were in pre-production, production, and post production simultaneously while shooting chapters three through nine. So for us, post production started two months after the show began shooting. We had four months of post after we wrapped production.”

one, first designed a number of the stages of Dart, with Hydraulx then taking on the CG character. “Hydraulx rigged and textured Dart, put in a muscle structure and the internal organs,” says Paul. “The first two stages are translucent, so depending on the light angle, you can see internal organs shine through. People were ecstatic about it. They really loved Dart.” Later in the series, Dart – now the size of a large dog – joins a group of fellow Demogorgons. One unfortunately kills Bob (Sean Astin), and that required the actor to mime interaction with what would later be inserted as a digital creature. Having previously worked on The Lord of the Rings films, Astin shared with Paul Graff that he thought having someone to interact with was the most important thing in pulling off a convincing shot. “Kate Trefry, one of the writers, was there when we had this conversation,” recalls Paul. “And she was like, ‘Oh, I want to do it!’ So she volunteered and then stood in as the Demogorgon when Sean gets eaten. She had bloody hands and we got the craziest shots of her in slow motion as they struggled. Later in post we had to erase her and put a Demogorgon in her place.” MEETING THE SHADOW MONSTER

A recurring visual effect throughout the series is Will Byer’s (Noah Schnapp) connection with what is known as the Shadow Monster or Shadow Man, the result of his time in the Upside Down in series one. Paul and Christina worked with Production Designer

“It was a fantastic collaborative experience... they gave us a lot of leeway to furnish some fun ideas and contribute both to the look and the story. It was a bit of a synergy between our two departments, the art department and visual effects. We held hands throughout the entire season.” —Christina Graff, Visual Effects Producer

TOP: A previs still from Will Byers’ (Noah Schnapp) encounter with the Shadow Monster, also known as the Shadow Man, and the final visual effects shot as he jumps between dimensions. BOTTOM: The original plate and final shot featuring a CG Dart, which was built by Hydraulx.

DESIGNING DEMOGORGONS

Among the CG creature assignments for Stranger Things 2 was the strange amphibian-like lizard that Dustin (Gaten Matarazzo) finds in a trash can and calls d’Artagnan, or Dart. It eventually transpires that Dart is a young Demogorgon from the Upside Down – an alternate dimension – and grows up quickly as the events of the show progress. Aaron Sims Creative, which had worked on season

14 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 15


TV

sets built in modular sections, designed to feel fleshy and have the quality of internal organs. Visual effects were then used to enhance tunnel scenes by adding moving vines, tunnel extensions and spores – lots of spores. “The Duffers really liked the spores in season one, and they were particular about the spores,” states Paul. “We had different vendors and in order to establish a trust we told the vendor, ‘Look, you gotta get the spores right.’ We had a lot of reviews about spores, and lighting spores, and interactions with light, and depth of field, and all kinds of issues that had to be closely matched into the shots. I think we got it to a level that I’m particularly proud of.” NEW KINDS OF POWERS

“Hydraulx rigged and textured Dart, put in a muscle structure and the internal organs. The first two stages are translucent, so depending on the light angle, you can see internal organs shine through. People were ecstatic about it. They really loved Dart.” —Paul Graff, Visual Effects Supervisor

Chris Trujillo to give the Shadow Monster the characteristics which would be befitting for the Duffers’ legacy. Matte painter and Concept Designer Steven Messing created the Shadow Monster as a dark, inky being with multiple appendages, referencing storm clouds and tornadoes. In one scene, Will envisions the Shadow Monster while switching between dimensions, all while the camera circles around him. “When we shot that, I knew I wanted to have a constant lighting situation for Will and that we’d need to have a greenscreen behind him while the camera was moving,” says Paul. “So we built a daisy chain train of three or four dollies on a circular track that went around Noah that had the camera, a wind machine and a greenscreen on them. “It took 10 minutes to shoot but it was really one of the most difficult visual effects shots of the season,” adds Paul, noting that Hydraulx completed the work. “The Shadow Monster is huge and he had these tornado-like qualities, which come right into the camera – we’re basically inside the vortex. So there were a lot of simulations and streams of particles, including the little fingers that go into Will’s eyes and ears and nose, which all started as a concept painting by Mike Maher.”

TOP ROW: Bob (Sean Astin) meets an unfortunate death at the hands of a Demogorgon. For realistic interaction, Astin performed the scene against a stand-in.

THE DUFFERS REALLY, REALLY CARE ABOUT SPORES

BOTTOM ROW: Eleven (Millie Bobby Brown) and Jim Hopper (David Harbour) confront the rift, a sequence crafted by Atomic Fiction.

During the series, Hawkins Police Chief Jim Hopper (David Harbour) discovers the existence of a series of underground tunnels being ‘infected’ by the Upside Down. These were practical

16 • VFXVOICE.COM SUMMER 2018

In search of her mother, Eleven (Millie Bobby Brown) leaves Hawkins and finds Kali (Linnea Berthelsen), also known as Eight. She’s a fellow test subject who is able to conjure illusions in the minds of others. Some of these are destructive in nature such as a fake wall that suddenly appears to hide their escape from pursuing police, while others are beautiful, like the bioluminescent butterfly Kali summons. “For the butterfly,” says Paul, “on set we had a metal screw for the eyelines that was dangling down on a fishing line. And then we asked the Duffers, ‘Do you want it to be bioluminescent?’ and they said, ‘No.’ Christina asked Atomic Fiction to try a bioluminescent version to tease the Duffers with. So when Atomic Fiction turned one around, the Duffers changed their minds. The other thing to note in that scene is that it’s just Kali and Eleven sitting in a corner of the room in front of a bluescreen, and all of the background is a whole Chicago background done as a Steven Messing animated matte painting.” INTO THE RIFT

In the final episode to season two, Hopper and Eleven descend in a cage elevator below Hawkins to confront ‘the rift’, the link to the Upside Down. This rift cavern needed to be around 200 feet tall with a 70-foot diameter, so it was filmed on a soundstage against bluescreen. Here, the two characters battle an incarnation of the Shadow Monster and several Demogorgons before Eleven uses her powers to close the rift. “Those shots began as some ideas from the Duffers, our department, the art department and Mike Maher’s concepts,” outlines Christina. “In-house we built out a rough model, worked out the scale, and Michael created the storyboards –which became our shooting bible. We then turned everything over to Atomic Fiction to begin developing the CG environment and all the volumetrics and atmospherics. There was very little time for completion and they knocked it out of the park.” The rift cavern sequence was, Paul reveals, an incredibly challenging one to pull off – not simply because of the large amount of visual effects (which, of course, included more spores), but also owing to its place in the grueling schedule. “We had only one and a half shooting days for these 115-odd shots inside this non-existent set with just a shark cage, three panels of bluescreen and a bunch of lights,” says Paul. “But it went really well in the end.”

TOP, MIDDLE AND BOTTOM: Original concept design for the underground tunnel system by Mike Maher, the on-set plate and the final shot, which includes the addition of heavily art-directed CG spores.

SUMMER 2018 VFXVOICE.COM • 17


TV

sets built in modular sections, designed to feel fleshy and have the quality of internal organs. Visual effects were then used to enhance tunnel scenes by adding moving vines, tunnel extensions and spores – lots of spores. “The Duffers really liked the spores in season one, and they were particular about the spores,” states Paul. “We had different vendors and in order to establish a trust we told the vendor, ‘Look, you gotta get the spores right.’ We had a lot of reviews about spores, and lighting spores, and interactions with light, and depth of field, and all kinds of issues that had to be closely matched into the shots. I think we got it to a level that I’m particularly proud of.” NEW KINDS OF POWERS

“Hydraulx rigged and textured Dart, put in a muscle structure and the internal organs. The first two stages are translucent, so depending on the light angle, you can see internal organs shine through. People were ecstatic about it. They really loved Dart.” —Paul Graff, Visual Effects Supervisor

Chris Trujillo to give the Shadow Monster the characteristics which would be befitting for the Duffers’ legacy. Matte painter and Concept Designer Steven Messing created the Shadow Monster as a dark, inky being with multiple appendages, referencing storm clouds and tornadoes. In one scene, Will envisions the Shadow Monster while switching between dimensions, all while the camera circles around him. “When we shot that, I knew I wanted to have a constant lighting situation for Will and that we’d need to have a greenscreen behind him while the camera was moving,” says Paul. “So we built a daisy chain train of three or four dollies on a circular track that went around Noah that had the camera, a wind machine and a greenscreen on them. “It took 10 minutes to shoot but it was really one of the most difficult visual effects shots of the season,” adds Paul, noting that Hydraulx completed the work. “The Shadow Monster is huge and he had these tornado-like qualities, which come right into the camera – we’re basically inside the vortex. So there were a lot of simulations and streams of particles, including the little fingers that go into Will’s eyes and ears and nose, which all started as a concept painting by Mike Maher.”

TOP ROW: Bob (Sean Astin) meets an unfortunate death at the hands of a Demogorgon. For realistic interaction, Astin performed the scene against a stand-in.

THE DUFFERS REALLY, REALLY CARE ABOUT SPORES

BOTTOM ROW: Eleven (Millie Bobby Brown) and Jim Hopper (David Harbour) confront the rift, a sequence crafted by Atomic Fiction.

During the series, Hawkins Police Chief Jim Hopper (David Harbour) discovers the existence of a series of underground tunnels being ‘infected’ by the Upside Down. These were practical

16 • VFXVOICE.COM SUMMER 2018

In search of her mother, Eleven (Millie Bobby Brown) leaves Hawkins and finds Kali (Linnea Berthelsen), also known as Eight. She’s a fellow test subject who is able to conjure illusions in the minds of others. Some of these are destructive in nature such as a fake wall that suddenly appears to hide their escape from pursuing police, while others are beautiful, like the bioluminescent butterfly Kali summons. “For the butterfly,” says Paul, “on set we had a metal screw for the eyelines that was dangling down on a fishing line. And then we asked the Duffers, ‘Do you want it to be bioluminescent?’ and they said, ‘No.’ Christina asked Atomic Fiction to try a bioluminescent version to tease the Duffers with. So when Atomic Fiction turned one around, the Duffers changed their minds. The other thing to note in that scene is that it’s just Kali and Eleven sitting in a corner of the room in front of a bluescreen, and all of the background is a whole Chicago background done as a Steven Messing animated matte painting.” INTO THE RIFT

In the final episode to season two, Hopper and Eleven descend in a cage elevator below Hawkins to confront ‘the rift’, the link to the Upside Down. This rift cavern needed to be around 200 feet tall with a 70-foot diameter, so it was filmed on a soundstage against bluescreen. Here, the two characters battle an incarnation of the Shadow Monster and several Demogorgons before Eleven uses her powers to close the rift. “Those shots began as some ideas from the Duffers, our department, the art department and Mike Maher’s concepts,” outlines Christina. “In-house we built out a rough model, worked out the scale, and Michael created the storyboards –which became our shooting bible. We then turned everything over to Atomic Fiction to begin developing the CG environment and all the volumetrics and atmospherics. There was very little time for completion and they knocked it out of the park.” The rift cavern sequence was, Paul reveals, an incredibly challenging one to pull off – not simply because of the large amount of visual effects (which, of course, included more spores), but also owing to its place in the grueling schedule. “We had only one and a half shooting days for these 115-odd shots inside this non-existent set with just a shark cage, three panels of bluescreen and a bunch of lights,” says Paul. “But it went really well in the end.”

TOP, MIDDLE AND BOTTOM: Original concept design for the underground tunnel system by Mike Maher, the on-set plate and the final shot, which includes the addition of heavily art-directed CG spores.

SUMMER 2018 VFXVOICE.COM • 17


TV

“We were in pre-production, production, and post production simultaneously while shooting chapters three through nine. So for us, post production started two months after the show began shooting. We had four months of post after we wrapped production.” —Christina Graff, Visual Effects Producer

TOP ROW: Eleven (Millie Bobby Brown) fights off the Shadow Monster at the rift. The scene was filmed in a bluescreen stage, with the environment and effects handled by Atomic Fiction.

THE STRANGER THINGS JOURNEY

For Christina and Paul Graff, being part of Stranger Things 2 proved to be a monumental experience. The pair had not managed a show of that scale before - somewhere in the realm of 2,000 VFX shots – but they found it to be one of the most collaborative experiences they’ve had. “It’s been quite a crazy creative journey,” says Christina. “Things were definitely always in flux creatively as scripts were being turned over to us two weeks prior to shooting. Many points we discussed in our first Duffer meeting never actually became part of the story, and other things had to be invented on the fly with the new scripts. We were always thinking on our feet, which is normal, to come up with quick solutions on how to shoot something, be flexible, and handle a mother lode of visual effect shots. It went smoothly since we all worked as a team and knew we could count on each other.” “The spirit of the show was very collaborative,” adds Paul. “It comes from the two brothers who are in perfect sync with each other, and it just bleeds from there through the whole team, where everybody was trying to help each other out. It was a cool journey and very satisfying to be able to make such a large creative contribution.”

AD

BOTTOM ROW: The rift sequence also involved some slight alterations made to Eleven’s face and eyes. For that, tracking markers were placed on Millie Bobby Brown, and Atomic Fiction then augmented the actor’s features.

18 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 19



TV

LOST IN SPACE 2018 UNDERGOES A RADICAL TRANSFORMATION By CHRIS McGOWAN

All images courtesy of Netflix TOP: Judy Robinson (Taylor Russell) realizes she’s not in Kansas anymore. (Image © 2018 Netflix and Legendary Television)

20 • VFXVOICE.COM SUMMER 2018

Until this year, the cult-classic ‘60s TV series Lost in Space was marooned in a shrinking universe, still beloved by fans but sorely needing the narrative and effects upgrades accorded to other venerable properties such as Star Trek, Superman and Batman. There was a poorly received 1998 movie with William Hurt, Mimi Rogers and Gary Oldman, but nothing that restored the luster of the franchise. Now, in 2018, Netflix has rebooted Lost in Space with fresh storytelling and high-budget VFX from top-flight effects/animation studios on four continents. Lost in Space (1965-1968) related the adventures of a family of space colonists who veered off course and had to survive in hostile and peculiar alien environments. It was wholesome family entertainment that didn’t take itself too seriously and sometimes indulged in silliness. The Robinsons encountered cosmic storms, aliens, telepaths, Vikings, space cowboys, pirates, wizards and even giant talking vegetables. Many episodes focused on young Will Robinson (Bill Mumy), bumbling saboteur Dr. Smith (Jonathan Harris) and the sardonic Robot (voiced by Dick Tufeld). Lost in Space could be sleek or low-budget: it had a John Williams theme, an expensive Jupiter 2 spaceship interior, an iconic robot, and impressive accessories – laser guns, jet packs, chariots and space pods—yet producer Irwin Allen also cut costs with recycled props and monsters, some coming from his show Voyage to the Bottom of the Sea. Although sci-fi purists looked down on the show, the unique narrative mix was an initial hit, with better first-year ratings than Star Trek and significant merchandise sales. Lost in Space retained a loyal fan base in syndication in the following decades. Netflix’s Lost in Space is still a family show, but the gravitas has been significantly enhanced, and the visual effects are formidable.

Legendary Television filmed season one between January and July 2017 in Vancouver. As in the original, the Robinsons are lost colonists who must survive on an unknown Earth-like planet and discover that they are not alone in their new home. Terron Pratt, Visual Effects Producer for Lost in Space, has won two VES Awards for his work on Black Sails, while Visual Effects Supervisor Jabbar Raisani is a Game of Thrones Visual Effects Emmy winner and director of the sci-fi film Alien Outpost (2015). Their presence in the production signifies that Lost in Space is taking its VFX quite seriously. “I’d say we are radically different in the look and tone of the show,” comments Pratt. “It has extremely high production values, epic scope and scale, with massive landscapes and very elaborate action CG sequences, so I’d say in that regard it’s very different

TOP: The Robot, upgraded and more menacing than in the original series, confronts young Will Robinson (Maxwell Jenkins). (Image © 2018 Netflix and Legendary Television) BOTTOM LEFT: A BTS shot of John Robinson and Don West for a scene in which they float in space next to airlock wreckage, trying to get back to the Jupiter 2 (filmed in Vancouver, Canada for episode 10). (Image © 2018 Netflix and Legendary Television) BOTTOM RIGHT: This robot has a mysterious back story and a promising television future. (Image © 2018 Netflix and Legendary Television)

SUMMER 2018 VFXVOICE.COM • 21


TV

LOST IN SPACE 2018 UNDERGOES A RADICAL TRANSFORMATION By CHRIS McGOWAN

All images courtesy of Netflix TOP: Judy Robinson (Taylor Russell) realizes she’s not in Kansas anymore. (Image © 2018 Netflix and Legendary Television)

20 • VFXVOICE.COM SUMMER 2018

Until this year, the cult-classic ‘60s TV series Lost in Space was marooned in a shrinking universe, still beloved by fans but sorely needing the narrative and effects upgrades accorded to other venerable properties such as Star Trek, Superman and Batman. There was a poorly received 1998 movie with William Hurt, Mimi Rogers and Gary Oldman, but nothing that restored the luster of the franchise. Now, in 2018, Netflix has rebooted Lost in Space with fresh storytelling and high-budget VFX from top-flight effects/animation studios on four continents. Lost in Space (1965-1968) related the adventures of a family of space colonists who veered off course and had to survive in hostile and peculiar alien environments. It was wholesome family entertainment that didn’t take itself too seriously and sometimes indulged in silliness. The Robinsons encountered cosmic storms, aliens, telepaths, Vikings, space cowboys, pirates, wizards and even giant talking vegetables. Many episodes focused on young Will Robinson (Bill Mumy), bumbling saboteur Dr. Smith (Jonathan Harris) and the sardonic Robot (voiced by Dick Tufeld). Lost in Space could be sleek or low-budget: it had a John Williams theme, an expensive Jupiter 2 spaceship interior, an iconic robot, and impressive accessories – laser guns, jet packs, chariots and space pods—yet producer Irwin Allen also cut costs with recycled props and monsters, some coming from his show Voyage to the Bottom of the Sea. Although sci-fi purists looked down on the show, the unique narrative mix was an initial hit, with better first-year ratings than Star Trek and significant merchandise sales. Lost in Space retained a loyal fan base in syndication in the following decades. Netflix’s Lost in Space is still a family show, but the gravitas has been significantly enhanced, and the visual effects are formidable.

Legendary Television filmed season one between January and July 2017 in Vancouver. As in the original, the Robinsons are lost colonists who must survive on an unknown Earth-like planet and discover that they are not alone in their new home. Terron Pratt, Visual Effects Producer for Lost in Space, has won two VES Awards for his work on Black Sails, while Visual Effects Supervisor Jabbar Raisani is a Game of Thrones Visual Effects Emmy winner and director of the sci-fi film Alien Outpost (2015). Their presence in the production signifies that Lost in Space is taking its VFX quite seriously. “I’d say we are radically different in the look and tone of the show,” comments Pratt. “It has extremely high production values, epic scope and scale, with massive landscapes and very elaborate action CG sequences, so I’d say in that regard it’s very different

TOP: The Robot, upgraded and more menacing than in the original series, confronts young Will Robinson (Maxwell Jenkins). (Image © 2018 Netflix and Legendary Television) BOTTOM LEFT: A BTS shot of John Robinson and Don West for a scene in which they float in space next to airlock wreckage, trying to get back to the Jupiter 2 (filmed in Vancouver, Canada for episode 10). (Image © 2018 Netflix and Legendary Television) BOTTOM RIGHT: This robot has a mysterious back story and a promising television future. (Image © 2018 Netflix and Legendary Television)

SUMMER 2018 VFXVOICE.COM • 21


TV

TOP: Toby Stephens (who plays John Robinson) previously used grappling hooks to raid merchant ships in Black Sails and here uses one to save the Jupiter 2. (Image © 2018 Netflix and Legendary Television) BOTTOM: On-location shooting. (Image © 2018 Netflix and Legendary Television) OPPOSITE TOP: The Robot and Dr. Smith (Parker Posey) are puzzles for the Robinson family to figure out. (Image © 2018 Netflix and Legendary Television) OPPOSITE BOTTOM: Maureen Robinson (Molly Parker) and John Robinson (Toby Stephens) struggle to bring their craft under control. (Image © 2018 Netflix and Legendary Television)

22 • VFXVOICE.COM SUMMER 2018

from the original.” While the first Lost in Space resided squarely in practical effects, the new edition is heavily digital. “It’s all across the board. We’re doing a wide gamut of effects,” says Pratt. “There are completely CG shots, where every single aspect is created in the computer, all the way to some very simple composites. In terms of the most complex work, there are full CG ships, creatures, people and environments. It’s extensive.” “A big part of the challenge for a show like this is the scope of production and knowing when to go to digital and when to go practical,” says Raisani. In terms of the Jupiter 2 spaceship, “the ship interior was fully constructed, while the most that was made on the exterior was a ramp.” He continues, “Our ship is 115 feet wide, and it is hard to understand that scope and that scale until you go out on location and figure out what that would take. Even to build a part of that ship is a massive undertaking. Those are long conversations we had over and over about what made the most sense and where that part of the budget should go, and ultimately we decided that that money was best spent in visual effects. Since we were going to have CG flying ships anyway, we decided we should take over all the ships in the show.” The new Lost in Space robot is menacing and radically different from the original, with a more humanoid-shaped body and a mysterious back story; it was created with full practical shots, combination practical-and-digital shots, and full CG shots. (The iconic 1965 model was designed by the late Robert Kinoshita, who had previously created Tabor for Tabor the Great and Robby the Robot for Forbidden Planet). One holdover from the original model is the use of lights in the robot’s face to suggest emotion. Of the 10 episodes, eight in 10 have extremely intensive visual effects sequences, according to Pratt and Raisani, and the other two aren’t what one normally thinks of as “bottle episodes.” One involved “a massive sequence with a ship escaping calamity. It’s small in terms of shot count, but in terms of complexity it’s still rather high for that one episode. Overall, shot-count wise, the show is huge. We are heading toward 2,700 shots in the 10 episodes,” notes Pratt. The massive VFX effort required a huge team working together, and Lost in Space is a thoroughly global effort, with post production based in Santa Monica. “Everyone knew the property and everybody knew it was being made by Netflix, and everyone wanted to get on it. Vendors were fighting to get on the show,” says Pratt. “We’re above 20 vendors. We’re mostly at that size of production because of our timeline. Ideally, we would limit to a small number of vendors, but in order to get the shot count pushed through in under a year, we branched out to everywhere. We have vendors in Sweden, Ireland, Germany, South Africa, Canada, India, the U.S. East and West Coasts, and Texas. “We’re kind of everywhere,” Pratt adds. “Our core team is Important Looking Pirates [ILP] in Stockholm, Image Engine in Vancouver, Cinesite in Montreal, Rhythm and Hues in L.A., and El Ranchito in Spain [Madrid], and then we work out of L.A. [Santa Monica]. The post-production office is here and we’re the hub and

epicenter of all the effects, everything that’s happening.” Each vendor faced its own exotic obstacles. “The most challenging sequence Cinesite worked on has to be the storm sequence which sees the Robinson’s battling the diamond rain in the woods before trying to outrun its path of destruction in the open,” says Cinesite VFX Supervisor Aymeric Perceval. “It was a dramatic scene with hundreds and hundreds of sharp diamonds raining down on the Robinsons from above.” Cinesite dealt with storms as well as an alien ship among its more than 170 different shots. Meanwhile, ILP’s challenges included the Jupiter 2, a spacewalk, a crash site and an alien glacier environment. “It was a super exciting show for us to work on,” comments ILP Executive Producer Måns Björklund. “We cast the vendors based upon on their skill set, and also the amount of time necessary to free them up for the next large body of work later down the road,” says Raisani. “For example, ILP’s main episodes are one, three and 10, but they’ve got a little bit of work scattered throughout. As we’re breaking down the scripts

“We are radically different in the look and tone of the show. It has extremely high production values, epic scope and scale, with massive landscapes and very elaborate action CG sequences, so I’d say in that regard it’s very different from the original.” —Terron Pratt, Visual Effects Producer

SUMMER 2018 VFXVOICE.COM • 23


TV

TOP: Toby Stephens (who plays John Robinson) previously used grappling hooks to raid merchant ships in Black Sails and here uses one to save the Jupiter 2. (Image © 2018 Netflix and Legendary Television) BOTTOM: On-location shooting. (Image © 2018 Netflix and Legendary Television) OPPOSITE TOP: The Robot and Dr. Smith (Parker Posey) are puzzles for the Robinson family to figure out. (Image © 2018 Netflix and Legendary Television) OPPOSITE BOTTOM: Maureen Robinson (Molly Parker) and John Robinson (Toby Stephens) struggle to bring their craft under control. (Image © 2018 Netflix and Legendary Television)

22 • VFXVOICE.COM SUMMER 2018

from the original.” While the first Lost in Space resided squarely in practical effects, the new edition is heavily digital. “It’s all across the board. We’re doing a wide gamut of effects,” says Pratt. “There are completely CG shots, where every single aspect is created in the computer, all the way to some very simple composites. In terms of the most complex work, there are full CG ships, creatures, people and environments. It’s extensive.” “A big part of the challenge for a show like this is the scope of production and knowing when to go to digital and when to go practical,” says Raisani. In terms of the Jupiter 2 spaceship, “the ship interior was fully constructed, while the most that was made on the exterior was a ramp.” He continues, “Our ship is 115 feet wide, and it is hard to understand that scope and that scale until you go out on location and figure out what that would take. Even to build a part of that ship is a massive undertaking. Those are long conversations we had over and over about what made the most sense and where that part of the budget should go, and ultimately we decided that that money was best spent in visual effects. Since we were going to have CG flying ships anyway, we decided we should take over all the ships in the show.” The new Lost in Space robot is menacing and radically different from the original, with a more humanoid-shaped body and a mysterious back story; it was created with full practical shots, combination practical-and-digital shots, and full CG shots. (The iconic 1965 model was designed by the late Robert Kinoshita, who had previously created Tabor for Tabor the Great and Robby the Robot for Forbidden Planet). One holdover from the original model is the use of lights in the robot’s face to suggest emotion. Of the 10 episodes, eight in 10 have extremely intensive visual effects sequences, according to Pratt and Raisani, and the other two aren’t what one normally thinks of as “bottle episodes.” One involved “a massive sequence with a ship escaping calamity. It’s small in terms of shot count, but in terms of complexity it’s still rather high for that one episode. Overall, shot-count wise, the show is huge. We are heading toward 2,700 shots in the 10 episodes,” notes Pratt. The massive VFX effort required a huge team working together, and Lost in Space is a thoroughly global effort, with post production based in Santa Monica. “Everyone knew the property and everybody knew it was being made by Netflix, and everyone wanted to get on it. Vendors were fighting to get on the show,” says Pratt. “We’re above 20 vendors. We’re mostly at that size of production because of our timeline. Ideally, we would limit to a small number of vendors, but in order to get the shot count pushed through in under a year, we branched out to everywhere. We have vendors in Sweden, Ireland, Germany, South Africa, Canada, India, the U.S. East and West Coasts, and Texas. “We’re kind of everywhere,” Pratt adds. “Our core team is Important Looking Pirates [ILP] in Stockholm, Image Engine in Vancouver, Cinesite in Montreal, Rhythm and Hues in L.A., and El Ranchito in Spain [Madrid], and then we work out of L.A. [Santa Monica]. The post-production office is here and we’re the hub and

epicenter of all the effects, everything that’s happening.” Each vendor faced its own exotic obstacles. “The most challenging sequence Cinesite worked on has to be the storm sequence which sees the Robinson’s battling the diamond rain in the woods before trying to outrun its path of destruction in the open,” says Cinesite VFX Supervisor Aymeric Perceval. “It was a dramatic scene with hundreds and hundreds of sharp diamonds raining down on the Robinsons from above.” Cinesite dealt with storms as well as an alien ship among its more than 170 different shots. Meanwhile, ILP’s challenges included the Jupiter 2, a spacewalk, a crash site and an alien glacier environment. “It was a super exciting show for us to work on,” comments ILP Executive Producer Måns Björklund. “We cast the vendors based upon on their skill set, and also the amount of time necessary to free them up for the next large body of work later down the road,” says Raisani. “For example, ILP’s main episodes are one, three and 10, but they’ve got a little bit of work scattered throughout. As we’re breaking down the scripts

“We are radically different in the look and tone of the show. It has extremely high production values, epic scope and scale, with massive landscapes and very elaborate action CG sequences, so I’d say in that regard it’s very different from the original.” —Terron Pratt, Visual Effects Producer

SUMMER 2018 VFXVOICE.COM • 23


TV

“Our [Jupiter 2] spaceship ship is 115 feet wide, and it is hard to understand that scope and that scale until you go out on location and figure out what that would take. Even to build a part of that ship is a massive undertaking. … Ultimately we decided that that money was best spent in visual effects. Since we were going to have CG flying ships anyway, we decided we should take over all the ships in the show.” —Jabbar Raisani, Visual Effects Supervisor

TOP: Will Robinson (Maxwell Jenkins) and his robot pal explore a breathtaking world. (Image © 2018 Netflix and Legendary Television) BOTTOM: The icy terrain scenes required extensive greenscreen shots. (Image © 2018 Netflix and Legendary Television)

24 • VFXVOICE.COM SUMMER 2018

and as we start to shoot and award the material, we look at where we want to preserve some of the larger vendors, to make sure we can allocate work appropriately.” For European or Asian vendors, time-zone differences can add an extra challenge. Pratt says, “We’re just kind of rolling constantly. When we’re getting up in the morning, the Swedish team is wrapping their day, so there’s a little bit of overlap. We’ve done calls at 8 a.m. with Sweden and then 9 p.m. with India on the same day.” “Working with different time zones actually works quite well,” says ILP Visual Effects Supervisor Niklas Jacobson, based in Stockholm. “We submit all our work and do our cineSync sessions and Skype calls at the end of our day, [which is] the beginning of the day in LA. Our client has all the material they need in their morning and they can review it and pass written notes back to us at the end of their day. Then we’ve got all the latest feedback when we jump start our days in the office.” Just as many vendors sought to work on Lost in Space, it was easy to recruit Raisani for the new series. “I’m a huge sci-fi fan and also fantasy fan. Before Thrones I’d read all the books before I ever knew the show was going to be a thing. [I was intrigued] as soon as I heard they were going to do a version of Lost in Space. I was familiar with the show. I told most people that this show was a kind of television version of all the t-shirts I owned. So it was a good fit for me. “When we had our initial meetings, they told me the type of show they wanted to make, and I said, ‘If you guys are really serious about doing that level of quality and production on a science fiction show, then I’m in.’ To get an opportunity to do something of this scope and scale where there are robots and spaceships, I’m in.” Pratt adds, “I heard about Lost in Space when I was mid-season in season four [of Black Sails]. Right around the end of the year [2016], I got a call that there was a possibility I could come on and do this project. As I was wrapping up Black Sails, I started conversations with Jabbar and Legendary [Television]. And the timing worked out for me. By the time we finish season one, we’ll have been on the show 17 months. It’s been a long journey for the two of us because we came on early.” More than 50 years after the debut of the original Lost in Space, the new version has taken full advantage of state-of-the-art VFX. “Our executive producers and writers have leaned into that as well,” says Pratt. “We’ve been able to select the best teams we can find because of the attention the show has gotten. People are excited to be involved in the project. That’s also helped us out, allowed us to elevate it yet even further.” While Lost in Space has undergone a radical transformation, Pratt notes, “There are definitely nods of the cap to the original here and there. You don’t want to alienate those who loved and enjoyed the old one, but you definitely progress for a modern audience to make sure you’re telling a story that is using all the latest, greatest technology and putting something out there that is above and beyond what’s currently on the market. That’s what we’re going for: high-end production values, and to make sure to not leave behind those people who really enjoyed the original.”

AD

SUMMER 2018 VFXVOICE.COM • 25



PROFILE

DOUGLAS TRUMBULL, VES: ADVANCING NEW TECHNOLOGIES FOR THE FUTURE OF FILM By TREVOR HOGG

Images courtesy of Douglas Trumbull TOP: Douglas Trumbull, VES

26 • VFXVOICE.COM SUMMER 2018

As a child, Douglas Trumbull, VES constructed mechanical and electrical devices such as crystal-set radios and loved watching alien invasion movies. Unknown to him at the time he would follow in the footsteps of his father, Donald Trumbull, who was an early pioneer of motion picture special effects. “By the time I was born he was in the aerospace industry and never mentioned much about The Wizard of Oz except that he had something to do with the lion’s tail, the apple tree and rigging the flying monkeys,” Trumbull says. Initially, his career ambition was to become an architect; however, Trumbull’s portfolio filled with illustrations of spaceships and alien planets caught the attention of Graphic Films which made technical films for NASA and the U.S. Air Force. During Trumbull’s early tenure at Graphic Films, there were three different projects being made for the New York World’s Fair in 1964 with the most interesting one being To the Moon and Beyond, “a 15-minute journey from the microcosm to the macrocosm,” describes Trumbull. Among the audience members at the New York World’s Fair were Arthur C. Clarke and Stanley Kubrick who were collaborating on 2001: A Space Odyssey. “They asked if Graphic would be willing to do some preliminary design development. I was working on lunar bases, pods and spacecraft designs.” The production shifted to England, so Trumbull cold-called the filmmaker resulting in him and his wife moving to London. Adds Trumbull, “The big problem with the HAL readouts was that they required so many 16mm rear-projected movies shot in 35mm on an animation stand. You needed 10 times as much readout on 16 screens simultaneously as the length of the shot. “Bruce Logan [ASC], Con Pederson and I built our own animation stand with a zoom lens. The motor running the camera drove the motion of the artwork with a little shaft that came down off of it. I would say, ‘Take this cell, put it on for 10 frames, put a blue gel on there and flicker it.’” The inception of what became the Slitscan Machine used for the Stargate scene came from a colleague on To the Moon and Beyond. “John Whitney was the pioneer of leaving the camera shutter open while you move things around under controlled situations so you can create a controlled blur, and repeat the moves. Jim Dickson and I adapted the camera with a shutter outside of it and a bellows gizmo that I built. Wally Veevers assisted a lot with the mechanical engineering of what we called the Slitscan Machine.” An electron microscope needed to be simulated for The Andromeda Strain. “I devised this whole methodology for linking up a 35mm Mitchell camera to a real microscope,” recalls Trumbull. “I found a Zeiss stereo microscope with a zoom lens and came up with the idea of the microorganism being a tetrahedron-shaped molecule illuminated by a strobe light and shot with filters so that it glows. It was going to be

a small two-and-a-half-inch hexagon of plexiglass mounted on a metal rod connected to a motor. The motor was going to be in this yoke and be upside down, inside out, backwards. “Jamie Shourt worked out this program that put this thing in one position, fired the strobe, closed the camera shutter, wait, put the thing in another position, open the camera shutter, fire the strobe, and add all six sides of the tetrahedron onto one frame of film with separate exposures. The idea was these things would start folding on their edges and multiply the number of flash exposures on each frame. He wrote this program that would go on into infinity. We just had to stop after too many hours and no sleep.” For his directorial debut, Silent Running, the three-time Oscarnominated visual effects supervisor experimented with computerized motion-control photography. “My father-in-law told me about these things called computer-controlled stepper motors where the shaft is broken into 200 segments. If you put a square wave pulse in there it will move one segment. If you can make a series of pulses, then the motor would run under automated control. I bought a stepper motor and a driver board. I figured out a way to record square wave pulses on my stereo tape recorder, play them into the motor, and repeat exactly the playout.” The front projector system was scaled down in size by using a slide projection lamp, 35mm Arriflex camera, a beam splitter mirror on a whirl head, and 4 x 5 plates. “As long as you planned ahead and had projection plates matching the scenes that were going to be shot, then you could shoot all day long with it. Sometimes we would shoot 15 process setups in one day. Everything shot in the domes, out of the windows and behind the spaceship were front projection. There were no post-production optics. It was all shot in camera.” Close Encounters of the Third Kind made great use of a cloud tank. “Steven Spielberg wanted the UFOs to come out of the clouds,” recalls Trumbull. “We bought a big aquarium tank and rented a manipulator arm from Atomics International. I put the manipulator arm in the tank so I could be outside behind the camera painting clouds.” Music plays a pivotal role in the finale where humanity makes contact with the alien visitors. “Spielberg wanted music to be a universal language of communication which I thought was a beautiful idea. I knew a woman who taught the Kodály method of visual hand signals that relate to musical notes; she came in and trained François Truffaut on how to do these hand signals to the notes that John Williams wrote way ahead of production which would become the theme of the movie. Those five notes are built into the score and are obliquely related to ‘when you wish upon a star.’” Benefiting from the research and development conducted for Close Encounters of the Third Kind was Blade Runner. “We had learned about taking a photograph to a place that would acid etch that image into a thin sheet of brass so it had incredible detail,” remarks Trumbull. “You could light it from behind, put in smoke and it looked great. The acid-etched brass was used a lot in Close Encounters and in Blade Runner to build this infinite forced-perspective miniature of Los Angeles [which was based on an oil refinery]. It was an expansion of the lighting, smoke and

TOP: A shot taken at the 1964 New York World’s Fair of the pavilion screening of To the Moon and Beyond in Cinerama. MIDDLE: Trumbull details the Moon Bus featured in 2001: A Space Odyssey. BOTTOM: Trumbull is part of a group discussion with Stanley Kubrick on the 2001: A Space Odyssey Centrifuge set.

SUMMER 2018 VFXVOICE.COM • 27


PROFILE

DOUGLAS TRUMBULL, VES: ADVANCING NEW TECHNOLOGIES FOR THE FUTURE OF FILM By TREVOR HOGG

Images courtesy of Douglas Trumbull TOP: Douglas Trumbull, VES

26 • VFXVOICE.COM SUMMER 2018

As a child, Douglas Trumbull, VES constructed mechanical and electrical devices such as crystal-set radios and loved watching alien invasion movies. Unknown to him at the time he would follow in the footsteps of his father, Donald Trumbull, who was an early pioneer of motion picture special effects. “By the time I was born he was in the aerospace industry and never mentioned much about The Wizard of Oz except that he had something to do with the lion’s tail, the apple tree and rigging the flying monkeys,” Trumbull says. Initially, his career ambition was to become an architect; however, Trumbull’s portfolio filled with illustrations of spaceships and alien planets caught the attention of Graphic Films which made technical films for NASA and the U.S. Air Force. During Trumbull’s early tenure at Graphic Films, there were three different projects being made for the New York World’s Fair in 1964 with the most interesting one being To the Moon and Beyond, “a 15-minute journey from the microcosm to the macrocosm,” describes Trumbull. Among the audience members at the New York World’s Fair were Arthur C. Clarke and Stanley Kubrick who were collaborating on 2001: A Space Odyssey. “They asked if Graphic would be willing to do some preliminary design development. I was working on lunar bases, pods and spacecraft designs.” The production shifted to England, so Trumbull cold-called the filmmaker resulting in him and his wife moving to London. Adds Trumbull, “The big problem with the HAL readouts was that they required so many 16mm rear-projected movies shot in 35mm on an animation stand. You needed 10 times as much readout on 16 screens simultaneously as the length of the shot. “Bruce Logan [ASC], Con Pederson and I built our own animation stand with a zoom lens. The motor running the camera drove the motion of the artwork with a little shaft that came down off of it. I would say, ‘Take this cell, put it on for 10 frames, put a blue gel on there and flicker it.’” The inception of what became the Slitscan Machine used for the Stargate scene came from a colleague on To the Moon and Beyond. “John Whitney was the pioneer of leaving the camera shutter open while you move things around under controlled situations so you can create a controlled blur, and repeat the moves. Jim Dickson and I adapted the camera with a shutter outside of it and a bellows gizmo that I built. Wally Veevers assisted a lot with the mechanical engineering of what we called the Slitscan Machine.” An electron microscope needed to be simulated for The Andromeda Strain. “I devised this whole methodology for linking up a 35mm Mitchell camera to a real microscope,” recalls Trumbull. “I found a Zeiss stereo microscope with a zoom lens and came up with the idea of the microorganism being a tetrahedron-shaped molecule illuminated by a strobe light and shot with filters so that it glows. It was going to be

a small two-and-a-half-inch hexagon of plexiglass mounted on a metal rod connected to a motor. The motor was going to be in this yoke and be upside down, inside out, backwards. “Jamie Shourt worked out this program that put this thing in one position, fired the strobe, closed the camera shutter, wait, put the thing in another position, open the camera shutter, fire the strobe, and add all six sides of the tetrahedron onto one frame of film with separate exposures. The idea was these things would start folding on their edges and multiply the number of flash exposures on each frame. He wrote this program that would go on into infinity. We just had to stop after too many hours and no sleep.” For his directorial debut, Silent Running, the three-time Oscarnominated visual effects supervisor experimented with computerized motion-control photography. “My father-in-law told me about these things called computer-controlled stepper motors where the shaft is broken into 200 segments. If you put a square wave pulse in there it will move one segment. If you can make a series of pulses, then the motor would run under automated control. I bought a stepper motor and a driver board. I figured out a way to record square wave pulses on my stereo tape recorder, play them into the motor, and repeat exactly the playout.” The front projector system was scaled down in size by using a slide projection lamp, 35mm Arriflex camera, a beam splitter mirror on a whirl head, and 4 x 5 plates. “As long as you planned ahead and had projection plates matching the scenes that were going to be shot, then you could shoot all day long with it. Sometimes we would shoot 15 process setups in one day. Everything shot in the domes, out of the windows and behind the spaceship were front projection. There were no post-production optics. It was all shot in camera.” Close Encounters of the Third Kind made great use of a cloud tank. “Steven Spielberg wanted the UFOs to come out of the clouds,” recalls Trumbull. “We bought a big aquarium tank and rented a manipulator arm from Atomics International. I put the manipulator arm in the tank so I could be outside behind the camera painting clouds.” Music plays a pivotal role in the finale where humanity makes contact with the alien visitors. “Spielberg wanted music to be a universal language of communication which I thought was a beautiful idea. I knew a woman who taught the Kodály method of visual hand signals that relate to musical notes; she came in and trained François Truffaut on how to do these hand signals to the notes that John Williams wrote way ahead of production which would become the theme of the movie. Those five notes are built into the score and are obliquely related to ‘when you wish upon a star.’” Benefiting from the research and development conducted for Close Encounters of the Third Kind was Blade Runner. “We had learned about taking a photograph to a place that would acid etch that image into a thin sheet of brass so it had incredible detail,” remarks Trumbull. “You could light it from behind, put in smoke and it looked great. The acid-etched brass was used a lot in Close Encounters and in Blade Runner to build this infinite forced-perspective miniature of Los Angeles [which was based on an oil refinery]. It was an expansion of the lighting, smoke and

TOP: A shot taken at the 1964 New York World’s Fair of the pavilion screening of To the Moon and Beyond in Cinerama. MIDDLE: Trumbull details the Moon Bus featured in 2001: A Space Odyssey. BOTTOM: Trumbull is part of a group discussion with Stanley Kubrick on the 2001: A Space Odyssey Centrifuge set.

SUMMER 2018 VFXVOICE.COM • 27


PROFILE

“Spielberg wanted music to be a universal language of communication which I thought was a beautiful idea. I knew a woman who taught the Kodály method of visual hand signals that relate to musical notes; she came in and trained François Truffaut on how to do these hand signals to the notes that John Williams wrote way ahead of production which would become the theme of the movie. Those five notes are built into the score and are obliquely related to ‘When You Wish Upon a Star.’” —Douglas Trumbull, VES

TOP: Trumbull’s initial responsibility on 2001: A Space Odyssey was to provide animation for the various computer screens such as those found in the Orion spacecraft. MIDDLE: The iconic blimp from Blade Runner gets constructed. BOTTOM: The completed effect with the blimp showcasing an advertisement with a Geisha girl as it flies over the Bradbury building.

28 • VFXVOICE.COM SUMMER 2018

fiberoptics used on the mothership. We added projections of these explosions off the top of the towers that had a little light on them for interactivity. “The key to the whole thing was futurist designer Syd Mead who conceptualized the city. I designed the Tyrell pyramid.” Advertisements are integrated into the urban landscape. “I came up with the idea of projecting images onto the blimp and sides of buildings with the 35mm projector onto these kind of LEGO textured things that look like light bulbs which predated what I thought would become true. Even in Los Angeles at the time they were experimenting with these signs on the freeway that would alert you to the traffic up ahead.” Paramount approached Trumbull to fix the visual effects for Star Trek: The Motion Picture. “Airplanes have lights that are built into the tail fins that shine up on the tail and there are lights in the belly that shine on the logo. I said, ‘That’s what we have to do to make the Enterprise look great even if it’s in total darkness.’ I had them take it apart, rebuild lights into it, and then refinish the entire thing.” Trumbull directed ‘Spock’s Spacewalk’. “I hired Bob McCall, who was one of the most talented and skilled astronomical space artists of all-time; he did these illustrations of seemingly irrelevant elements of planets, stars and streaks that became our guidelines for how we would do the sequence. Then we took this little miniature of Spock in his space pack and threw him through it similar to Keir Dullea going through the Stargate in 2001.” The most technologically complicated sequence was the wormhole. “The camera moved like it did for the Stargate sequence in 2001, and we modulated the laser beam so it changed shape and became complicated.” “I started a research and development company owned by Paramount called Future General Corporation and pitched advanced ideas for the future of the entertainment industry,” states Trumbull. “Richard Yuricich [ASC] and I experimented with all of the film processes that were ever developed throughout the entire history of movies from 35mm to 70mm, D150, Todd-AO, Cinerama, CinemaScope and IMAX.” The two business partners also shot and

projected at different frame rates: 24, 30, 36, 40, 48, 60, 66 and 72 fps. They discovered that viewers got more physiologically stimulated by the higher frame rates. “It led us to the Showscan process of 60 frames-per-second 70mm which was fabulous and still looked cinematic. I was under orders by Paramount management to develop a feature film that was to be shot in the process and that became Brainstorm [1983].” Studio support waned for the project even after shifting to MGM as it was seen unrealistic to expect thousands of theaters to convert to the Showscan projection process. “I had to make the

TOP LEFT: The opening Blade Runner shot of Los Angeles 2019 was inspired by oil refineries and created by using a forced-perspective miniature made out of acid-etched brass. TOP RIGHT: The Tyrell Pyramid was the first miniature built for the production of Blade Runner. BOTTOM: A camera with a fish-eye lens shot footage during the principal photography of Brainstorm to be used for the memory balls.

SUMMER 2018 VFXVOICE.COM • 29


PROFILE

“Spielberg wanted music to be a universal language of communication which I thought was a beautiful idea. I knew a woman who taught the Kodály method of visual hand signals that relate to musical notes; she came in and trained François Truffaut on how to do these hand signals to the notes that John Williams wrote way ahead of production which would become the theme of the movie. Those five notes are built into the score and are obliquely related to ‘When You Wish Upon a Star.’” —Douglas Trumbull, VES

TOP: Trumbull’s initial responsibility on 2001: A Space Odyssey was to provide animation for the various computer screens such as those found in the Orion spacecraft. MIDDLE: The iconic blimp from Blade Runner gets constructed. BOTTOM: The completed effect with the blimp showcasing an advertisement with a Geisha girl as it flies over the Bradbury building.

28 • VFXVOICE.COM SUMMER 2018

fiberoptics used on the mothership. We added projections of these explosions off the top of the towers that had a little light on them for interactivity. “The key to the whole thing was futurist designer Syd Mead who conceptualized the city. I designed the Tyrell pyramid.” Advertisements are integrated into the urban landscape. “I came up with the idea of projecting images onto the blimp and sides of buildings with the 35mm projector onto these kind of LEGO textured things that look like light bulbs which predated what I thought would become true. Even in Los Angeles at the time they were experimenting with these signs on the freeway that would alert you to the traffic up ahead.” Paramount approached Trumbull to fix the visual effects for Star Trek: The Motion Picture. “Airplanes have lights that are built into the tail fins that shine up on the tail and there are lights in the belly that shine on the logo. I said, ‘That’s what we have to do to make the Enterprise look great even if it’s in total darkness.’ I had them take it apart, rebuild lights into it, and then refinish the entire thing.” Trumbull directed ‘Spock’s Spacewalk’. “I hired Bob McCall, who was one of the most talented and skilled astronomical space artists of all-time; he did these illustrations of seemingly irrelevant elements of planets, stars and streaks that became our guidelines for how we would do the sequence. Then we took this little miniature of Spock in his space pack and threw him through it similar to Keir Dullea going through the Stargate in 2001.” The most technologically complicated sequence was the wormhole. “The camera moved like it did for the Stargate sequence in 2001, and we modulated the laser beam so it changed shape and became complicated.” “I started a research and development company owned by Paramount called Future General Corporation and pitched advanced ideas for the future of the entertainment industry,” states Trumbull. “Richard Yuricich [ASC] and I experimented with all of the film processes that were ever developed throughout the entire history of movies from 35mm to 70mm, D150, Todd-AO, Cinerama, CinemaScope and IMAX.” The two business partners also shot and

projected at different frame rates: 24, 30, 36, 40, 48, 60, 66 and 72 fps. They discovered that viewers got more physiologically stimulated by the higher frame rates. “It led us to the Showscan process of 60 frames-per-second 70mm which was fabulous and still looked cinematic. I was under orders by Paramount management to develop a feature film that was to be shot in the process and that became Brainstorm [1983].” Studio support waned for the project even after shifting to MGM as it was seen unrealistic to expect thousands of theaters to convert to the Showscan projection process. “I had to make the

TOP LEFT: The opening Blade Runner shot of Los Angeles 2019 was inspired by oil refineries and created by using a forced-perspective miniature made out of acid-etched brass. TOP RIGHT: The Tyrell Pyramid was the first miniature built for the production of Blade Runner. BOTTOM: A camera with a fish-eye lens shot footage during the principal photography of Brainstorm to be used for the memory balls.

SUMMER 2018 VFXVOICE.COM • 29


PROFILE

“It’s a tragic situation where the profits generated by these movies are kept by the studio, director and actors, but almost never the visual effects people. Visual effects companies have to re-imagine themselves as producers, because they are the ones who are making the difference between success and failure.” —Douglas Trumbull, VES TOP: Steven Spielberg had an idea that the visiting aliens would replicate objects from everyday life to make humans feel comfortable, as reflected in the mothership. BOTTOM: Trumbull discusses the composition of a shot with Steven Spielberg while on set for Close Encounters of the Third Kind.

movie conventionally with it being a combination of 35mm and mono sound with 70mm and stereo sound.” During the course of the production lead actress Natalie Wood died under suspicious circumstances. “We had to finish Brainstorm on an insurance policy, so a lot of stuff that I had planned never came to fruition.” Memory balls going off into infinity made an appearance in Brainstorm, inspired by a pencil drawing by M.C. Escher featuring the artist looking at himself in a sphere that reflects the surroundings. “We had built a computerized multi-plane camera system that was used for Blade Runner and other movies. There are scenes in Brainstorm that have over 700 exposures on each frame of film. I had this whole thing in mind during the principal photography so we had another camera with a fish-eye lens shooting pieces of those shots. Interestingly, Brainstorm is now viewed as a prescient look at the potential of virtual reality.” After deciding to leave Hollywood for the Berkshires, Trumbull created a 747-sized, 40-passenger flight-simulator ride called Tour of the Universe that played at the CN Tower in Toronto utilizing Showscan; he was then recruited by Steven Spielberg to revamp the Back to the Future theme-park ride. “I reverse engineered the story around the efficiencies of the process so it hit the sweet spot between throwing up and being thrilled. It was a huge success running for 16 years at 115% of park capacity at Universal Parks in Osaka, Los Angeles and Orlando.” Trumbull established a new business venture called Ridefilm. “I doubled the frame rate to 48 frames a second, made the screen hemispherical, but with VistaVision instead of IMAX film, came up with a new motion-base design that is completely barf proof.” The company subsequently merged with IMAX. “We took IMAX public back in 1994, raised $350 million and brought it into the

30 • VFXVOICE.COM SUMMER 2018

mainstream movie industry. We got AMC, Regal and other places to put IMAX theaters into multiplexes; that became what we now know as premium large-format exhibition of movies.” Audience members are looking for a premium spectacular immersive experience. “We’re in the world where people think virtual reality is going to deliver that to them,” observes Trumbull. “From a perspective of storytelling, character development, plot, twists and suspension, all of the artform of cinema is almost impossible to execute in VR or AR because you’re giving too much control to the viewer. It becomes a completely different artform. “I’m on the advisory board of Magic Leap which promises to be on the cutting edge of mixed reality. I have also been developing the Magi process which I think is much better and easier than VR. It’s large-group fully-immersive hemispherical-screen entertainment in 4K 3D at 120 fps.” The 3D theatrical experience is being undermined by low-brightness 2K 3D at 24 fps. “If you start out with 14 foot-lamberts and put a polarizing filter in front of the projector, the light is cut in half. When you put the glasses on the light is cut again. You end up with 3 to 6 foot-lamberts of brightness. 3D has gotten a bad name partly for that reason. Secondly, the studios don’t like shooting in 3D because it’s a pain in the neck and is complicated, so they shoot in 2D and dimensionalize in post production. That looks bad.” The visual effects industry deserves more credit for their contributions to the cinematic experience. “In the making of a movie you have a screenplay, actors, director, producer, cinematographer, set dressers, makeup and wardrobe,” remarks Trumbull. “Because of the talent pool available in the visual effects industry, the visual effects are doing the lion’s share of the entire movie. They’re digitally building sets, locations, characters, wardrobe, makeup, and can add a tear to someone when they don’t feel a tear coming. People tend to be stuck in this mode of thinking that visual effects are just added to a movie. The entire artform of movies is an illusion, and visual effects has become the heart of the industry. It’s like a magic show. “Sadly, visual effects companies go bankrupt every few months. It’s a tragic situation where the profits generated by these movies are kept by the studio, director and actors, but almost never the visual effects people. Visual effects companies have to re-imagine themselves as producers, because they are the ones who are making the difference between success and failure.”

TOP: Shooting the model of the Starfleet space station orbiting Earth for Star Trek: The Motion Picture. MIDDLE: Motion-control cameras were used to capture the ‘Spacedock’ sequence that reveals the USS Enterprise in Star Trek: The Motion Picture. BOTTOM: The DeLorean simulator used for the Back to the Future ride that ran for 16 years at Universal parks in Osaka, Los Angeles and Orlando.

SUMMER 2018 VFXVOICE.COM • 31


PROFILE

“It’s a tragic situation where the profits generated by these movies are kept by the studio, director and actors, but almost never the visual effects people. Visual effects companies have to re-imagine themselves as producers, because they are the ones who are making the difference between success and failure.” —Douglas Trumbull, VES TOP: Steven Spielberg had an idea that the visiting aliens would replicate objects from everyday life to make humans feel comfortable, as reflected in the mothership. BOTTOM: Trumbull discusses the composition of a shot with Steven Spielberg while on set for Close Encounters of the Third Kind.

movie conventionally with it being a combination of 35mm and mono sound with 70mm and stereo sound.” During the course of the production lead actress Natalie Wood died under suspicious circumstances. “We had to finish Brainstorm on an insurance policy, so a lot of stuff that I had planned never came to fruition.” Memory balls going off into infinity made an appearance in Brainstorm, inspired by a pencil drawing by M.C. Escher featuring the artist looking at himself in a sphere that reflects the surroundings. “We had built a computerized multi-plane camera system that was used for Blade Runner and other movies. There are scenes in Brainstorm that have over 700 exposures on each frame of film. I had this whole thing in mind during the principal photography so we had another camera with a fish-eye lens shooting pieces of those shots. Interestingly, Brainstorm is now viewed as a prescient look at the potential of virtual reality.” After deciding to leave Hollywood for the Berkshires, Trumbull created a 747-sized, 40-passenger flight-simulator ride called Tour of the Universe that played at the CN Tower in Toronto utilizing Showscan; he was then recruited by Steven Spielberg to revamp the Back to the Future theme-park ride. “I reverse engineered the story around the efficiencies of the process so it hit the sweet spot between throwing up and being thrilled. It was a huge success running for 16 years at 115% of park capacity at Universal Parks in Osaka, Los Angeles and Orlando.” Trumbull established a new business venture called Ridefilm. “I doubled the frame rate to 48 frames a second, made the screen hemispherical, but with VistaVision instead of IMAX film, came up with a new motion-base design that is completely barf proof.” The company subsequently merged with IMAX. “We took IMAX public back in 1994, raised $350 million and brought it into the

30 • VFXVOICE.COM SUMMER 2018

mainstream movie industry. We got AMC, Regal and other places to put IMAX theaters into multiplexes; that became what we now know as premium large-format exhibition of movies.” Audience members are looking for a premium spectacular immersive experience. “We’re in the world where people think virtual reality is going to deliver that to them,” observes Trumbull. “From a perspective of storytelling, character development, plot, twists and suspension, all of the artform of cinema is almost impossible to execute in VR or AR because you’re giving too much control to the viewer. It becomes a completely different artform. “I’m on the advisory board of Magic Leap which promises to be on the cutting edge of mixed reality. I have also been developing the Magi process which I think is much better and easier than VR. It’s large-group fully-immersive hemispherical-screen entertainment in 4K 3D at 120 fps.” The 3D theatrical experience is being undermined by low-brightness 2K 3D at 24 fps. “If you start out with 14 foot-lamberts and put a polarizing filter in front of the projector, the light is cut in half. When you put the glasses on the light is cut again. You end up with 3 to 6 foot-lamberts of brightness. 3D has gotten a bad name partly for that reason. Secondly, the studios don’t like shooting in 3D because it’s a pain in the neck and is complicated, so they shoot in 2D and dimensionalize in post production. That looks bad.” The visual effects industry deserves more credit for their contributions to the cinematic experience. “In the making of a movie you have a screenplay, actors, director, producer, cinematographer, set dressers, makeup and wardrobe,” remarks Trumbull. “Because of the talent pool available in the visual effects industry, the visual effects are doing the lion’s share of the entire movie. They’re digitally building sets, locations, characters, wardrobe, makeup, and can add a tear to someone when they don’t feel a tear coming. People tend to be stuck in this mode of thinking that visual effects are just added to a movie. The entire artform of movies is an illusion, and visual effects has become the heart of the industry. It’s like a magic show. “Sadly, visual effects companies go bankrupt every few months. It’s a tragic situation where the profits generated by these movies are kept by the studio, director and actors, but almost never the visual effects people. Visual effects companies have to re-imagine themselves as producers, because they are the ones who are making the difference between success and failure.”

TOP: Shooting the model of the Starfleet space station orbiting Earth for Star Trek: The Motion Picture. MIDDLE: Motion-control cameras were used to capture the ‘Spacedock’ sequence that reveals the USS Enterprise in Star Trek: The Motion Picture. BOTTOM: The DeLorean simulator used for the Back to the Future ride that ran for 16 years at Universal parks in Osaka, Los Angeles and Orlando.

SUMMER 2018 VFXVOICE.COM • 31


TV

WESTWORLD 2: STARTLING EFFECTS CONTINUE TO EXPAND THE SMALL SCREEN By CHRIS McGOWAN

All images copyright © 2018 HBO TOP: Jay Worth, VFX Supervisor

32 • VFXVOICE.COM SUMMER 2018

Michael Crichton, a master of the science fiction thriller, often explored cutting-edge technology and the question “What could possibly go wrong?” in his novels and the movies based on them. In HBO’s Westworld TV series, based on Crichton’s 1973 movie of the same name, the android hosts of a future amusement park malfunction in ways both mundane and menacing. By the end of the first season, Dolores, Maeve and other hosts had become self-aware and turned on their human keepers. “The park has gone a little haywire,” as VFX Supervisor Jay Worth puts it. The finale, “The Bicameral Mind,” won the 2017 Emmy for Outstanding Special Visual Effects for Worth, Elizabeth Castro (VFX Coordinator), Joe Wehmeyer (On-Set VFX Supervisor), Eric Levin-Hatz (VFX Compositor), Bobo Skipper (ILP VFX Supervisor), Gustav Ahrén (Modeling Lead), Paul Ghezzo (CG Supervisor, CoSA VFX), Mitchell S. Drain (VFX Supervisor, Shade VFX) and Michael Lantieri (Special Effects Coordinator). Worth’s team also includes: Joe Wehmeyer (On-Set VFX Supervisor), Bruce Branit (On-Set VFX Supervisor), Jacqueline VandenBussche (Lead VFX Coordinator), Melanie Tucker (VFX Coordinator), Jill Paget (VFX Editor) and Sarah Bush (VFX Assistant Editor). Jonathan Nolan and Lisa Joy co-created the Westworld TV series and serve as executive producers along with Bad Robot’s J.J. Abrams and Bryan Burk. A Super Bowl commercial in February – HBO’s first in 20 years – made it clear that much havoc was yet to come in season two of Westworld. “Jonah [Co-creator/writer/director Jonathan Nolan] had this great idea, a vision of these bulls running through our lab set, wild beasts in a scientific area,” Worth recalls. One of the bull’s hides is partly torn off, revealing a black exoskeleton and wiring. “Dolores’s robot body in the season finale was one of the sequences that we focused on the most, and we brought that back in a lot of different ways, such as with the bull,” says Worth. “The team at Double Negative worked around the clock taking our design elements from ILP. Double Negative built out the interior of this robot bull, and it just looks amazing.” With technology having gone off the tracks, this year has offered additional VFX thrills and built on what has gone before. Seasons one and two together have offered many startling visual effects. In season one, VFX rejuvenated Dr. Robert Ford (Anthony Hopkins) by 40 years and peered into the head of “Robot Boy” and the inner workings of the heroine Dolores, among other feats. Season two built out Westworld, introduced Shogun World and explored the bigger picture of the park’s history. It also presented CG tigers and the peculiar drone hosts. In both years, Westworld portrayed a wide range of possible android behavior, from the more mechanical to the apparently human. Worth and his team have overseen the VFX efforts, with help from such vendors in seasons one and two as CoSA VFX, ILP (Important Looking Pirates), Rhythm & Hues Studios, Double Negative, Fractured FX, Pixomondo, Shade VFX, Chickenbone Effects, Deep Water FX, Skulley Effects, bot vfx, Yafka Studio, Revolt 33 Studios and Crafty Apes VFX. “Our team of vendors is pretty amazing. In season two we extended it out to crank through this volume of shots because our

schedule was a little more compressed than it was last year,” says Worth. The second season has also explored the ongoing puzzles of the park´s location, what lies outside it, and its underlying corporate purpose. Worth enjoyed trying “to figure out where our people go, with our flashbacks to the present day or near future, what our present reality looks like and where that might be. “We didn’t base it on a specific city, but it feels like a presentreality, near-future, interesting place. Once again [it was] more in the world-building category but, as always, we were just trying to make it look as real as possible.” Comments Worth, “We went back to things we loved. I felt like in season one we had enough time and we had enough freedom to really craft things the way we wanted to, to tell the story that Jonah and Lisa wanted to tell. It didn’t feel like we were ever hamstrung or having to avoid or to cut around, or didn’t quite fully realize something we talked about. [So it was great] to bring back some of those things.” Worth joined the VFX big leagues with a little help from his life-long pal VFX Supervisor Kevin Blank. “In 2005, I was walking home from work in Chicago when I called him and he said, ‘Do you want to work on Alias?’ Six weeks later I was in a cornfield working on the final season of Alias as a VFX Coordinator. I fell in love with the production side of VFX instantly,” says Worth. Part of his education came from working with J.J. Abrams as a VFX Supervisor on Fringe. He met Abrams once in passing and had no prior relationship with him up to that point. “However, that started a two-year email conversation back and forth on Season 1 and 2 of Fringe, where J.J. reviewed and approved almost all of the shots. We didn’t have a single phone call or in-person meeting – it was all over email. I would send emails almost every night and we would go back and forth with thoughts and ideas. That was my VFX school really – figuring out how to describe things, to distill ideas, to explain and execute abstract concepts.” Worth has worked with Abrams and Bad Robot since then, as well as with Jonathan Nolan, beginning with Person of Interest (2011-2016). Some of Worth’s other projects include 11.22.63, Almost Human, Revolution and Cloverfield. He had received five Emmy nominations prior to winning for Westworld in 2017. Worth spent a great deal of time brainstorming future iterations of androids with Nolan. “We dug into everything in terms of design on Westworld – but how the hosts are built was a big part of the show and the visual language of the show. Additionally, we had

“What we come back to is what’s going to help tell the story the best.” —Jay Worth, VFX Supervisor

TOP AND BOTTOM: Westworld’s runaway steam train barrels toward the Mesa. It was shot with a real train in Fillmore California, then the entire environment and some of the train mechanics were replaced with visual effects.

“We didn’t base it on a specific city, but it feels like a present-reality, near-future, interesting place. Once again [it was] more in the world-building category but, as always, we were just trying to make it look as real as possible.” —Jay Worth, VFX Supervisor

SUMMER 2018 VFXVOICE.COM • 33


TV

WESTWORLD 2: STARTLING EFFECTS CONTINUE TO EXPAND THE SMALL SCREEN By CHRIS McGOWAN

All images copyright © 2018 HBO TOP: Jay Worth, VFX Supervisor

32 • VFXVOICE.COM SUMMER 2018

Michael Crichton, a master of the science fiction thriller, often explored cutting-edge technology and the question “What could possibly go wrong?” in his novels and the movies based on them. In HBO’s Westworld TV series, based on Crichton’s 1973 movie of the same name, the android hosts of a future amusement park malfunction in ways both mundane and menacing. By the end of the first season, Dolores, Maeve and other hosts had become self-aware and turned on their human keepers. “The park has gone a little haywire,” as VFX Supervisor Jay Worth puts it. The finale, “The Bicameral Mind,” won the 2017 Emmy for Outstanding Special Visual Effects for Worth, Elizabeth Castro (VFX Coordinator), Joe Wehmeyer (On-Set VFX Supervisor), Eric Levin-Hatz (VFX Compositor), Bobo Skipper (ILP VFX Supervisor), Gustav Ahrén (Modeling Lead), Paul Ghezzo (CG Supervisor, CoSA VFX), Mitchell S. Drain (VFX Supervisor, Shade VFX) and Michael Lantieri (Special Effects Coordinator). Worth’s team also includes: Joe Wehmeyer (On-Set VFX Supervisor), Bruce Branit (On-Set VFX Supervisor), Jacqueline VandenBussche (Lead VFX Coordinator), Melanie Tucker (VFX Coordinator), Jill Paget (VFX Editor) and Sarah Bush (VFX Assistant Editor). Jonathan Nolan and Lisa Joy co-created the Westworld TV series and serve as executive producers along with Bad Robot’s J.J. Abrams and Bryan Burk. A Super Bowl commercial in February – HBO’s first in 20 years – made it clear that much havoc was yet to come in season two of Westworld. “Jonah [Co-creator/writer/director Jonathan Nolan] had this great idea, a vision of these bulls running through our lab set, wild beasts in a scientific area,” Worth recalls. One of the bull’s hides is partly torn off, revealing a black exoskeleton and wiring. “Dolores’s robot body in the season finale was one of the sequences that we focused on the most, and we brought that back in a lot of different ways, such as with the bull,” says Worth. “The team at Double Negative worked around the clock taking our design elements from ILP. Double Negative built out the interior of this robot bull, and it just looks amazing.” With technology having gone off the tracks, this year has offered additional VFX thrills and built on what has gone before. Seasons one and two together have offered many startling visual effects. In season one, VFX rejuvenated Dr. Robert Ford (Anthony Hopkins) by 40 years and peered into the head of “Robot Boy” and the inner workings of the heroine Dolores, among other feats. Season two built out Westworld, introduced Shogun World and explored the bigger picture of the park’s history. It also presented CG tigers and the peculiar drone hosts. In both years, Westworld portrayed a wide range of possible android behavior, from the more mechanical to the apparently human. Worth and his team have overseen the VFX efforts, with help from such vendors in seasons one and two as CoSA VFX, ILP (Important Looking Pirates), Rhythm & Hues Studios, Double Negative, Fractured FX, Pixomondo, Shade VFX, Chickenbone Effects, Deep Water FX, Skulley Effects, bot vfx, Yafka Studio, Revolt 33 Studios and Crafty Apes VFX. “Our team of vendors is pretty amazing. In season two we extended it out to crank through this volume of shots because our

schedule was a little more compressed than it was last year,” says Worth. The second season has also explored the ongoing puzzles of the park´s location, what lies outside it, and its underlying corporate purpose. Worth enjoyed trying “to figure out where our people go, with our flashbacks to the present day or near future, what our present reality looks like and where that might be. “We didn’t base it on a specific city, but it feels like a presentreality, near-future, interesting place. Once again [it was] more in the world-building category but, as always, we were just trying to make it look as real as possible.” Comments Worth, “We went back to things we loved. I felt like in season one we had enough time and we had enough freedom to really craft things the way we wanted to, to tell the story that Jonah and Lisa wanted to tell. It didn’t feel like we were ever hamstrung or having to avoid or to cut around, or didn’t quite fully realize something we talked about. [So it was great] to bring back some of those things.” Worth joined the VFX big leagues with a little help from his life-long pal VFX Supervisor Kevin Blank. “In 2005, I was walking home from work in Chicago when I called him and he said, ‘Do you want to work on Alias?’ Six weeks later I was in a cornfield working on the final season of Alias as a VFX Coordinator. I fell in love with the production side of VFX instantly,” says Worth. Part of his education came from working with J.J. Abrams as a VFX Supervisor on Fringe. He met Abrams once in passing and had no prior relationship with him up to that point. “However, that started a two-year email conversation back and forth on Season 1 and 2 of Fringe, where J.J. reviewed and approved almost all of the shots. We didn’t have a single phone call or in-person meeting – it was all over email. I would send emails almost every night and we would go back and forth with thoughts and ideas. That was my VFX school really – figuring out how to describe things, to distill ideas, to explain and execute abstract concepts.” Worth has worked with Abrams and Bad Robot since then, as well as with Jonathan Nolan, beginning with Person of Interest (2011-2016). Some of Worth’s other projects include 11.22.63, Almost Human, Revolution and Cloverfield. He had received five Emmy nominations prior to winning for Westworld in 2017. Worth spent a great deal of time brainstorming future iterations of androids with Nolan. “We dug into everything in terms of design on Westworld – but how the hosts are built was a big part of the show and the visual language of the show. Additionally, we had

“What we come back to is what’s going to help tell the story the best.” —Jay Worth, VFX Supervisor

TOP AND BOTTOM: Westworld’s runaway steam train barrels toward the Mesa. It was shot with a real train in Fillmore California, then the entire environment and some of the train mechanics were replaced with visual effects.

“We didn’t base it on a specific city, but it feels like a present-reality, near-future, interesting place. Once again [it was] more in the world-building category but, as always, we were just trying to make it look as real as possible.” —Jay Worth, VFX Supervisor

SUMMER 2018 VFXVOICE.COM • 33


TV

“[In season two] we lean away from making the hosts more robotic because we want to show the humanity of these characters. We show them as more mechanical when they’re old or malfunctioning. But when they’re struggling through their emotional state, and cycling through different things, we do almost nothing.” —Jay Worth, VFX Supervisor

TOP AND BOTTOM: The robot bull’s exoskeleton is on display as it rampages through the labs, with effects by Double Negative from design elements by ILP.

34 • VFXVOICE.COM SUMMER 2018

many discussions about the different generations of hosts. That was probably the concept that occupied most of our discussions – how could this happen, what efficiencies could we create in the idea behind the process, and how would that impact the visual design? “We had many discussions about skeletal procedures, organ printing and stuffing, muscle weaving, skin dip tanks – all of it was designed throughout and planned. We really worked at designing every part of the process so that if you only see one part, it feels like it is part of a whole. “What we come back to is what’s going to help tell the story the best,” says Worth. “In season one there were a lot of hosts that malfunctioned. And we had Old Bill, who was not necessarily malfunctioning but was an older model.” Old Bill was given an animatronic look, “but he was a very specific host. Then we had some hosts like the sheriff that malfunctioned, a little more sporadic in his movement, and then we had hosts that just struggled, like Abernathy [Dolores´s father] or Bernard. When they were just glitching, as we called it, we did almost nothing because their performances were so amazing.” In season two “we lean away from making the hosts more robotic because we want to show the humanity of these characters. We show them as more mechanical when they’re old or malfunctioning. But when they’re struggling through their emotional state, and cycling through different things, we do almost nothing.” Season two introduced a creepy new all-white robot. “The drone hosts are down in another facility; they function as sort of worker bees and it was fun to build out that idea. That was one of those situations where you read the script for the first time and you weren’t quite sure what you were going to make. It’s this host that doesn’t have any skin and doesn’t have any pigment and it’s all muscle. “Justin Raleigh and his team at Fractured FX really did an amazing job. They were able to find a very slim person to put on the [drone host] suit; it looks like normal proportions when he puts on the suit. From the FX standpoint, all we did was cleanup on the suit itself. Where there´s a crease or where it doesn’t look like it’s their skin-cleanup on makeup effects.” There were new animal characters as well. “We did a tiger sequence for episode three, which obviously had its challenges, doing full CG tigers on a TV schedule,” says Worth. “But the team at Rhythm & Hues busted it out and killed it as always. It was a lot of fun to work with them and bring this to life.” Different progressions of Westworld’s Interactive Map are also seen in Season 2. “We tried to find ways of showing this technology that’s not doing as well. What does the map look like when it’s malfunctioning? What does the map look like when there’s a battle in the world? We built up the technology in season one. It was based on a projector system but we always wanted it not looking like a projector, not looking like a TV screen and not looking like a hologram. So we had a map upon which we projected real time imagery of this landscape and it was all based on correct lighting, so at different times of day the shadows across the map would be different, with a three-dimensional quality. This season is about

what happens when the map goes south. The systems are all shut down and because of that the map breaks, the map comes back online, the map dies.” Worth had to figure out what that looks like “from a story perspective and from a visual effects perspective as well.” Shogun World was featured prominently in season two and “required some element shoots. When I read about the shogun battle, I knew that with that many people getting sliced with katanas we were going to need to have specific elements that were tailor-made for the show.” Fortunately, squib shoots got a green light. “This season we were able to do a very robust elements shoot that we wanted to do for season one but weren’t able to. We fired off hundreds of squibs and muzzle flashes, every one of our weapons, so it brings a reality to the texture of things and it’s not a CG element or out of the tool kit. We shot them with specific lighting in mind and specific angles in mind.” Production Designer Howard Cummings and his team built the Shogun World and Protagoras sets for season two. Protagoras was the biggest set piece of the year from the FX standpoint. “That and what we’re calling The Gate in episode 10,” adds Worth. “The finale is really the culmination of everything in terms of set extensions, building out facilities, as well as ambition.”

“Jonah [co-creator/writer/director Jonathan Nolan] had this great idea, a vision of these bulls running through our lab set, wild beasts in a scientific area. Dolores’ robot body in the season finale was one of the sequences that we focused on the most, and we brought that back in a lot of different ways, such as with the bull. The team at Double Negative worked around the clock taking our design elements from ILP. Double Negative built out the interior of this robot bull, and it just looks amazing.” —Jay Worth, VFX Supervisor TOP LEFT AND RIGHT: Hector Escaton (Rodrigo Santoro) and Maeve (Thandie Newton) take in a spectacular view of the world they are just beginning to understand. BOTTOM LEFT AND RIGHT: VFX performer stands in for a CGI tiger created by Rhythm & Hues.

SUMMER 2018 VFXVOICE.COM • 35


TV

“[In season two] we lean away from making the hosts more robotic because we want to show the humanity of these characters. We show them as more mechanical when they’re old or malfunctioning. But when they’re struggling through their emotional state, and cycling through different things, we do almost nothing.” —Jay Worth, VFX Supervisor

TOP AND BOTTOM: The robot bull’s exoskeleton is on display as it rampages through the labs, with effects by Double Negative from design elements by ILP.

34 • VFXVOICE.COM SUMMER 2018

many discussions about the different generations of hosts. That was probably the concept that occupied most of our discussions – how could this happen, what efficiencies could we create in the idea behind the process, and how would that impact the visual design? “We had many discussions about skeletal procedures, organ printing and stuffing, muscle weaving, skin dip tanks – all of it was designed throughout and planned. We really worked at designing every part of the process so that if you only see one part, it feels like it is part of a whole. “What we come back to is what’s going to help tell the story the best,” says Worth. “In season one there were a lot of hosts that malfunctioned. And we had Old Bill, who was not necessarily malfunctioning but was an older model.” Old Bill was given an animatronic look, “but he was a very specific host. Then we had some hosts like the sheriff that malfunctioned, a little more sporadic in his movement, and then we had hosts that just struggled, like Abernathy [Dolores´s father] or Bernard. When they were just glitching, as we called it, we did almost nothing because their performances were so amazing.” In season two “we lean away from making the hosts more robotic because we want to show the humanity of these characters. We show them as more mechanical when they’re old or malfunctioning. But when they’re struggling through their emotional state, and cycling through different things, we do almost nothing.” Season two introduced a creepy new all-white robot. “The drone hosts are down in another facility; they function as sort of worker bees and it was fun to build out that idea. That was one of those situations where you read the script for the first time and you weren’t quite sure what you were going to make. It’s this host that doesn’t have any skin and doesn’t have any pigment and it’s all muscle. “Justin Raleigh and his team at Fractured FX really did an amazing job. They were able to find a very slim person to put on the [drone host] suit; it looks like normal proportions when he puts on the suit. From the FX standpoint, all we did was cleanup on the suit itself. Where there´s a crease or where it doesn’t look like it’s their skin-cleanup on makeup effects.” There were new animal characters as well. “We did a tiger sequence for episode three, which obviously had its challenges, doing full CG tigers on a TV schedule,” says Worth. “But the team at Rhythm & Hues busted it out and killed it as always. It was a lot of fun to work with them and bring this to life.” Different progressions of Westworld’s Interactive Map are also seen in Season 2. “We tried to find ways of showing this technology that’s not doing as well. What does the map look like when it’s malfunctioning? What does the map look like when there’s a battle in the world? We built up the technology in season one. It was based on a projector system but we always wanted it not looking like a projector, not looking like a TV screen and not looking like a hologram. So we had a map upon which we projected real time imagery of this landscape and it was all based on correct lighting, so at different times of day the shadows across the map would be different, with a three-dimensional quality. This season is about

what happens when the map goes south. The systems are all shut down and because of that the map breaks, the map comes back online, the map dies.” Worth had to figure out what that looks like “from a story perspective and from a visual effects perspective as well.” Shogun World was featured prominently in season two and “required some element shoots. When I read about the shogun battle, I knew that with that many people getting sliced with katanas we were going to need to have specific elements that were tailor-made for the show.” Fortunately, squib shoots got a green light. “This season we were able to do a very robust elements shoot that we wanted to do for season one but weren’t able to. We fired off hundreds of squibs and muzzle flashes, every one of our weapons, so it brings a reality to the texture of things and it’s not a CG element or out of the tool kit. We shot them with specific lighting in mind and specific angles in mind.” Production Designer Howard Cummings and his team built the Shogun World and Protagoras sets for season two. Protagoras was the biggest set piece of the year from the FX standpoint. “That and what we’re calling The Gate in episode 10,” adds Worth. “The finale is really the culmination of everything in terms of set extensions, building out facilities, as well as ambition.”

“Jonah [co-creator/writer/director Jonathan Nolan] had this great idea, a vision of these bulls running through our lab set, wild beasts in a scientific area. Dolores’ robot body in the season finale was one of the sequences that we focused on the most, and we brought that back in a lot of different ways, such as with the bull. The team at Double Negative worked around the clock taking our design elements from ILP. Double Negative built out the interior of this robot bull, and it just looks amazing.” —Jay Worth, VFX Supervisor TOP LEFT AND RIGHT: Hector Escaton (Rodrigo Santoro) and Maeve (Thandie Newton) take in a spectacular view of the world they are just beginning to understand. BOTTOM LEFT AND RIGHT: VFX performer stands in for a CGI tiger created by Rhythm & Hues.

SUMMER 2018 VFXVOICE.COM • 35


TV

TOP LEFT AND RIGHT: The blue suit is used for lighting and environment reference, camera position and focal distance to help create a full-CGI tiger. BOTTOM: Fractured FX worked on the drone hosts, creepy, muscle-bound “worker bees” who have no skin, pigment or apparent eyes.

In the story, everything in the Westworld theme park has been designed, which adds to the real-world labor of creating it. “It makes all the difference. The challenge we have is that every inch of every frame has to be designed. This goes for every department. There really aren’t many places you can shoot where you just dress somebody, light them and shoot. Every single frame in our show is designed. What timeline might we be in, where are we, who is this character? It isn’t just that it is a theme park – the entire world we have created is very specific and defined. “So we are looking at every single frame,” Worth states. “Whether it be a con trail in the sky or a slight variation in a host

from the normal day-to-day that has to be unified to every possible variation in set design and continuity – it is a complete production effort to bring this [to] life. Thankfully we have an amazing group of collaborators – we all have each other’s back and work incredibly well together to pull off this vision.” The original Westworld, directed by Crichton, pioneered the use of digital effects in feature films when it used two total minutes of pixelization for an android POV. Twenty years later, Steven Spielberg’s Jurassic Park (based on Crichton’s novel) was another landmark with its CG dinosaurs. Asked what Crichton might have thought of the new Westworld, Worth responds, “I hope and think that he would have loved it. I think it’s taking this idea that he had of what does it mean to be human and what does it mean to be in control or not, and how to use technology to do that. Obviously, he was fascinated with the idea of telling stories that pushed the boundaries of how you can tell them visually. To have some small connection to what he was able to envision is really a lot of fun, to be part of the world that he created.”

“We did a tiger sequence for episode three, which obviously had its challenges, doing full CG tigers on a TV schedule. But the team at Rhythm & Hues busted it out and killed it as always. It was a lot of fun to work with them and bring this to life.” —Jay Worth, VFX Supervisor

TOP LEFT AND RIGHT: Shogun World offers exquisite beauty and dark horror and is partially inspired by the Edo period in Japan. BOTTOM LEFT AND RIGHT: A trail appears winding though the hills towards Shogun World, with the snow-capped mountain in the distance.

36 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 37


TV

TOP LEFT AND RIGHT: The blue suit is used for lighting and environment reference, camera position and focal distance to help create a full-CGI tiger. BOTTOM: Fractured FX worked on the drone hosts, creepy, muscle-bound “worker bees” who have no skin, pigment or apparent eyes.

In the story, everything in the Westworld theme park has been designed, which adds to the real-world labor of creating it. “It makes all the difference. The challenge we have is that every inch of every frame has to be designed. This goes for every department. There really aren’t many places you can shoot where you just dress somebody, light them and shoot. Every single frame in our show is designed. What timeline might we be in, where are we, who is this character? It isn’t just that it is a theme park – the entire world we have created is very specific and defined. “So we are looking at every single frame,” Worth states. “Whether it be a con trail in the sky or a slight variation in a host

from the normal day-to-day that has to be unified to every possible variation in set design and continuity – it is a complete production effort to bring this [to] life. Thankfully we have an amazing group of collaborators – we all have each other’s back and work incredibly well together to pull off this vision.” The original Westworld, directed by Crichton, pioneered the use of digital effects in feature films when it used two total minutes of pixelization for an android POV. Twenty years later, Steven Spielberg’s Jurassic Park (based on Crichton’s novel) was another landmark with its CG dinosaurs. Asked what Crichton might have thought of the new Westworld, Worth responds, “I hope and think that he would have loved it. I think it’s taking this idea that he had of what does it mean to be human and what does it mean to be in control or not, and how to use technology to do that. Obviously, he was fascinated with the idea of telling stories that pushed the boundaries of how you can tell them visually. To have some small connection to what he was able to envision is really a lot of fun, to be part of the world that he created.”

“We did a tiger sequence for episode three, which obviously had its challenges, doing full CG tigers on a TV schedule. But the team at Rhythm & Hues busted it out and killed it as always. It was a lot of fun to work with them and bring this to life.” —Jay Worth, VFX Supervisor

TOP LEFT AND RIGHT: Shogun World offers exquisite beauty and dark horror and is partially inspired by the Edo period in Japan. BOTTOM LEFT AND RIGHT: A trail appears winding though the hills towards Shogun World, with the snow-capped mountain in the distance.

36 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 37


COVER

PRODUCERS AND ANIMATOR FACE FRESH CHALLENGES ON INCREDIBLES 2 By IAN FAILES

When Brad Bird and Pixar began tackling the director’s 2004 computer-generated animation superhero film, The Incredibles, their biggest hurdle at the time was CG human characters. Up until then, the studio had mainly been crafting either toys, animals or other-worldly creatures in CG. Of course, it was a mountain Pixar was eventually able to climb, along with the necessary hair, cloth and skin simulations that came with generating a stylized yet believable superhero family for the film. Human characters have subsequently appeared in several Pixar outings, and this meant going into the sequel, Incredibles 2, that many of the previously difficult technical challenges were not so elusive. However, as with any of its animated features, Incredibles 2 still posed numerous challenges for Pixar’s crew, including the film’s producers, John Walker and Nicole Paradis Grindle. They explain to VFX Voice just what it took to get this new Brad Bird film made, from time pressures to technological obstacles, to keeping an army of animation professionals on track. GETTING IT DONE

All images copyright 2018 Disney/Pixar. All rights reserved. TOP: In Incredibles 2, the Parr family make a triumphant return as superheroes in the sequel to the 2004 film.

38 • VFXVOICE.COM SUMMER 2018

The animation side of Pixar is perhaps one that most people are familiar with, but none of that would be possible without producers shepherding a project to conclusion. “We work with the director to make sure that the film gets made on budget and on time,” outlines Incredibles 2 producer John Walker, who also produced The Incredibles, and has worked with Bird since The Iron Giant, which was released in 1999. “I always joke that the director makes notes, and we make decisions, but it’s kind of the same,” says Walker. “We’re creative partners with the director, but he is the primary creative force in the film, certainly. Especially a guy like Brad, who’s a writer and director. But he uses us as a sounding board, and we’ll get into

discussions with him about things about the movie that impact the budget or impact the schedule.” Fellow producer Nicole Paradis Grindle notes that one of the challenges of getting things done on budget and schedule on Incredibles 2 was the fact that the story continued to evolve during the three years it was in production. This happens on most Pixar films, but it does impact decisions about when to take parts of the story development into full-scale animation. “You don’t want to make a lot of things that you then throw away. That’s very expensive,” says Grindle. “So as producers we’re making a lot of calls about readiness and the order of things, and then we have to be really strategic about when we’re bringing people on board, and then how we want them to work together.” OLD AND NEW CHALLENGES

Coming into this new film, Pixar had made many recent major inroads on technologies that have greatly expanded the worlds and characters it creates – technologies such as the implementation of a physically-based path tracing architecture in RenderMan (called RIS), Universal Scene Description (USD) technology designed to help orchestrate the interchange of 3D graphics data, and developments in its proprietary animation software, Presto, including real-time feedback and preview renders. The result was that the filmmakers who had struggled to make the 2004 film with the technology back then now had so many new tools in their arsenal and a wealth of experience behind them. “In fact,” says Walker, “I think that one of the things that excited Brad and Ralph Eggleston, the Production Designer, was the fact that the technology existed now to finally realize the designs in the way that they had hoped to realize them in 2004. There were no notions of, ‘Well, we don’t know how to do long hair, we don’t know how to do humans, we don’t know how to do muscles.’ Everybody

“You don’t want to make a lot of things that you then throw away. That’s very expensive. So, as producers we’re making a lot of calls about readiness and the order of things, and then we have to be really strategic about when we’re bringing people on board, and then how we want them to work together.” —Nicole Paradis Grindle, Producer TOP: Helen (aka Elastigirl) tries out her new Elasticycle in a scene from the film. BOTTOM: Bob Parr/Mr. Incredible (voiced by Craig T. Nelson) with his son, Jack-Jack, whose own super powers are soon revealed.

SUMMER 2018 VFXVOICE.COM • 39


COVER

PRODUCERS AND ANIMATOR FACE FRESH CHALLENGES ON INCREDIBLES 2 By IAN FAILES

When Brad Bird and Pixar began tackling the director’s 2004 computer-generated animation superhero film, The Incredibles, their biggest hurdle at the time was CG human characters. Up until then, the studio had mainly been crafting either toys, animals or other-worldly creatures in CG. Of course, it was a mountain Pixar was eventually able to climb, along with the necessary hair, cloth and skin simulations that came with generating a stylized yet believable superhero family for the film. Human characters have subsequently appeared in several Pixar outings, and this meant going into the sequel, Incredibles 2, that many of the previously difficult technical challenges were not so elusive. However, as with any of its animated features, Incredibles 2 still posed numerous challenges for Pixar’s crew, including the film’s producers, John Walker and Nicole Paradis Grindle. They explain to VFX Voice just what it took to get this new Brad Bird film made, from time pressures to technological obstacles, to keeping an army of animation professionals on track. GETTING IT DONE

All images copyright 2018 Disney/Pixar. All rights reserved. TOP: In Incredibles 2, the Parr family make a triumphant return as superheroes in the sequel to the 2004 film.

38 • VFXVOICE.COM SUMMER 2018

The animation side of Pixar is perhaps one that most people are familiar with, but none of that would be possible without producers shepherding a project to conclusion. “We work with the director to make sure that the film gets made on budget and on time,” outlines Incredibles 2 producer John Walker, who also produced The Incredibles, and has worked with Bird since The Iron Giant, which was released in 1999. “I always joke that the director makes notes, and we make decisions, but it’s kind of the same,” says Walker. “We’re creative partners with the director, but he is the primary creative force in the film, certainly. Especially a guy like Brad, who’s a writer and director. But he uses us as a sounding board, and we’ll get into

discussions with him about things about the movie that impact the budget or impact the schedule.” Fellow producer Nicole Paradis Grindle notes that one of the challenges of getting things done on budget and schedule on Incredibles 2 was the fact that the story continued to evolve during the three years it was in production. This happens on most Pixar films, but it does impact decisions about when to take parts of the story development into full-scale animation. “You don’t want to make a lot of things that you then throw away. That’s very expensive,” says Grindle. “So as producers we’re making a lot of calls about readiness and the order of things, and then we have to be really strategic about when we’re bringing people on board, and then how we want them to work together.” OLD AND NEW CHALLENGES

Coming into this new film, Pixar had made many recent major inroads on technologies that have greatly expanded the worlds and characters it creates – technologies such as the implementation of a physically-based path tracing architecture in RenderMan (called RIS), Universal Scene Description (USD) technology designed to help orchestrate the interchange of 3D graphics data, and developments in its proprietary animation software, Presto, including real-time feedback and preview renders. The result was that the filmmakers who had struggled to make the 2004 film with the technology back then now had so many new tools in their arsenal and a wealth of experience behind them. “In fact,” says Walker, “I think that one of the things that excited Brad and Ralph Eggleston, the Production Designer, was the fact that the technology existed now to finally realize the designs in the way that they had hoped to realize them in 2004. There were no notions of, ‘Well, we don’t know how to do long hair, we don’t know how to do humans, we don’t know how to do muscles.’ Everybody

“You don’t want to make a lot of things that you then throw away. That’s very expensive. So, as producers we’re making a lot of calls about readiness and the order of things, and then we have to be really strategic about when we’re bringing people on board, and then how we want them to work together.” —Nicole Paradis Grindle, Producer TOP: Helen (aka Elastigirl) tries out her new Elasticycle in a scene from the film. BOTTOM: Bob Parr/Mr. Incredible (voiced by Craig T. Nelson) with his son, Jack-Jack, whose own super powers are soon revealed.

SUMMER 2018 VFXVOICE.COM • 39


COVER

“The technology existed now to finally realize the designs in the way that they had hoped to realize them in 2002. There were no notions of, ‘Well, we don’t know how to do long hair... we don’t know how to do humans, we don’t know how to do muscles.’ Everybody knows how to do it. It’s just now about doing it quickly.” —John Walker, Producer

TOP: Bob Parr (aka Mr. Incredible) is in charge of home life. BOTTOM: A lineup of characters from Incredibles 2. Concept art by Production Designer Ralph Eggleston.

40 • VFXVOICE.COM SUMMER 2018

knows how to do it. It’s just now about doing it quickly.” However, the continued ‘plus-ing’ of animation tech can lead to its own issues. “Once we figure out how to do something around here, we make our lives miserable by just doing a lot more of it,” admits Grindle, only perhaps half-jokingly. “For example, we now have these enormous crowds. That was one of the complaints Brad actually had from the first film, was that the streets sometimes seemed really empty in the background when big things were happening, because we could only afford so many characters! So, now we’ve made up for that, but it’s created other problems. Because you create a lot of characters, you have to build them and simulate them, and animate them, and then you have to render them all, too.” Such is the Pixar way, of course. The advancement of the art of animation and its role in improving the storytelling is what makes the studio such a powerhouse. And while the animation technology has certainly changed since 2004 in terms of producing the final image, so too has the way Pixar’s films are produced. “We get to see what the film is going to eventually look like much sooner than before,” says Grindle. “For instance on this film, our lighting team did some lighting blocking really early, concurrent with when we started animation blocking, and that allowed us to see, even with the most rudimentary animation blocking or some proxy effects and a spattering of crowds in the background, but you got a real sense of it. Back in the day, it really was very much about each person and their smaller parts of the work – you didn’t really get to see what it looked like until you got to almost the very end. This change is what has allowed us to move so quickly on Incredibles 2, and allowed us to do things like optimize for faster renders much earlier so that people can iterate more, and letting us look at shots in continuity much earlier.” “It used to be,” adds Walker, “that each little piece of the film was

reviewed all by itself. The director had to look at things chair by chair, table by table, wall by wall. They’d have to say, ‘Yes, I like that wall.’ Or, ‘Yes, I like that pencil.’ It was really an inefficient way of making the films because you spend time looking at a pencil that’s buried in the back of a set some place; you never saw it in context. And now that’s completely disappeared. That’s been the biggest change from a production standpoint that we’ve seen, that you just get to look at things as they are in life and make decisions about them.”

“These films do not spring fully-formed, and they’re pretty messy, and they don’t look all that great at the beginning.” —John Walker, Producer

KEEPING THE FAITH

Another large part of the producers’ roles in helping to make a major animated feature – this time away from the technical side of filmmaking – was keeping up artist motivation and morale over a three-year period. “This is a big challenge in animation,” suggests Walker, “because everybody sees the same thing over and over and over again in

TOP: Violet uses her forcefield power to save her family while battling the Underminer villain. BOTTOM: Concept art by Bryn Imagire for the character Edna Mode.

SUMMER 2018 VFXVOICE.COM • 41


COVER

“The technology existed now to finally realize the designs in the way that they had hoped to realize them in 2002. There were no notions of, ‘Well, we don’t know how to do long hair... we don’t know how to do humans, we don’t know how to do muscles.’ Everybody knows how to do it. It’s just now about doing it quickly.” —John Walker, Producer

TOP: Bob Parr (aka Mr. Incredible) is in charge of home life. BOTTOM: A lineup of characters from Incredibles 2. Concept art by Production Designer Ralph Eggleston.

40 • VFXVOICE.COM SUMMER 2018

knows how to do it. It’s just now about doing it quickly.” However, the continued ‘plus-ing’ of animation tech can lead to its own issues. “Once we figure out how to do something around here, we make our lives miserable by just doing a lot more of it,” admits Grindle, only perhaps half-jokingly. “For example, we now have these enormous crowds. That was one of the complaints Brad actually had from the first film, was that the streets sometimes seemed really empty in the background when big things were happening, because we could only afford so many characters! So, now we’ve made up for that, but it’s created other problems. Because you create a lot of characters, you have to build them and simulate them, and animate them, and then you have to render them all, too.” Such is the Pixar way, of course. The advancement of the art of animation and its role in improving the storytelling is what makes the studio such a powerhouse. And while the animation technology has certainly changed since 2004 in terms of producing the final image, so too has the way Pixar’s films are produced. “We get to see what the film is going to eventually look like much sooner than before,” says Grindle. “For instance on this film, our lighting team did some lighting blocking really early, concurrent with when we started animation blocking, and that allowed us to see, even with the most rudimentary animation blocking or some proxy effects and a spattering of crowds in the background, but you got a real sense of it. Back in the day, it really was very much about each person and their smaller parts of the work – you didn’t really get to see what it looked like until you got to almost the very end. This change is what has allowed us to move so quickly on Incredibles 2, and allowed us to do things like optimize for faster renders much earlier so that people can iterate more, and letting us look at shots in continuity much earlier.” “It used to be,” adds Walker, “that each little piece of the film was

reviewed all by itself. The director had to look at things chair by chair, table by table, wall by wall. They’d have to say, ‘Yes, I like that wall.’ Or, ‘Yes, I like that pencil.’ It was really an inefficient way of making the films because you spend time looking at a pencil that’s buried in the back of a set some place; you never saw it in context. And now that’s completely disappeared. That’s been the biggest change from a production standpoint that we’ve seen, that you just get to look at things as they are in life and make decisions about them.”

“These films do not spring fully-formed, and they’re pretty messy, and they don’t look all that great at the beginning.” —John Walker, Producer

KEEPING THE FAITH

Another large part of the producers’ roles in helping to make a major animated feature – this time away from the technical side of filmmaking – was keeping up artist motivation and morale over a three-year period. “This is a big challenge in animation,” suggests Walker, “because everybody sees the same thing over and over and over again in

TOP: Violet uses her forcefield power to save her family while battling the Underminer villain. BOTTOM: Concept art by Bryn Imagire for the character Edna Mode.

SUMMER 2018 VFXVOICE.COM • 41


COVER

Producer Tips

“One of the complaints director Brad Bird actually had from the first film, was that the streets sometimes seemed really empty in the background when big things were happening, because we could only afford so many characters! So, now we’ve made up for that, but it’s created other problems. Because you create a lot of characters, you have to build them and simulate them, and animate them, and then you have to render them all, too.” —Nicole Paradis Grindle, Producer TOP: Artist Andrew Jimenez (right) presents materials to director Brad Bird. BOTTOM: Brad Bird, left, consults with Incredibles 2 Production Designer Ralph Eggleston during an art review.

42 • VFXVOICE.COM SUMMER 2018

Incredibles 2 producers Nicole Paradis Grindle and John Walker reveal their strategies for getting the film made on time and on budget. Finding scenes to go into production first: “We’ll often make considerations about the sets that are available to us,” says Grindle. “In looking at the movie, we thought, ‘Well, we know we’re going to have scenes with the family, and we know we’re going to have scenes of the family in their house, so let’s build that house first. Because whatever they do in there we’re going to need them in there, and then we could look at the scenes that happened in the house and see which of those were starting to gel.” Bringing people together: “For reviews, we try to bring lots of people into the room rather than filing them off, like, ‘Well, here’s the effects review, and here’s the layout review, and here’s the story review,’” states Walker. “We try to open those things up. And something that Nicole and I do that’s maybe a little different than some of the producers at Pixar and in animation in general is to have those big rooms of people with large reviews across multiple departments. We’re trying to reproduce an artificial set where everybody’s on the set together, and I think that that’s helpful. I think that the solo-ing that happens in animation is something that producers have to actively work against.” Streaming reviews: “Another thing we did on Incredibles 2,” says Grindle, “which I think was started on a movie before ours, was that we streamed our reviews. Folks who are working at their desks can actually look at the work that’s being shown in the review, and listen to what Brad and others are saying about it in the room. So, even when you are in a relatively small room and not everyone can be there, you can still listen in. We have to remember that everyone who’s working on the film is a filmmaker, and they want to feel like a filmmaker, and if you’re working isolated in a room on this one little piece of the movie, you don’t feel like a filmmaker. It’s hard to stay motivated. So, being in on the ground floor of everything that’s going into the storytelling is very motivating.” Dealing with film length: “Nicole and I look through the film and we pick sequences that we feel could be cut if we had to, and we lift them out of the film and we refuse to put them into production. We tell Brad he’s got to get the rest of the movie done first before we’re going to make these ‘hostage’ scenes. That makes him angry, but it motivates him to fix the film in time for us, and then we release the scenes at the end of process, and then we make the whole film. It’s an insurance policy – these are complicated films and sometimes it’s not clear at the beginning of the process that we can actually make a film as long and as complex in the time that we have allotted.”

various stages, and jokes stop being funny. So part of our job is to keep reminding people that they’re working on a good film even when there’s lots of evidence to the contrary. These films do not spring fully-formed and they’re pretty messy, and they don’t look all that great at the beginning.” Grindle says that they have talked regularly about these aspects with the crew during the making of Incredibles 2. “They believe very much in Brad Bird. He’s got a great track record, he’s a wonderful and inspiring leader, and even when, because we were moving so quickly, the reels weren’t looking as good as maybe the first movie’s reels have looked, they had to believe in him. These are a group of incredibly talented, experienced artists, so they certainly know what they’re getting into, and they throw themselves completely into the work of their own departments and make sure that’s excellent. As you start to see the results coming through, I think everybody starts to say, ‘Oh, right. We know how to do this. Brad knows what he’s doing, we all know what we’re doing.’” “The thing that motivates people the most in making films is making a good film,” concludes Walker. “We try to be kind to people and not work them too hard, and we give them snacks and have masseuses around. We try to do all that kind of stuff, but what really motivates people is working on a good film. And there is a time on all these movies when the film is not good. So a big part of our job is trying to assure people that we’re working hard, and we’re going to make it a great film, and hang in there with us.”

“We get to see what the film is going to eventually look like much sooner than before. … This change is what has allowed us to move so quickly on Incredibles 2, and allowed us to do things like optimize for faster renders much earlier so that people can iterate more, and letting us look at shots in continuity much earlier.” —Nicole Paradis Grindle, Producer

TOP: Combined concept art by Ralph Eggleston for the Underminer villain.

SUMMER 2018 VFXVOICE.COM • 43


COVER

Producer Tips

“One of the complaints director Brad Bird actually had from the first film, was that the streets sometimes seemed really empty in the background when big things were happening, because we could only afford so many characters! So, now we’ve made up for that, but it’s created other problems. Because you create a lot of characters, you have to build them and simulate them, and animate them, and then you have to render them all, too.” —Nicole Paradis Grindle, Producer TOP: Artist Andrew Jimenez (right) presents materials to director Brad Bird. BOTTOM: Brad Bird, left, consults with Incredibles 2 Production Designer Ralph Eggleston during an art review.

42 • VFXVOICE.COM SUMMER 2018

Incredibles 2 producers Nicole Paradis Grindle and John Walker reveal their strategies for getting the film made on time and on budget. Finding scenes to go into production first: “We’ll often make considerations about the sets that are available to us,” says Grindle. “In looking at the movie, we thought, ‘Well, we know we’re going to have scenes with the family, and we know we’re going to have scenes of the family in their house, so let’s build that house first. Because whatever they do in there we’re going to need them in there, and then we could look at the scenes that happened in the house and see which of those were starting to gel.” Bringing people together: “For reviews, we try to bring lots of people into the room rather than filing them off, like, ‘Well, here’s the effects review, and here’s the layout review, and here’s the story review,’” states Walker. “We try to open those things up. And something that Nicole and I do that’s maybe a little different than some of the producers at Pixar and in animation in general is to have those big rooms of people with large reviews across multiple departments. We’re trying to reproduce an artificial set where everybody’s on the set together, and I think that that’s helpful. I think that the solo-ing that happens in animation is something that producers have to actively work against.” Streaming reviews: “Another thing we did on Incredibles 2,” says Grindle, “which I think was started on a movie before ours, was that we streamed our reviews. Folks who are working at their desks can actually look at the work that’s being shown in the review, and listen to what Brad and others are saying about it in the room. So, even when you are in a relatively small room and not everyone can be there, you can still listen in. We have to remember that everyone who’s working on the film is a filmmaker, and they want to feel like a filmmaker, and if you’re working isolated in a room on this one little piece of the movie, you don’t feel like a filmmaker. It’s hard to stay motivated. So, being in on the ground floor of everything that’s going into the storytelling is very motivating.” Dealing with film length: “Nicole and I look through the film and we pick sequences that we feel could be cut if we had to, and we lift them out of the film and we refuse to put them into production. We tell Brad he’s got to get the rest of the movie done first before we’re going to make these ‘hostage’ scenes. That makes him angry, but it motivates him to fix the film in time for us, and then we release the scenes at the end of process, and then we make the whole film. It’s an insurance policy – these are complicated films and sometimes it’s not clear at the beginning of the process that we can actually make a film as long and as complex in the time that we have allotted.”

various stages, and jokes stop being funny. So part of our job is to keep reminding people that they’re working on a good film even when there’s lots of evidence to the contrary. These films do not spring fully-formed and they’re pretty messy, and they don’t look all that great at the beginning.” Grindle says that they have talked regularly about these aspects with the crew during the making of Incredibles 2. “They believe very much in Brad Bird. He’s got a great track record, he’s a wonderful and inspiring leader, and even when, because we were moving so quickly, the reels weren’t looking as good as maybe the first movie’s reels have looked, they had to believe in him. These are a group of incredibly talented, experienced artists, so they certainly know what they’re getting into, and they throw themselves completely into the work of their own departments and make sure that’s excellent. As you start to see the results coming through, I think everybody starts to say, ‘Oh, right. We know how to do this. Brad knows what he’s doing, we all know what we’re doing.’” “The thing that motivates people the most in making films is making a good film,” concludes Walker. “We try to be kind to people and not work them too hard, and we give them snacks and have masseuses around. We try to do all that kind of stuff, but what really motivates people is working on a good film. And there is a time on all these movies when the film is not good. So a big part of our job is trying to assure people that we’re working hard, and we’re going to make it a great film, and hang in there with us.”

“We get to see what the film is going to eventually look like much sooner than before. … This change is what has allowed us to move so quickly on Incredibles 2, and allowed us to do things like optimize for faster renders much earlier so that people can iterate more, and letting us look at shots in continuity much earlier.” —Nicole Paradis Grindle, Producer

TOP: Combined concept art by Ralph Eggleston for the Underminer villain.

SUMMER 2018 VFXVOICE.COM • 43


PROFILE

ON THE HIGHWAYS AND SKYWAYS OF THE FUTURE WITH SYD MEAD By CHRIS McGOWAN

44 • VFXVOICE.COM SUMMER 2018

The future many of us imagine has been greatly shaped by the gouache paintings of Syd Mead, who provided conceptual art for Blade Runner, Tron and other science fiction films. Blade Runner has become a cinematic template for a gritty hightech future with dystopian and/or noir edges, while Tron was the first visual representation of cyberspace, before the term entered popular culture. Mead also stimulated our minds with spaceship designs for Aliens and 2010, and a conception of the V’ger entity for Star Trek: The Motion Picture. He designed vehicles, costumes, gear and cityscapes for Timecop, Short Circuit, Strange Days, Johnny Mnemonic, Mission to Mars, Tomorrowland and Elysium. His influential designs and illustrations have been published in books like Oblagon and viewed widely

on the Internet. Mead received a VES Visionary Award from the Visual Effects Society in 2016, and his work has been highlighted in two recent books: The Movie Art of Syd Mead: Visual Futurist (Titan Books, 2017) and A Future Remembered: An Autobiography (2018, available at sydmead.com). Sydney Jay Mead was born in 1933 in St. Paul, Minnesota, but moved within a few years to Colorado Springs. His father was a fundamentalist Baptist minister who set aside the Bible to fuel a young Syd’s imagination with tales of Buck Rogers and Flash Gordon. Following high school, Mead served for three years in the Army in Okinawa, and then enrolled in ArtCenter College of Design in Pasadena, California, where he now resides. “I studied Industrial Design, with a major in Automotive and a [minor] in Product. It gave me a knowledge of how things hinged together and how molds are made and things like that,” says Mead. His time at ArtCenter helped him create functional-seeming art – the vehicles he designed were strikingly beautiful and also seemed like they would work – in the near or distant future. “If you have the industrial design methodology in your mind, things look the way they do because of how they’re made. And if you’re working on a movie or some other story format you first have to ask, ‘What is the socioeconomic, the technological world in which this story is taking place?’” After graduating, Mead spent two years with Ford and then went out on his own with catalog design, architectural renderings for major firms, 747-jet interiors, show cars, luxury ships and assorted product design. Philips Electronics, U.S. Steel, Sony, Minolta and Honda were among his corporate clients. He moved to Southern California in 1975. He had been making quite a good living for almost 20 years before Hollywood knocked on his door. “I got a call from John Dykstra and his partner Bob Shepherd and they said, ‘Would you like to work on a science fiction movie?’ I have no idea how they knew I was out here. Not a clue. So I said, ‘Sure’ and I met them at their Van Nuys studio, Apogee. I worked

OPPOSITE TOP: View of the Tyrell Building in Blade Runner. (Image © Warner Bros. 1982) OPPOSITE BOTTOM: Downtown Los Angeles with Spinner flying car visible. (Image © Warner Bros. 1982) TOP: Blade Runner. Downtown Los Angeles with Spinner car visible. (Image ©Warner Bros. 1982) BOTTOM: Street scene. Syd Mead concept art for Blade Runner. (Image © 1982 Warner Bros.)

“I studied Industrial Design, with a major in Automotive and a [minor] in Product. It gave me a knowledge of how things hinged together and how molds are made and things like that.” —Syd Mead

SUMMER 2018 VFXVOICE.COM • 45


PROFILE

ON THE HIGHWAYS AND SKYWAYS OF THE FUTURE WITH SYD MEAD By CHRIS McGOWAN

44 • VFXVOICE.COM SUMMER 2018

The future many of us imagine has been greatly shaped by the gouache paintings of Syd Mead, who provided conceptual art for Blade Runner, Tron and other science fiction films. Blade Runner has become a cinematic template for a gritty hightech future with dystopian and/or noir edges, while Tron was the first visual representation of cyberspace, before the term entered popular culture. Mead also stimulated our minds with spaceship designs for Aliens and 2010, and a conception of the V’ger entity for Star Trek: The Motion Picture. He designed vehicles, costumes, gear and cityscapes for Timecop, Short Circuit, Strange Days, Johnny Mnemonic, Mission to Mars, Tomorrowland and Elysium. His influential designs and illustrations have been published in books like Oblagon and viewed widely

on the Internet. Mead received a VES Visionary Award from the Visual Effects Society in 2016, and his work has been highlighted in two recent books: The Movie Art of Syd Mead: Visual Futurist (Titan Books, 2017) and A Future Remembered: An Autobiography (2018, available at sydmead.com). Sydney Jay Mead was born in 1933 in St. Paul, Minnesota, but moved within a few years to Colorado Springs. His father was a fundamentalist Baptist minister who set aside the Bible to fuel a young Syd’s imagination with tales of Buck Rogers and Flash Gordon. Following high school, Mead served for three years in the Army in Okinawa, and then enrolled in ArtCenter College of Design in Pasadena, California, where he now resides. “I studied Industrial Design, with a major in Automotive and a [minor] in Product. It gave me a knowledge of how things hinged together and how molds are made and things like that,” says Mead. His time at ArtCenter helped him create functional-seeming art – the vehicles he designed were strikingly beautiful and also seemed like they would work – in the near or distant future. “If you have the industrial design methodology in your mind, things look the way they do because of how they’re made. And if you’re working on a movie or some other story format you first have to ask, ‘What is the socioeconomic, the technological world in which this story is taking place?’” After graduating, Mead spent two years with Ford and then went out on his own with catalog design, architectural renderings for major firms, 747-jet interiors, show cars, luxury ships and assorted product design. Philips Electronics, U.S. Steel, Sony, Minolta and Honda were among his corporate clients. He moved to Southern California in 1975. He had been making quite a good living for almost 20 years before Hollywood knocked on his door. “I got a call from John Dykstra and his partner Bob Shepherd and they said, ‘Would you like to work on a science fiction movie?’ I have no idea how they knew I was out here. Not a clue. So I said, ‘Sure’ and I met them at their Van Nuys studio, Apogee. I worked

OPPOSITE TOP: View of the Tyrell Building in Blade Runner. (Image © Warner Bros. 1982) OPPOSITE BOTTOM: Downtown Los Angeles with Spinner flying car visible. (Image © Warner Bros. 1982) TOP: Blade Runner. Downtown Los Angeles with Spinner car visible. (Image ©Warner Bros. 1982) BOTTOM: Street scene. Syd Mead concept art for Blade Runner. (Image © 1982 Warner Bros.)

“I studied Industrial Design, with a major in Automotive and a [minor] in Product. It gave me a knowledge of how things hinged together and how molds are made and things like that.” —Syd Mead

SUMMER 2018 VFXVOICE.COM • 45


PROFILE

TOP: High-tech, low-life street scene. Syd Mead concept art for Blade Runner. (Image © 1982 Warner Bros.) BOTTOM: Syd Mead at home. The mural behind him is a blowup of art created for the 2000 Pebble Beach Concours d’Elegance in California, by invitation from the Automotive Fine Arts Society. It depicts a Concours of the future where vehicles not yet conceived will be on display along with some of the cars we hold as classics today. (Image courtesy of Syd Mead)

TOP AND BOTTOM: Scenes from “Avatar: Flight of Passage.”

46 • VFXVOICE.COM SUMMER 2018

on Star Trek: The Motion Picture with Robert Wise in post.” Mead designed the V’ger entity that was central to the plot. He sketched the final end view of the V’ger on a cocktail napkin in a hotel in Eindhoven, Holland, where Mead was visiting Philips. “I was having a triple martini. I guess it was inspiration of a sort,” he laughs. That first Star Trek movie debuted in 1979 and breathed new life into the franchise. Soon afterwards, Ridley Scott was in Los Angeles putting together an unusual new movie, Blade Runner, based on a story by Philip K. Dick. Scott had seen Mead’s futuristic work in a book published by U.S. Steel and was intrigued by “one picture which has this night shot, a rainy, evil night, and these cars all driving on this huge expressway into the city and just this vertical wall of architecture,” recalls Mead. “Ridley called and I met him at the Playboy building on Sunset. And Ridley said, ‘This is not going to be Logan’s Run [a rather conventional sci-fi film].’ And I was the first person on the official roster as a hire for the film. Ridley wanted the streets to look packed and he wanted the visual frames to be packed. And I thought, ‘If we want to pack the streets, why don’t we just move the façade out like they do in Europe and have the columns and have an arcade underneath, because that narrows the street, and have garish lighting and the Japanese kanji.’ I had all the signage in fake kanji and later had a consultant to actually make it mean something. I did the gouache paintings [while] Ridley was busy fighting with the finance people.” In terms of the buildings and cityscapes, Mead sought to “mash together Mayan and Byzantine and Moderne and Memphis [an Italian geometric, post-modern style] and anything I could get my hands on because it’s a jumbled up world, a dystopian world. And we just jammed it all together.” It was a look that was layered – with the new atop the old – and Mead sometimes jokingly referred to it as “retro deco” or “trash chic.” Mead designed the iconic “spinner” flying cars and a cityscape with towering buildings. “I thought, ‘If this building is 3,000 feet high, we’re going to need a larger access footprint.’ So I had these pyramids coming out maybe in a four-block plot.” The rich lived near the top and “the nice people didn’t go below the 40th floor, so the street became the basement of society.” It was a future Los Angeles that was “high-tech and full of low-life,” as has been said about much science fiction that followed in its path. It was densely populated and harshly polluted, neonbright, and grimy where the ordinary folk lived. The film’s look and attitude influenced a generation of science fiction writers and filmmakers. Blade Runner bowed in 1982 and “changed the way the world looks and how we look at the world,” the renowned sci-fi author William Gibson told Wired. Mead recalls, “Philip K. Dick [who passed away in 1982] was very happy with the first rough cut, very pleased.” The film’s credits listed Mead as a “visual futurist.” As Mead started work on Blade Runner, “I got a call from [producer] Donald Kushner who says, ‘I’m working on a movie over at Disney called Tron and the director is Steven Lisberger.’ I had lunch with them, and when I finished pre-production on Blade Runner I went into pre-production on Tron. You can’t imagine two

more different movies.” Tron was the first feature film to make extensive use of computer animation, as well as the first to explore cyberspace (before Gibson popularized the term). Mead designed light cycles, tanks and aircraft carriers for the electronic world. About the film’s setting, he remembers, “It dawned on me that everything’s taking place on the back side of a CRT tube. So there’s no gravity. The rules of the real world don’t apply anymore. So I was in a meeting with Lisberger and he said, ‘We’ve got these light cycles and how are we going to make them turn? Because it’s going to take a lot of animation to make that curve.’ And I said, ‘No, there’s no gravity, there’s no inertia, so just make them do a 90-degree turn – bang like that.’” Mead designed the light cycles for Tron, as well as tanks, solar sailers and carriers. Mead’s back-and-forth between the two films continued. Once Tron went into principal photography, Mead went “back to post-production on Blade Runner,” doing matte shots with Effects Supervisors David Dryer and Douglas Trumbull, VES, and

“I’m very fortunate because I go in at the zero point. They want an idea to go along with the script, to illustrate the script, and so I work on a movie about every two years, by referral mostly, by the director or the production design people. I go in to the project as if I’ve never worked on a movie before, ever, so I don’t have any preconceptions.” —Syd Mead TOP: Cityscape buildings. Concept art for Blade Runner. (Image © Warner Bros. 1982) BOTTOM: Tron light cycle. (Image © 1982 Walt Disney Corp.)

SUMMER 2018 VFXVOICE.COM • 47


PROFILE

TOP: High-tech, low-life street scene. Syd Mead concept art for Blade Runner. (Image © 1982 Warner Bros.) BOTTOM: Syd Mead at home. The mural behind him is a blowup of art created for the 2000 Pebble Beach Concours d’Elegance in California, by invitation from the Automotive Fine Arts Society. It depicts a Concours of the future where vehicles not yet conceived will be on display along with some of the cars we hold as classics today. (Image courtesy of Syd Mead)

TOP AND BOTTOM: Scenes from “Avatar: Flight of Passage.”

46 • VFXVOICE.COM SUMMER 2018

on Star Trek: The Motion Picture with Robert Wise in post.” Mead designed the V’ger entity that was central to the plot. He sketched the final end view of the V’ger on a cocktail napkin in a hotel in Eindhoven, Holland, where Mead was visiting Philips. “I was having a triple martini. I guess it was inspiration of a sort,” he laughs. That first Star Trek movie debuted in 1979 and breathed new life into the franchise. Soon afterwards, Ridley Scott was in Los Angeles putting together an unusual new movie, Blade Runner, based on a story by Philip K. Dick. Scott had seen Mead’s futuristic work in a book published by U.S. Steel and was intrigued by “one picture which has this night shot, a rainy, evil night, and these cars all driving on this huge expressway into the city and just this vertical wall of architecture,” recalls Mead. “Ridley called and I met him at the Playboy building on Sunset. And Ridley said, ‘This is not going to be Logan’s Run [a rather conventional sci-fi film].’ And I was the first person on the official roster as a hire for the film. Ridley wanted the streets to look packed and he wanted the visual frames to be packed. And I thought, ‘If we want to pack the streets, why don’t we just move the façade out like they do in Europe and have the columns and have an arcade underneath, because that narrows the street, and have garish lighting and the Japanese kanji.’ I had all the signage in fake kanji and later had a consultant to actually make it mean something. I did the gouache paintings [while] Ridley was busy fighting with the finance people.” In terms of the buildings and cityscapes, Mead sought to “mash together Mayan and Byzantine and Moderne and Memphis [an Italian geometric, post-modern style] and anything I could get my hands on because it’s a jumbled up world, a dystopian world. And we just jammed it all together.” It was a look that was layered – with the new atop the old – and Mead sometimes jokingly referred to it as “retro deco” or “trash chic.” Mead designed the iconic “spinner” flying cars and a cityscape with towering buildings. “I thought, ‘If this building is 3,000 feet high, we’re going to need a larger access footprint.’ So I had these pyramids coming out maybe in a four-block plot.” The rich lived near the top and “the nice people didn’t go below the 40th floor, so the street became the basement of society.” It was a future Los Angeles that was “high-tech and full of low-life,” as has been said about much science fiction that followed in its path. It was densely populated and harshly polluted, neonbright, and grimy where the ordinary folk lived. The film’s look and attitude influenced a generation of science fiction writers and filmmakers. Blade Runner bowed in 1982 and “changed the way the world looks and how we look at the world,” the renowned sci-fi author William Gibson told Wired. Mead recalls, “Philip K. Dick [who passed away in 1982] was very happy with the first rough cut, very pleased.” The film’s credits listed Mead as a “visual futurist.” As Mead started work on Blade Runner, “I got a call from [producer] Donald Kushner who says, ‘I’m working on a movie over at Disney called Tron and the director is Steven Lisberger.’ I had lunch with them, and when I finished pre-production on Blade Runner I went into pre-production on Tron. You can’t imagine two

more different movies.” Tron was the first feature film to make extensive use of computer animation, as well as the first to explore cyberspace (before Gibson popularized the term). Mead designed light cycles, tanks and aircraft carriers for the electronic world. About the film’s setting, he remembers, “It dawned on me that everything’s taking place on the back side of a CRT tube. So there’s no gravity. The rules of the real world don’t apply anymore. So I was in a meeting with Lisberger and he said, ‘We’ve got these light cycles and how are we going to make them turn? Because it’s going to take a lot of animation to make that curve.’ And I said, ‘No, there’s no gravity, there’s no inertia, so just make them do a 90-degree turn – bang like that.’” Mead designed the light cycles for Tron, as well as tanks, solar sailers and carriers. Mead’s back-and-forth between the two films continued. Once Tron went into principal photography, Mead went “back to post-production on Blade Runner,” doing matte shots with Effects Supervisors David Dryer and Douglas Trumbull, VES, and

“I’m very fortunate because I go in at the zero point. They want an idea to go along with the script, to illustrate the script, and so I work on a movie about every two years, by referral mostly, by the director or the production design people. I go in to the project as if I’ve never worked on a movie before, ever, so I don’t have any preconceptions.” —Syd Mead TOP: Cityscape buildings. Concept art for Blade Runner. (Image © Warner Bros. 1982) BOTTOM: Tron light cycle. (Image © 1982 Walt Disney Corp.)

SUMMER 2018 VFXVOICE.COM • 47


PROFILE

TOP: “Palm Springs 2006”. Art by Syd Mead. BOTTOM: “City on a Megabeam”. Art by Syd Mead.

48 • VFXVOICE.COM SUMMER 2018

then returning to Tron after that. “That was a movie made under extreme duress. A lot of the Disney animators at that time would not work on the film because Disney was using computers. How that’s changed!” His next major film project was Peter Hyams’ 2010 (a sequel to 2001: A Space Odyssey), for which he had to design the Leonov spaceship that travels to Jupiter. Not long after, Mead was traveling and got a call from his office. “They said, ‘James Cameron is sending a script down to you.’ It arrived FedEx and I read the script for Aliens. And on the way back on the plane I started drawing the Sulaco [the film’s spaceship], and the script said this large forest of antennae comes into frame from the lower left, and this huge bulk of machinery follows it. And [I thought] let’s do a sphere because that’s a sort of universal shape, with these antennae coming out in front of it. So I get back here and have three sketches done and have a meeting with Cameron at his house that was then on Mulholland Drive. He said, ‘Well, Syd, we’ll make a model of this, and it has to move past in front of the camera and I don’t want to pull focus to keep the globe in focus. It has to be flatter.’ So he drew me a little quick sketch. Then I did the Sulaco sort of as an armed cargo ship. And I did some interior sets – the dropship bay and I started on the loader, also.” “I’m very fortunate because I go in at the zero point,” continues Mead. “They want an idea to go along with the script, to illustrate

the script, and so I work on a movie about every two years, by referral mostly, by the director or the production design people. I go in to the project as if I’ve never worked on a movie before, ever, so I don’t have any preconceptions.” Mead also worked on the Japanese anime titles Yamato 2520 and Turn A Gundam, and, along the way, the video games CyberRace, Cyber Speedway, Wing Commander: Prophecy and Aliens: Colonial Marines. He was also involved in some notable projects that never got made – such as The Jetsons – and two billion-dollar theme parks in Japan. Mead’s most recent film work was for Blade Runner 2049. “Denis [Villeneuve] came over here [to Pasadena] a couple of times to explain what he was after. So I started doing the Las Vegas street scenes, the skyline and different things. The script was very secret. I had to get a secure download to read just the Las Vegas part. But it was exciting. He’s a wonderful man to work with and he wrote the foreword for the book, The Movie Art of Syd Mead.” Mead advises young people working on visual effects or design to “notice everything. Sophistication is basically having a very good memory, whether it’s language or music or painting. You have to eventually build up your internal library of things you remember and how they should look. You can’t adjust something if you don’t know how it looks in the first place. Just remember how things

look, because the tools are changing all the time, so you’ll relearn that. You have to feed in from your personal memory banks, and if your banks are better than somebody else’s you’ll probably get the job.” Regarding the success of Blade Runner, Mead comments, “You don’t know it’s going to become a classic. You make a movie and the public decides if it likes it, and that’s how classics are made. So I was very fortunate.”

TOP LEFT: “Future Bugatti.” Art by Syd Mead. TOP RIGHT: Sulaco spaceship in Aliens. (Image © 1986 20th Century Fox) BOTTOM: Blade Runner 2049. Mead contributed concept art for the future Las Vegas. (Image © 2017 Warner Bros.)

SUMMER 2018 VFXVOICE.COM • 49


PROFILE

TOP: “Palm Springs 2006”. Art by Syd Mead. BOTTOM: “City on a Megabeam”. Art by Syd Mead.

48 • VFXVOICE.COM SUMMER 2018

then returning to Tron after that. “That was a movie made under extreme duress. A lot of the Disney animators at that time would not work on the film because Disney was using computers. How that’s changed!” His next major film project was Peter Hyams’ 2010 (a sequel to 2001: A Space Odyssey), for which he had to design the Leonov spaceship that travels to Jupiter. Not long after, Mead was traveling and got a call from his office. “They said, ‘James Cameron is sending a script down to you.’ It arrived FedEx and I read the script for Aliens. And on the way back on the plane I started drawing the Sulaco [the film’s spaceship], and the script said this large forest of antennae comes into frame from the lower left, and this huge bulk of machinery follows it. And [I thought] let’s do a sphere because that’s a sort of universal shape, with these antennae coming out in front of it. So I get back here and have three sketches done and have a meeting with Cameron at his house that was then on Mulholland Drive. He said, ‘Well, Syd, we’ll make a model of this, and it has to move past in front of the camera and I don’t want to pull focus to keep the globe in focus. It has to be flatter.’ So he drew me a little quick sketch. Then I did the Sulaco sort of as an armed cargo ship. And I did some interior sets – the dropship bay and I started on the loader, also.” “I’m very fortunate because I go in at the zero point,” continues Mead. “They want an idea to go along with the script, to illustrate

the script, and so I work on a movie about every two years, by referral mostly, by the director or the production design people. I go in to the project as if I’ve never worked on a movie before, ever, so I don’t have any preconceptions.” Mead also worked on the Japanese anime titles Yamato 2520 and Turn A Gundam, and, along the way, the video games CyberRace, Cyber Speedway, Wing Commander: Prophecy and Aliens: Colonial Marines. He was also involved in some notable projects that never got made – such as The Jetsons – and two billion-dollar theme parks in Japan. Mead’s most recent film work was for Blade Runner 2049. “Denis [Villeneuve] came over here [to Pasadena] a couple of times to explain what he was after. So I started doing the Las Vegas street scenes, the skyline and different things. The script was very secret. I had to get a secure download to read just the Las Vegas part. But it was exciting. He’s a wonderful man to work with and he wrote the foreword for the book, The Movie Art of Syd Mead.” Mead advises young people working on visual effects or design to “notice everything. Sophistication is basically having a very good memory, whether it’s language or music or painting. You have to eventually build up your internal library of things you remember and how they should look. You can’t adjust something if you don’t know how it looks in the first place. Just remember how things

look, because the tools are changing all the time, so you’ll relearn that. You have to feed in from your personal memory banks, and if your banks are better than somebody else’s you’ll probably get the job.” Regarding the success of Blade Runner, Mead comments, “You don’t know it’s going to become a classic. You make a movie and the public decides if it likes it, and that’s how classics are made. So I was very fortunate.”

TOP LEFT: “Future Bugatti.” Art by Syd Mead. TOP RIGHT: Sulaco spaceship in Aliens. (Image © 1986 20th Century Fox) BOTTOM: Blade Runner 2049. Mead contributed concept art for the future Las Vegas. (Image © 2017 Warner Bros.)

SUMMER 2018 VFXVOICE.COM • 49


INDUSTRY ROUNDTABLE

have specialists, but we have a lot more generalists that can take a shot, own it and finish it. You have multiple people working in conjunction so there are no bottlenecks to get a high volume of top-quality shots in a short period of time. Your quality control has to be very stringent. It has to be perfect the first time, because it may be going on air the day after you deliver.

VFX EXPLOSION BRINGS NEW EXCITEMENT TO TV, WITH DEMANDS AND REWARDS FOR PROS By PAULA PARISI

TOP: Star Trek: Discovery (Image courtesy of CBS All Access)

50 • VFXVOICE.COM SUMMER 2018

Real-time compositing, full CG character animation, virtual sets – visual effects that were until recently in the vanguard of feature films – are now in demand on series television, whose practitioners are charged with delivering the same caliber of mind-blowing visuals, but faster and cheaper. VFX Voice asked top practitioners to share their perspective on accomplishing that feat. VFX Voice: What are the primary considerations in managing VFX for an episodic TV series and maintaining a consistently high level of quality with such frequency, and how does it compare to working on feature films? Sam Nicholson, ASC, Founder/CEO Stargate Studios, Syfy’s Nightflyers We do both features and television, and while the quality of the work is much the same, there is a huge difference in terms of the pipeline. On a feature you might have six months to finish the effects, where on a television series you’re prepping an upcoming episode while you’re shooting one and posting another. Even something as simple as getting shots approved can be very different. In the television pipeline your visual effects supervisor is most likely on set during production and can’t really be approving shots while in the midst of principal photography. The way around that is generally to have producers in a more creative role, where they are creatively approving shots. The feature pipeline is extremely specialized: you have a motion-tracking department, you have hair, match-move guys, a roto team – the shot goes through, then it goes to compositing. In television, we do

Paul Graff, Senior VFX Producer, Netflix’s Stranger Things Season 2 With over 2,000 visual effects shots, Stranger Things Season 2 is as big as any huge visual effects movie in terms of volume. What is different is that we are in pre-production, production and post production at the same time. That means that while we were shooting episodes 201-204, we did not have the scripts for episodes seven, eight and nine yet. The primary consideration is to be flexible, have multiple strategies for all visual effects issues (plan A, plan B, plan C, etc…) and keep it simple. Then we need to have a safe, primary strategy for every shot that is achievable under somewhat unpredictable conditions. It was important for us to have a good, creative relationship with the vendors and have them in the picture on how we are planning to shoot the shots. For particularly challenging sequences we would invite the vendor to set to make sure they got what they needed. On heavy visual effects sequences, we try to identify hero shots and have the vendors start working on them immediately before editorial even finishes with the cut. This allows us to have a head start at defining the look and feel of the sequences. Christina Graff, Senior VFX Producer, Netflix’s Stranger Things Season 2 Communication between the departments is essential. One needs to have many sidebar meetings to keep things rolling while we’re prepping, shooting and delivering. We work very closely with the Duffer brothers [Stranger Things co-creators Matt and Paul], the art department and the practical effects team on the show. We also worked hard designing the shots before they got to the vendor. We created many concepts and storyboards (which needed to be approved by the Duffers) before they went to the vendor. We did as much previs as possible for both vendors and production, and we also asked vendors to provide some postvis for editorial.

“Your quality control has to be very stringent. It has to be perfect the first time, because it may be going on air the day after you deliver.” —Sam Nicholson

“The primary consideration is to be flexible, have multiple strategies for all visual effects issues and keep it simple. Then we need to have a safe, primary strategy for every shot that is achievable under somewhat unpredictable conditions.” —Paul Graff

BOTTOM LEFT: Nightflyers (Image courtesy of Syfy Channel) BOTTOM RIGHT: Stranger Things (Image courtesy of Netflix)

SUMMER 2018 VFXVOICE.COM • 51


INDUSTRY ROUNDTABLE

have specialists, but we have a lot more generalists that can take a shot, own it and finish it. You have multiple people working in conjunction so there are no bottlenecks to get a high volume of top-quality shots in a short period of time. Your quality control has to be very stringent. It has to be perfect the first time, because it may be going on air the day after you deliver.

VFX EXPLOSION BRINGS NEW EXCITEMENT TO TV, WITH DEMANDS AND REWARDS FOR PROS By PAULA PARISI

TOP: Star Trek: Discovery (Image courtesy of CBS All Access)

50 • VFXVOICE.COM SUMMER 2018

Real-time compositing, full CG character animation, virtual sets – visual effects that were until recently in the vanguard of feature films – are now in demand on series television, whose practitioners are charged with delivering the same caliber of mind-blowing visuals, but faster and cheaper. VFX Voice asked top practitioners to share their perspective on accomplishing that feat. VFX Voice: What are the primary considerations in managing VFX for an episodic TV series and maintaining a consistently high level of quality with such frequency, and how does it compare to working on feature films? Sam Nicholson, ASC, Founder/CEO Stargate Studios, Syfy’s Nightflyers We do both features and television, and while the quality of the work is much the same, there is a huge difference in terms of the pipeline. On a feature you might have six months to finish the effects, where on a television series you’re prepping an upcoming episode while you’re shooting one and posting another. Even something as simple as getting shots approved can be very different. In the television pipeline your visual effects supervisor is most likely on set during production and can’t really be approving shots while in the midst of principal photography. The way around that is generally to have producers in a more creative role, where they are creatively approving shots. The feature pipeline is extremely specialized: you have a motion-tracking department, you have hair, match-move guys, a roto team – the shot goes through, then it goes to compositing. In television, we do

Paul Graff, Senior VFX Producer, Netflix’s Stranger Things Season 2 With over 2,000 visual effects shots, Stranger Things Season 2 is as big as any huge visual effects movie in terms of volume. What is different is that we are in pre-production, production and post production at the same time. That means that while we were shooting episodes 201-204, we did not have the scripts for episodes seven, eight and nine yet. The primary consideration is to be flexible, have multiple strategies for all visual effects issues (plan A, plan B, plan C, etc…) and keep it simple. Then we need to have a safe, primary strategy for every shot that is achievable under somewhat unpredictable conditions. It was important for us to have a good, creative relationship with the vendors and have them in the picture on how we are planning to shoot the shots. For particularly challenging sequences we would invite the vendor to set to make sure they got what they needed. On heavy visual effects sequences, we try to identify hero shots and have the vendors start working on them immediately before editorial even finishes with the cut. This allows us to have a head start at defining the look and feel of the sequences. Christina Graff, Senior VFX Producer, Netflix’s Stranger Things Season 2 Communication between the departments is essential. One needs to have many sidebar meetings to keep things rolling while we’re prepping, shooting and delivering. We work very closely with the Duffer brothers [Stranger Things co-creators Matt and Paul], the art department and the practical effects team on the show. We also worked hard designing the shots before they got to the vendor. We created many concepts and storyboards (which needed to be approved by the Duffers) before they went to the vendor. We did as much previs as possible for both vendors and production, and we also asked vendors to provide some postvis for editorial.

“Your quality control has to be very stringent. It has to be perfect the first time, because it may be going on air the day after you deliver.” —Sam Nicholson

“The primary consideration is to be flexible, have multiple strategies for all visual effects issues and keep it simple. Then we need to have a safe, primary strategy for every shot that is achievable under somewhat unpredictable conditions.” —Paul Graff

BOTTOM LEFT: Nightflyers (Image courtesy of Syfy Channel) BOTTOM RIGHT: Stranger Things (Image courtesy of Netflix)

SUMMER 2018 VFXVOICE.COM • 51


INDUSTRY ROUNDTABLE

“Communication between the departments is essential. One needs to have many sidebar meetings to keep things rolling while we’re prepping, shooting and delivering.” —Christina Graff

“For episodic TV you’re always up against that air date, in our case 15 of them – not just a single entity but a season’s worth of shows.” —Jason Zimmerman

BOTTOM LEFT: Stranger Things 2 (Photo: Jackson Davis. Image courtesy of Netflix) BOTTOM RIGHT: Star Trek: Discovery (Image courtesy of CBS All Access) OPPOSITE BOTTOM LEFT: American Gods (Image courtesy of Starz) OPPOSITE BOTTOM RIGHT: Game of Thrones (Image courtesy of HBO)

52 • VFXVOICE.COM SUMMER 2018

Jason Zimmerman, VFX Supervisor, CBS All Access’s Star Trek: Discovery For episodic TV you’re always up against that air date, in our case 15 of them – not just a single entity but a season’s worth of shows. While we don’t necessarily have all 15 scripts in hand at the beginning, it’s essential to know what’s coming, what sequences are going to require more time and attention and how that can fit into the pipeline. A lot of it comes down to time management and resource allocation, ensuring your team manages the vendor workload appropriately to get everything done on the next episode while also making sure the larger sequences that will air later are being addressed. We usually have two to three months to work on each episode, but there’s overlap, so we may be working on three or four things at once that have their own time frame. At any given time, we’re working on four or five episodes that will deliver within a week of each other. For season one of Star Trek: Discovery we used well over 10 vendor companies for the series, but we expand and contract on each episode, based on the shot count and things like that. That’s a standard workflow these days. Our main house is Pixomondo. My internal team is eight to 10 people. I have another visual effects supervisor that works with me, Ante Dekovic, and a visual effects producer, Aleksandra Kochoska, and they’re my right and left hands. We’ve been a team since [Fox’s] Sleepy Hollow in 2013. Working with them as long as I have, plus being friends, gives us a shorthand that allows us to accomplish a lot. I believe we had more than 470 artists working on the show our first year. David Stump, ASC, VFX Supervisor, American Gods Season 1 The schedules in television are crazy, and so are the budgets, which are much lower than film. Last year for Season 1 of American Gods they handed off the edited shows in mid-January and we had three months, till about mid-April, to do the visual effects for all eight episodes. That’s very compressed. It required a lot of begging, a lot of late nights, and a lot of people who cut their profit margin to the bone. Features are definitely more profitable, but if you augment features with television

work where you do better than break even, or at least don’t lose money too fast, you can keep a lot of your artists on staff longer for the juicier projects, so it’s kind of an amortization. It’s like the old 1960s advertising slogan: “How do we do it? Volume!” What TV lacks in profitability it makes up for in volume. Joe Bauer, VFX Supervisor, HBO’s Game of Thrones I’ve worked on over 20 features prior to joining Game of Thrones, and I would say there is shockingly little difference. If anything, I believe TV operates on a more successful functional level with no waste, no fat. Every pre-production and production moment and every penny is represented on the screen. The primary considerations are time, money, planning and inspiration. We are very fortunate on this show to have consistently excellent creative material to work with, great story, performance and production value. Our job becomes not screwing up everyone else’s hard work! While we move faster than most features, we generally have the time and resources to put our best foot forward. That said, we must be both clever and realistic in our goals. An added advantage, which goes back to the quality of the material, is that the show is very popular and many of the most talented creatives in the business want to work with us. This makes possible things we could not otherwise endeavor toward. Anytime someone climbs up on a dragon is a multi-step process that takes tremendous coordination and planning. Normal TV parameters could not support the way we do things. Our methods and our artists are the same as on high budget VFX-heavy features. Aside from that, there is a commitment from production to do everything right, to go the extra mile in service to the visuals. The results are immediate and obvious. Jabbar Raisani, VFX Supervisor, Netflix’s Lost in Space I went from features straight into shows like Lost in Space and Game of Thrones, and it definitely feels like the gap between features and high-end episodic work is getting smaller. The main differences are quantity of shots and a reduced budget, though the TV budgets are getting larger every year. On Lost in Space, we had spaceships, robots, alien landscapes, various alien

“We had three months... to do the visual effects for all eight episodes. That’s very compressed. It required a lot of begging, a lot of late nights, and a lot of people who cut their profit margin to the bone. … It’s like the old 1960s advertising slogan: ‘How do we do it? Volume!’ What TV lacks in profitability it makes up for in volume.” —David Stump

“Anytime someone climbs up on a dragon is a multi-step process that takes tremendous coordination and planning. Normal TV parameters could not support the way we do things. Our methods and our artists are the same as on high budget VFX-heavy features. Aside from that, there is a commitment from production to do everything right, to go the extra mile in service to the visuals.” —Joe Bauer

SUMMER 2018 VFXVOICE.COM • 53


INDUSTRY ROUNDTABLE

“Communication between the departments is essential. One needs to have many sidebar meetings to keep things rolling while we’re prepping, shooting and delivering.” —Christina Graff

“For episodic TV you’re always up against that air date, in our case 15 of them – not just a single entity but a season’s worth of shows.” —Jason Zimmerman

BOTTOM LEFT: Stranger Things 2 (Photo: Jackson Davis. Image courtesy of Netflix) BOTTOM RIGHT: Star Trek: Discovery (Image courtesy of CBS All Access) OPPOSITE BOTTOM LEFT: American Gods (Image courtesy of Starz) OPPOSITE BOTTOM RIGHT: Game of Thrones (Image courtesy of HBO)

52 • VFXVOICE.COM SUMMER 2018

Jason Zimmerman, VFX Supervisor, CBS All Access’s Star Trek: Discovery For episodic TV you’re always up against that air date, in our case 15 of them – not just a single entity but a season’s worth of shows. While we don’t necessarily have all 15 scripts in hand at the beginning, it’s essential to know what’s coming, what sequences are going to require more time and attention and how that can fit into the pipeline. A lot of it comes down to time management and resource allocation, ensuring your team manages the vendor workload appropriately to get everything done on the next episode while also making sure the larger sequences that will air later are being addressed. We usually have two to three months to work on each episode, but there’s overlap, so we may be working on three or four things at once that have their own time frame. At any given time, we’re working on four or five episodes that will deliver within a week of each other. For season one of Star Trek: Discovery we used well over 10 vendor companies for the series, but we expand and contract on each episode, based on the shot count and things like that. That’s a standard workflow these days. Our main house is Pixomondo. My internal team is eight to 10 people. I have another visual effects supervisor that works with me, Ante Dekovic, and a visual effects producer, Aleksandra Kochoska, and they’re my right and left hands. We’ve been a team since [Fox’s] Sleepy Hollow in 2013. Working with them as long as I have, plus being friends, gives us a shorthand that allows us to accomplish a lot. I believe we had more than 470 artists working on the show our first year. David Stump, ASC, VFX Supervisor, American Gods Season 1 The schedules in television are crazy, and so are the budgets, which are much lower than film. Last year for Season 1 of American Gods they handed off the edited shows in mid-January and we had three months, till about mid-April, to do the visual effects for all eight episodes. That’s very compressed. It required a lot of begging, a lot of late nights, and a lot of people who cut their profit margin to the bone. Features are definitely more profitable, but if you augment features with television

work where you do better than break even, or at least don’t lose money too fast, you can keep a lot of your artists on staff longer for the juicier projects, so it’s kind of an amortization. It’s like the old 1960s advertising slogan: “How do we do it? Volume!” What TV lacks in profitability it makes up for in volume. Joe Bauer, VFX Supervisor, HBO’s Game of Thrones I’ve worked on over 20 features prior to joining Game of Thrones, and I would say there is shockingly little difference. If anything, I believe TV operates on a more successful functional level with no waste, no fat. Every pre-production and production moment and every penny is represented on the screen. The primary considerations are time, money, planning and inspiration. We are very fortunate on this show to have consistently excellent creative material to work with, great story, performance and production value. Our job becomes not screwing up everyone else’s hard work! While we move faster than most features, we generally have the time and resources to put our best foot forward. That said, we must be both clever and realistic in our goals. An added advantage, which goes back to the quality of the material, is that the show is very popular and many of the most talented creatives in the business want to work with us. This makes possible things we could not otherwise endeavor toward. Anytime someone climbs up on a dragon is a multi-step process that takes tremendous coordination and planning. Normal TV parameters could not support the way we do things. Our methods and our artists are the same as on high budget VFX-heavy features. Aside from that, there is a commitment from production to do everything right, to go the extra mile in service to the visuals. The results are immediate and obvious. Jabbar Raisani, VFX Supervisor, Netflix’s Lost in Space I went from features straight into shows like Lost in Space and Game of Thrones, and it definitely feels like the gap between features and high-end episodic work is getting smaller. The main differences are quantity of shots and a reduced budget, though the TV budgets are getting larger every year. On Lost in Space, we had spaceships, robots, alien landscapes, various alien

“We had three months... to do the visual effects for all eight episodes. That’s very compressed. It required a lot of begging, a lot of late nights, and a lot of people who cut their profit margin to the bone. … It’s like the old 1960s advertising slogan: ‘How do we do it? Volume!’ What TV lacks in profitability it makes up for in volume.” —David Stump

“Anytime someone climbs up on a dragon is a multi-step process that takes tremendous coordination and planning. Normal TV parameters could not support the way we do things. Our methods and our artists are the same as on high budget VFX-heavy features. Aside from that, there is a commitment from production to do everything right, to go the extra mile in service to the visuals.” —Joe Bauer

SUMMER 2018 VFXVOICE.COM • 53


INDUSTRY ROUNDTABLE

creatures, and a ton of simulation work needed on our first season and everything in large quantities. My primary concern was to figure out which vendors were the best fit for the types of work and to dole out the sequences so we didn’t end up with any vendor too taxed to keep us on schedule. I attempted to convince the showrunner to give us a smaller VFX episode or two, but ‘small’ turned out to be a relative term for Lost in Space. Our small episode includes a ship escaping from a collapsing glacier, and that was only one sequence. There were also CG creatures attacking the Robinsons and devouring their fuel.

“It definitely feels like the gap between features and highend episodic work is getting smaller. The main differences are quantity of shots and a reduced budget, though the TV budgets are getting larger every year.” —Jabbar Raisani

“[T]he ability to accurately anticipate the show’s needs and getting the vendors in sync with that as early as possible are the two primary considerations for success [in episodic TV].” —Christopher Scollard

BOTTOM LEFT: Lost in Space (Image courtesy of Netflix) BOTTOM RIGHT: Fear the Walking Dead (Image courtesy of AMC)

54 • VFXVOICE.COM SUMMER 2018

Christopher Scollard, VFX Supervisor, AMC’s Fear the Walking Dead Most of my TV work has been on hour-long episodes – in essence we’re shooting a mini-feature every two weeks. On features, I’ve rarely begun actual show work until we have a locked cut. In TV, however, after we get a few episodes in the can, our work begins, so we are working while we are shooting new episodes. That’s the main difference – the schedule. TV allows for much less development. Usually on features the principal photography is the time at which we can test various looks and methods that will be applied once the digital work begins. TV requires immediate execution. Episodic schedules are grueling, so a clear understanding of what the show wants, mutual expectations, and being able to communicate that to your vendors is key to turning in high-quality work on a tight schedule. It helps to eliminate, or at least reduce, the trial-anderror of finding what works. I’d say the ability to accurately anticipate the show’s needs and getting the vendors in sync with that as early as possible are the two primary considerations for success.

Tammy Sutton-Walker, Visual Effects Producer, Amazon’s Man in the High Castle Most of my career I have worked on features like Avatar, Elf and The Conjuring. When I made the shift to TV two years ago, I knew the biggest challenge was going to be the shorter deadlines. A TV show that is not considered a VFX-heavy show may only have a week or two to turn around the work. On a feature you have more time to work on shots, but this also means global changes can occur later in the process which can significantly derail your schedule and budget. With episodic work I find the hard deadlines for each episode can help to lock in creative decisions more quickly. Heavier VFX TV shows like Man in the High Castle have longer deadlines and feel closer to the way a feature film works. Such shows are becoming more common. On Man in the High Castle we have a much longer turn around time, and we’re brought into the creative discussion early on. Our team at Barnstorm VFX and our VFX Supervisor, Lawson Deming, are heavily involved during pre-production. We are able to start building the larger assets before production starts shooting, which helps us get a head start. After finishing up on season two we jumped right back in where we left off and are now halfway through season three. Mike Borrett, Mr. X, VFX Producer, History’s Vikings We’ll typically be working on five different episodes of Vikings at a time, delivering an episode every 10 to 14 weeks. The TV schedule keeps everything fresh and provides a sense of satisfaction as we complete each show. We are now in season six of Vikings, and our VFX crew has essentially been the same since season one. Because of that, we’ve developed a shorthand within the team on how to quickly approach a massive battle, a huge city environment, or a Viking fleet on a stormy sea. We’ve built a very trusting relationship with our clients at Take 5 Productions, MGM Television and History, which translates into a clear focus on making the shots as good as they can be.

“When I made the shift to TV I knew the biggest challenge was going to be the shorter deadlines. … With episodic work I find the hard deadlines for each episode helps to lock in creative decisions more quickly.” —Tammy Sutton-Walker

“We’ll typically be working on five different episodes of Vikings at a time, delivering an episode every 10 to 14 weeks. The TV schedule keeps everything fresh and provides a sense of satisfaction as we complete each show.” —Mike Borrett

BOTTOM LEFT: Man in the High Castle (Image courtesy of Amazon Studios) BOTTOM RIGHT: Vikings (Image courtesy of the History)

SUMMER 2018 VFXVOICE.COM • 55


INDUSTRY ROUNDTABLE

creatures, and a ton of simulation work needed on our first season and everything in large quantities. My primary concern was to figure out which vendors were the best fit for the types of work and to dole out the sequences so we didn’t end up with any vendor too taxed to keep us on schedule. I attempted to convince the showrunner to give us a smaller VFX episode or two, but ‘small’ turned out to be a relative term for Lost in Space. Our small episode includes a ship escaping from a collapsing glacier, and that was only one sequence. There were also CG creatures attacking the Robinsons and devouring their fuel.

“It definitely feels like the gap between features and highend episodic work is getting smaller. The main differences are quantity of shots and a reduced budget, though the TV budgets are getting larger every year.” —Jabbar Raisani

“[T]he ability to accurately anticipate the show’s needs and getting the vendors in sync with that as early as possible are the two primary considerations for success [in episodic TV].” —Christopher Scollard

BOTTOM LEFT: Lost in Space (Image courtesy of Netflix) BOTTOM RIGHT: Fear the Walking Dead (Image courtesy of AMC)

54 • VFXVOICE.COM SUMMER 2018

Christopher Scollard, VFX Supervisor, AMC’s Fear the Walking Dead Most of my TV work has been on hour-long episodes – in essence we’re shooting a mini-feature every two weeks. On features, I’ve rarely begun actual show work until we have a locked cut. In TV, however, after we get a few episodes in the can, our work begins, so we are working while we are shooting new episodes. That’s the main difference – the schedule. TV allows for much less development. Usually on features the principal photography is the time at which we can test various looks and methods that will be applied once the digital work begins. TV requires immediate execution. Episodic schedules are grueling, so a clear understanding of what the show wants, mutual expectations, and being able to communicate that to your vendors is key to turning in high-quality work on a tight schedule. It helps to eliminate, or at least reduce, the trial-anderror of finding what works. I’d say the ability to accurately anticipate the show’s needs and getting the vendors in sync with that as early as possible are the two primary considerations for success.

Tammy Sutton-Walker, Visual Effects Producer, Amazon’s Man in the High Castle Most of my career I have worked on features like Avatar, Elf and The Conjuring. When I made the shift to TV two years ago, I knew the biggest challenge was going to be the shorter deadlines. A TV show that is not considered a VFX-heavy show may only have a week or two to turn around the work. On a feature you have more time to work on shots, but this also means global changes can occur later in the process which can significantly derail your schedule and budget. With episodic work I find the hard deadlines for each episode can help to lock in creative decisions more quickly. Heavier VFX TV shows like Man in the High Castle have longer deadlines and feel closer to the way a feature film works. Such shows are becoming more common. On Man in the High Castle we have a much longer turn around time, and we’re brought into the creative discussion early on. Our team at Barnstorm VFX and our VFX Supervisor, Lawson Deming, are heavily involved during pre-production. We are able to start building the larger assets before production starts shooting, which helps us get a head start. After finishing up on season two we jumped right back in where we left off and are now halfway through season three. Mike Borrett, Mr. X, VFX Producer, History’s Vikings We’ll typically be working on five different episodes of Vikings at a time, delivering an episode every 10 to 14 weeks. The TV schedule keeps everything fresh and provides a sense of satisfaction as we complete each show. We are now in season six of Vikings, and our VFX crew has essentially been the same since season one. Because of that, we’ve developed a shorthand within the team on how to quickly approach a massive battle, a huge city environment, or a Viking fleet on a stormy sea. We’ve built a very trusting relationship with our clients at Take 5 Productions, MGM Television and History, which translates into a clear focus on making the shots as good as they can be.

“When I made the shift to TV I knew the biggest challenge was going to be the shorter deadlines. … With episodic work I find the hard deadlines for each episode helps to lock in creative decisions more quickly.” —Tammy Sutton-Walker

“We’ll typically be working on five different episodes of Vikings at a time, delivering an episode every 10 to 14 weeks. The TV schedule keeps everything fresh and provides a sense of satisfaction as we complete each show.” —Mike Borrett

BOTTOM LEFT: Man in the High Castle (Image courtesy of Amazon Studios) BOTTOM RIGHT: Vikings (Image courtesy of the History)

SUMMER 2018 VFXVOICE.COM • 55


COMPANY SPOTLIGHT

If there’s one constant in the visual effects industry, it’s that it is always changing. One VFX company that has ridden the wave of change in the past few years is Method Studios, part of the Deluxe Entertainment group of production and post-production outfits that also includes EFILM, Company 3, Encore and Stereo D. Like so many other effects studios, Method Studios also spans the globe, with locations in North America, India and Australia, where Iloura VFX in Melbourne and Sydney (already part of Deluxe) came under the ‘Method’ brand in February. The studio is also in the process of exploring continued expansion to other locations. VFX Voice sat down with some of Method Studios’ key staff to reflect on where the studio has been and where it is headed into the future. METHOD’S JOURNEY

METHOD STUDIOS: BUILDING A GLOBAL NETWORK WITH A BOUTIQUE SPIRIT By IAN FAILES

Method was originally founded in 1998 as a boutique L.A.-based facility mainly working in commercials, very soon establishing its name as a creative force in the area. After a number of mergers and acquisitions, the VFX studio now provides a wide offering of visual effects-related services in feature films, television, commercials, design and VR/AR. A quick look at Method Studios’ most recent film projects reveals the extent to which the studio has become one of the top players in visual effects. These projects include Deadpool 2, Ant-Man and the Wasp, Avengers: Infinity War, Black Panther, A Wrinkle in Time, Thor: Ragnarok, Guardians of the Galaxy Vol. 2 and Okja. Plus there’s been a wealth of VES Award nominations and wins, and several films nominated for the Best Visual Effects Oscar that Method has worked on, such as Mad Max: Fury Road, Deepwater Horizon, Doctor Strange and Guardians of the Galaxy Vol. 2. The company reached a high level surprisingly quickly, acknowledges President and General Manager Ed Ulbrich, one of Digital Domain’s founding executives, who came on board Method Studios in 2016. He admits he was “blown away” by work he saw coming out of the studio, specifically in the 2015 disaster film San Andreas, during a stint at Warner Bros. “I thought I was seeing the work of an ILM or another one of the majors,” Ulbrich recalls. “I watched this end sequence of the movie and was thinking, ‘Wow.’ It was just extraordinarily complicated work. Someone told me it was Method. I’m like, ‘Wait, Method?’ I’m very familiar with that type of work. There’s no easy way to do that.” “I kept being confronted by work that Method was doing,” continues Ulbrich. “I knew of other people at other studios that had worked with them. I thought, ‘Method’s grown up. It’s this new shop.’ They’d really started to put out some exceptional-looking work.” WORKING AS ONE

Images courtesy of Method Studios except where noted. TOP: A screening/review room at Method Studios’ Melbourne location.

56 • VFXVOICE.COM SUMMER 2018

Indeed, Method Studios had grown up – and quickly – partly from the acquisition of VFX companies operating in different cities (Method has operations in Los Angeles, New York, Atlanta,

Vancouver, Melbourne, Sydney and Pune, India). As a worldwide group, these studios all operate under the Method Studios banner, but until recently, notes Ulbrich, they had been mostly working independently. Ulbrich’s mission was to bring these separate studios together in a larger way. “I remember having this conversation with [Deluxe President and CEO] John Wallace, saying, ‘Just imagine if all these shops could actually unite and function as one. That would be a very powerful entity. That would truly be something to reckon with in the industry. Another major would emerge. If somehow you could harness that power across all these various locations, and bring them together and unite them, you’d really have a major ‘tier one’ visual effects provider that could go head-to-head with anybody.’” And that’s exactly what Method Studios has been doing over the past year, says Ulbrich, who told VFX Voice that a ‘unification’ project at the company is a combination of several things: aligning pipelines, defining combined business practices and communicating more closely. Global production meetings are now commonplace, and Method’s various offices regularly collaborate on the same projects. Still, Ulbrich recognizes that each studio location has its own personality and skill sets. “Melbourne, for example, is very much oriented around character animation, for good reason,” he says. “They produce outstanding work. There’s an amazing pool of talent there. There’s some really remarkable things that are done there. So the notion is, why change that? Uniting together as part of a global force does not mean that we give that up.”

TOP LEFT: Method Studios President and General Manager Ed Ulbrich. TOP MIDDLE: Melbourne-based Method Studios Visual Effects Supervisor Glenn Melenhorst. TOP RIGHT: Los Angeles-based Method Studios Visual Effects Supervisor Olivier Dumont. BOTTOM: Ed Ulbrich hosts a monthly meeting of artists at Method Los Angeles.

CREATIVE SUCCESS, GLOBALLY

That’s echoed by those at the Melbourne studio, including Visual Effects Supervisor Glenn Melenhorst, who helped for years to shape the Melbourne studio’s success; in recent times on projects such as Game of Thrones, Ted and Ted 2, The SpongeBob Movie: Sponge Out of Water, Jumanji: Welcome to the Jungle and the upcoming Robin Hood. “Being an old animator, I have a soft spot for things like Ted and SpongeBob and those sorts of things,” states Melenhorst. “We’ve tried to always have that point of difference that we pursue

SUMMER 2018 VFXVOICE.COM • 57


COMPANY SPOTLIGHT

If there’s one constant in the visual effects industry, it’s that it is always changing. One VFX company that has ridden the wave of change in the past few years is Method Studios, part of the Deluxe Entertainment group of production and post-production outfits that also includes EFILM, Company 3, Encore and Stereo D. Like so many other effects studios, Method Studios also spans the globe, with locations in North America, India and Australia, where Iloura VFX in Melbourne and Sydney (already part of Deluxe) came under the ‘Method’ brand in February. The studio is also in the process of exploring continued expansion to other locations. VFX Voice sat down with some of Method Studios’ key staff to reflect on where the studio has been and where it is headed into the future. METHOD’S JOURNEY

METHOD STUDIOS: BUILDING A GLOBAL NETWORK WITH A BOUTIQUE SPIRIT By IAN FAILES

Method was originally founded in 1998 as a boutique L.A.-based facility mainly working in commercials, very soon establishing its name as a creative force in the area. After a number of mergers and acquisitions, the VFX studio now provides a wide offering of visual effects-related services in feature films, television, commercials, design and VR/AR. A quick look at Method Studios’ most recent film projects reveals the extent to which the studio has become one of the top players in visual effects. These projects include Deadpool 2, Ant-Man and the Wasp, Avengers: Infinity War, Black Panther, A Wrinkle in Time, Thor: Ragnarok, Guardians of the Galaxy Vol. 2 and Okja. Plus there’s been a wealth of VES Award nominations and wins, and several films nominated for the Best Visual Effects Oscar that Method has worked on, such as Mad Max: Fury Road, Deepwater Horizon, Doctor Strange and Guardians of the Galaxy Vol. 2. The company reached a high level surprisingly quickly, acknowledges President and General Manager Ed Ulbrich, one of Digital Domain’s founding executives, who came on board Method Studios in 2016. He admits he was “blown away” by work he saw coming out of the studio, specifically in the 2015 disaster film San Andreas, during a stint at Warner Bros. “I thought I was seeing the work of an ILM or another one of the majors,” Ulbrich recalls. “I watched this end sequence of the movie and was thinking, ‘Wow.’ It was just extraordinarily complicated work. Someone told me it was Method. I’m like, ‘Wait, Method?’ I’m very familiar with that type of work. There’s no easy way to do that.” “I kept being confronted by work that Method was doing,” continues Ulbrich. “I knew of other people at other studios that had worked with them. I thought, ‘Method’s grown up. It’s this new shop.’ They’d really started to put out some exceptional-looking work.” WORKING AS ONE

Images courtesy of Method Studios except where noted. TOP: A screening/review room at Method Studios’ Melbourne location.

56 • VFXVOICE.COM SUMMER 2018

Indeed, Method Studios had grown up – and quickly – partly from the acquisition of VFX companies operating in different cities (Method has operations in Los Angeles, New York, Atlanta,

Vancouver, Melbourne, Sydney and Pune, India). As a worldwide group, these studios all operate under the Method Studios banner, but until recently, notes Ulbrich, they had been mostly working independently. Ulbrich’s mission was to bring these separate studios together in a larger way. “I remember having this conversation with [Deluxe President and CEO] John Wallace, saying, ‘Just imagine if all these shops could actually unite and function as one. That would be a very powerful entity. That would truly be something to reckon with in the industry. Another major would emerge. If somehow you could harness that power across all these various locations, and bring them together and unite them, you’d really have a major ‘tier one’ visual effects provider that could go head-to-head with anybody.’” And that’s exactly what Method Studios has been doing over the past year, says Ulbrich, who told VFX Voice that a ‘unification’ project at the company is a combination of several things: aligning pipelines, defining combined business practices and communicating more closely. Global production meetings are now commonplace, and Method’s various offices regularly collaborate on the same projects. Still, Ulbrich recognizes that each studio location has its own personality and skill sets. “Melbourne, for example, is very much oriented around character animation, for good reason,” he says. “They produce outstanding work. There’s an amazing pool of talent there. There’s some really remarkable things that are done there. So the notion is, why change that? Uniting together as part of a global force does not mean that we give that up.”

TOP LEFT: Method Studios President and General Manager Ed Ulbrich. TOP MIDDLE: Melbourne-based Method Studios Visual Effects Supervisor Glenn Melenhorst. TOP RIGHT: Los Angeles-based Method Studios Visual Effects Supervisor Olivier Dumont. BOTTOM: Ed Ulbrich hosts a monthly meeting of artists at Method Los Angeles.

CREATIVE SUCCESS, GLOBALLY

That’s echoed by those at the Melbourne studio, including Visual Effects Supervisor Glenn Melenhorst, who helped for years to shape the Melbourne studio’s success; in recent times on projects such as Game of Thrones, Ted and Ted 2, The SpongeBob Movie: Sponge Out of Water, Jumanji: Welcome to the Jungle and the upcoming Robin Hood. “Being an old animator, I have a soft spot for things like Ted and SpongeBob and those sorts of things,” states Melenhorst. “We’ve tried to always have that point of difference that we pursue

SUMMER 2018 VFXVOICE.COM • 57


COMPANY SPOTLIGHT

character work, rather than straight-up effects work. We like to do a bit of everything, but we do like creatures and characters, so Jumanji was another good example. That was 120 minutes of fun, just a great film to be working on.” The Melbourne studio had already been part of Deluxe for a few years, and working toward integration with Method before officially coming under the same name, all the while remaining a relatively small outfit at around 200 artists. “We’ve always liked having that kind of boutique moniker,” says Melenhorst, who notes the change is helping to combine technology and pipelines, but says the company will still remain much the same in terms of operation. “We like that we’re big enough to handle being the sole vendor on a film, but we can also contribute to much larger films, too.” Los Angeles-based Visual Effects Supervisor Olivier Dumont is another who has been close to the changes at Method Studios. He’s been with the studio for a decade, and recent experiences on the complicated effects films King Arthur: Legend of the Sword and Doctor Strange made the VFX supervisor realize the extent to which Method had grown. But Dumont also says “despite the growth, I can still feel the boutique spirit in the work that Method delivers. Some might consider this a weakness – maybe inefficient – but I see this as a strength. We try to be different from the other companies and we want to be, above all else, original.” Although located in Los Angeles, Dumont has had first-hand experience at capitalizing on Method Studios’ global presence, especially on King Arthur, which saw the L.A. and Vancouver offices combine to deliver Method’s final shots for the film. “That was a pleasure because I got to meet all the talented artists up there. Working remotely is always difficult, but this project showed me it was possible, and that Method had the infrastructure to deliver a movie with the help of offices in different locations.” “Also,” adds Dumont, “once the initial bridge is built between the offices – like the one I had the chance to participate in with the Vancouver office during the King Arthur project – everything then seems simpler. All teams are driven by the same passion to make

58 • VFXVOICE.COM SUMMER 2018

“Being an old animator, I have a soft spot for things like Ted and SpongeBob and those sorts of things. We’ve tried to always have that point of difference that we pursue character work, rather than straight-up effects work. We like to do a bit of everything, but we do like creatures and characters, so Jumanji was another good example. That was 120 minutes of fun, just a great film to be working on.” —Glenn Melenhorst, Visual Effects Supervisor, Method Melbourne

OPPOSITE TOP: Staff at Method New York on International Women’s Day 2018. OPPOSITE MIDDLE: Artist workstations at Method Vancouver. OPPOSITE BOTTOM: Inside Method Sydney. TOP: Staff gather at Method Los Angeles for International Women’s Day 2018. BOTTOM: Digital armies combined with real photography formed part of the visual effects challenges in the ‘Battle of the Bastards’ episode of Game of Thrones. (Image © 2016 HBO)

Doctor Strange The fourteenth film in Marvel Cinematic Universe features some stunningly fantastical and mind-bending imagery, including a whole Method Studios sequence dubbed the Magic Mystery Tour. In it, Dr. Strange (Benedict Cumberbatch) travels through several psychedelic universes. Method Studios was tasked with creating them. “This project was the most challenging creatively I ever worked on because we had to build 16 unique worlds,” says Visual Effects Supervisor Olivier Dumont. “Each had their own set of rules and elements that would affect the main characters. We obviously wanted to be as original as possible and ‘the crazier the better’ was our motto. We also had to create a short story for each universe before even starting to work on the visuals which was really a lot of fun.”

SUMMER 2018 VFXVOICE.COM • 59


COMPANY SPOTLIGHT

character work, rather than straight-up effects work. We like to do a bit of everything, but we do like creatures and characters, so Jumanji was another good example. That was 120 minutes of fun, just a great film to be working on.” The Melbourne studio had already been part of Deluxe for a few years, and working toward integration with Method before officially coming under the same name, all the while remaining a relatively small outfit at around 200 artists. “We’ve always liked having that kind of boutique moniker,” says Melenhorst, who notes the change is helping to combine technology and pipelines, but says the company will still remain much the same in terms of operation. “We like that we’re big enough to handle being the sole vendor on a film, but we can also contribute to much larger films, too.” Los Angeles-based Visual Effects Supervisor Olivier Dumont is another who has been close to the changes at Method Studios. He’s been with the studio for a decade, and recent experiences on the complicated effects films King Arthur: Legend of the Sword and Doctor Strange made the VFX supervisor realize the extent to which Method had grown. But Dumont also says “despite the growth, I can still feel the boutique spirit in the work that Method delivers. Some might consider this a weakness – maybe inefficient – but I see this as a strength. We try to be different from the other companies and we want to be, above all else, original.” Although located in Los Angeles, Dumont has had first-hand experience at capitalizing on Method Studios’ global presence, especially on King Arthur, which saw the L.A. and Vancouver offices combine to deliver Method’s final shots for the film. “That was a pleasure because I got to meet all the talented artists up there. Working remotely is always difficult, but this project showed me it was possible, and that Method had the infrastructure to deliver a movie with the help of offices in different locations.” “Also,” adds Dumont, “once the initial bridge is built between the offices – like the one I had the chance to participate in with the Vancouver office during the King Arthur project – everything then seems simpler. All teams are driven by the same passion to make

58 • VFXVOICE.COM SUMMER 2018

“Being an old animator, I have a soft spot for things like Ted and SpongeBob and those sorts of things. We’ve tried to always have that point of difference that we pursue character work, rather than straight-up effects work. We like to do a bit of everything, but we do like creatures and characters, so Jumanji was another good example. That was 120 minutes of fun, just a great film to be working on.” —Glenn Melenhorst, Visual Effects Supervisor, Method Melbourne

OPPOSITE TOP: Staff at Method New York on International Women’s Day 2018. OPPOSITE MIDDLE: Artist workstations at Method Vancouver. OPPOSITE BOTTOM: Inside Method Sydney. TOP: Staff gather at Method Los Angeles for International Women’s Day 2018. BOTTOM: Digital armies combined with real photography formed part of the visual effects challenges in the ‘Battle of the Bastards’ episode of Game of Thrones. (Image © 2016 HBO)

Doctor Strange The fourteenth film in Marvel Cinematic Universe features some stunningly fantastical and mind-bending imagery, including a whole Method Studios sequence dubbed the Magic Mystery Tour. In it, Dr. Strange (Benedict Cumberbatch) travels through several psychedelic universes. Method Studios was tasked with creating them. “This project was the most challenging creatively I ever worked on because we had to build 16 unique worlds,” says Visual Effects Supervisor Olivier Dumont. “Each had their own set of rules and elements that would affect the main characters. We obviously wanted to be as original as possible and ‘the crazier the better’ was our motto. We also had to create a short story for each universe before even starting to work on the visuals which was really a lot of fun.”

SUMMER 2018 VFXVOICE.COM • 59


COMPANY SPOTLIGHT

COMPANY SPOTLIGHT

“Despite the growth, I can still feel the boutique spirit in the work that Method delivers. Some might consider this a weakness – maybe inefficient – but I see this as a strength. We try to be different from the other companies and we want to be, above all else, original.” —Olivier Dumont, Visual Effects Supervisor

60 • VFXVOICE.COM SUMMER 2018

Game of Thrones: Battle of the Bastards

Okja

This landmark episode from season six of the gamechanging HBO series, which involved a monumental bloody confrontation waged with horse riders, archers and foot soldiers, was also a landmark moment in the show for visual effects. Method Melbourne (formerly Iloura) crafted a range of photo-real horses, riders and combat moments for the episode, treating the visual effects as they would a feature. “People often ask about that work because it’s TV and they can’t imagine it’s got the same rigour as film, but in every way it actually did,” says Visual Effects Supervisor Glenn Melenhorst. The visual effects for the episode itself were recognized with both an Emmy and a VES Award.

With Netflix’s Okja, Method Studios worked closely with director Bong Joon-ho to deliver a believable and lovable main CG character. A key success of this ‘super-pig’ was giving it a suitable personality via animation, while also relying on Method Studios’ technical abilities to deliver realistic movement, skin and muscle simulations – all at 4K resolution. Says President and General Manager Ed Ulbrich: “Our team in Vancouver had developed a process whereby they could simulate secondary animation based on physics as in-the-box, openGL playblasts. They did that really to help the director with a much better experience of judging performance in a scene without having to approve things in all of the usual animation layers. The development of that tool, which is now a tool that we’re rolling out globally, was invaluable to speed the process up, which really worked well as a strategy in our first 4K movie.”

the best image for the screen, and it is easy to get on the same page no matter where the location is.”

OPPOSITE BOTTOM: A scene from Bong Joon-ho’s Okja, for which Method crafted a completely CG main character. (Image © 2017 Netflix)

THE METHOD STUDIOS OF THE FUTURE

So what’s next for Method Studios? Already the company has more than 1,500 employees across its different facilities, with the possibility of further growth. Work on the unification of the various locations also continues, and Method Studios has a suite of large film projects in production such as Aquaman, The New Mutants and Godzilla: King of the Monsters, among several others. Then there’s the ongoing advertising business and newer forays into immersive experiences via its Method EXP group. Ulbrich suggests that Method Studios is aiming to be a tier-one player that competes on a global basis and can “leverage and access incentives and rebates and low-cost labor markets around

OPPOSITE TOP: A fractal world created by the visual effects studio for Doctor Strange. (Image © 2016 Marvel Studios)

the world to provide a very competitive price.” The company is perhaps already there, and has an extra weapon in its arsenal: its existence as a part of Deluxe. As a whole, the larger company can now offer filmmakers services ranging from design and previs all the way through to visual effects, DI and 3D stereo conversion. “We can actually go in and talk about a bigger scope of services for a movie or a show than most places can,” observes Ulbrich. “We can talk about the full suite of services and packages, something that’s very attractive for those studios or networks or producers.”

TOP: For San Andreas, Method Studios helped destroy parts of Los Angeles via destruction effects simulations and a host of digital buildings and vehicles. (Images © 2015 Warner Bros.) BOTTOM: Method Melbourne delivered CG rhinos and a synthetic cavern environment for Jumanji: Welcome to the Jungle. (Image © 2017 Sony Pictures)

SUMMER 2018 VFXVOICE.COM • 61


COMPANY SPOTLIGHT

COMPANY SPOTLIGHT

“Despite the growth, I can still feel the boutique spirit in the work that Method delivers. Some might consider this a weakness – maybe inefficient – but I see this as a strength. We try to be different from the other companies and we want to be, above all else, original.” —Olivier Dumont, Visual Effects Supervisor

60 • VFXVOICE.COM SUMMER 2018

Game of Thrones: Battle of the Bastards

Okja

This landmark episode from season six of the gamechanging HBO series, which involved a monumental bloody confrontation waged with horse riders, archers and foot soldiers, was also a landmark moment in the show for visual effects. Method Melbourne (formerly Iloura) crafted a range of photo-real horses, riders and combat moments for the episode, treating the visual effects as they would a feature. “People often ask about that work because it’s TV and they can’t imagine it’s got the same rigour as film, but in every way it actually did,” says Visual Effects Supervisor Glenn Melenhorst. The visual effects for the episode itself were recognized with both an Emmy and a VES Award.

With Netflix’s Okja, Method Studios worked closely with director Bong Joon-ho to deliver a believable and lovable main CG character. A key success of this ‘super-pig’ was giving it a suitable personality via animation, while also relying on Method Studios’ technical abilities to deliver realistic movement, skin and muscle simulations – all at 4K resolution. Says President and General Manager Ed Ulbrich: “Our team in Vancouver had developed a process whereby they could simulate secondary animation based on physics as in-the-box, openGL playblasts. They did that really to help the director with a much better experience of judging performance in a scene without having to approve things in all of the usual animation layers. The development of that tool, which is now a tool that we’re rolling out globally, was invaluable to speed the process up, which really worked well as a strategy in our first 4K movie.”

the best image for the screen, and it is easy to get on the same page no matter where the location is.”

OPPOSITE BOTTOM: A scene from Bong Joon-ho’s Okja, for which Method crafted a completely CG main character. (Image © 2017 Netflix)

THE METHOD STUDIOS OF THE FUTURE

So what’s next for Method Studios? Already the company has more than 1,500 employees across its different facilities, with the possibility of further growth. Work on the unification of the various locations also continues, and Method Studios has a suite of large film projects in production such as Aquaman, The New Mutants and Godzilla: King of the Monsters, among several others. Then there’s the ongoing advertising business and newer forays into immersive experiences via its Method EXP group. Ulbrich suggests that Method Studios is aiming to be a tier-one player that competes on a global basis and can “leverage and access incentives and rebates and low-cost labor markets around

OPPOSITE TOP: A fractal world created by the visual effects studio for Doctor Strange. (Image © 2016 Marvel Studios)

the world to provide a very competitive price.” The company is perhaps already there, and has an extra weapon in its arsenal: its existence as a part of Deluxe. As a whole, the larger company can now offer filmmakers services ranging from design and previs all the way through to visual effects, DI and 3D stereo conversion. “We can actually go in and talk about a bigger scope of services for a movie or a show than most places can,” observes Ulbrich. “We can talk about the full suite of services and packages, something that’s very attractive for those studios or networks or producers.”

TOP: For San Andreas, Method Studios helped destroy parts of Los Angeles via destruction effects simulations and a host of digital buildings and vehicles. (Images © 2015 Warner Bros.) BOTTOM: Method Melbourne delivered CG rhinos and a synthetic cavern environment for Jumanji: Welcome to the Jungle. (Image © 2017 Sony Pictures)

SUMMER 2018 VFXVOICE.COM • 61


FILM

UNIFYING THE MARVEL VFX UNIVERSE IN AVENGERS: INFINITY WAR By TREVOR HOGG

All images © 2018 Marvel Studios TOP: A decision was made to have a thick volume of clouds above the Wakanda battlefield that could be used to justify the ever-changing weather conditions in Atlanta. (Photo: Film Frame)

62 • VFXVOICE.COM SUMMER 2018

A decade after Iron Man launched the Marvel Cinematic Universe in 2008, the collection of characters from 18 movies comes together with a two-part Avengers storyline that commences with Infinity War. Overseeing the massive production that involves a cast of over 65 characters are filmmaking siblings Joe and Anthony Russo. “We were tuned into what the MCU was doing from the first film as well as being lifelong fans of the comics,” notes Anthony Russo, who spoke for the duo. “We’ve been on a four-movie journey with our screenwriters Christopher Markus and Stephen McFeely that started with Captain America: The Winter Soldier, evolved into Captain America: Civil War and then into Avengers: Infinity War [and its sequel].” Originally, the plan was to shoot both Avengers movies at the same time. “During prep we felt that it would be better for everybody to focus on a single story at a single time, shoot that and then move on to the second one,” explains Russo. “It became too complex to be moving back and forth. We didn’t want things to get blurred between the two films.” Avengers: Infinity War and its sequel are linked narratively but are also able to stand on their own. Another factor taken into consideration was that Ant-Man and the Wasp and Captain Marvel will be released before the sequel. “Those stories were taken into account when mapping out the second film.” Marvel Studios’ team, consisting of Executive Vice President, Physical Production Victoria Alonso, Visual Effects Supervisor Dan DeLeeuw and Visual Effects Producer Jen Underdahl, worked closely with the Russos to ensure that the digital augmentation and live action have a uniform look. “They take the lead from how

we like to physically execute things,” notes Russo. “You have to plan extensively especially when you’re dealing with a team this big where you have artists located around the world doing work on your film. However, we’re not making a piece of previs but a film, so you have to be able to appreciate what is being offered at the moment and figure out how do you use that to make the best movie that you can make.” There were six weeks of pre-production before principal photography commenced on January 23, 2017 at Pinewood Atlanta Studios. “The script was going through changes as we were in prep, so that was a moving target for everybody,” notes Trent Opaloch, who has served as the cinematographer for all of the Marvel movies directed by the Russos. A three-week break occurred between the two productions, with the sequel commencing on August 10, 2017. “Once [editor] Jeffrey Ford got to the point where he needed Joe and I in the edit room while we were still shooting the second movie,” explains Russo, “we made a decision to shoot 10-hour days instead of traditional 12-hour days in order to have more time after wrapping to spend in the edit room on a daily basis.” Whereas Captain America: The Winter Soldier drew inspiration from 1970s conspiracy thrillers, Infinity War was viewed as an epic heist revolving around a supreme alien being, Thanos (Josh Brolin), trying to steal all of the Infinity Stones to rebalance the universe by annihilating half of the entire population. “It gave us an organizing principle to based the film on in a stylistic or thematic or textured level,” notes Russo. “That concept came from the idea of us thinking of Thanos as the central character. We worked hard on his introduction scene. It was designed early on in the process and the rest of the film was built on that.” “Thanos had to work or there was no movie,” observes DeLeeuw. “He’s a huge driving force in Infinity War, and [there are] more surprises to come in the sequel. We wanted to capture Josh Brolin’s performance as efficiently as possible and did some early mocap tests. The directors worked with him just on the character and selected different pages from the script. The thought was to mocap those line reads and put them into the animation rig to see what we came up with for Thanos. We kept the mocap running and recording in between the takes when Josh was not being the supreme emperor of the universe, but more of a thoughtful, introspective character. Those were the lines picked for our tests, because if we were able to get those subtle performances onto the Thanos model, then we were in a good spot to replicate whatever Josh Brolin came up with.” “There are definitely a lot of CG characters, but with Spider-Man being a normal-sized human being, it was perfectly appropriate that [Spider-Man actor] Tom Holland be described by his personality and [normal human] movement,” remarks DeLeeuw. “When eight-foot tall Thanos interacts with Gamora [who is the height of Zoe Saldana, who plays the character] we had to put Josh Brolin up on a deck so that their eyeline would work. It was important to us that the actors could look into each other’s eyes. We knew that there would be a lot of repairing of the backgrounds to paint out all of the decking that Josh was standing on, but what you got was this awesome performance between two actors.”

TOP: In between takes of the mocap performance of Josh Brolin were placed onto the animation rig for Thanos, resulting in the character being able to express a wide range of emotions. (Photo: Film Frame) MIDDLE: Whenever possible, the actual actor was used rather than their CG counterpart, such as Tom Holland as Spider-Man. (Photo: Chuck Zlotnick) BOTTOM: Robert Downey Jr. wore an LED RT to simulate the interactive light generated by the arc reactor situated in his chest. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 63


FILM

UNIFYING THE MARVEL VFX UNIVERSE IN AVENGERS: INFINITY WAR By TREVOR HOGG

All images © 2018 Marvel Studios TOP: A decision was made to have a thick volume of clouds above the Wakanda battlefield that could be used to justify the ever-changing weather conditions in Atlanta. (Photo: Film Frame)

62 • VFXVOICE.COM SUMMER 2018

A decade after Iron Man launched the Marvel Cinematic Universe in 2008, the collection of characters from 18 movies comes together with a two-part Avengers storyline that commences with Infinity War. Overseeing the massive production that involves a cast of over 65 characters are filmmaking siblings Joe and Anthony Russo. “We were tuned into what the MCU was doing from the first film as well as being lifelong fans of the comics,” notes Anthony Russo, who spoke for the duo. “We’ve been on a four-movie journey with our screenwriters Christopher Markus and Stephen McFeely that started with Captain America: The Winter Soldier, evolved into Captain America: Civil War and then into Avengers: Infinity War [and its sequel].” Originally, the plan was to shoot both Avengers movies at the same time. “During prep we felt that it would be better for everybody to focus on a single story at a single time, shoot that and then move on to the second one,” explains Russo. “It became too complex to be moving back and forth. We didn’t want things to get blurred between the two films.” Avengers: Infinity War and its sequel are linked narratively but are also able to stand on their own. Another factor taken into consideration was that Ant-Man and the Wasp and Captain Marvel will be released before the sequel. “Those stories were taken into account when mapping out the second film.” Marvel Studios’ team, consisting of Executive Vice President, Physical Production Victoria Alonso, Visual Effects Supervisor Dan DeLeeuw and Visual Effects Producer Jen Underdahl, worked closely with the Russos to ensure that the digital augmentation and live action have a uniform look. “They take the lead from how

we like to physically execute things,” notes Russo. “You have to plan extensively especially when you’re dealing with a team this big where you have artists located around the world doing work on your film. However, we’re not making a piece of previs but a film, so you have to be able to appreciate what is being offered at the moment and figure out how do you use that to make the best movie that you can make.” There were six weeks of pre-production before principal photography commenced on January 23, 2017 at Pinewood Atlanta Studios. “The script was going through changes as we were in prep, so that was a moving target for everybody,” notes Trent Opaloch, who has served as the cinematographer for all of the Marvel movies directed by the Russos. A three-week break occurred between the two productions, with the sequel commencing on August 10, 2017. “Once [editor] Jeffrey Ford got to the point where he needed Joe and I in the edit room while we were still shooting the second movie,” explains Russo, “we made a decision to shoot 10-hour days instead of traditional 12-hour days in order to have more time after wrapping to spend in the edit room on a daily basis.” Whereas Captain America: The Winter Soldier drew inspiration from 1970s conspiracy thrillers, Infinity War was viewed as an epic heist revolving around a supreme alien being, Thanos (Josh Brolin), trying to steal all of the Infinity Stones to rebalance the universe by annihilating half of the entire population. “It gave us an organizing principle to based the film on in a stylistic or thematic or textured level,” notes Russo. “That concept came from the idea of us thinking of Thanos as the central character. We worked hard on his introduction scene. It was designed early on in the process and the rest of the film was built on that.” “Thanos had to work or there was no movie,” observes DeLeeuw. “He’s a huge driving force in Infinity War, and [there are] more surprises to come in the sequel. We wanted to capture Josh Brolin’s performance as efficiently as possible and did some early mocap tests. The directors worked with him just on the character and selected different pages from the script. The thought was to mocap those line reads and put them into the animation rig to see what we came up with for Thanos. We kept the mocap running and recording in between the takes when Josh was not being the supreme emperor of the universe, but more of a thoughtful, introspective character. Those were the lines picked for our tests, because if we were able to get those subtle performances onto the Thanos model, then we were in a good spot to replicate whatever Josh Brolin came up with.” “There are definitely a lot of CG characters, but with Spider-Man being a normal-sized human being, it was perfectly appropriate that [Spider-Man actor] Tom Holland be described by his personality and [normal human] movement,” remarks DeLeeuw. “When eight-foot tall Thanos interacts with Gamora [who is the height of Zoe Saldana, who plays the character] we had to put Josh Brolin up on a deck so that their eyeline would work. It was important to us that the actors could look into each other’s eyes. We knew that there would be a lot of repairing of the backgrounds to paint out all of the decking that Josh was standing on, but what you got was this awesome performance between two actors.”

TOP: In between takes of the mocap performance of Josh Brolin were placed onto the animation rig for Thanos, resulting in the character being able to express a wide range of emotions. (Photo: Film Frame) MIDDLE: Whenever possible, the actual actor was used rather than their CG counterpart, such as Tom Holland as Spider-Man. (Photo: Chuck Zlotnick) BOTTOM: Robert Downey Jr. wore an LED RT to simulate the interactive light generated by the arc reactor situated in his chest. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 63


FILM

TOP: One of the joys of the production was the ‘strange alchemy’ created by bringing the four corners of the Marvel Cinematic Universe together. (Photo: Film Frame) BOTTOM: Custom lenses subsequently named Ultra Panatars were built from the ground up by Panavision to accommodate the needs of cinematographer Trent Opaloch. (Photo: Film Frame)

64 • VFXVOICE.COM SUMMER 2018

DeLeeuw believes that the production would make for a great case study. “It’s about five volumes thick! We have learned a lot about doing a couple of films together, and no matter how big you think that they’re going to be when you start, they’re only going to get bigger by the end.” In order to handle the workload that involved creating postvis for Infinity War while producing previs for the sequel, a much bigger team was needed. Advance preparation was critical when developing the CG assets. “We had to plan ahead for big characters like Thanos. We would look to develop different effects or technological concepts to understand how they should be positioned to shoot, knowing that we would do cleanup on them later on in post.” Thirteen vendors were recruited to handle the film’s 2,500 visual effects shots. “There are definitely shared shots,” states DeLeeuw. “Just like casting an actor for a role in a film, you cast visual effects companies for the characters they’re creating. We’ve got Weta Digital and Digital Domain working on Thanos. Then Thanos will interact with other CG characters that are shared with ILM or Framestore or Method Studios. The difficulties are more to do with tracking and keeping everything in line with the look in the shared shots.” Heavy simulation work has been produced by Weta Digital and ILM. “It’s an atmospheric show and that was part of Charles Wood’s production design. It’s the visceral feeling of the landscape with smoke, dust, rain and sparks floating around.” “I sit down with Dan DeLeeuw and Gerardo Ramirez [Previs/ Postvis Supervisor at The Third Floor] to work out what we want to have in the frame, usually based on a previous previsualization or concept art or storyboard,” explains Ford. “We work together to add images to the plate work and keep refining until the vendors take over to make the shots beautiful and give them photoreal qualities.”

“Definitely we were dealing with more non-practical locations on this film than we have on the previous two,” notes Russo. “But we put a lot of time and thought into what those worlds are before getting to the execution phase so that we have all of the details worked out. Charles Wood is a brilliant production designer and even though we are working in fantastical environments he provides us with a lot of tools that the camera can use for framing, the actors can use for their blocking, and we as directors can use to sort everything out.” A variety of places are showcased in Infinity War. “We had a wall of the different places that we would visit in the film, with each of them having their style and color,” remarks DeLeeuw. “We shot plates or texture stills in order to be able to recreate that environment based on what was captured on the stage or in the areas around the studio in Atlanta.” Robert Downey Jr. wears an Iron Man bust with an LED RT controlled by dimmers, which were turned on and off to conserve battery power. “We would be constantly referring back to [Marvel Studios Visual Effects Supervisor] Swen Gillberg about whether or not he wanted interactive light for a specific gag, and if we did what level of detail should there be,” remarks Opaloch. “It is hard to determine on these movies what the end result will want to be when we’re shooting because things change. If there’s a big lighting event that happens like an explosion, we would almost always do a plate shot to capture that. It could be used as an actual element in the composite, or as a reference for the artists. We try to do a general interaction on the actor’s face that allows it to be perfectly sync’d up with the final version.” Lighting equipment consisted of HMIs, tungsten and LEDs. “Dimmer board operator Scott Barnes helped gaffer Jeff Murrell and I to design the lighting schemes and plans to create otherworldly effects for when they were up in space, on another planet, or going through some crazy event happening here on Earth,” remarks Opaloch. “One of the biggest challenges that we had was lighting up Edinburgh [Scotland] at night. They used to heat the buildings with a lot of coal fires, so there is a thick black soot that covers everything and absorbs the light. We had hundreds of SkyPanels and an extensive pre-light schedule just to be able to see anything in that shot at night. We were lighting from the rooftop of

“We kept the mocap running and recording in between the takes when Josh [Brolin] was not being the supreme emperor of the universe, but more of a thoughtful introspective character. Those were the lines picked for our tests because if we were able to get those subtle performances onto the Thanos model, then we were in a good spot to replicate whatever Josh Brolin came up with.” —Dan DeLeeuw, VFX Supervisor, Marvel Studios

TOP: Spider-man (Tom Holland) in Avengers: Infinity War. “The Russos shoot action in a character-driven, narrative way,” notes the film’s editor, Jeff Ford. (Photo: Film Frame) MIDDLE: Thanos is the central character and his introductory scene laid the foundation for the storytelling. (Photo: Film Frame) BOTTOM: Scenes with the Guardians of the Galaxy characters have deep color saturation and dynamics because they go through some crazy environments. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 65


FILM

TOP: One of the joys of the production was the ‘strange alchemy’ created by bringing the four corners of the Marvel Cinematic Universe together. (Photo: Film Frame) BOTTOM: Custom lenses subsequently named Ultra Panatars were built from the ground up by Panavision to accommodate the needs of cinematographer Trent Opaloch. (Photo: Film Frame)

64 • VFXVOICE.COM SUMMER 2018

DeLeeuw believes that the production would make for a great case study. “It’s about five volumes thick! We have learned a lot about doing a couple of films together, and no matter how big you think that they’re going to be when you start, they’re only going to get bigger by the end.” In order to handle the workload that involved creating postvis for Infinity War while producing previs for the sequel, a much bigger team was needed. Advance preparation was critical when developing the CG assets. “We had to plan ahead for big characters like Thanos. We would look to develop different effects or technological concepts to understand how they should be positioned to shoot, knowing that we would do cleanup on them later on in post.” Thirteen vendors were recruited to handle the film’s 2,500 visual effects shots. “There are definitely shared shots,” states DeLeeuw. “Just like casting an actor for a role in a film, you cast visual effects companies for the characters they’re creating. We’ve got Weta Digital and Digital Domain working on Thanos. Then Thanos will interact with other CG characters that are shared with ILM or Framestore or Method Studios. The difficulties are more to do with tracking and keeping everything in line with the look in the shared shots.” Heavy simulation work has been produced by Weta Digital and ILM. “It’s an atmospheric show and that was part of Charles Wood’s production design. It’s the visceral feeling of the landscape with smoke, dust, rain and sparks floating around.” “I sit down with Dan DeLeeuw and Gerardo Ramirez [Previs/ Postvis Supervisor at The Third Floor] to work out what we want to have in the frame, usually based on a previous previsualization or concept art or storyboard,” explains Ford. “We work together to add images to the plate work and keep refining until the vendors take over to make the shots beautiful and give them photoreal qualities.”

“Definitely we were dealing with more non-practical locations on this film than we have on the previous two,” notes Russo. “But we put a lot of time and thought into what those worlds are before getting to the execution phase so that we have all of the details worked out. Charles Wood is a brilliant production designer and even though we are working in fantastical environments he provides us with a lot of tools that the camera can use for framing, the actors can use for their blocking, and we as directors can use to sort everything out.” A variety of places are showcased in Infinity War. “We had a wall of the different places that we would visit in the film, with each of them having their style and color,” remarks DeLeeuw. “We shot plates or texture stills in order to be able to recreate that environment based on what was captured on the stage or in the areas around the studio in Atlanta.” Robert Downey Jr. wears an Iron Man bust with an LED RT controlled by dimmers, which were turned on and off to conserve battery power. “We would be constantly referring back to [Marvel Studios Visual Effects Supervisor] Swen Gillberg about whether or not he wanted interactive light for a specific gag, and if we did what level of detail should there be,” remarks Opaloch. “It is hard to determine on these movies what the end result will want to be when we’re shooting because things change. If there’s a big lighting event that happens like an explosion, we would almost always do a plate shot to capture that. It could be used as an actual element in the composite, or as a reference for the artists. We try to do a general interaction on the actor’s face that allows it to be perfectly sync’d up with the final version.” Lighting equipment consisted of HMIs, tungsten and LEDs. “Dimmer board operator Scott Barnes helped gaffer Jeff Murrell and I to design the lighting schemes and plans to create otherworldly effects for when they were up in space, on another planet, or going through some crazy event happening here on Earth,” remarks Opaloch. “One of the biggest challenges that we had was lighting up Edinburgh [Scotland] at night. They used to heat the buildings with a lot of coal fires, so there is a thick black soot that covers everything and absorbs the light. We had hundreds of SkyPanels and an extensive pre-light schedule just to be able to see anything in that shot at night. We were lighting from the rooftop of

“We kept the mocap running and recording in between the takes when Josh [Brolin] was not being the supreme emperor of the universe, but more of a thoughtful introspective character. Those were the lines picked for our tests because if we were able to get those subtle performances onto the Thanos model, then we were in a good spot to replicate whatever Josh Brolin came up with.” —Dan DeLeeuw, VFX Supervisor, Marvel Studios

TOP: Spider-man (Tom Holland) in Avengers: Infinity War. “The Russos shoot action in a character-driven, narrative way,” notes the film’s editor, Jeff Ford. (Photo: Film Frame) MIDDLE: Thanos is the central character and his introductory scene laid the foundation for the storytelling. (Photo: Film Frame) BOTTOM: Scenes with the Guardians of the Galaxy characters have deep color saturation and dynamics because they go through some crazy environments. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 65


FILM

TOP: Atmospherics such as smoke, dust, rain and sparks interacting with the landscape were part of the production design devised by Charles Wood. (Photo: Film Frame) BOTTOM: 850 hours of footage were captured for the back-to-back production, with six to eight weeks of additional photography for scenes that were reinstated for the sequel to Infinity War. (Photo: Film Frame)

66 • VFXVOICE.COM SUMMER 2018

a hotel located next to the Edinburgh train station. It was myself, Jeff Murrell and the electric team remotely casting whatever color light we wanted on the entire city. That was impressive to see.” A close partnership has developed between the stunts, special effects and visual effects departments. “We have all worked together since Winter Soldier,” notes DeLeeuw. “The conversation is more about what can you do to hide whatever wires or pads that they needed to work with the stunt person. The special effects have gotten interesting with [Special Effects Supervisor] Dan Sudick; whereas there are the traditional things with the explosions, wirework and rigging, a big part of his job is fabrication. Dan has a machine shop that can cut all of the sheets of aluminum, wood, and foam so anything we ask him to build he can construct for us. Dan fabricated us a Thanos torso that he cut out of foam based on our CG model, made a cast and created a rubberized version that Josh Brolin could wear. It was the craziest looking thing with Brolin’s arms sticking out of the side, but it enabled characters to walk up and touch him in a realistic way.” “On Winter Soldier, we worked hard with Markus and McFeely in pre-production to make sure that the script was exactly what we were going to execute,” notes Russo. “There’s a remarkable consistency between the production draft, what we shot, the first edit, and what came out. These movies are different in the sense that we’re dealing with more characters and storylines, so we had more choices to make in terms of how the stories were woven together.” The visual spectacle doesn’t overshadow the importance of the characters. “Joe and I have always been actor-oriented as directors. We’re sensitive to what performers do, and look for the

nuances in that to help guide us in our storytelling. As much as the action excites us to choreograph, execute and conceive, the heart of these films is always the character work, and that’s where we put the most time and effort, and get the most thrill from.” The biggest challenge for Avengers: Infinity War was the size of the cast. “There are so many high-level actors working in the film that making sure they were available when we needed them was complicated,” notes Russo. “The value of it is that we get to have a cast the likes of which has never been seen before in a movie. After being teased for so long in the MCU, the introduction of Thanos is a special scene for us.” DeLeeuw is pleased with the range of performance that has been achieved with Thanos, including menace, sadness and anger. “We’ve got the big battles and a lot of the things that we’re famous for. However, we also have this incredibly evil character who is sympathetic. In some ways you’re almost rooting for the villain! One of the considerations for the show was how to marry all of the four corners of the Marvel Universe in what the directors would call ‘strange alchemy.’” DeLeeuw adds, “There is nothing small in Infinity War. Everybody gets a moment to shine, and all of the sequences are great.”

TOP LEFT: While the 2,500 visual effects shots were in post-production for Infinity War, previs needed to be created for the sequel, which was still shooting. (Photo: Film Frame) TOP RIGHT: Numerous on-set props were replaced digitally, but were critical in providing believable interaction with actors. (Photo: Film Frame) BOTTOM LEFT: Thanos is on mission to collect all of the Infinity Stones, including the Mind Stone found in the forehead of Vision. (Photo: Film Frame) BOTTOM RIGHT: Never one to stop inventing and revising, Tony Stark (Robert Downey Jr.) gives his Iron Man suit yet another upgrade. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 67


FILM

TOP: Atmospherics such as smoke, dust, rain and sparks interacting with the landscape were part of the production design devised by Charles Wood. (Photo: Film Frame) BOTTOM: 850 hours of footage were captured for the back-to-back production, with six to eight weeks of additional photography for scenes that were reinstated for the sequel to Infinity War. (Photo: Film Frame)

66 • VFXVOICE.COM SUMMER 2018

a hotel located next to the Edinburgh train station. It was myself, Jeff Murrell and the electric team remotely casting whatever color light we wanted on the entire city. That was impressive to see.” A close partnership has developed between the stunts, special effects and visual effects departments. “We have all worked together since Winter Soldier,” notes DeLeeuw. “The conversation is more about what can you do to hide whatever wires or pads that they needed to work with the stunt person. The special effects have gotten interesting with [Special Effects Supervisor] Dan Sudick; whereas there are the traditional things with the explosions, wirework and rigging, a big part of his job is fabrication. Dan has a machine shop that can cut all of the sheets of aluminum, wood, and foam so anything we ask him to build he can construct for us. Dan fabricated us a Thanos torso that he cut out of foam based on our CG model, made a cast and created a rubberized version that Josh Brolin could wear. It was the craziest looking thing with Brolin’s arms sticking out of the side, but it enabled characters to walk up and touch him in a realistic way.” “On Winter Soldier, we worked hard with Markus and McFeely in pre-production to make sure that the script was exactly what we were going to execute,” notes Russo. “There’s a remarkable consistency between the production draft, what we shot, the first edit, and what came out. These movies are different in the sense that we’re dealing with more characters and storylines, so we had more choices to make in terms of how the stories were woven together.” The visual spectacle doesn’t overshadow the importance of the characters. “Joe and I have always been actor-oriented as directors. We’re sensitive to what performers do, and look for the

nuances in that to help guide us in our storytelling. As much as the action excites us to choreograph, execute and conceive, the heart of these films is always the character work, and that’s where we put the most time and effort, and get the most thrill from.” The biggest challenge for Avengers: Infinity War was the size of the cast. “There are so many high-level actors working in the film that making sure they were available when we needed them was complicated,” notes Russo. “The value of it is that we get to have a cast the likes of which has never been seen before in a movie. After being teased for so long in the MCU, the introduction of Thanos is a special scene for us.” DeLeeuw is pleased with the range of performance that has been achieved with Thanos, including menace, sadness and anger. “We’ve got the big battles and a lot of the things that we’re famous for. However, we also have this incredibly evil character who is sympathetic. In some ways you’re almost rooting for the villain! One of the considerations for the show was how to marry all of the four corners of the Marvel Universe in what the directors would call ‘strange alchemy.’” DeLeeuw adds, “There is nothing small in Infinity War. Everybody gets a moment to shine, and all of the sequences are great.”

TOP LEFT: While the 2,500 visual effects shots were in post-production for Infinity War, previs needed to be created for the sequel, which was still shooting. (Photo: Film Frame) TOP RIGHT: Numerous on-set props were replaced digitally, but were critical in providing believable interaction with actors. (Photo: Film Frame) BOTTOM LEFT: Thanos is on mission to collect all of the Infinity Stones, including the Mind Stone found in the forehead of Vision. (Photo: Film Frame) BOTTOM RIGHT: Never one to stop inventing and revising, Tony Stark (Robert Downey Jr.) gives his Iron Man suit yet another upgrade. (Photo: Film Frame)

SUMMER 2018 VFXVOICE.COM • 67


FILM

HOW THE AVENGERS’ SUPER POWERS BECAME EVEN MORE SUPER By IAN FAILES

The cataclysmic ending of Avengers: Infinity War sees the apparent loss of several key characters in the Marvel Cinematic Universe at the hands of Thanos. Leading up to that moment, a number of the Avengers try, in vain, to take on the villain on his home planet of Titan. Weta Digital handled these Titan scenes, which included providing Doctor Strange, Iron Man and Spider-Man with a raft of impressive new superhero abilities, from golden lighting to nano-particle-filled suits. VFX Voice sat down with Weta Digital Visual Effects Supervisor Matt Aitken to discuss how the studio amped up the Avengers’ super powers. VFX Voice: Let’s talk about Doctor Strange’s golden lightning. How did you tackle that? Aitken: Doctor Strange is able to effectively generate streams of lightning, and, because he knows if he just fires them straight at Thanos that he will sidestep them, he ricochets them off some large floating rocks that Thanos is standing on. The golden lightning has so much energy that whenever it hits a rock, the rock explodes into lava, so we had individual fluid sims running on each of these impacts, with lava, sparks and smoke. It was done with a combination of Houdini and Synapse, our in-house tool. We had a great effects team working on the show, and it was a big effects show. I asked the head of the effects department how [Avengers: Infinity War] compared with Guardians of the Galaxy Vol. 2, and he said, shot for shot, [Avengers: Infinity War] was about 50% more complex on the effects side. VFX Voice: Iron Man is wearing his new nano-particle suit in this battle. What effects challenges were involved here? Aitken: The way the fight proceeds is that, initially, Iron Man is trying out all these different weapons; we built 12 different weapons. We got concept art for about four of them, and so we used that as a guide and worked up the other eight or so ourselves. We could have done concept art for those, but we actually just worked them up as geometry and got them approved as geometric turntables. Part of the way those weapons form is that the nano-particles are moving around on the suit, and at some point Thanos is getting the upper hand over Tony. One of the ways he’s achieving that is by punching chunks of the nano-tech off the suit. He’ll smash Iron Man’s helmet, and large chunks of the helmet will fly away. Once they get beyond a certain distance from the suit, they can’t make it back, so the volume of nano material that Iron Man has to work with gets gradually reduced. VFX Voice: How was the Thanos and Iron Man fight filmed? Aitken: Robert Downey Jr. did that on stage, either with Josh Brolin as Thanos or with a stuntie standing in for Josh. They wore a cardboard shoulder strap version of Thanos’ bust of the shoulders and head, so that there was an eyeline match that could be referred to. Essentially, Robert just wore the suit that he wears under the

68 • VFXVOICE.COM SUMMER 2018

Iron Man suit, but we ended up invariably roto’ing his head off and replacing everything else, which was easier than it sounds, because we were replacing the environment anyway. VFX Voice: Spider-Man is in the battle, too, with a new suit. What were the visual effects components here? Aitken: He’s got the ‘Iron Spider’ suit, which has similar technology to Iron Man’s bleeding-edge nano-tech suit, so it had some similarities in the look, but it had to be a distinctive Spider-Man suit as well. He’s got the hood on, but he’s still often talking. We made the eyes in the suit a little bit like the blades in a camera iris. We built eye blend-shape targets for the eye expressions, but then our facial rigging team essentially wrapped cloth over his face and went through all the different phonemes and all the different blend-shape targets and created a cloth-wrapped version of all those shapes. Then those became the facial-animation rig for the hood, and so it meant that as far as the animators were concerned, they had exactly the same controls to animate the rest of Spidey’s face as they would have for the whole of Spidey’s face. VFX Voice: After Thanos snaps his fingers, Strange and Spider-Man are among those who disintegrate. Can you talk about that effect? Aitken: The way that this visually manifests itself is that they ‘blip out.’ It had to look violent, but not too violent. It had to look final – the brief we got was that this can’t look like you can come back from it. Essentially, it’s like a wave that passes over these characters, and they turn to – not ash, but a kind of dust and flakes that blow away in the wind. VFX Voice: What reference did you look to for that ashy feel? Aitken: We didn’t want it to be like anything else – we wanted it to be something different. It’s [comprised of] procedural mattes that are able to move across the body, so we needed really good digital doubles. You want to avoid transitions where you can, so it’s easier if they’re just digital all the way through. We had an initial matte, which is where the flakes are starting to form, and then another matte, which was where the flakes start to drift off. Often, these are coming through really fast, because it all has to happen in quite a constrained time frame. There’s a beat in the Strange blip-out where his face almost collapses in on itself, and there’s a frame where it looks like a skull, because his eyes have sunken in on themselves, so it’s quite creepy. At one point we had introduced, as part of one of these blips, an energy, like an internal green glow that was illuminating all the flakes, and we had that going for a while. But then the studio reviewed it and decided it looked too much like they were transporting, like they were beaming up or beaming down. They didn’t want it to look like these characters were going somewhere else. They wanted it to have that finality that we were talking about, because this is how part one ends: on a rather somber note.

OPPOSITE PAGE TOP: A final shot of Tony Stark’s new shield in use. OPPOSITE PAGE MIDDLE: Weta Digital’s concept frame for Doctor Strange’s golden lightning. OPPOSITE PAGE BOTTOM: Final shot of Doctor Strange. THIS PAGE TOP: Effects simulations made the nano-particle components of Iron Man’s new suit possible. SECOND FROM TOP: A final shot of Iron Man’s suit in action. THIRD FROM TOP: Final shot of Spider-Man in his new Iron Spider suit. BOTTOM: Here’s Weta Digital’s effects pass of Iron Man’s shield.

SUMMER 2018 VFXVOICE.COM • 69


FILM

HOW THE AVENGERS’ SUPER POWERS BECAME EVEN MORE SUPER By IAN FAILES

The cataclysmic ending of Avengers: Infinity War sees the apparent loss of several key characters in the Marvel Cinematic Universe at the hands of Thanos. Leading up to that moment, a number of the Avengers try, in vain, to take on the villain on his home planet of Titan. Weta Digital handled these Titan scenes, which included providing Doctor Strange, Iron Man and Spider-Man with a raft of impressive new superhero abilities, from golden lighting to nano-particle-filled suits. VFX Voice sat down with Weta Digital Visual Effects Supervisor Matt Aitken to discuss how the studio amped up the Avengers’ super powers. VFX Voice: Let’s talk about Doctor Strange’s golden lightning. How did you tackle that? Aitken: Doctor Strange is able to effectively generate streams of lightning, and, because he knows if he just fires them straight at Thanos that he will sidestep them, he ricochets them off some large floating rocks that Thanos is standing on. The golden lightning has so much energy that whenever it hits a rock, the rock explodes into lava, so we had individual fluid sims running on each of these impacts, with lava, sparks and smoke. It was done with a combination of Houdini and Synapse, our in-house tool. We had a great effects team working on the show, and it was a big effects show. I asked the head of the effects department how [Avengers: Infinity War] compared with Guardians of the Galaxy Vol. 2, and he said, shot for shot, [Avengers: Infinity War] was about 50% more complex on the effects side. VFX Voice: Iron Man is wearing his new nano-particle suit in this battle. What effects challenges were involved here? Aitken: The way the fight proceeds is that, initially, Iron Man is trying out all these different weapons; we built 12 different weapons. We got concept art for about four of them, and so we used that as a guide and worked up the other eight or so ourselves. We could have done concept art for those, but we actually just worked them up as geometry and got them approved as geometric turntables. Part of the way those weapons form is that the nano-particles are moving around on the suit, and at some point Thanos is getting the upper hand over Tony. One of the ways he’s achieving that is by punching chunks of the nano-tech off the suit. He’ll smash Iron Man’s helmet, and large chunks of the helmet will fly away. Once they get beyond a certain distance from the suit, they can’t make it back, so the volume of nano material that Iron Man has to work with gets gradually reduced. VFX Voice: How was the Thanos and Iron Man fight filmed? Aitken: Robert Downey Jr. did that on stage, either with Josh Brolin as Thanos or with a stuntie standing in for Josh. They wore a cardboard shoulder strap version of Thanos’ bust of the shoulders and head, so that there was an eyeline match that could be referred to. Essentially, Robert just wore the suit that he wears under the

68 • VFXVOICE.COM SUMMER 2018

Iron Man suit, but we ended up invariably roto’ing his head off and replacing everything else, which was easier than it sounds, because we were replacing the environment anyway. VFX Voice: Spider-Man is in the battle, too, with a new suit. What were the visual effects components here? Aitken: He’s got the ‘Iron Spider’ suit, which has similar technology to Iron Man’s bleeding-edge nano-tech suit, so it had some similarities in the look, but it had to be a distinctive Spider-Man suit as well. He’s got the hood on, but he’s still often talking. We made the eyes in the suit a little bit like the blades in a camera iris. We built eye blend-shape targets for the eye expressions, but then our facial rigging team essentially wrapped cloth over his face and went through all the different phonemes and all the different blend-shape targets and created a cloth-wrapped version of all those shapes. Then those became the facial-animation rig for the hood, and so it meant that as far as the animators were concerned, they had exactly the same controls to animate the rest of Spidey’s face as they would have for the whole of Spidey’s face. VFX Voice: After Thanos snaps his fingers, Strange and Spider-Man are among those who disintegrate. Can you talk about that effect? Aitken: The way that this visually manifests itself is that they ‘blip out.’ It had to look violent, but not too violent. It had to look final – the brief we got was that this can’t look like you can come back from it. Essentially, it’s like a wave that passes over these characters, and they turn to – not ash, but a kind of dust and flakes that blow away in the wind. VFX Voice: What reference did you look to for that ashy feel? Aitken: We didn’t want it to be like anything else – we wanted it to be something different. It’s [comprised of] procedural mattes that are able to move across the body, so we needed really good digital doubles. You want to avoid transitions where you can, so it’s easier if they’re just digital all the way through. We had an initial matte, which is where the flakes are starting to form, and then another matte, which was where the flakes start to drift off. Often, these are coming through really fast, because it all has to happen in quite a constrained time frame. There’s a beat in the Strange blip-out where his face almost collapses in on itself, and there’s a frame where it looks like a skull, because his eyes have sunken in on themselves, so it’s quite creepy. At one point we had introduced, as part of one of these blips, an energy, like an internal green glow that was illuminating all the flakes, and we had that going for a while. But then the studio reviewed it and decided it looked too much like they were transporting, like they were beaming up or beaming down. They didn’t want it to look like these characters were going somewhere else. They wanted it to have that finality that we were talking about, because this is how part one ends: on a rather somber note.

OPPOSITE PAGE TOP: A final shot of Tony Stark’s new shield in use. OPPOSITE PAGE MIDDLE: Weta Digital’s concept frame for Doctor Strange’s golden lightning. OPPOSITE PAGE BOTTOM: Final shot of Doctor Strange. THIS PAGE TOP: Effects simulations made the nano-particle components of Iron Man’s new suit possible. SECOND FROM TOP: A final shot of Iron Man’s suit in action. THIRD FROM TOP: Final shot of Spider-Man in his new Iron Spider suit. BOTTOM: Here’s Weta Digital’s effects pass of Iron Man’s shield.

SUMMER 2018 VFXVOICE.COM • 69


FILM

BLACK PANTHER: BRINGING THE BATTLE OF WAKANDA TO LIFE By IAN FAILES

The third act of Ryan Coogler’s Black Panther features a brutal battle set in the Mount Bashenga area of the mystical nation of Wakanda. The action pits the king of Wakanda, T’Challa (Chadwick Boseman) – aka Black Panther – against his adversary Erik Killmonger (Michael B. Jordan), and involves a host of Wakandan tribes. To help bring that battle to the big screen, overall Visual Effects Supervisor Geoffrey Baumann enlisted Method Studios to build out the Mount Bashenga environment digitally, and to render several digital doubles, vehicles, creatures and weapon effects for the sequence. VFX Voice sat down with Method Studios Visual Effects Supervisor Andy Brown and Associate Visual Effects Supervisor Todd Sheridan Perry in Vancouver to find out how orchestrating the massive melee was handled.

platform that the ships are sitting on and that Killmonger faces off against the Dora Milaje warriors on.” “They had eight little rigs there,” adds Perry, “with huge bluescreens on them that they would just continually shift around depending on which angle we were facing to try to get as much as possible there in camera. We were doing a ton of rotoscoping, not just of people, but also shadows, because you’re outside so the sunlight changes constantly.” In addition to the Paarl Rock inspiration for the spire area, Method Studios also had to build out a vaster area. Says Brown: “We spent a lot of time developing that from the main shoot that was happening. It was an extension out from the set, so the set was LIDAR scanned and then we built beyond that. The spire took a lot of development. We had some early previs to go off of, but the actual architecture or structure had to be fleshed out more. Our build went right down into the mine shaft. Then the views behind the spire with the big crater were inspired more by the areas around the Congo. And there was a back story here in terms of the meteorite crash, which contains the vibranium that the Wakandans use for everything.” VIBRANIUM INSPIRATIONS

Seen in the battle are a myriad of weapons, most based on vibranium technology. Method Studios referenced cymatics to

TOP LEFT: A wide view of the Mount Bashenga environment crafted by Method Studios. TOP RIGHT: For shots below the spire, including where Black Panther and Killmonger tumble down the shaft, Method Studios worked on a vast mine environment. This was shared with another vendor, Double Negative, for scenes of the characters fighting on the train tracks. BOTTOM LEFT: Killmonger, who subverts Black Panther as king, relies on a suit with powerful nanotechnology. Method Studios created the CG suits based on scans and photogrammetry of the on-set suits, then worked with meticulous detail in modeling and texturing. The assets were shared with several other vendors on the film. BOTTOM RIGHT: Digital doubles, vehicles and background environments featured heavily in the battle.

BATTLE PLANS

All images © 2018 Marvel Studios TOP: Black Panther uses kinetic energy absorbed in his vibranium suit to create an energy blast. Method Studios handled effects such as these in Houdini.

70 • VFXVOICE.COM SUMMER 2018

The area Method Studios was required to build reached 3,600 square kilometers, centering on a giant spire that heralds the top of a vibranium mine shaft. Around that zone, artists crafted a digital environment somewhat inspired by the Paarl Rock area in South Africa, a huge granite rock formation with rounded outcrops. Into that environment, the studio inserted live-action photography from two shooting sessions that took place in Georgia, and also incorporated several CG characters, creatures and flying ships. “The set was built on a ranch in Georgia,” outlines Perry. “The battle scenes were split up into two shoots. I was on the principal one and Andy went on the other re-shoot once the scope of the battle increased. They had built the bottom part of the set, the

SUMMER 2018 VFXVOICE.COM • 71


FILM

BLACK PANTHER: BRINGING THE BATTLE OF WAKANDA TO LIFE By IAN FAILES

The third act of Ryan Coogler’s Black Panther features a brutal battle set in the Mount Bashenga area of the mystical nation of Wakanda. The action pits the king of Wakanda, T’Challa (Chadwick Boseman) – aka Black Panther – against his adversary Erik Killmonger (Michael B. Jordan), and involves a host of Wakandan tribes. To help bring that battle to the big screen, overall Visual Effects Supervisor Geoffrey Baumann enlisted Method Studios to build out the Mount Bashenga environment digitally, and to render several digital doubles, vehicles, creatures and weapon effects for the sequence. VFX Voice sat down with Method Studios Visual Effects Supervisor Andy Brown and Associate Visual Effects Supervisor Todd Sheridan Perry in Vancouver to find out how orchestrating the massive melee was handled.

platform that the ships are sitting on and that Killmonger faces off against the Dora Milaje warriors on.” “They had eight little rigs there,” adds Perry, “with huge bluescreens on them that they would just continually shift around depending on which angle we were facing to try to get as much as possible there in camera. We were doing a ton of rotoscoping, not just of people, but also shadows, because you’re outside so the sunlight changes constantly.” In addition to the Paarl Rock inspiration for the spire area, Method Studios also had to build out a vaster area. Says Brown: “We spent a lot of time developing that from the main shoot that was happening. It was an extension out from the set, so the set was LIDAR scanned and then we built beyond that. The spire took a lot of development. We had some early previs to go off of, but the actual architecture or structure had to be fleshed out more. Our build went right down into the mine shaft. Then the views behind the spire with the big crater were inspired more by the areas around the Congo. And there was a back story here in terms of the meteorite crash, which contains the vibranium that the Wakandans use for everything.” VIBRANIUM INSPIRATIONS

Seen in the battle are a myriad of weapons, most based on vibranium technology. Method Studios referenced cymatics to

TOP LEFT: A wide view of the Mount Bashenga environment crafted by Method Studios. TOP RIGHT: For shots below the spire, including where Black Panther and Killmonger tumble down the shaft, Method Studios worked on a vast mine environment. This was shared with another vendor, Double Negative, for scenes of the characters fighting on the train tracks. BOTTOM LEFT: Killmonger, who subverts Black Panther as king, relies on a suit with powerful nanotechnology. Method Studios created the CG suits based on scans and photogrammetry of the on-set suits, then worked with meticulous detail in modeling and texturing. The assets were shared with several other vendors on the film. BOTTOM RIGHT: Digital doubles, vehicles and background environments featured heavily in the battle.

BATTLE PLANS

All images © 2018 Marvel Studios TOP: Black Panther uses kinetic energy absorbed in his vibranium suit to create an energy blast. Method Studios handled effects such as these in Houdini.

70 • VFXVOICE.COM SUMMER 2018

The area Method Studios was required to build reached 3,600 square kilometers, centering on a giant spire that heralds the top of a vibranium mine shaft. Around that zone, artists crafted a digital environment somewhat inspired by the Paarl Rock area in South Africa, a huge granite rock formation with rounded outcrops. Into that environment, the studio inserted live-action photography from two shooting sessions that took place in Georgia, and also incorporated several CG characters, creatures and flying ships. “The set was built on a ranch in Georgia,” outlines Perry. “The battle scenes were split up into two shoots. I was on the principal one and Andy went on the other re-shoot once the scope of the battle increased. They had built the bottom part of the set, the

SUMMER 2018 VFXVOICE.COM • 71


FILM

“[The vibranium weapons] were based on vibrations and vibranium itself being able to absorb energy. The idea is that it kind of builds up and up, and you can use it, then release it. And that was all based itself on sound energy waves and the patterns that those waves form.” —Andy Brown, Visual Effects Supervisor, Method Studios develop the way weapons looked and operated, as well as engines and other power-emitting devices. “Vibranium technology was based on vibrations and vibranium itself being able to absorb energy,” explains Brown. “The idea is that it kind of builds up and up, and you can use it, then release it. And that was all based itself on sound energy waves and the patterns that those waves form.” To research that energy wave look, Method Studios looked at what happens when a speaker is positioned near water, or when dust particles or sand are placed directly on a reverberating speaker. “Then,” says Perry, “our Houdini team created a mathematical solution that would give us these patterns, and we would use that to drive the effects. That was our starting point for any kind of thing we were developing as a vibranium weapon – we’d need to figure out how to involve cymatics. We would even have cymatic patterns in things like a ship landing. The dust on the ground would shift and form these patterns underneath.” CG WAKANDANS

TOP TO BOTTOM: Tribe members generate shields with their blankets. Method Studios carried out extensive look development on the force-field effect. Black Panther mid-battle. During the melee, the character relied on his super strength, but also the kinetic-energy attributes of his suit. Method Studios’ contributions to the sequence also included several flying vehicles, such as these dragonfly-like ships.

72 • VFXVOICE.COM SUMMER 2018

Digital doubles were a large part of Method Studios’ work on the battle sequence. Black Panther and Killmonger were two of the most ‘hero’ CG assets – a large part of the work here involved crafting their nanotechnology-based suits. Modeling and texturing the Panther suits involved replicating a multitude of tiny triangle formations and Wakandan glyphs via intricate texturing. “Our asset team is pretty amazing,” comments Perry. “They put those together really, really fast as far as the textures and everything. We had an amazing amount of textures from Clear Angle, who did the original scans of the on-set suits, and then we had the photogrammetry photography as well as a texture shoot that they did, and then we had them send the actual costumes to us so that we could get additional detail if we needed to.” In addition to the hero actors, there were also digi-doubles required for tribe members in crowd shots and even front-andcenter clashes. Method roto-animated the fight choreography from the live-action photography in order to acquire distinctive styles of fighting among the tribes. “And then we also did a couple mo-cap shoots where we had a number of the stunt people who were at the original shoot,” says Perry. “We devised fight antics by giving them 10 or 15 minutes to work through a choreography, and that gave us a collection of all of these pieces.”

When Rhinos Attack One of the interesting challenges in the battle turned out to be ‘battle rhinos’ wearing armor. These began as CG assets shared from Method Studios’ sister company Iloura (now part of Method), which had just delivered VFX for a rhino-filled scene in Jumanji: Welcome to the Jungle. Method Studios had also recently worked on Okja, a film that featured a creature with a somewhat similar skin quality to the rhinos. “All of the animators and the modelers and so forth who had spent a year developing the Okja character now had all of this experience doing these quadrupeds of that size and then the riggers and tech animators had the muscle systems already in place,” says Perry. “And so some of our artists in the asset department, they just moved onto it and they just kind of knew what to do.” On set, production mostly used a stick placeholder in lieu of any real rhinos. A number of Clydesdale horses were also used, with riders, to act as stand-ins for W’Kabi (Daniel Kaluuya), who rides one of the creatures. “It gives people eyelines and that type of thing, but in some cases we just used a really, really well developed W’Kabi digi-double,” notes Perry. “We grabbed his head off of the performance plate and put it onto the CG body, so you’ve got the natural performance from Daniel, and then you had animation doing all of the step down until he’s just coming off the horse and then you match that.”

“During those mo-cap sessions,” notes Perry, “we’d say, ‘Now, this is a Dora, she’s taking on three guys at one time. So she comes in through here, through here, through here.’ They’d then run through it a couple of times, and that gave us a collection to use for crowds. The crowds would be for the background action. Then our keyframe animation was mid-ground, and live-action was mostly close to camera. We just mixed and matched these to try to get them to work.” BATTLE ICONS

A shot that has turned out to be one of the iconic moments of the battle has Shuri (Letitia Wright), Black Panther’s sister, wielding a set of gauntlets on her hands, which shoot out some high-tech weaponry. “Unofficially, we called them ‘kitten mittens’,” says Perry. “Our modeler is very, very proud of these things. There were scans of the practical ones and there was an opened and closed version of them, and so we just took that to be, oh, this thing kind of activates and opens up. So in the model he made them transform and had these little animations in there. “And then we sent it to production,” continues Perry, “and the word back was, ‘Wow, this is really cool. We’re going to make something for this moment.’ That was really cool, to contribute to the story like that.”

TOP TO BOTTOM: The end of the battle nears. The spire area seen at a different time of day. One of Method Studios’ digital rhinos goes on the attack. The CG creatures benefited in particular from earlier work the company’s sister studio, Iloura, had done for Jumanji: Welcome to the Jungle.

SUMMER 2018 VFXVOICE.COM • 73


FILM

“[The vibranium weapons] were based on vibrations and vibranium itself being able to absorb energy. The idea is that it kind of builds up and up, and you can use it, then release it. And that was all based itself on sound energy waves and the patterns that those waves form.” —Andy Brown, Visual Effects Supervisor, Method Studios develop the way weapons looked and operated, as well as engines and other power-emitting devices. “Vibranium technology was based on vibrations and vibranium itself being able to absorb energy,” explains Brown. “The idea is that it kind of builds up and up, and you can use it, then release it. And that was all based itself on sound energy waves and the patterns that those waves form.” To research that energy wave look, Method Studios looked at what happens when a speaker is positioned near water, or when dust particles or sand are placed directly on a reverberating speaker. “Then,” says Perry, “our Houdini team created a mathematical solution that would give us these patterns, and we would use that to drive the effects. That was our starting point for any kind of thing we were developing as a vibranium weapon – we’d need to figure out how to involve cymatics. We would even have cymatic patterns in things like a ship landing. The dust on the ground would shift and form these patterns underneath.” CG WAKANDANS

TOP TO BOTTOM: Tribe members generate shields with their blankets. Method Studios carried out extensive look development on the force-field effect. Black Panther mid-battle. During the melee, the character relied on his super strength, but also the kinetic-energy attributes of his suit. Method Studios’ contributions to the sequence also included several flying vehicles, such as these dragonfly-like ships.

72 • VFXVOICE.COM SUMMER 2018

Digital doubles were a large part of Method Studios’ work on the battle sequence. Black Panther and Killmonger were two of the most ‘hero’ CG assets – a large part of the work here involved crafting their nanotechnology-based suits. Modeling and texturing the Panther suits involved replicating a multitude of tiny triangle formations and Wakandan glyphs via intricate texturing. “Our asset team is pretty amazing,” comments Perry. “They put those together really, really fast as far as the textures and everything. We had an amazing amount of textures from Clear Angle, who did the original scans of the on-set suits, and then we had the photogrammetry photography as well as a texture shoot that they did, and then we had them send the actual costumes to us so that we could get additional detail if we needed to.” In addition to the hero actors, there were also digi-doubles required for tribe members in crowd shots and even front-andcenter clashes. Method roto-animated the fight choreography from the live-action photography in order to acquire distinctive styles of fighting among the tribes. “And then we also did a couple mo-cap shoots where we had a number of the stunt people who were at the original shoot,” says Perry. “We devised fight antics by giving them 10 or 15 minutes to work through a choreography, and that gave us a collection of all of these pieces.”

When Rhinos Attack One of the interesting challenges in the battle turned out to be ‘battle rhinos’ wearing armor. These began as CG assets shared from Method Studios’ sister company Iloura (now part of Method), which had just delivered VFX for a rhino-filled scene in Jumanji: Welcome to the Jungle. Method Studios had also recently worked on Okja, a film that featured a creature with a somewhat similar skin quality to the rhinos. “All of the animators and the modelers and so forth who had spent a year developing the Okja character now had all of this experience doing these quadrupeds of that size and then the riggers and tech animators had the muscle systems already in place,” says Perry. “And so some of our artists in the asset department, they just moved onto it and they just kind of knew what to do.” On set, production mostly used a stick placeholder in lieu of any real rhinos. A number of Clydesdale horses were also used, with riders, to act as stand-ins for W’Kabi (Daniel Kaluuya), who rides one of the creatures. “It gives people eyelines and that type of thing, but in some cases we just used a really, really well developed W’Kabi digi-double,” notes Perry. “We grabbed his head off of the performance plate and put it onto the CG body, so you’ve got the natural performance from Daniel, and then you had animation doing all of the step down until he’s just coming off the horse and then you match that.”

“During those mo-cap sessions,” notes Perry, “we’d say, ‘Now, this is a Dora, she’s taking on three guys at one time. So she comes in through here, through here, through here.’ They’d then run through it a couple of times, and that gave us a collection to use for crowds. The crowds would be for the background action. Then our keyframe animation was mid-ground, and live-action was mostly close to camera. We just mixed and matched these to try to get them to work.” BATTLE ICONS

A shot that has turned out to be one of the iconic moments of the battle has Shuri (Letitia Wright), Black Panther’s sister, wielding a set of gauntlets on her hands, which shoot out some high-tech weaponry. “Unofficially, we called them ‘kitten mittens’,” says Perry. “Our modeler is very, very proud of these things. There were scans of the practical ones and there was an opened and closed version of them, and so we just took that to be, oh, this thing kind of activates and opens up. So in the model he made them transform and had these little animations in there. “And then we sent it to production,” continues Perry, “and the word back was, ‘Wow, this is really cool. We’re going to make something for this moment.’ That was really cool, to contribute to the story like that.”

TOP TO BOTTOM: The end of the battle nears. The spire area seen at a different time of day. One of Method Studios’ digital rhinos goes on the attack. The CG creatures benefited in particular from earlier work the company’s sister studio, Iloura, had done for Jumanji: Welcome to the Jungle.

SUMMER 2018 VFXVOICE.COM • 73


PROFILE

RACHEL DAY: THE GAME-CHANGING GAMEMAKER BEHIND OVERWATCH By WILLIE CLARK

Images courtesy of Blizzard Entertainment

74 • VFXVOICE.COM SUMMER 2018

Rachel Day, 32, has been consuming VFX since she was young. Born in Long Beach, California, Day remembers going to the drive-in to see Indiana Jones, taking in “all the exciting, crumbling, pure adventure that’s going on in those movies,” she says, and absorbing the impact of VFX. Now a Senior VFX Artist at videogame trailblazer Blizzard Entertainment, Day works on Overwatch. Originally released in 2016, Overwatch has been a phenomenon for Blizzard. The game has over 35 million players worldwide, and has even spawned the eSports Overwatch League, whose first season started in January of this year. Teams for the league represent cities such as Los Angeles, Seoul, Shanghai and New York. That’s how far around the world the game has reached. “Working with Rachel is always one of the highlights of my day,” says Renaud Galand, Lead Character Artist (Overwatch), for Blizzard. “She’s not only a great artist but an amazing game developer and collaborator, using her technical skills, artistic eye and flawless communication to always help us push the boundaries of what’s possible in the world of Overwatch.” As for gaming, Day grew up with Atari and the likes of Pitfall and Pac-Man. “From the time I can remember anything, I had a controller in my hand,” Day says, but if she had to pick, Day named Warcraft II: Tides of Darkness as the tipping point when she realized she was really into gaming. “That was the first game where I would come from school and hurry up and get all my homework done immediately so that I could just play until I fell asleep,” Day says. “That was the first time where I was like, ‘Yeah, this is cool, I really like video games.” Now Day works at the very company that was responsible for Warcraft II. Even though she grew up gaming, Day isn’t sure why she didn’t think it was a job. One of her friends got a job at Blizzard, and not long after that Day went to the Art Institute of Orange County to become a gamemaker. Day’s first industry job was interning at Blizzard in quality assurance during the development of World of Warcraft: Wrath of the Lich King in 2008. She then returned the following year to intern again in QA for StarCraft II: Wings of Liberty. “I consider myself a smart person because I took that time in college to talk to everybody at Blizzard that I could, and network and tell people, ‘I really want to be a technical artist when I graduate college.’ And that paid off, because four months before I graduated I got a call.” That led to Day joining the team of Diablo III as an associate technical artist. She thought she wanted to be a technical artist, but her lead suggested she try out VFX. “I’m a pretty technical person, I like to dig into the logic and problem-solving side of things, but I also love color and animation and impact and gameplay, and VFX in video games is really the trifecta of those three things,” Day says. “I get to have a little bit of design influence, flex my logic skills creating shaders and things like that, and I really get to be an artist, visually creating what people are seeing and how they’re playing.” Even as Day works on Overwatch, the popularity of the game is

something that she hates thinking about. “It is terrifying to think that 35 million people, or however many it is now, have seen my art and interacted with my art on a daily basis,” Day says. “It’s insane.” As for the role of VFX in a game like Overwatch, Day thinks having that visual feedback is a very important part of a game. “In my perspective, VFX is the visual telling of gameplay,” she says. “So we work hand in hand with the designers. If we have a projectile, they’re telling us how fast it should go, and I tell them what it looks like when it’s flying through the air. So it’s really this relationship, back and forth between us and design and gameplay engineering, to create the experience that people are having. Does it feel impactful when a projectile hits the wall? Can you tell that something happened immediately? Are you giving that feedback? It’s important.” That’s not to say that the effects can’t sometimes go too far in one direction. “That’s the fun part of my job, honestly. I like to push everything to 11 or 12 and then dial it back a little bit,” Day says. “When we’re making a system, I’ll put 20 or so different assets in and go, ‘Oh wow, that’s amazing. OK... it’s too big, it’s whatever, let’s dial it back a little bit.’ We’re constantly playing the pendulum game of ‘swing it too far, OK, bring it back a little bit more.’ That’s the fun of the balance of this work.” For Overwatch, Day worked on Hanzo’s dragon (“Figuring out how to make a giant portal with two dragons that are the size of the map be performant, look good and be impactful was a super-fun challenge”) and Orisa’s entire effects package (“I really think the colors on that are harmonious”). She also thinks that Overwatch’s style allows for more freedom, as opposed to games that are more hyper-realistic. “It’s really the limits of your imagination,” Day says. “You can look at fire and go, ‘OK, I know what fire’s supposed to look like,’ but I don’t have to replicate it. I can make the feeling of fire and everybody knows what it is, it just doesn’t have to look exactly like real life. It’s boundless.” “I don’t think anything we’ve done has been super innovative,”

“I’m a pretty technical person, I like to dig into the logic and problem-solving side of things, but I also love color and animation and impact and gameplay, and VFX in video games is really the trifecta of those three things. I get to have a little bit of design influence, flex my logic skills creating shaders... and I really get to be an artist, visually creating what people are seeing and how they’re playing.” —Rachel Day

OPPOSITE TOP: Rachel Day OPPOSITE BOTTOM: Hanzo from Overwatch TOP LEFT AND RIGHT: Hanzo and a scene from Overwatch

SUMMER 2018 VFXVOICE.COM • 75


PROFILE

RACHEL DAY: THE GAME-CHANGING GAMEMAKER BEHIND OVERWATCH By WILLIE CLARK

Images courtesy of Blizzard Entertainment

74 • VFXVOICE.COM SUMMER 2018

Rachel Day, 32, has been consuming VFX since she was young. Born in Long Beach, California, Day remembers going to the drive-in to see Indiana Jones, taking in “all the exciting, crumbling, pure adventure that’s going on in those movies,” she says, and absorbing the impact of VFX. Now a Senior VFX Artist at videogame trailblazer Blizzard Entertainment, Day works on Overwatch. Originally released in 2016, Overwatch has been a phenomenon for Blizzard. The game has over 35 million players worldwide, and has even spawned the eSports Overwatch League, whose first season started in January of this year. Teams for the league represent cities such as Los Angeles, Seoul, Shanghai and New York. That’s how far around the world the game has reached. “Working with Rachel is always one of the highlights of my day,” says Renaud Galand, Lead Character Artist (Overwatch), for Blizzard. “She’s not only a great artist but an amazing game developer and collaborator, using her technical skills, artistic eye and flawless communication to always help us push the boundaries of what’s possible in the world of Overwatch.” As for gaming, Day grew up with Atari and the likes of Pitfall and Pac-Man. “From the time I can remember anything, I had a controller in my hand,” Day says, but if she had to pick, Day named Warcraft II: Tides of Darkness as the tipping point when she realized she was really into gaming. “That was the first game where I would come from school and hurry up and get all my homework done immediately so that I could just play until I fell asleep,” Day says. “That was the first time where I was like, ‘Yeah, this is cool, I really like video games.” Now Day works at the very company that was responsible for Warcraft II. Even though she grew up gaming, Day isn’t sure why she didn’t think it was a job. One of her friends got a job at Blizzard, and not long after that Day went to the Art Institute of Orange County to become a gamemaker. Day’s first industry job was interning at Blizzard in quality assurance during the development of World of Warcraft: Wrath of the Lich King in 2008. She then returned the following year to intern again in QA for StarCraft II: Wings of Liberty. “I consider myself a smart person because I took that time in college to talk to everybody at Blizzard that I could, and network and tell people, ‘I really want to be a technical artist when I graduate college.’ And that paid off, because four months before I graduated I got a call.” That led to Day joining the team of Diablo III as an associate technical artist. She thought she wanted to be a technical artist, but her lead suggested she try out VFX. “I’m a pretty technical person, I like to dig into the logic and problem-solving side of things, but I also love color and animation and impact and gameplay, and VFX in video games is really the trifecta of those three things,” Day says. “I get to have a little bit of design influence, flex my logic skills creating shaders and things like that, and I really get to be an artist, visually creating what people are seeing and how they’re playing.” Even as Day works on Overwatch, the popularity of the game is

something that she hates thinking about. “It is terrifying to think that 35 million people, or however many it is now, have seen my art and interacted with my art on a daily basis,” Day says. “It’s insane.” As for the role of VFX in a game like Overwatch, Day thinks having that visual feedback is a very important part of a game. “In my perspective, VFX is the visual telling of gameplay,” she says. “So we work hand in hand with the designers. If we have a projectile, they’re telling us how fast it should go, and I tell them what it looks like when it’s flying through the air. So it’s really this relationship, back and forth between us and design and gameplay engineering, to create the experience that people are having. Does it feel impactful when a projectile hits the wall? Can you tell that something happened immediately? Are you giving that feedback? It’s important.” That’s not to say that the effects can’t sometimes go too far in one direction. “That’s the fun part of my job, honestly. I like to push everything to 11 or 12 and then dial it back a little bit,” Day says. “When we’re making a system, I’ll put 20 or so different assets in and go, ‘Oh wow, that’s amazing. OK... it’s too big, it’s whatever, let’s dial it back a little bit.’ We’re constantly playing the pendulum game of ‘swing it too far, OK, bring it back a little bit more.’ That’s the fun of the balance of this work.” For Overwatch, Day worked on Hanzo’s dragon (“Figuring out how to make a giant portal with two dragons that are the size of the map be performant, look good and be impactful was a super-fun challenge”) and Orisa’s entire effects package (“I really think the colors on that are harmonious”). She also thinks that Overwatch’s style allows for more freedom, as opposed to games that are more hyper-realistic. “It’s really the limits of your imagination,” Day says. “You can look at fire and go, ‘OK, I know what fire’s supposed to look like,’ but I don’t have to replicate it. I can make the feeling of fire and everybody knows what it is, it just doesn’t have to look exactly like real life. It’s boundless.” “I don’t think anything we’ve done has been super innovative,”

“I’m a pretty technical person, I like to dig into the logic and problem-solving side of things, but I also love color and animation and impact and gameplay, and VFX in video games is really the trifecta of those three things. I get to have a little bit of design influence, flex my logic skills creating shaders... and I really get to be an artist, visually creating what people are seeing and how they’re playing.” —Rachel Day

OPPOSITE TOP: Rachel Day OPPOSITE BOTTOM: Hanzo from Overwatch TOP LEFT AND RIGHT: Hanzo and a scene from Overwatch

SUMMER 2018 VFXVOICE.COM • 75


PROFILE

“It is terrifying to think that 35 million people, or however many it is now, have seen my art and interacted with my art on a daily basis. It’s insane.” —Rachel Day

TOP LEFT: Doomfist from Overwatch TOP RIGHT, BOTTOM AND OPPOSITE TOP: Orisa from Overwatch OPPOSITE BOTTOM: From the Oasis map in Overwatch

76 • VFXVOICE.COM SUMMER 2018

she adds. “We have a very stylized, hand-painted, basic kind of effects system that we use. But we can really get creative with what we do.” As Overwatch continues to expand, the team also has to keep an eye on performance. “If our games aren’t playing snappy, then they aren’t fun,” Day says. “So we have to make sure that we’re keeping an eye on things and making them beautiful and impactful and also playing well at the same time.” With Diablo and Overwatch, Day has had her hands in two incredibly popular franchises, but there is still one area she feels she should have studied more. “Having stumbled into this, and not had school training on it, I wish I had paid attention to my animation classes a little bit more,” Day says. “Learning timing is such an instinctual thing. I think it’s important to understand the fundamentals of animation and the basics to understand effects. The same principles apply to effects as well as animation.” Looking ahead, Day said she sees more sims being used in real time VFX, where developers are taking simulations from Houdini, stylizing them and bringing them into the game. She doesn’t think the process has been fully honed, but she’s excited at the prospect. Day says the part of her job she enjoys most is talking to the whole team. “I’m talking to character artists one day and gameplay engineers the next day, designers after that. I really feel like I have my hand in just about every aspect of making a game, so it’s a fantastic vantage point in the process to be a developer of a great game.”

SUMMER 2018 VFXVOICE.COM • 77


PROFILE

“It is terrifying to think that 35 million people, or however many it is now, have seen my art and interacted with my art on a daily basis. It’s insane.” —Rachel Day

TOP LEFT: Doomfist from Overwatch TOP RIGHT, BOTTOM AND OPPOSITE TOP: Orisa from Overwatch OPPOSITE BOTTOM: From the Oasis map in Overwatch

76 • VFXVOICE.COM SUMMER 2018

she adds. “We have a very stylized, hand-painted, basic kind of effects system that we use. But we can really get creative with what we do.” As Overwatch continues to expand, the team also has to keep an eye on performance. “If our games aren’t playing snappy, then they aren’t fun,” Day says. “So we have to make sure that we’re keeping an eye on things and making them beautiful and impactful and also playing well at the same time.” With Diablo and Overwatch, Day has had her hands in two incredibly popular franchises, but there is still one area she feels she should have studied more. “Having stumbled into this, and not had school training on it, I wish I had paid attention to my animation classes a little bit more,” Day says. “Learning timing is such an instinctual thing. I think it’s important to understand the fundamentals of animation and the basics to understand effects. The same principles apply to effects as well as animation.” Looking ahead, Day said she sees more sims being used in real time VFX, where developers are taking simulations from Houdini, stylizing them and bringing them into the game. She doesn’t think the process has been fully honed, but she’s excited at the prospect. Day says the part of her job she enjoys most is talking to the whole team. “I’m talking to character artists one day and gameplay engineers the next day, designers after that. I really feel like I have my hand in just about every aspect of making a game, so it’s a fantastic vantage point in the process to be a developer of a great game.”

SUMMER 2018 VFXVOICE.COM • 77


COMPANY SPOTLIGHT

ILP: SWEDISH VFX TALENT COMES HOME TO FIND THE WORLD AT THEIR DOORSTEP By CHRIS McGOWAN

78 • VFXVOICE.COM SUMMER 2018

One of the international visual effects and digital animation studios that has come to the forefront in recent years is a Swedish firm with the Monty Python-esque name of Important Looking Pirates (ILP). Founded by four colleagues in Stockholm in 2007, ILP initially worked mostly on commercials and then jumped into the TV and motion picture big leagues with the Norwegian film Kon-Tiki (2012) and the NBC series Crossbones (2014). Starting with the latter effort, ILP has received VES Award nominations every year and shared VES Awards for Black Sails in 2017 and 2018. ILP has also worked on VFX for Lost in Space, Westworld, Star Wars: The Last Jedi, and Jurassic World: Fallen Kingdom. Other projects include Geostorm, Inhumans, Everest, The Magicians, Outlander, Fear the Walking Dead and Krypton. ILP has grown to around 100 people working in its office, with 85 of those being staff and 15 contractors. The roots of ILP reach back to when two of the co-founders, VFX artists Niklas Jacobson and Yafei Wu, were working in London “when it was booming with special effects movies,” recalls Jacobson. In London, Jacobson moved into film, while Wu stayed with commercials. Then Wu moved to Los Angeles to start up a 3D department at Brickyard VFX, while Jacobson commuted between London and Stockholm. “We started talking about creating something back home,” continues Jacobson. “There was no big visual effects industry in Sweden at the time. It was mostly commercials work.” The domestic films or series in production there had little need of major VFX, as they tended to be down-to-earth, realistic dramas rather than action, fantasy or sci-fi movies in need of heavy CGI. But in today’s global film industry, in areas like post-production, it doesn’t really matter where you live, as long as you can access the Internet. “Yafei had been on the road for a few years and we talked about settling down in one place. We had these hopes and dreams of starting a company and doing the same kind of epic, full work we were seeing in the United States and London.” Jacobson and Wu, along with Eric Hermelin and Carl Hermelin, founded ILP, which took its moniker from the videogame world (not, as it turns out, from Monty Python). “It is a nerd reference for sure,” says Måns Björklund, ILP’s Effects Executive Producer. “It’s from a ‘90s adventure game called Monkey Island, from LucasArts [founded by George Lucas]. In the beginning of the game you enter a bar and there’s a bunch of important looking pirates sitting in there.” Curiously, ILP would collaborate with the similar-sounding ILM (Industrial Light and Magic, also founded by Lucas) on Star Wars: The Last Jedi, “so the circle was kind of closed,” muses Jacobson, now a Visual Effects Supervisor at ILP. With its unique name in hand, ILP gained major notice within the global VFX community with its work on Kon-Tiki. Directed by Joachim Rønning and Espen Sandberg, it garnered an Academy Award nomination for Best Foreign Language Film. “We did sharks and water work, and could do it properly,” recalls Jacobson. “That put us on the map internationally.”

“Kon-Tiki went viral, at least in our business,” says Björklund. “It was shared by ILM, Weta, Rhythm & Hues and others,” adds Jacobson, “and they were all like, ‘Who the hell are these Swedes that we’ve never heard of?’ ” He laughs. They gained the attention of veteran VFX Supervisor Kevin Blank (Xena, Hercules, Lost, Alias, Fringe and Star Trek: Enterprise). “He was one of the first to assign work to us. He convinced Crossbones to work with us.” The 2014 pirate series, starring John Malkovich and Claire Foy, only lasted nine episodes, but it got ILP into the U.S. effects arena. “The ships we built for Crossbones led us to Black Sails. Also the sharks – that’s part of why we got it,” says Jacobson. “That’s the good thing with this industry – if you’ve done good work everybody has the Internet and you can display it. There might be people who are interested in working together with you.” More attention arrived when ILP’s Jacobson and Björklund shared a VES nomination with Blank, Ron Pogue and Andy Weder for Crossbones. “That was the birth of us getting into television, and we have been working consistently since,” notes Jacobson. ILP now is involved in film and TV, as well as commercials and video games (such as EA’s Battlefield and Star Wars Battlefront). One of ILP’s biggest recent undertakings has been Westworld, in which they had the interesting task in season one of taking 40 years off the age of Anthony Hopkins for scenes with a young Dr. Robert Ford, the park founder. They also worked on shots that reveal Dolores’s exoskeleton and a young boy’s robotic face. And they created scenes of steam trains and modern train stations, and crafted the enormous terraforming machine that excavates and reshapes landscapes. A major current project is the Netflix series Lost in Space, a new take on the popular ‘60s TV show. Terron Pratt, who worked with ILP on Black Sails, is a Visual Effects Producer for Lost in Space.

OPPOSITE TOP: Two of ILP’s four Co-founders, Yafei Wu (left) and Niklas Jacobson, in Stockholm. (Image courtesy of ILP) OPPOSITE BOTTOM: Måns Björklund, ILP Effects Executive Producer, and Niklas Jacobson, ILP Co-founder and Visual Effects Supervisor, in Stockholm. (Image courtesy of ILP) TOP: Creating realistic sharks was part of ILP’s work for the 2012 film Kon-Tiki, directed by Joachim Rønning and Espen Sandberg, which garnered an Academy Award nomination for Best Foreign Language Film. (Image © 2012 Nordisk Film) BOTTOM: Young Dr. Ford: ILP’s VFX efforts rejuvenated actor Anthony Hopkins by 40 years in Westworld. (Image © 2017 HBO)

SUMMER 2018 VFXVOICE.COM • 79


COMPANY SPOTLIGHT

ILP: SWEDISH VFX TALENT COMES HOME TO FIND THE WORLD AT THEIR DOORSTEP By CHRIS McGOWAN

78 • VFXVOICE.COM SUMMER 2018

One of the international visual effects and digital animation studios that has come to the forefront in recent years is a Swedish firm with the Monty Python-esque name of Important Looking Pirates (ILP). Founded by four colleagues in Stockholm in 2007, ILP initially worked mostly on commercials and then jumped into the TV and motion picture big leagues with the Norwegian film Kon-Tiki (2012) and the NBC series Crossbones (2014). Starting with the latter effort, ILP has received VES Award nominations every year and shared VES Awards for Black Sails in 2017 and 2018. ILP has also worked on VFX for Lost in Space, Westworld, Star Wars: The Last Jedi, and Jurassic World: Fallen Kingdom. Other projects include Geostorm, Inhumans, Everest, The Magicians, Outlander, Fear the Walking Dead and Krypton. ILP has grown to around 100 people working in its office, with 85 of those being staff and 15 contractors. The roots of ILP reach back to when two of the co-founders, VFX artists Niklas Jacobson and Yafei Wu, were working in London “when it was booming with special effects movies,” recalls Jacobson. In London, Jacobson moved into film, while Wu stayed with commercials. Then Wu moved to Los Angeles to start up a 3D department at Brickyard VFX, while Jacobson commuted between London and Stockholm. “We started talking about creating something back home,” continues Jacobson. “There was no big visual effects industry in Sweden at the time. It was mostly commercials work.” The domestic films or series in production there had little need of major VFX, as they tended to be down-to-earth, realistic dramas rather than action, fantasy or sci-fi movies in need of heavy CGI. But in today’s global film industry, in areas like post-production, it doesn’t really matter where you live, as long as you can access the Internet. “Yafei had been on the road for a few years and we talked about settling down in one place. We had these hopes and dreams of starting a company and doing the same kind of epic, full work we were seeing in the United States and London.” Jacobson and Wu, along with Eric Hermelin and Carl Hermelin, founded ILP, which took its moniker from the videogame world (not, as it turns out, from Monty Python). “It is a nerd reference for sure,” says Måns Björklund, ILP’s Effects Executive Producer. “It’s from a ‘90s adventure game called Monkey Island, from LucasArts [founded by George Lucas]. In the beginning of the game you enter a bar and there’s a bunch of important looking pirates sitting in there.” Curiously, ILP would collaborate with the similar-sounding ILM (Industrial Light and Magic, also founded by Lucas) on Star Wars: The Last Jedi, “so the circle was kind of closed,” muses Jacobson, now a Visual Effects Supervisor at ILP. With its unique name in hand, ILP gained major notice within the global VFX community with its work on Kon-Tiki. Directed by Joachim Rønning and Espen Sandberg, it garnered an Academy Award nomination for Best Foreign Language Film. “We did sharks and water work, and could do it properly,” recalls Jacobson. “That put us on the map internationally.”

“Kon-Tiki went viral, at least in our business,” says Björklund. “It was shared by ILM, Weta, Rhythm & Hues and others,” adds Jacobson, “and they were all like, ‘Who the hell are these Swedes that we’ve never heard of?’ ” He laughs. They gained the attention of veteran VFX Supervisor Kevin Blank (Xena, Hercules, Lost, Alias, Fringe and Star Trek: Enterprise). “He was one of the first to assign work to us. He convinced Crossbones to work with us.” The 2014 pirate series, starring John Malkovich and Claire Foy, only lasted nine episodes, but it got ILP into the U.S. effects arena. “The ships we built for Crossbones led us to Black Sails. Also the sharks – that’s part of why we got it,” says Jacobson. “That’s the good thing with this industry – if you’ve done good work everybody has the Internet and you can display it. There might be people who are interested in working together with you.” More attention arrived when ILP’s Jacobson and Björklund shared a VES nomination with Blank, Ron Pogue and Andy Weder for Crossbones. “That was the birth of us getting into television, and we have been working consistently since,” notes Jacobson. ILP now is involved in film and TV, as well as commercials and video games (such as EA’s Battlefield and Star Wars Battlefront). One of ILP’s biggest recent undertakings has been Westworld, in which they had the interesting task in season one of taking 40 years off the age of Anthony Hopkins for scenes with a young Dr. Robert Ford, the park founder. They also worked on shots that reveal Dolores’s exoskeleton and a young boy’s robotic face. And they created scenes of steam trains and modern train stations, and crafted the enormous terraforming machine that excavates and reshapes landscapes. A major current project is the Netflix series Lost in Space, a new take on the popular ‘60s TV show. Terron Pratt, who worked with ILP on Black Sails, is a Visual Effects Producer for Lost in Space.

OPPOSITE TOP: Two of ILP’s four Co-founders, Yafei Wu (left) and Niklas Jacobson, in Stockholm. (Image courtesy of ILP) OPPOSITE BOTTOM: Måns Björklund, ILP Effects Executive Producer, and Niklas Jacobson, ILP Co-founder and Visual Effects Supervisor, in Stockholm. (Image courtesy of ILP) TOP: Creating realistic sharks was part of ILP’s work for the 2012 film Kon-Tiki, directed by Joachim Rønning and Espen Sandberg, which garnered an Academy Award nomination for Best Foreign Language Film. (Image © 2012 Nordisk Film) BOTTOM: Young Dr. Ford: ILP’s VFX efforts rejuvenated actor Anthony Hopkins by 40 years in Westworld. (Image © 2017 HBO)

SUMMER 2018 VFXVOICE.COM • 79


COMPANY SPOTLIGHT

“The ships we built for Crossbones led us to Black Sails. Also the sharks – that’s part of why we got it. That’s the good thing with this industry – if you’ve done good work everybody has the Internet and you can display it. There might be people who are interested in working together with you.” —Niklas Jacobson, Co-founder/VFX Artist, ILP

TOP LEFT AND RIGHT: ILP created VFX ships, sails, sharks and water for the Black Sails series on the Starz Network (2014-2017), set in the Golden Age of Piracy in the 1715 West Indies. (Image © Starz Entertainment) BOTTOM LEFT AND RIGHT: The Robot Boy was a VFX highlight for ILP in Westworld. (Image © 2017 HBO)

80 • VFXVOICE.COM SUMMER 2018

“Black Sails was the start of a great working relationship,” says Jacobson. “We had such a blast working on it for two seasons, and we really enjoyed working with both Terron and Erik Henry (Black Sails’ VFX Supervisor). So when Black Sails came to its end and we were offered work on Terron´s next show, Lost in Space, we were thrilled. I mean Toby Stephens (Captain Flint in Black Sails) in space! Space pirates!” says Jacobson. He continues, “After the 2017 VES Awards, we flew straight from L.A. to Vancouver to meet with Terron, VFX Supervisor Jabbar [Raisani] and the rest of the Lost in Space production team. Seeing what they were doing on set really blew us away. There was some proper production value going on! The show is very ambitious, and I think CG plays a big part in modern sci-fi dramas. Colonization stories such as this one offer fantastic opportunities to show off new worlds.” ILP prefers challenging work. “We’re specialized in the tricky stuff. The heavy lifting, like CG heavy work. We prefer getting a good sequence of like 50 shots of really complicated work rather than hundreds of shots of greenscreen work or things like that,” says Jacobson. To accomplish that heavy lifting, ILP relies on a wide range of software, including Maya, Nuke, Clarisse, Arnold, V-Ray and Renderman. “We see what’s right for any given situation

and any given job. It’s messier to maintain all these different tools and software, but it says something about the culture when you have an open mind [to options]. You don’t just learn one tool and stick with that. You have to be quite flexible to stay competitive.” He continues, “Our company has always been about quality of work, and having the climate and atmosphere to be able to do that kind of work. We wanted to create a company where we ourselves wanted to work.” Björklund adds, “We have a passion for what we do and we have a great time doing it. That’s super important because that’s when some of the best work is created.” Jacobson notes that there are advantages and disadvantages to working in Stockholm. “Salaries may not be as high as in the U.S., but there is free health care and free education. It’s a nice place, especially if you want to start a family. More security. You don’t have to worry about the day-to-day.” That plus homesickness has lured expatriate Swedes working in the U.S. or U.K. back to ILP. “There are lots of talented Swedes around the world working at big and famous places. They return sometimes to start a family or because they’re from here. We’re in a good place to have a platform for them to come home and have the same kind of work.” Star Wars: The Last Jedi and Jurassic World: Fallen Kingdom have been two other high-profile projects for ILP. Working on

“We’re specialized in the tricky stuff. The heavy lifting, like CG heavy work. We prefer getting a good sequence of like 50 shots of really complicated work rather than hundreds of shots of greenscreen work...” —Niklas Jacobson, Co-founder/VFX Artist, ILP

TOP LEFT AND RIGHT: ILP showed us the mechanical inner workings of Westworld heroine Dolores. (Image © 2017 HBO) BOTTOM LEFT: ILP worked on the full-CG character Brainiac, a villain from the Krypton TV series. (Image © 2018 Syfy Channel and Warner Bros. Television) BOTTOM RIGHT: ILP provided impressive sharks for the 2016 survival film The Shallows, starring Blake Lively as a surfer in a battle of wits with a great white shark. (Image © 2016 Columbia Pictures)

SUMMER 2018 VFXVOICE.COM • 81


COMPANY SPOTLIGHT

“The ships we built for Crossbones led us to Black Sails. Also the sharks – that’s part of why we got it. That’s the good thing with this industry – if you’ve done good work everybody has the Internet and you can display it. There might be people who are interested in working together with you.” —Niklas Jacobson, Co-founder/VFX Artist, ILP

TOP LEFT AND RIGHT: ILP created VFX ships, sails, sharks and water for the Black Sails series on the Starz Network (2014-2017), set in the Golden Age of Piracy in the 1715 West Indies. (Image © Starz Entertainment) BOTTOM LEFT AND RIGHT: The Robot Boy was a VFX highlight for ILP in Westworld. (Image © 2017 HBO)

80 • VFXVOICE.COM SUMMER 2018

“Black Sails was the start of a great working relationship,” says Jacobson. “We had such a blast working on it for two seasons, and we really enjoyed working with both Terron and Erik Henry (Black Sails’ VFX Supervisor). So when Black Sails came to its end and we were offered work on Terron´s next show, Lost in Space, we were thrilled. I mean Toby Stephens (Captain Flint in Black Sails) in space! Space pirates!” says Jacobson. He continues, “After the 2017 VES Awards, we flew straight from L.A. to Vancouver to meet with Terron, VFX Supervisor Jabbar [Raisani] and the rest of the Lost in Space production team. Seeing what they were doing on set really blew us away. There was some proper production value going on! The show is very ambitious, and I think CG plays a big part in modern sci-fi dramas. Colonization stories such as this one offer fantastic opportunities to show off new worlds.” ILP prefers challenging work. “We’re specialized in the tricky stuff. The heavy lifting, like CG heavy work. We prefer getting a good sequence of like 50 shots of really complicated work rather than hundreds of shots of greenscreen work or things like that,” says Jacobson. To accomplish that heavy lifting, ILP relies on a wide range of software, including Maya, Nuke, Clarisse, Arnold, V-Ray and Renderman. “We see what’s right for any given situation

and any given job. It’s messier to maintain all these different tools and software, but it says something about the culture when you have an open mind [to options]. You don’t just learn one tool and stick with that. You have to be quite flexible to stay competitive.” He continues, “Our company has always been about quality of work, and having the climate and atmosphere to be able to do that kind of work. We wanted to create a company where we ourselves wanted to work.” Björklund adds, “We have a passion for what we do and we have a great time doing it. That’s super important because that’s when some of the best work is created.” Jacobson notes that there are advantages and disadvantages to working in Stockholm. “Salaries may not be as high as in the U.S., but there is free health care and free education. It’s a nice place, especially if you want to start a family. More security. You don’t have to worry about the day-to-day.” That plus homesickness has lured expatriate Swedes working in the U.S. or U.K. back to ILP. “There are lots of talented Swedes around the world working at big and famous places. They return sometimes to start a family or because they’re from here. We’re in a good place to have a platform for them to come home and have the same kind of work.” Star Wars: The Last Jedi and Jurassic World: Fallen Kingdom have been two other high-profile projects for ILP. Working on

“We’re specialized in the tricky stuff. The heavy lifting, like CG heavy work. We prefer getting a good sequence of like 50 shots of really complicated work rather than hundreds of shots of greenscreen work...” —Niklas Jacobson, Co-founder/VFX Artist, ILP

TOP LEFT AND RIGHT: ILP showed us the mechanical inner workings of Westworld heroine Dolores. (Image © 2017 HBO) BOTTOM LEFT: ILP worked on the full-CG character Brainiac, a villain from the Krypton TV series. (Image © 2018 Syfy Channel and Warner Bros. Television) BOTTOM RIGHT: ILP provided impressive sharks for the 2016 survival film The Shallows, starring Blake Lively as a surfer in a battle of wits with a great white shark. (Image © 2016 Columbia Pictures)

SUMMER 2018 VFXVOICE.COM • 81


COMPANY SPOTLIGHT

TOP: The Jupiter 2 crew tries to figure out how to retrieve their ship in Lost in Space, with VFX by ILP. (Image © 2018 Netflix and Legendary Television) BOTTOM: The Robinsons navigate formidable icy landscapes in Lost in Space, with VFX by ILP. (Image © 2018 Netflix and Legendary Television)

82 • VFXVOICE.COM SUMMER 2018

them has been deeply fulfilling for Jacobson. “We’re so proud to be able to collaborate with companies such as ILM, which has definitely been one of the biggest inspirations forever. And movies like Jurassic Park and Terminator 2 made me want to go into this industry. To work on those kind of movies today – it’s an honor.” “I have a hard time thinking of doing anything else,” concludes Björklund.

AD

SUMMER 2018 VFXVOICE.COM • 83



VFX TRENDS

WHAT’S IN YOUR VFX KIT? TIPS FROM THE EXPERTS By IAN FAILES

It’s a question that regularly gets asked among visual effects supervisors: what do you take with you on a job? And the answer, of course, often depends on the nature of the project – sometimes you need to be nimble and sometimes you need all the gear. VFX Voice asked four highly experienced VFX pros to weigh in on the VFX kits they bring along for shooting. Visual Effects Supervisor Stephan Fleet details all the gear he relies on when on-set supervising, while fellow VFX Supervisors Mark Kolpack and Dan Schrecker as well as Creative Producer Eric Alba pinpoint specific pieces of gear and how they use them to help craft great visual effects shots.

STEPHAN FLEET, Visual Effects Supervisor Recent projects: Season 2 of NBC’s Timeless, Misfits and Reverie pilots, and Marvel’s Iron Fist IPHONE X

In the past 5 years or so, this has become the most invaluable tool in my kit. I don’t often bring a tablet. I want something I can fit in my pocket. It has to be an Apple device because I use a custom Filemaker database that I wrote to take all of my on-set notes and data. It’s a pretty cool database that lets me quickly get data for multiple cameras, add reference images, jot down notes, link to my VFX breakdown, and my coolest new feature is that I can grab GPS coordinates so in post we can hone in on the exact location we were filming. I also use the stock camera app, the stock compass/level app, an HDR remote app called Simple HDR that links up to my Ricoh Theta S, an app called Measures for recording measurement plans, Dark Sky for weather checking, Dax, Pix, and an app called VUER to livestream footage over wifi with a Teradek.

handle that, as well as Photoshop and Premiere! PANASONIC GH5 (LARGE CAMERA)

I recently switched from being a Canon 5D user to mirrorless, and the GH5 is perfect for my needs. Here’s why: • Silent shutter: This is a huge advantage on set. I can snap away while they are filming. • Shoots amazing video: The fact that this camera can shoot 10-bit 4k 4:2:2 footage in V-Log is huge. I have actually shot B-unit VFX plates for production with this little guy. You can even pull a decent key off of a greenscreen. • It has a cool mode called 6K photo capture that actually captures a 6K H.265 movie. I have a unique technique where I can pull stills from that video and use Agisoft PhotoScan to make a great photogrammetry 3D model. I can literally whip the camera around an actor’s head like a magic wand in 10 seconds to accomplish this. • I use a 12-60mm F2.8 Panasonic lens as my main lens. The only drawback for me is that the camera has a smaller chip size, so it struggles a little more in low light compared to, say, the Sony a7S II. Shows always shoot in super low light these days, wide open. To compensate for this, I keep the superb Leica 12mm F1.4 lens in my kit. It’s small and not too heavy. You can also up the ISO to around 3200 before it breaks.

OPPOSITE TOP LEFT: Visual Effects Supervisor Stephan Fleet OPPOSITE TOP RIGHT: The Manfrotto MB MS-BP-IGR medium backpack is Fleet’s most recent carry bag for his VFX kit. OPPOSITE BOTTOM: Fleet’s VFX kit bag and its contents laid out. TOP LEFT: Visual Effects Supervisor Mark Kolpack. TOP RIGHT: Kolpack scans Agents of S.H.I.E.L.D. Visual Effects Editor Ryan Moos with the Structure Sensor. BOTTOM: Part of Kolpack’s on-set kit, including his iPad Pro.

SONY RX100 V (POCKET CAMERA)

This is a little point-and-shoot camera that stays in my kit at all times. In fact, I use it more than my GH5. It can fit in a pocket, it shoots great 4K and stills, pretty decent low light functionality. I highly recommend this camera as a backup, or even a main for a light, non-plate heavy show. RICOH THETA S

DELL XPS 15 LAPTOP

I use it over a Macbook because it’s cheaper, has a 4K screen, and I could get a model with 32 gigabytes of RAM. I do a lot of previs myself, using Cinema 4D with Redshift, and this little guy can

84 • VFXVOICE.COM SUMMER 2018

This has been a revolution in HDRI acquisition for me. Paired with the Simple HDR app on my iPhone, I can now take HDRIs with something half the size of a phone, quickly and efficiently. I can put it in unique places, and even just put it out there while the

SUMMER 2018 VFXVOICE.COM • 85


VFX TRENDS

WHAT’S IN YOUR VFX KIT? TIPS FROM THE EXPERTS By IAN FAILES

It’s a question that regularly gets asked among visual effects supervisors: what do you take with you on a job? And the answer, of course, often depends on the nature of the project – sometimes you need to be nimble and sometimes you need all the gear. VFX Voice asked four highly experienced VFX pros to weigh in on the VFX kits they bring along for shooting. Visual Effects Supervisor Stephan Fleet details all the gear he relies on when on-set supervising, while fellow VFX Supervisors Mark Kolpack and Dan Schrecker as well as Creative Producer Eric Alba pinpoint specific pieces of gear and how they use them to help craft great visual effects shots.

STEPHAN FLEET, Visual Effects Supervisor Recent projects: Season 2 of NBC’s Timeless, Misfits and Reverie pilots, and Marvel’s Iron Fist IPHONE X

In the past 5 years or so, this has become the most invaluable tool in my kit. I don’t often bring a tablet. I want something I can fit in my pocket. It has to be an Apple device because I use a custom Filemaker database that I wrote to take all of my on-set notes and data. It’s a pretty cool database that lets me quickly get data for multiple cameras, add reference images, jot down notes, link to my VFX breakdown, and my coolest new feature is that I can grab GPS coordinates so in post we can hone in on the exact location we were filming. I also use the stock camera app, the stock compass/level app, an HDR remote app called Simple HDR that links up to my Ricoh Theta S, an app called Measures for recording measurement plans, Dark Sky for weather checking, Dax, Pix, and an app called VUER to livestream footage over wifi with a Teradek.

handle that, as well as Photoshop and Premiere! PANASONIC GH5 (LARGE CAMERA)

I recently switched from being a Canon 5D user to mirrorless, and the GH5 is perfect for my needs. Here’s why: • Silent shutter: This is a huge advantage on set. I can snap away while they are filming. • Shoots amazing video: The fact that this camera can shoot 10-bit 4k 4:2:2 footage in V-Log is huge. I have actually shot B-unit VFX plates for production with this little guy. You can even pull a decent key off of a greenscreen. • It has a cool mode called 6K photo capture that actually captures a 6K H.265 movie. I have a unique technique where I can pull stills from that video and use Agisoft PhotoScan to make a great photogrammetry 3D model. I can literally whip the camera around an actor’s head like a magic wand in 10 seconds to accomplish this. • I use a 12-60mm F2.8 Panasonic lens as my main lens. The only drawback for me is that the camera has a smaller chip size, so it struggles a little more in low light compared to, say, the Sony a7S II. Shows always shoot in super low light these days, wide open. To compensate for this, I keep the superb Leica 12mm F1.4 lens in my kit. It’s small and not too heavy. You can also up the ISO to around 3200 before it breaks.

OPPOSITE TOP LEFT: Visual Effects Supervisor Stephan Fleet OPPOSITE TOP RIGHT: The Manfrotto MB MS-BP-IGR medium backpack is Fleet’s most recent carry bag for his VFX kit. OPPOSITE BOTTOM: Fleet’s VFX kit bag and its contents laid out. TOP LEFT: Visual Effects Supervisor Mark Kolpack. TOP RIGHT: Kolpack scans Agents of S.H.I.E.L.D. Visual Effects Editor Ryan Moos with the Structure Sensor. BOTTOM: Part of Kolpack’s on-set kit, including his iPad Pro.

SONY RX100 V (POCKET CAMERA)

This is a little point-and-shoot camera that stays in my kit at all times. In fact, I use it more than my GH5. It can fit in a pocket, it shoots great 4K and stills, pretty decent low light functionality. I highly recommend this camera as a backup, or even a main for a light, non-plate heavy show. RICOH THETA S

DELL XPS 15 LAPTOP

I use it over a Macbook because it’s cheaper, has a 4K screen, and I could get a model with 32 gigabytes of RAM. I do a lot of previs myself, using Cinema 4D with Redshift, and this little guy can

84 • VFXVOICE.COM SUMMER 2018

This has been a revolution in HDRI acquisition for me. Paired with the Simple HDR app on my iPhone, I can now take HDRIs with something half the size of a phone, quickly and efficiently. I can put it in unique places, and even just put it out there while the

SUMMER 2018 VFXVOICE.COM • 85


VFX TRENDS

what the day has in store. But gaffers tape is always a must for trackers and other tape needs. BOSCH GLM 50 LASER MEASURE METER

This is a cheap little distance meter. I used to have a nice Hilti that broke. Honestly, I’m not the biggest distance meter fan; they are somewhat unreliable and struggle in filming conditions. PING PONG BALLS AND VELCRO

I wrap ping pong balls in green gaffers tape and affix velcro to them in a plus formation. This creates trackers that I can toss up high and will stick on most green or bluescreens. You can also play fun bar games with the rigging grip crew. LED TRACKERS

I rarely use these, but they come in handy in a pinch. For these just type in ‘LED White Party Lights’ in Amazon. You’ll find little small LED lights with silver bases that cost around 10 bucks for a bag. They work great. PATAGONIA MICRO PUFF JACKET WITH HOOD

TOP: Visual Effects Supervisor Dan Schrecker BOTTOM: Schrecker’s typical VFX kit is designed to fit into a single backpack and tripod bag.

crew is working, if I am in survival mode, to get ‘something’ (which is better than nothing, right?). Gone are the days in network TV where the crew will stop to let you get that perfect HDRI.

This is the smallest, warmest jacket I could find. It stuffs itself into a pillow. It’s so important to bring layers and never find yourself freezing.

MANFROTTO 5001B 74-INCH NANO STAND

ANKER POWERCORE FUSION 5000 2-IN-1 PORTABLE CHARGER AND AN IPHONE CABLE

By far one of my favorite pieces of gear, and it’s only 50 bucks! This is technically a light stand. It’s super light, and it folds up to be under two feet long and weighs practically nothing. Here’s the crazy thing; most of the time, I only bring this, I do not bring a tripod. Coupled with a small Giottos ballhead mount, this thing holds my GH5 if I need to do a lock-off shot. I mainly use it for my Ricoh Theta and HDRI captures. It’s better than a tripod because the legs are low profile. I also pair it with my refractive laser. Most recently, on Timeless, I taped a 13-foot pole to the stand and used it as a height guideline for adding a CG timeship. This stand also raises very high, higher than 6 feet, which is great for HDRIs.

TOP: Schrecker surveys an area for shooting the Darren Aronofsky film, Mother!

My favorite little battery pack. It’s small, light, and actually has a power plug on it, so you can plug it in and use it like a regular charging plug for an iPhone or iPad, or you can get about three battery charges for one iPhone off of it. LACIE RUGGED USB-C DRIVE, 2TB

For every season of a show I do, I get at least one Rugged drive to house all of my photos, video, etc – anything that’s too big to toss on Box. Make sure to back stuff up! BAG TIPS

This is a small color chip chart. I don’t use it too much. Some companies like chip charts in their HDRIs, so I will toss this down near my Theta when I shoot HDRI images.

My bag is an ever-changing thing. I’m obsessed with bags and want the smallest, most comfortable and convenient bag there is. As such, I have used the Peak Every Day Messenger (it was OK), the Peak Every Day backpack (hated it), and, most recently, the Manfrotto MB MS-BP-IGR medium backpack for DSLR camera and personal gear. This is a small hybrid bag that lets me pack some camera gear, laptop, a jacket, and house my Manfrotto stand on the side. It’s not perfect, but I like its small size and how comfortable it is. If I need to bring my GH5 I use the Incase DSLR Pro Pack, which is just OK. If I am traveling, I have a Think Tank Airport AirStream. It’s a roller backpack, nice quality, small, and lets me bring my gear as a carry on. The perfect bag is still not out there for me!

GREEN AND BLUE GAFFERS TAPE

KEEP THESE IN THE TRUNK OF YOUR CAR

GREEN LASER GRID PEN WITH REFRACTIVE GLASS

At Ghoststop.com you can get a laser pointer that shoots an array of green dots. Often, when we are doing interior greenscreen, instead of slathering the screen with trackers, I set this laser up, off to the side, on my Manfrotto stand, and point it at the screen. I can art direct where I want the trackers to be, mostly avoiding them straddling hair. It’s just faster. Bring batteries though, this thing eats ‘em up! X-RITE COLORCHECKER PASSPORT PHOTO

I usually only keep one or the other in my bag, depending on

86 • VFXVOICE.COM SUMMER 2018

In addition to what I carry on me, my nomadic lifestyle has

SUMMER 2018 VFXVOICE.COM • 87


VFX TRENDS

what the day has in store. But gaffers tape is always a must for trackers and other tape needs. BOSCH GLM 50 LASER MEASURE METER

This is a cheap little distance meter. I used to have a nice Hilti that broke. Honestly, I’m not the biggest distance meter fan; they are somewhat unreliable and struggle in filming conditions. PING PONG BALLS AND VELCRO

I wrap ping pong balls in green gaffers tape and affix velcro to them in a plus formation. This creates trackers that I can toss up high and will stick on most green or bluescreens. You can also play fun bar games with the rigging grip crew. LED TRACKERS

I rarely use these, but they come in handy in a pinch. For these just type in ‘LED White Party Lights’ in Amazon. You’ll find little small LED lights with silver bases that cost around 10 bucks for a bag. They work great. PATAGONIA MICRO PUFF JACKET WITH HOOD

TOP: Visual Effects Supervisor Dan Schrecker BOTTOM: Schrecker’s typical VFX kit is designed to fit into a single backpack and tripod bag.

crew is working, if I am in survival mode, to get ‘something’ (which is better than nothing, right?). Gone are the days in network TV where the crew will stop to let you get that perfect HDRI.

This is the smallest, warmest jacket I could find. It stuffs itself into a pillow. It’s so important to bring layers and never find yourself freezing.

MANFROTTO 5001B 74-INCH NANO STAND

ANKER POWERCORE FUSION 5000 2-IN-1 PORTABLE CHARGER AND AN IPHONE CABLE

By far one of my favorite pieces of gear, and it’s only 50 bucks! This is technically a light stand. It’s super light, and it folds up to be under two feet long and weighs practically nothing. Here’s the crazy thing; most of the time, I only bring this, I do not bring a tripod. Coupled with a small Giottos ballhead mount, this thing holds my GH5 if I need to do a lock-off shot. I mainly use it for my Ricoh Theta and HDRI captures. It’s better than a tripod because the legs are low profile. I also pair it with my refractive laser. Most recently, on Timeless, I taped a 13-foot pole to the stand and used it as a height guideline for adding a CG timeship. This stand also raises very high, higher than 6 feet, which is great for HDRIs.

TOP: Schrecker surveys an area for shooting the Darren Aronofsky film, Mother!

My favorite little battery pack. It’s small, light, and actually has a power plug on it, so you can plug it in and use it like a regular charging plug for an iPhone or iPad, or you can get about three battery charges for one iPhone off of it. LACIE RUGGED USB-C DRIVE, 2TB

For every season of a show I do, I get at least one Rugged drive to house all of my photos, video, etc – anything that’s too big to toss on Box. Make sure to back stuff up! BAG TIPS

This is a small color chip chart. I don’t use it too much. Some companies like chip charts in their HDRIs, so I will toss this down near my Theta when I shoot HDRI images.

My bag is an ever-changing thing. I’m obsessed with bags and want the smallest, most comfortable and convenient bag there is. As such, I have used the Peak Every Day Messenger (it was OK), the Peak Every Day backpack (hated it), and, most recently, the Manfrotto MB MS-BP-IGR medium backpack for DSLR camera and personal gear. This is a small hybrid bag that lets me pack some camera gear, laptop, a jacket, and house my Manfrotto stand on the side. It’s not perfect, but I like its small size and how comfortable it is. If I need to bring my GH5 I use the Incase DSLR Pro Pack, which is just OK. If I am traveling, I have a Think Tank Airport AirStream. It’s a roller backpack, nice quality, small, and lets me bring my gear as a carry on. The perfect bag is still not out there for me!

GREEN AND BLUE GAFFERS TAPE

KEEP THESE IN THE TRUNK OF YOUR CAR

GREEN LASER GRID PEN WITH REFRACTIVE GLASS

At Ghoststop.com you can get a laser pointer that shoots an array of green dots. Often, when we are doing interior greenscreen, instead of slathering the screen with trackers, I set this laser up, off to the side, on my Manfrotto stand, and point it at the screen. I can art direct where I want the trackers to be, mostly avoiding them straddling hair. It’s just faster. Bring batteries though, this thing eats ‘em up! X-RITE COLORCHECKER PASSPORT PHOTO

I usually only keep one or the other in my bag, depending on

86 • VFXVOICE.COM SUMMER 2018

In addition to what I carry on me, my nomadic lifestyle has

SUMMER 2018 VFXVOICE.COM • 87


VFX TRENDS

TOP: Creative Producer Eric Alba BOTTOM: Alba’s self-made tracking markers that he makes available for anyone to download and print. He says he has seen them being used on sets in New York, Prague, New Zealand and Buenos Aires.

taught me to keep the following in the trunk of my car at all times. First, a folding chair – get a nice one. Nothing makes life better on set than a good chair. I have a rocking one, it’s better than any director’s chair out there. Buy shell pants and a shell jacket for getting caught in the rain or for freezing-cold days on set. Bring strong weatherproof boots, which are also good for the rough terrain and cold days. Smart wool socks: a must. Finally, have sunscreen, toiletries, water, and a big heavy jacket at the ready. As you can tell I have shot in a lot of cold weather places!

The first is a Canon 5D Mark II if hi-res HDRIs or hi-res texture reference are required. He’ll also bring along a Structure Sensor attached to an iPad Pro for quick 3D scans of rooms or actors. The third is a set of two GoPro HERO4 cameras. “I picked these up to shoot reflection plates for a window where the glass had to be removed,” says Schrecker. “Because they’re so small I was able to hide them pretty well in the frame and get a decent reflection image of each take. They can also be used as witness cameras in some cases, but their lenses are so wide they’re not ideal.”

MARK KOLPACK, Visual Effects Supervisor Recent projects: ABC television show Marvel’s Agents of S.H.I.E.L.D., Zoo and Shots Fired

ERIC ALBA, Creative Producer Recent projects: The immersive VR installation “Paraiso Secreto” for Corona, with production by The Mill

Along with a nimble array of key camera, lens and tripod gear, Kolpack’s must-have on set is a shoulder bag containing his iPad Pro. “The iPad is for taking on-set notes in the FileMaker Go App with a camera note taking layout,” he says. “The layout allows me to get the camera data off of three cameras as well as my HDRI lighting sphere data.” “The iPad also allows me to keep both video clips and concept stills to show the actors what the environment or creature is going to look like,” adds Kolpack. “This goes a long way to helping them to tie in their performances with what will be placed into the shot later.” Having the iPad Pro also enables the visual effects supervisor to utilize a Structure Sensor. “This attaches to my iPad Pro and allows me to scan props, people, sets and costume pieces,” explains Kolpack. “While it is not at the level of photogrammetry that we do for our high-end scans of actors, it does allow me to capture those items that either give me a reference scan that is used for interactive CG pieces on actors as well as set environments.”

After countless hours spent on set, Alba decided to produce his own tracking markers, and to make them publicly available on his website [http://24liespersecond.com] for others to download and use themselves. “Each sheet of tracking markers is 20 inches by 30 inches,” he says, “made up of 16 5-inch markers and 32 2.5-inch markers.” Alba suggests the printed markers can be placed on sheets of 20-inch x 30-inch self-adhesive foam-core boards cut to size. To help with laying out the markers, he relies on a laser level which he originally purchased to hang framed photos at home. “Now I use it

to lay out tracking markers evenly across floors and walls,” says Alba. “It can screw onto a tripod, it’s self-leveling and it has three modes: vertical line, horizontal line and crosshair. The only problem is that it’s good only for about 30 feet and not great in sunlight.” Alba also has a quick-fire tracking marker solution for outdoor shoots on ground or grass when time is too short for a more elaborate tracking setup: golf tees. “They are super neon bright, so I only need the tip to show. They are easy to key out and big enough for the tracker to track them.”

BOTTOM: Black and white tracking markers being positioned on a greenscreen.

DAN SCHRECKER, Visual Effects Supervisor Recent projects: Darren Aronofsky’s Mother! and currently in post on the Jonathan Levine comedy, Flarsky “My on-set kit fluctuates depending on the job,” says Schrecker. “If I’m on a bigger VFX show where I have some support crew and I don’t need to bring gear with me to set every day, since I can leave it there, I’ll carry more items. But if I’m flying solo and the VFX are less complex, or I know exactly what’s in store, I’ll strip my kit down to essential items which can fit in a single backpack and tripod bag. “Those stripped-down items include a Canon 6D DSLR, Canon 17-40mm lens, Canon 24-105mm lens, Rokinon 8mm lens, Ricoh Theta V, Manfrotto 501HDV tripod, Ricoh Theta V monopod, Leica Disto E7300, MacBook Pro with Adobe After Effects, gaffers tape, small LED lights for tracking markers, cables, chargers, extra batteries, laser pointer, lens cleaning kit, Mini Mag-light, Leatherman, tape measure, Sharpies, USB drive, extra memory cards for DSLR, Joe’s Sticky Stuff (clear butyl), and paper and a pen.” Schrecker adds three extra pieces of kit in special circumstances.

88 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 89


VFX TRENDS

TOP: Creative Producer Eric Alba BOTTOM: Alba’s self-made tracking markers that he makes available for anyone to download and print. He says he has seen them being used on sets in New York, Prague, New Zealand and Buenos Aires.

taught me to keep the following in the trunk of my car at all times. First, a folding chair – get a nice one. Nothing makes life better on set than a good chair. I have a rocking one, it’s better than any director’s chair out there. Buy shell pants and a shell jacket for getting caught in the rain or for freezing-cold days on set. Bring strong weatherproof boots, which are also good for the rough terrain and cold days. Smart wool socks: a must. Finally, have sunscreen, toiletries, water, and a big heavy jacket at the ready. As you can tell I have shot in a lot of cold weather places!

The first is a Canon 5D Mark II if hi-res HDRIs or hi-res texture reference are required. He’ll also bring along a Structure Sensor attached to an iPad Pro for quick 3D scans of rooms or actors. The third is a set of two GoPro HERO4 cameras. “I picked these up to shoot reflection plates for a window where the glass had to be removed,” says Schrecker. “Because they’re so small I was able to hide them pretty well in the frame and get a decent reflection image of each take. They can also be used as witness cameras in some cases, but their lenses are so wide they’re not ideal.”

MARK KOLPACK, Visual Effects Supervisor Recent projects: ABC television show Marvel’s Agents of S.H.I.E.L.D., Zoo and Shots Fired

ERIC ALBA, Creative Producer Recent projects: The immersive VR installation “Paraiso Secreto” for Corona, with production by The Mill

Along with a nimble array of key camera, lens and tripod gear, Kolpack’s must-have on set is a shoulder bag containing his iPad Pro. “The iPad is for taking on-set notes in the FileMaker Go App with a camera note taking layout,” he says. “The layout allows me to get the camera data off of three cameras as well as my HDRI lighting sphere data.” “The iPad also allows me to keep both video clips and concept stills to show the actors what the environment or creature is going to look like,” adds Kolpack. “This goes a long way to helping them to tie in their performances with what will be placed into the shot later.” Having the iPad Pro also enables the visual effects supervisor to utilize a Structure Sensor. “This attaches to my iPad Pro and allows me to scan props, people, sets and costume pieces,” explains Kolpack. “While it is not at the level of photogrammetry that we do for our high-end scans of actors, it does allow me to capture those items that either give me a reference scan that is used for interactive CG pieces on actors as well as set environments.”

After countless hours spent on set, Alba decided to produce his own tracking markers, and to make them publicly available on his website [http://24liespersecond.com] for others to download and use themselves. “Each sheet of tracking markers is 20 inches by 30 inches,” he says, “made up of 16 5-inch markers and 32 2.5-inch markers.” Alba suggests the printed markers can be placed on sheets of 20-inch x 30-inch self-adhesive foam-core boards cut to size. To help with laying out the markers, he relies on a laser level which he originally purchased to hang framed photos at home. “Now I use it

to lay out tracking markers evenly across floors and walls,” says Alba. “It can screw onto a tripod, it’s self-leveling and it has three modes: vertical line, horizontal line and crosshair. The only problem is that it’s good only for about 30 feet and not great in sunlight.” Alba also has a quick-fire tracking marker solution for outdoor shoots on ground or grass when time is too short for a more elaborate tracking setup: golf tees. “They are super neon bright, so I only need the tip to show. They are easy to key out and big enough for the tracker to track them.”

BOTTOM: Black and white tracking markers being positioned on a greenscreen.

DAN SCHRECKER, Visual Effects Supervisor Recent projects: Darren Aronofsky’s Mother! and currently in post on the Jonathan Levine comedy, Flarsky “My on-set kit fluctuates depending on the job,” says Schrecker. “If I’m on a bigger VFX show where I have some support crew and I don’t need to bring gear with me to set every day, since I can leave it there, I’ll carry more items. But if I’m flying solo and the VFX are less complex, or I know exactly what’s in store, I’ll strip my kit down to essential items which can fit in a single backpack and tripod bag. “Those stripped-down items include a Canon 6D DSLR, Canon 17-40mm lens, Canon 24-105mm lens, Rokinon 8mm lens, Ricoh Theta V, Manfrotto 501HDV tripod, Ricoh Theta V monopod, Leica Disto E7300, MacBook Pro with Adobe After Effects, gaffers tape, small LED lights for tracking markers, cables, chargers, extra batteries, laser pointer, lens cleaning kit, Mini Mag-light, Leatherman, tape measure, Sharpies, USB drive, extra memory cards for DSLR, Joe’s Sticky Stuff (clear butyl), and paper and a pen.” Schrecker adds three extra pieces of kit in special circumstances.

88 • VFXVOICE.COM SUMMER 2018

SUMMER 2018 VFXVOICE.COM • 89


[ VFX CAREERS ]

Creative Recruitment in Today’s VFX Industry By LEAH JAIKARAN Welcome to the first installment of VFX Voice’s occasional column, VFX Careers. Each column will explore a different set of considerations for global VFX professionals. This issue’s column discusses the employer/studio view from Leah Jaikaran, Head of Creative Resources at Spin VFX , Toronto, Canada.

TOP: Leah Jaikaran, Head of Creative Resources, Spin VFX OPPOSITE TOP: Visual effects artists at work at Spin VFX. (Photo: Sandy MacKay)

“Anyone can buy the best and fastest computers, but our not-so-secret sauce is in our team.” —Neishaw Ali, President, Spin VFX

90 • VFXVOICE.COM SUMMER 2018

ENGAGEMENT WITH THE ARTIST COMMUNITY Our rules of engagement with the global VFX community are a fundamental part of our process, and require a multi-pronged recruitment strategy that focuses on the artist experience. This process includes investment in recruitment technology and maintaining a digital presence, as well as hosting job fairs, studio tours and networking events, navigating the immigration landscape, and growing our Reel & Resume Workshop/SPINternship Program. Spin VFX President Neishaw Ali often says that “anyone can buy the best and fastest computers, but our not-so-secret sauce is in our team. They are our most valuable asset and they are independently driven, so our unique challenge is to provide a positive, diverse cultural environment that is candid. This is why four years ago we implemented our open-book management style of leadership, wherein we share with our staff our financial successes, as well as our tough months. This way, they always know the team’s score.” We believe that our artists need to reconcile their personal vision as it relates to their career goals and family lifestyles when deciding to join our studio. With this in mind, each candidate not only meets with our Recruitment Team, but also their potential direct supervisor, teammates and most often Neishaw herself. We want all candidates to get to know Spin VFX personally and anticipate our company culture and code of ethics. Not everyone is a fit for our existing culture, nor are we a fit to everyone’s needs. We believe that our recruitment system allows for all of the voices to be heard in this decision-making process. In keeping with this, the artist will be taken through a “day-in-thelife” of the role they are interviewing for. We share details about our workflow and the elements of our company that make us unique, such as the talented team, our year-end Spin Awards for Creative Excellence Gala (known as the “SAKE Awards”), and our various sports teams that compete with other VFX studios across our city. We get to know more about their unique skills, their passions and inspirations, and create a career plan as part of their journey with us.

COMMUNITY EVENTS Over the years, our VFX team has attended numerous recruitment events such as the Toronto Animation Arts Festival International, Effects Montreal, and the Ottawa Animation Festival. We also take part in Computer Animation Studios of Ontario (CASO) events, SIGGRAPH, The ZBrush Summit and the National Association of Broadcasters annual conference, which allows us to gain insight into new trends while networking with other experts in our field. Additionally, part of our philosophy at Spin VFX allows for our employees to act as brand ambassadors and take an active part in recruitment. We often have a presence at events that are not necessarily recruitment-focused, but a great place to meet talent, such as Monsterpalooza in California, and a variety of Comic-Cons.

RECRUITMENT PROCESS Our primary and most effective recruitment investment has been our Recruiter Seat on LinkedIn. Not only do we post our active roles, but we engage with the community and have personal online conversations with both active and passive candidates. Artists around the world are active in several online communities, and LinkedIn has proven to be a great repository of talent for us to engage with, often in real-time.

INTEGRATION OF INTERNATIONAL ARTISTS A key part of our international recruitment strategy is our welcome package, which helps international hires familiarize themselves with our company, our city and local attractions. Our package includes suggested neighborhoods, transit information, schools/daycares and other necessities in the community. We work with a team of knowledgeable lawyers and immigration specialists to help simplify the process of relocating talent from

We have also been able to utilize social media as a tool to develop our brand by promoting the culture of the studio, showcasing engagement initiatives, spotlighting artists and interacting with our fans and followers. Another key component of our recruitment process is our annual Spin Roadshow, which takes us across North America. Job seekers who meet with us can expect to speak with senior members from Spin VFX. We want potential employees to understand who we are, and we invite those that are interested to come in for a studio tour as a follow up. This method has proven effective for not only attracting highly talented VFX artists, but also for ensuring that they have the opportunity to familiarize themselves with our studio before choosing to join us.

across the globe to our Toronto studio. Whether working through LMIA, NAFTA, or a variety of work permits, we recognize this undertaking as an investment in our studio. REEL & RESUME WORKSHOP In 2016, we launched our first Reel & Resume Workshop, which has evolved into an annual initiative with multiple workshops. Each year we invite various levels of talent into the studio to undergo best practices training. We share industry tips and tricks for showcasing your digital work, followed by a presentation on resume and interviewing techniques. Attendees are given the opportunity to meet one-on-one with industry experts who review attendees’ work and provide real-time feedback on improvements to best promote the artist’s skill set. This has the added value of connecting our tenured artists with new and emerging talent so that they can build independent relationships. Participants then meet one-on-one with a member of our Recruitment Team to have their resume reviewed and participate in mock interviews. We have hired several talented artists and forged relationships with future applicants as a result of our Workshops. SPINTERNSHIP While our Reel & Resume Workshop is open to the broad community, no matter where an artist desires to work, it has lent much success to our formalized internship program which we have fondly coined “SPINternship.” We developed and implemented this program to attract new graduates, build relationships with various post-secondary institutions (locally and internationally), and encourage our internal experts to mentor emerging talent. Our program is a four-month paid internship that takes interns through department-specific trainings, lunch-and-learn sessions, master classes, team building events, and prepares them for full-time positions. Over the last couple of years, 90% of our interns received and accepted full-time offers to join Spin VFX on a permanent basis upon completion of the SPINternship term.

SUMMER 2018

VFXVOICE.COM • 91


[ VFX CAREERS ]

Creative Recruitment in Today’s VFX Industry By LEAH JAIKARAN Welcome to the first installment of VFX Voice’s occasional column, VFX Careers. Each column will explore a different set of considerations for global VFX professionals. This issue’s column discusses the employer/studio view from Leah Jaikaran, Head of Creative Resources at Spin VFX , Toronto, Canada.

TOP: Leah Jaikaran, Head of Creative Resources, Spin VFX OPPOSITE TOP: Visual effects artists at work at Spin VFX. (Photo: Sandy MacKay)

“Anyone can buy the best and fastest computers, but our not-so-secret sauce is in our team.” —Neishaw Ali, President, Spin VFX

90 • VFXVOICE.COM SUMMER 2018

ENGAGEMENT WITH THE ARTIST COMMUNITY Our rules of engagement with the global VFX community are a fundamental part of our process, and require a multi-pronged recruitment strategy that focuses on the artist experience. This process includes investment in recruitment technology and maintaining a digital presence, as well as hosting job fairs, studio tours and networking events, navigating the immigration landscape, and growing our Reel & Resume Workshop/SPINternship Program. Spin VFX President Neishaw Ali often says that “anyone can buy the best and fastest computers, but our not-so-secret sauce is in our team. They are our most valuable asset and they are independently driven, so our unique challenge is to provide a positive, diverse cultural environment that is candid. This is why four years ago we implemented our open-book management style of leadership, wherein we share with our staff our financial successes, as well as our tough months. This way, they always know the team’s score.” We believe that our artists need to reconcile their personal vision as it relates to their career goals and family lifestyles when deciding to join our studio. With this in mind, each candidate not only meets with our Recruitment Team, but also their potential direct supervisor, teammates and most often Neishaw herself. We want all candidates to get to know Spin VFX personally and anticipate our company culture and code of ethics. Not everyone is a fit for our existing culture, nor are we a fit to everyone’s needs. We believe that our recruitment system allows for all of the voices to be heard in this decision-making process. In keeping with this, the artist will be taken through a “day-in-thelife” of the role they are interviewing for. We share details about our workflow and the elements of our company that make us unique, such as the talented team, our year-end Spin Awards for Creative Excellence Gala (known as the “SAKE Awards”), and our various sports teams that compete with other VFX studios across our city. We get to know more about their unique skills, their passions and inspirations, and create a career plan as part of their journey with us.

COMMUNITY EVENTS Over the years, our VFX team has attended numerous recruitment events such as the Toronto Animation Arts Festival International, Effects Montreal, and the Ottawa Animation Festival. We also take part in Computer Animation Studios of Ontario (CASO) events, SIGGRAPH, The ZBrush Summit and the National Association of Broadcasters annual conference, which allows us to gain insight into new trends while networking with other experts in our field. Additionally, part of our philosophy at Spin VFX allows for our employees to act as brand ambassadors and take an active part in recruitment. We often have a presence at events that are not necessarily recruitment-focused, but a great place to meet talent, such as Monsterpalooza in California, and a variety of Comic-Cons.

RECRUITMENT PROCESS Our primary and most effective recruitment investment has been our Recruiter Seat on LinkedIn. Not only do we post our active roles, but we engage with the community and have personal online conversations with both active and passive candidates. Artists around the world are active in several online communities, and LinkedIn has proven to be a great repository of talent for us to engage with, often in real-time.

INTEGRATION OF INTERNATIONAL ARTISTS A key part of our international recruitment strategy is our welcome package, which helps international hires familiarize themselves with our company, our city and local attractions. Our package includes suggested neighborhoods, transit information, schools/daycares and other necessities in the community. We work with a team of knowledgeable lawyers and immigration specialists to help simplify the process of relocating talent from

We have also been able to utilize social media as a tool to develop our brand by promoting the culture of the studio, showcasing engagement initiatives, spotlighting artists and interacting with our fans and followers. Another key component of our recruitment process is our annual Spin Roadshow, which takes us across North America. Job seekers who meet with us can expect to speak with senior members from Spin VFX. We want potential employees to understand who we are, and we invite those that are interested to come in for a studio tour as a follow up. This method has proven effective for not only attracting highly talented VFX artists, but also for ensuring that they have the opportunity to familiarize themselves with our studio before choosing to join us.

across the globe to our Toronto studio. Whether working through LMIA, NAFTA, or a variety of work permits, we recognize this undertaking as an investment in our studio. REEL & RESUME WORKSHOP In 2016, we launched our first Reel & Resume Workshop, which has evolved into an annual initiative with multiple workshops. Each year we invite various levels of talent into the studio to undergo best practices training. We share industry tips and tricks for showcasing your digital work, followed by a presentation on resume and interviewing techniques. Attendees are given the opportunity to meet one-on-one with industry experts who review attendees’ work and provide real-time feedback on improvements to best promote the artist’s skill set. This has the added value of connecting our tenured artists with new and emerging talent so that they can build independent relationships. Participants then meet one-on-one with a member of our Recruitment Team to have their resume reviewed and participate in mock interviews. We have hired several talented artists and forged relationships with future applicants as a result of our Workshops. SPINTERNSHIP While our Reel & Resume Workshop is open to the broad community, no matter where an artist desires to work, it has lent much success to our formalized internship program which we have fondly coined “SPINternship.” We developed and implemented this program to attract new graduates, build relationships with various post-secondary institutions (locally and internationally), and encourage our internal experts to mentor emerging talent. Our program is a four-month paid internship that takes interns through department-specific trainings, lunch-and-learn sessions, master classes, team building events, and prepares them for full-time positions. Over the last couple of years, 90% of our interns received and accepted full-time offers to join Spin VFX on a permanent basis upon completion of the SPINternship term.

SUMMER 2018

VFXVOICE.COM • 91


[ VES SECTION SPOTLIGHT: MONTREAL ]

Dynamic Canadian Section Riding the VFX Wave By NAOMI GOLDMAN

TOP: Montreal Section Board, from left to right: Jason Quintana, Chris Kazmier, Emma McGuinness, Peter Nofz, Chloe Grysole, David Bitton, Patrick Parenteau, Jean-Paul Rovela, Jerry Tung. Not pictured are Thai Son Doan and Thomas Tannenberger. BOTTOM LEFT: Montreal Section Chair Chloe Grysole and Section Co-chair David Bitton at the Fall Cocktail Party at effectsMTL.

92 • VFXVOICE.COM SUMMER 2018

Now two decades strong, the Visual Effects Society owes much of its growth and diversification to its global network of Sections that galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. Founded in 2012, the Montreal Section – one of three Canadian Sections, along with Toronto and Vancouver – is in an exciting period of expansion, mirroring the recent explosive growth of its local visual effects and feature animation industry. “Five years ago, the Montreal visual effects community was a boutique environment,” says Chloe Grysole, VES Montreal Section Chair. “In that short time frame, the local industry has grown enormously to upwards of 3,000 people working in VFX and feature animation, and projections anticipate another 65% growth by 2020. The industry is booming and the buzz is palpable.” Montreal houses local offices of major multi-national visual effects facilities headquartered in the U.K., including Cinesite, Double Negative, Framestore and MPC, and also has a strong roster of local companies, including Folks VFX, Mels, Raynault VFX and Rodeo FX in addition to American companies like Atomic Fiction and Reel FX. “We have always fostered a strong tradition of local artistry, but because the industry wasn’t that large, many practitioners pursued their careers elsewhere in the world,” says Grysole. “With our massive growth spurt, it feels like the industry has finally tapped into the creative kernel that has long existed here. Given the caliber of local VFX talent and the added draw of tax incentives, Montreal is benefiting from Vancouver’s success, which has spread across the country.” This French/English bilingual Section currently boasts approximately 110 members, most doing visual effects work in the film industry on VFX blockbusters or feature animation. An estimated 40% of local VFX practitioners have relocated in Montreal from France or elsewhere in Canada, and also from all over Europe and the U.S., making for an eclectic and rich multicultural community. The dynamic Section management team, co-led by Grysole and Section Co-chair David Bitton, is focused on membership growth and building a programmatic calendar of educational events around the craft. It currently hosts popular monthly film screenings for its members, due in large part to its partnership with Technicolor. The Section also hosts two major annual events that serve as opportunities to recruit prospective members and raise the profile of the VES. The Fall Cocktail has become one the Section’s signature events where members, guests, artists and industry influencers convene and socialize. Last year, the Section teamed up with EffectsMTL – the flourishing east coast VFX and feature animation conference – to tap into the diverse audience of attendees for its Fall Cocktail.

The Montreal holiday Mega Party is another highly anticipated annual event and convenes VFX practitioners from across the region. Thanks to generous support from Rodeo FX, Double Negative and Cinesite, among others, last year’s party drew upwards of 600 people. “Because we have so many new people coming into the city, we have used our festive holiday gathering to build a sense of community. We are very proud to offer this meet-and-greet for the community at large and welcome anyone working in the industry to join us,” says Grysole. The Section is also looking to replicate successful events put on by other Sections, such as the Regional VES Awards Celebration hosted by the New York section for the past four years. The team is gleaning best practices from some of the more mature Sections, including its sister Canadian Sections, and is excitedly exploring opportunities to highlight Montreal’s artists and innovators. Grysole comments on Montreal’s watershed moment: “Look at this year with Framestore winning a BAFTA and an Academy Award for its visual effects work on Blade Runner 2049. All of the work was done in Framestore’s Montreal office, shining another light on the talent we have here. Production is booming and we have acclaimed directors from Quebec, including Denis Villeneuve and Jean-Marc Vallée, in hot demand. It’s an exciting time. There is a synergy and momentum going on, and we want to be able to reward and recognize our homegrown talent.” The Montreal Section is proud to operate in the heart of the city. Beyond the film and gaming industry, Montreal is host to world-renowned music and cultural festivals, has the second largest number of restaurants per capita after New York City, and is noted as one of the most affordable big cities in North America. Grysole adds, “Culturally speaking, we’re halfway between Europe and North America. The quality of life in our unique creative hub is high, making it a great place to work and play. And our Section has the dedication, the resources and the drive to build a vibrant visual effects community that embraces this amazing culture – and contributes greatly to the growing universe of filmed entertainment coming out of Montreal. The future is bright.”

TOP LEFT AND TOP RIGHT: Montreal Section members enjoy the Fall Cocktail Party at effectsMTL. MIDDLE AND BOTTOM: Montreal Section members and industry guests celebrating at the Annual Holiday Mega Party.

SUMMER 2018

VFXVOICE.COM • 93


[ VES SECTION SPOTLIGHT: MONTREAL ]

Dynamic Canadian Section Riding the VFX Wave By NAOMI GOLDMAN

TOP: Montreal Section Board, from left to right: Jason Quintana, Chris Kazmier, Emma McGuinness, Peter Nofz, Chloe Grysole, David Bitton, Patrick Parenteau, Jean-Paul Rovela, Jerry Tung. Not pictured are Thai Son Doan and Thomas Tannenberger. BOTTOM LEFT: Montreal Section Chair Chloe Grysole and Section Co-chair David Bitton at the Fall Cocktail Party at effectsMTL.

92 • VFXVOICE.COM SUMMER 2018

Now two decades strong, the Visual Effects Society owes much of its growth and diversification to its global network of Sections that galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. Founded in 2012, the Montreal Section – one of three Canadian Sections, along with Toronto and Vancouver – is in an exciting period of expansion, mirroring the recent explosive growth of its local visual effects and feature animation industry. “Five years ago, the Montreal visual effects community was a boutique environment,” says Chloe Grysole, VES Montreal Section Chair. “In that short time frame, the local industry has grown enormously to upwards of 3,000 people working in VFX and feature animation, and projections anticipate another 65% growth by 2020. The industry is booming and the buzz is palpable.” Montreal houses local offices of major multi-national visual effects facilities headquartered in the U.K., including Cinesite, Double Negative, Framestore and MPC, and also has a strong roster of local companies, including Folks VFX, Mels, Raynault VFX and Rodeo FX in addition to American companies like Atomic Fiction and Reel FX. “We have always fostered a strong tradition of local artistry, but because the industry wasn’t that large, many practitioners pursued their careers elsewhere in the world,” says Grysole. “With our massive growth spurt, it feels like the industry has finally tapped into the creative kernel that has long existed here. Given the caliber of local VFX talent and the added draw of tax incentives, Montreal is benefiting from Vancouver’s success, which has spread across the country.” This French/English bilingual Section currently boasts approximately 110 members, most doing visual effects work in the film industry on VFX blockbusters or feature animation. An estimated 40% of local VFX practitioners have relocated in Montreal from France or elsewhere in Canada, and also from all over Europe and the U.S., making for an eclectic and rich multicultural community. The dynamic Section management team, co-led by Grysole and Section Co-chair David Bitton, is focused on membership growth and building a programmatic calendar of educational events around the craft. It currently hosts popular monthly film screenings for its members, due in large part to its partnership with Technicolor. The Section also hosts two major annual events that serve as opportunities to recruit prospective members and raise the profile of the VES. The Fall Cocktail has become one the Section’s signature events where members, guests, artists and industry influencers convene and socialize. Last year, the Section teamed up with EffectsMTL – the flourishing east coast VFX and feature animation conference – to tap into the diverse audience of attendees for its Fall Cocktail.

The Montreal holiday Mega Party is another highly anticipated annual event and convenes VFX practitioners from across the region. Thanks to generous support from Rodeo FX, Double Negative and Cinesite, among others, last year’s party drew upwards of 600 people. “Because we have so many new people coming into the city, we have used our festive holiday gathering to build a sense of community. We are very proud to offer this meet-and-greet for the community at large and welcome anyone working in the industry to join us,” says Grysole. The Section is also looking to replicate successful events put on by other Sections, such as the Regional VES Awards Celebration hosted by the New York section for the past four years. The team is gleaning best practices from some of the more mature Sections, including its sister Canadian Sections, and is excitedly exploring opportunities to highlight Montreal’s artists and innovators. Grysole comments on Montreal’s watershed moment: “Look at this year with Framestore winning a BAFTA and an Academy Award for its visual effects work on Blade Runner 2049. All of the work was done in Framestore’s Montreal office, shining another light on the talent we have here. Production is booming and we have acclaimed directors from Quebec, including Denis Villeneuve and Jean-Marc Vallée, in hot demand. It’s an exciting time. There is a synergy and momentum going on, and we want to be able to reward and recognize our homegrown talent.” The Montreal Section is proud to operate in the heart of the city. Beyond the film and gaming industry, Montreal is host to world-renowned music and cultural festivals, has the second largest number of restaurants per capita after New York City, and is noted as one of the most affordable big cities in North America. Grysole adds, “Culturally speaking, we’re halfway between Europe and North America. The quality of life in our unique creative hub is high, making it a great place to work and play. And our Section has the dedication, the resources and the drive to build a vibrant visual effects community that embraces this amazing culture – and contributes greatly to the growing universe of filmed entertainment coming out of Montreal. The future is bright.”

TOP LEFT AND TOP RIGHT: Montreal Section members enjoy the Fall Cocktail Party at effectsMTL. MIDDLE AND BOTTOM: Montreal Section members and industry guests celebrating at the Annual Holiday Mega Party.

SUMMER 2018

VFXVOICE.COM • 93


[ THE VES HANDBOOK ]

Acquisition/Shooting By MARK H. WEINGARTNER

“The only hard and fast rule with regard to shooting elements is that they have to look right when they are incorporated into a shot. Since individual elements are shot one at a time, all sorts of tricks can be used when creating them. … Even as the bar is constantly being raised in creating realistic digital effects, numerous tricks of the trade of the old school creators of special photographic effects are still valid. To paraphrase Duke Ellington, ‘If it looks good, it is good.’” —Mark H. Weingartner

The VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures (Focal Press), edited by Susan Zwerman and Jeffrey A. Okun, VES, has become a ‘go-to’ staple for professionals as a reference, for students as a guide and lay people to reveal the magic of visual effects. Here is an excerpt from the second edition. GENERIC VERSUS SHOT-SPECIFIC ELEMENTS Resizing elements in the digital world is only a mouse-click away, but even with this ease of adjustment, there are benefits to shooting elements in a shot-specific way. In shooting an element for a specific shot, one has the advantage of being

94 • VFXVOICE.COM SUMMER 2018

able to choose the appropriate lens and distance to match the perspective of the shot, the appropriate framing to make the shot work, and the appropriate lighting so as to integrate the element into the scene. Any camera position data from the original shot can be used to aid in line-up, but with elements such as water, pyro, flames, or atmospherics that are likely to break the edges of frame, it is best to overshoot (shoot a slightly wider shot), which will allow the compositor a bit of freedom in repositioning the element to best advantage without bits of the action disappearing off the edge of the frame. VFX Directors of Photography often use all of the area of full-frame 35mm or, when possible, VistaVision, (43) in order

to allow for the most repositioning without sacrificing resolution when resizing or repositioning the image in post. When it comes to shooting actors in front of blue screen, green screen, or sky, once the correct distance and angle have been worked out to ensure proper perspective, one frequent practice is to select a longer focal length lens in order to magnify the subject. As long as the subject does not break the edge of frame, this magnification yields better edge detail, which allows for easier, cleaner matte extraction. Note in this case care should be taken to avoid the perspective difference of the foreground and background becoming apparent. The intent is still to make the finished shot

appear as if shot as one. Even though there are obvious advantages to shooting elements for specific shots, once the equipment and crew are assembled, one may realize a great return on investment by shooting a library of elements with variations in size, focal length, orientation, action, and lighting. These generic elements can be used as necessary in building many different shots, scheduled or added. When shooting either generic or shot-specific elements such as flame, dust hits, atmosphere, etc., it makes sense to shoot some lighting, focal length, and angle variations – sometimes the correct setup ends up not looking nearly as good as some variation shot just in case. DETERMINING ELEMENT NEEDS During the bidding and pre-production of a project, the numbers and types of elements needed for various shots are usually broken down and categorized. (44) During production it is common to keep a running list of new elements that are required. Additional elements may be added to the list after a review of the shots in the edited sequence or during the post-production work on those shots. The first step is to review the list of desired elements with an SFX Supervisor and a VFX DP if at all possible. Their expertise can make things much easier and provide insights to alternate techniques. A skilled special effects person can do wonders with devices they have or with devices they can construct rapidly. Certainly, elements that require anything potentially dangerous such as flames or explosions will require a special effects person with the appropriate license. CHEATING The only hard and fast rule with regard to shooting elements is that they have to look right when they are incorporated into a shot. Since individual elements are shot one at a time, all sorts of tricks can be used when creating them. One can composite

together elements shot in different scales, at different frame rates, with different camera orientations, and even with different formats of cameras. An element can be flipped and flopped, played backward, recolored, pushed out of focus, made partially transparent – in short, the VFX Supervisor has an arsenal that includes all of the different types of shooting tricks plus all of the various compositing tricks for creating the desired look. Even as the bar is constantly being raised in creating realistic digital effects, numerous tricks of the trade of the old school creators of special photographic effects (45) are still valid. To paraphrase Duke Ellington, “If it looks good, it is good.” Some types of atmospheric elements, such as smoke, steam, or fog, can be shot small and scaled up, often being shot overcranked in order to give the correct sense of scale. Water effects can be scaled up somewhat, though issues with surface tension limit the degree to which this works. Different materials can be used in clever ways. For Star Wars: The Phantom Menace (1999) large amounts of salt (46) were poured over a black background to create the illusion of waterfalls seen in a Naboo matte painting. For Independence Day (1996) a comprehensive library of steam, dust, glass, and debris elements was shot on a single day of thousand-frame-persecond photography. Those elements were massaged in countless ways and incorporated in many shots of destruction. To create the effect of a man whose skin was on fire in a zero-gravity environment for Event Horizon (1997) hundreds of individual bits of paraffin-soaked paper were attached to a specially constructed puppeted mannequin and burned in small groups to create small patches of flame that were not drawn together by convection. For the free-fall sequence in Vanilla Sky (2001), Tom Cruise was suspended upside-down and wind-blown with E-fans while the camera, turned sideways or upside-down at various times, hovered around him on a crane or whizzed past him

on a 90-foot-long motion control track. Miniature pyrotechnics that are supposed to be in space are often shot looking up (or at least upside-down) so that the arc-shaped paths of debris and convection-driven smoke propagation patterns don’t give away the presence of gravity. It is easier to cheat at some games than others, of course. As humans, we are highly attuned to the way other humans look, and when human elements are shot from the wrong angle, the wrong height, or the wrong distance, even the most nontechnical audience member will often sense that something is wrong with a shot. The cues that trigger this response usually have to do with a perspective mismatch – and the most common error in shooting actors on green screen is to shoot from too short a distance with too wide a lens, resulting in unnatural relative sizes of different parts of the subject. The best protection against this danger is to line up the shot with the same lens, distance, and angle as was used in the plate shot or as dictated by the action of the shot, but when this is not possible, a decent rule of thumb regarding perspective on a human that is allegedly being seen from a great distance is to get the camera at least 25 or 30 feet away. Footnotes: (43) VistaVision: a special format camera that shoots with horizontal running motion picture film to provide a large size image equivalent to a 35 mm still camera. (44) Elements shot for stereoscopic 3D production must be carefully shot with stereo camera rigs, because elements shot “single eye” will often be very difficult to integrate into a 3D shot. (45) Before the rise of digital compositing and CG effects, most visual effects were referred to as special photographic effects. (46) Care should be taken when filming any material that may produce dust or small particles. Use eye protection and breathing masks.

SUMMER 2018 VFXVOICE.COM • 95


[ THE VES HANDBOOK ]

Acquisition/Shooting By MARK H. WEINGARTNER

“The only hard and fast rule with regard to shooting elements is that they have to look right when they are incorporated into a shot. Since individual elements are shot one at a time, all sorts of tricks can be used when creating them. … Even as the bar is constantly being raised in creating realistic digital effects, numerous tricks of the trade of the old school creators of special photographic effects are still valid. To paraphrase Duke Ellington, ‘If it looks good, it is good.’” —Mark H. Weingartner

The VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures (Focal Press), edited by Susan Zwerman and Jeffrey A. Okun, VES, has become a ‘go-to’ staple for professionals as a reference, for students as a guide and lay people to reveal the magic of visual effects. Here is an excerpt from the second edition. GENERIC VERSUS SHOT-SPECIFIC ELEMENTS Resizing elements in the digital world is only a mouse-click away, but even with this ease of adjustment, there are benefits to shooting elements in a shot-specific way. In shooting an element for a specific shot, one has the advantage of being

94 • VFXVOICE.COM SUMMER 2018

able to choose the appropriate lens and distance to match the perspective of the shot, the appropriate framing to make the shot work, and the appropriate lighting so as to integrate the element into the scene. Any camera position data from the original shot can be used to aid in line-up, but with elements such as water, pyro, flames, or atmospherics that are likely to break the edges of frame, it is best to overshoot (shoot a slightly wider shot), which will allow the compositor a bit of freedom in repositioning the element to best advantage without bits of the action disappearing off the edge of the frame. VFX Directors of Photography often use all of the area of full-frame 35mm or, when possible, VistaVision, (43) in order

to allow for the most repositioning without sacrificing resolution when resizing or repositioning the image in post. When it comes to shooting actors in front of blue screen, green screen, or sky, once the correct distance and angle have been worked out to ensure proper perspective, one frequent practice is to select a longer focal length lens in order to magnify the subject. As long as the subject does not break the edge of frame, this magnification yields better edge detail, which allows for easier, cleaner matte extraction. Note in this case care should be taken to avoid the perspective difference of the foreground and background becoming apparent. The intent is still to make the finished shot

appear as if shot as one. Even though there are obvious advantages to shooting elements for specific shots, once the equipment and crew are assembled, one may realize a great return on investment by shooting a library of elements with variations in size, focal length, orientation, action, and lighting. These generic elements can be used as necessary in building many different shots, scheduled or added. When shooting either generic or shot-specific elements such as flame, dust hits, atmosphere, etc., it makes sense to shoot some lighting, focal length, and angle variations – sometimes the correct setup ends up not looking nearly as good as some variation shot just in case. DETERMINING ELEMENT NEEDS During the bidding and pre-production of a project, the numbers and types of elements needed for various shots are usually broken down and categorized. (44) During production it is common to keep a running list of new elements that are required. Additional elements may be added to the list after a review of the shots in the edited sequence or during the post-production work on those shots. The first step is to review the list of desired elements with an SFX Supervisor and a VFX DP if at all possible. Their expertise can make things much easier and provide insights to alternate techniques. A skilled special effects person can do wonders with devices they have or with devices they can construct rapidly. Certainly, elements that require anything potentially dangerous such as flames or explosions will require a special effects person with the appropriate license. CHEATING The only hard and fast rule with regard to shooting elements is that they have to look right when they are incorporated into a shot. Since individual elements are shot one at a time, all sorts of tricks can be used when creating them. One can composite

together elements shot in different scales, at different frame rates, with different camera orientations, and even with different formats of cameras. An element can be flipped and flopped, played backward, recolored, pushed out of focus, made partially transparent – in short, the VFX Supervisor has an arsenal that includes all of the different types of shooting tricks plus all of the various compositing tricks for creating the desired look. Even as the bar is constantly being raised in creating realistic digital effects, numerous tricks of the trade of the old school creators of special photographic effects (45) are still valid. To paraphrase Duke Ellington, “If it looks good, it is good.” Some types of atmospheric elements, such as smoke, steam, or fog, can be shot small and scaled up, often being shot overcranked in order to give the correct sense of scale. Water effects can be scaled up somewhat, though issues with surface tension limit the degree to which this works. Different materials can be used in clever ways. For Star Wars: The Phantom Menace (1999) large amounts of salt (46) were poured over a black background to create the illusion of waterfalls seen in a Naboo matte painting. For Independence Day (1996) a comprehensive library of steam, dust, glass, and debris elements was shot on a single day of thousand-frame-persecond photography. Those elements were massaged in countless ways and incorporated in many shots of destruction. To create the effect of a man whose skin was on fire in a zero-gravity environment for Event Horizon (1997) hundreds of individual bits of paraffin-soaked paper were attached to a specially constructed puppeted mannequin and burned in small groups to create small patches of flame that were not drawn together by convection. For the free-fall sequence in Vanilla Sky (2001), Tom Cruise was suspended upside-down and wind-blown with E-fans while the camera, turned sideways or upside-down at various times, hovered around him on a crane or whizzed past him

on a 90-foot-long motion control track. Miniature pyrotechnics that are supposed to be in space are often shot looking up (or at least upside-down) so that the arc-shaped paths of debris and convection-driven smoke propagation patterns don’t give away the presence of gravity. It is easier to cheat at some games than others, of course. As humans, we are highly attuned to the way other humans look, and when human elements are shot from the wrong angle, the wrong height, or the wrong distance, even the most nontechnical audience member will often sense that something is wrong with a shot. The cues that trigger this response usually have to do with a perspective mismatch – and the most common error in shooting actors on green screen is to shoot from too short a distance with too wide a lens, resulting in unnatural relative sizes of different parts of the subject. The best protection against this danger is to line up the shot with the same lens, distance, and angle as was used in the plate shot or as dictated by the action of the shot, but when this is not possible, a decent rule of thumb regarding perspective on a human that is allegedly being seen from a great distance is to get the camera at least 25 or 30 feet away. Footnotes: (43) VistaVision: a special format camera that shoots with horizontal running motion picture film to provide a large size image equivalent to a 35 mm still camera. (44) Elements shot for stereoscopic 3D production must be carefully shot with stereo camera rigs, because elements shot “single eye” will often be very difficult to integrate into a 3D shot. (45) Before the rise of digital compositing and CG effects, most visual effects were referred to as special photographic effects. (46) Care should be taken when filming any material that may produce dust or small particles. Use eye protection and breathing masks.

SUMMER 2018 VFXVOICE.COM • 95


[ FINAL FRAME ]

Lost in Space — 50 Years Ago

The original Lost in Space TV series (1965 to 1968) was the brainchild of producer/ director Irwin Allen, who helmed a number of sci-fi projects and disaster movies in the 1960s and ‘70s, including The Towering Inferno, one of the top boxoffice films of 1974. This publicity photo shows cast members (left to right, back) Mark Goddard, June Lockhart, Guy Williams, Jonathan Harris, and Marta Kristen, with Billy Mumy and Angela Cartwright in front. The Robot was played by Bob May and voiced by Dick Tufeld. It’s safe to say the 2018 reboot on Netflix is light years removed on the VFX and SFX fronts.

According to IMDb, credits for series visual effects went to L.B. Abbott, Howard Lydecker and Jack Polito. The prolific Abbott won two Special Achievement Awards from the Academy and two Oscars, one in 1971 for Tora! Tora! Tora! (shared with A.D. Flowers) and in 1968 for Doctor Dolittle. Lydecker was nominated for two Oscars for special effects, one for 1942’s Flying Tigers and one for 1940’s Women in War (shared with William Bradford, Bud Thackery and Herbert Norsch).

(Image copyright © Twentieth Century Fox Television)

96 • VFXVOICE.COM SUMMER 2018




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.