VFX Voice - Winter 2019 Issue

Page 1



WINNER OF THE 2018 FOLIO: EDDIE AWARD Best Launch of a New Magazine (Association/Nonprofit)

WINNER OF THE 2018 FOLIO: OZZIE AWARD

Best Design of a New Magazine (Professional/Membership Organization)

The Folio: Awards are one of the most prestigious national awards programs in the publishing industry.

Congratulations to the VFXV creative, editorial and publishing team! Thank you, Folio: judges, for making VFXV a multiple Folio: Award winner.

The Visual Effects Society


[ EXECUTIVE NOTE ]

Welcome to the Winter issue of VFX Voice! We’re thrilled to celebrate awards season, where outstanding visual effects artistry is in the spotlight. In this issue, we’ll preview the top contenders for the VES Awards, the Oscars and the BAFTAs, and take a deep dive into winter VFX. We go behind the scenes of Welcome to Marwen and the creation of the Spider-Verse. Technical and creative genius Phil Tippett, VES takes center stage in this issue’s legends profile. Get an exclusive look at Netflix programming and its growing commitment to VFX, read the latest on script-to-previs and virtual production, get a special look at VFX in the U.K. and then get nostalgic with a look back at the original big-screen Superman. It’s all in here and more! And in big news, VFX Voice won two major awards at the 2018 FOLIO: Eddie and Ozzie Awards, one of the most prestigious national awards programs in the publishing community, for Launch of a New Magazine (Association/Nonprofit) and Design of a New Magazine (Professional/ Membership Organization). We’re continuing to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Your enthusiasm and support has made VFX Voice a must-read publication around the globe, and we’re proud to be the definitive authority of all things VFX.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM WINTER 2019



[ CONTENTS ] FEATURES 8 FILM: THE VFX OSCAR VFX Voice runs down the possible contenders from 2018.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 91 VES SECTION SPOTLIGHT: LONDON

18 FILM: HOT WINTER VFX A breakdown of the top VFX-injected films in the first quarter. 22 FILM: VIRTUAL PRODUCTION Specialists discuss this fast-growing area of filmmaking. 32 PROFILE: PHIL TIPPETT A technical and creative genius who has impacted global culture. 38 FILM: FROM SCRIPT TO PREVIS Fleshing out story ideas and key moments for three films. 48 FOCUS: VFX IN THE U.K. Facilities and studios in London and across the U.K. are bustling. 60 FILM: WELCOME TO MARWEN Atomic Fiction answers call to create challenging effects. 66 ANIMATION: SPIDER-MAN: INTO THE SPIDER-VERSE Imageworks animators and artists push the creative envelope. 72 TV: ALTERED CARBON Effects help conjure a darkly compelling sci-fi universe. 78 COVER: NETFLIX VFX is fueling Netflix’s surge amid expanding TV opportunities. 86 VFX VAULT: SUPERMAN TAKES FLIGHT Revisiting 1978’s Superman with Effects Director Colin Chilvers.

4 • VFXVOICE.COM WINTER 2019

92 VES NEWS 94 VFX CAREERS 96 FINAL FRAME: MARY POPPINS

ON THE COVER: Lost in Space returns to Netflix later this year. (Image courtesy of Netflix)



WINTER 2019 • VOL. 3, NO. 1

MULTIPLE WINNER OF THE 2018 FOLIO: EDDIE & OZZIE AWARDS

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Dan Schrecker, Treasurer Rita Cahill, Secretary DIRECTORS Jeff Barnes, Brooke Breton, Kathryn Brillhart Emma Clifton Perry, Bob Coleman, Lisa Cooke Dayne Cowan, Kim Davidson, Richard Edlund, VES Bryan Grill, Pam Hogarth, Joel Hynek, Jeff Kleiser Suresh Kondareddy, Brooke Lyndon-Stanford Tim McGovern, Kevin Rafferty, Scott Ross Tim Sassoon, Lisa Sepp-Wilson, David Tanaka Bill Taylor, VES, Richard Winn Taylor II, VES Joe Weidenbach, Susan Zwerman

CONTRIBUTING WRITERS Ian Failes Naomi Goldman Kevin H. Martin Chris McGowan Adrian Pennington Barbara Robertson

ALTERNATES Andrew Bly, Charlie Iturriaga Christian Kubsch, Daniel Rosen Katie Stetson, Bill Villarreal

ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com

Tom Atkin, Founder Allen Battino, VES Logo Design

VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2019 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM WINTER 2019



FILM

Could it be the year that a comic book film returns to the center stage of visual effects? Will one of the many ‘creature features’ released in the past 12 months snag the Academy Award for Best Visual Effects, including ones with human-like performances? Perhaps effects that helped build the world of a film will take the honor. Or, as in some past years, an ‘outsider’ might be victorious. VFX Voice runs down the possible VFX Oscar contenders from 2018 – with input from many of the contributors behind the films – who will be hoping for gold on February 24. THE YEAR OF THE SUPERHERO FILM?

THE RACE TO THE 2019 VFX OSCAR By IAN FAILES

TOP: Thanos from Avengers: Infinity War was made possible via incorporating machine learning into the character pipeline. (Image © 2018 Marvel Studios) BOTTOM: Black Panther’s visual effects filmmakers followed a ‘bible’ of Wakandan history in piecing together characters and action set pieces. (Image © 2018 Marvel Studios)

The biggest film of the year, Avengers: Infinity War, had its fair share of Marvel-esque fight sequences and numerous otherworldly locations, but it also features a central CG character in Thanos (Josh Brolin). “We knew Thanos would have to carry the film – after all, it was a film written to have the villain win,” says Visual Effects Supervisor Dan DeLeeuw. “We reached out to Digital Domain and Weta Digital to work in parallel developing Thanos. The final shots conveyed the subtleties of Thanos’ rage in addition to his sorrow and regret.” Black Panther utilized visual effects to sell an entirely fictitious yet believable location in Wakanda, which, according to Visual Effects Supervisor Geoffrey Baumann, “required multiple iterations and a complex logistical foundation to allow the creative team to chase the best story possible with footage shot primarily in and around Atlanta. The added layer of rendering plausible, visually interesting vibranium sonic technology that fills nearly every frame of the film exponentially increased the difficulty of the undertaking.” Ant-Man and the Wasp brought with it several challenges relating to CG characters, including new additions Wasp and the ‘ghosting’ villain. Says Visual Effects Supervisor Stephane Ceretti, “From the photorealistic everyday life and comical use of digital doubles and augmented environments, to the recreation of the macroscopic Quantum Realm a la Fantastic Voyage, the broad spectrum of effects used in the film makes it an allaround exciting experience for the filmmakers to create and for audiences to watch.” The volume of visual effects in these and other superhero films make them strong candidates for VFX Oscar contenders, as do their plentiful memorable effects moments. For example, Deadpool 2’s convoy chase added layers and layers of real photography with extra VFX elements, plus a wave of simulated highway destruction. Aquaman similarly ramped up the action, with the added complexity of numerous under – and above – water simulations. Finally, Venom presents a horrifying array of transformation from human to creature scenes that are unique to anything else in the superhero world.

The Meg, for example, made use of not one, but two giant sharks. Visual Effects Supervisor Adrian De Wet says he had a number of favorite shots, including “the breach shot in the third act, where Jonas (Jason Statham) pushes the spike into the Meg’s eye while at the top of its arc, after it has jumped out of the ocean. My favorite could possibly be the whole boat capsizing sequence in the second act. It’s a real pivotal moment in the film and hopefully it’s entirely unexpected.” Rampage had three main digital characters and a whole lot of Chicago city destruction, as orchestrated by Weta Digital. “Rampage drew on our depth of experience making digital characters and also offered a juicy opportunity to destroy one of my favorite cities,” states Visual Effects Supervisor Erik Winquist. “The movie has a light-hearted tone and knows exactly what it is, but we took our role seriously in making sure that the heart and soul of the gorilla George came from actor Jason Liles.” In Peter Rabbit, there’s far less destruction but still a significant number of meticulous CG performances from Animal Logic. “The biggest visual effects challenge on Peter Rabbit was the complex physical interactions between the actors and the CG characters, in particular the physical fight between Peter Rabbit and Mr. McGregor,” notes Visual Effects Supervisor Will Reichelt. “The success of these shots depended on a number of departments – from the stunt team choreographing the fight, to the SFX team providing specific physical interactions, to Domhnall Gleeson as Mr. McGregor himself.” Christopher Robin is another film that matched CG characters to live-action actors, led by Framestore and Method Studios. “The tremendous challenge with Christopher Robin was to create characters capable of carrying the emotional heart of the movie,” says Visual Effects Supervisor Chris Lawrence. “They needed to be simple toys imbued with the magic of childhood imagination. We achieved this through craft as much as tech-

TOP: In Ant-Man and the Wasp, not only were there advancements made in how CG characters were brought to the screen, there were also plenty of old-school effects gags to sell size differences. (Image © 2018 Marvel Studios) BOTTOM: A scene from Deadpool 2’s major convoy chase sequence, where the real world met major effects simulations. (Image © 2018 20th Century Fox)

CREATURE FEATURES

Of course, several of these comic book films could also be classified as ‘creature features,’ and there were many other movies released in 2018 that featured incredible classic creature work.

8 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 9


FILM

Could it be the year that a comic book film returns to the center stage of visual effects? Will one of the many ‘creature features’ released in the past 12 months snag the Academy Award for Best Visual Effects, including ones with human-like performances? Perhaps effects that helped build the world of a film will take the honor. Or, as in some past years, an ‘outsider’ might be victorious. VFX Voice runs down the possible VFX Oscar contenders from 2018 – with input from many of the contributors behind the films – who will be hoping for gold on February 24. THE YEAR OF THE SUPERHERO FILM?

THE RACE TO THE 2019 VFX OSCAR By IAN FAILES

TOP: Thanos from Avengers: Infinity War was made possible via incorporating machine learning into the character pipeline. (Image © 2018 Marvel Studios) BOTTOM: Black Panther’s visual effects filmmakers followed a ‘bible’ of Wakandan history in piecing together characters and action set pieces. (Image © 2018 Marvel Studios)

The biggest film of the year, Avengers: Infinity War, had its fair share of Marvel-esque fight sequences and numerous otherworldly locations, but it also features a central CG character in Thanos (Josh Brolin). “We knew Thanos would have to carry the film – after all, it was a film written to have the villain win,” says Visual Effects Supervisor Dan DeLeeuw. “We reached out to Digital Domain and Weta Digital to work in parallel developing Thanos. The final shots conveyed the subtleties of Thanos’ rage in addition to his sorrow and regret.” Black Panther utilized visual effects to sell an entirely fictitious yet believable location in Wakanda, which, according to Visual Effects Supervisor Geoffrey Baumann, “required multiple iterations and a complex logistical foundation to allow the creative team to chase the best story possible with footage shot primarily in and around Atlanta. The added layer of rendering plausible, visually interesting vibranium sonic technology that fills nearly every frame of the film exponentially increased the difficulty of the undertaking.” Ant-Man and the Wasp brought with it several challenges relating to CG characters, including new additions Wasp and the ‘ghosting’ villain. Says Visual Effects Supervisor Stephane Ceretti, “From the photorealistic everyday life and comical use of digital doubles and augmented environments, to the recreation of the macroscopic Quantum Realm a la Fantastic Voyage, the broad spectrum of effects used in the film makes it an allaround exciting experience for the filmmakers to create and for audiences to watch.” The volume of visual effects in these and other superhero films make them strong candidates for VFX Oscar contenders, as do their plentiful memorable effects moments. For example, Deadpool 2’s convoy chase added layers and layers of real photography with extra VFX elements, plus a wave of simulated highway destruction. Aquaman similarly ramped up the action, with the added complexity of numerous under – and above – water simulations. Finally, Venom presents a horrifying array of transformation from human to creature scenes that are unique to anything else in the superhero world.

The Meg, for example, made use of not one, but two giant sharks. Visual Effects Supervisor Adrian De Wet says he had a number of favorite shots, including “the breach shot in the third act, where Jonas (Jason Statham) pushes the spike into the Meg’s eye while at the top of its arc, after it has jumped out of the ocean. My favorite could possibly be the whole boat capsizing sequence in the second act. It’s a real pivotal moment in the film and hopefully it’s entirely unexpected.” Rampage had three main digital characters and a whole lot of Chicago city destruction, as orchestrated by Weta Digital. “Rampage drew on our depth of experience making digital characters and also offered a juicy opportunity to destroy one of my favorite cities,” states Visual Effects Supervisor Erik Winquist. “The movie has a light-hearted tone and knows exactly what it is, but we took our role seriously in making sure that the heart and soul of the gorilla George came from actor Jason Liles.” In Peter Rabbit, there’s far less destruction but still a significant number of meticulous CG performances from Animal Logic. “The biggest visual effects challenge on Peter Rabbit was the complex physical interactions between the actors and the CG characters, in particular the physical fight between Peter Rabbit and Mr. McGregor,” notes Visual Effects Supervisor Will Reichelt. “The success of these shots depended on a number of departments – from the stunt team choreographing the fight, to the SFX team providing specific physical interactions, to Domhnall Gleeson as Mr. McGregor himself.” Christopher Robin is another film that matched CG characters to live-action actors, led by Framestore and Method Studios. “The tremendous challenge with Christopher Robin was to create characters capable of carrying the emotional heart of the movie,” says Visual Effects Supervisor Chris Lawrence. “They needed to be simple toys imbued with the magic of childhood imagination. We achieved this through craft as much as tech-

TOP: In Ant-Man and the Wasp, not only were there advancements made in how CG characters were brought to the screen, there were also plenty of old-school effects gags to sell size differences. (Image © 2018 Marvel Studios) BOTTOM: A scene from Deadpool 2’s major convoy chase sequence, where the real world met major effects simulations. (Image © 2018 20th Century Fox)

CREATURE FEATURES

Of course, several of these comic book films could also be classified as ‘creature features,’ and there were many other movies released in 2018 that featured incredible classic creature work.

8 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 9


FILM

“We knew Thanos would have to carry the film – after all, it was a film written to have the villain win. We reached out to Digital Domain and Weta Digital to work in parallel developing Thanos. The final shots conveyed the subtleties of Thanos’ rage in addition to his sorrow and regret.” —Dan DeLeeuw, Visual Effects Supervisor, Avengers: Infinity War

TOP: Jason Momoa plays the titular character in Aquaman, a film where decades of advancements in water simulation techniques were put to the test. (Image © 2018 Warner Bros. Pictures) BOTTOM: A convincing central character played by Tom Hardy with major input from the VFX teams is at the center of Venom. (Image © 2018 Sony Pictures)

nology – nuanced animated performances, beautiful lighting and lensing, finished off with incredibly detailed simulations of fur and contact with plants, environments and of course: ‘Hunny!’” Audiences got to witness a different kind of magic in Fantastic Beasts: The Crimes of Grindelwald, which made use of many CG creatures. Visual Effects Supervisor Christian Manz, who oversaw the VFX with Tim Burke, identified human-to-human and human-to-beast transformations as a major challenge. “We wanted both to feel physically plausible as much as possible, and to be viewed in close-up with no clever camera work to hide the joins.” Robots might not normally be described as ‘creatures,’ but VFX studios still have to approach them that way. On Pacific Rim Uprising, Visual Effects Supervisor Peter Chiang identifies his giant robot challenge as “conveying scale and mass, while at the same time ensuring that the choreography of the battles remains exciting.” Meanwhile, on Bumblebee, Visual Effects Supervisor Jason Smith says not only was believable CG character work important, it also had to involve an emotional connection between robot and girl. “Making the robot itself feel solidly rooted in the real world is a monumental task by itself. But the most important thing we tried to achieve in this film, by far our biggest hurdle, was making the audience really believe in that relationship so they could forget about VFX and just enjoy the story.” With Jurassic World: Fallen Kingdom – a more traditional creature feature – Industrial Light & Magic enjoyed an unprecedented level of collaboration with Neal Scanlan’s creature FX team. “We wanted to engage the actors, DP, camera operators and director with the digital creatures in the film,” says Visual Effects Supervisor David Vickery, “and I really believe that the tangible energy and authenticity you can feel in the dinosaurs final on-screen performance would have suffered were it not for that collaboration.” Fallen Kingdom also exemplifies the important role of the previs teams in visual effects production. Proof, for example, worked on the previs for that film, and several others this past year. “What’s interesting about where we now fit into the ecosystem is that we fit in so many different ways,” suggests Proof President Ron Frankel. “On Jurassic and A Wrinkle in Time – those are films we were involved in from the very early days, almost from design and story development all the way through to postvis. We were a design resource and a story resource for the directors. Ideas were experimented with – some made it into the movie and some were left on the editing room floor – which is kind of the point.”

as if it had been filmed in the years leading up to the original Star Wars. “We leveraged some of the oldest visual effects techniques, such as front- and rear-screen projection and drove those immersive set-ups with the very latest technology,” says Bredow. The latest technology was also on show in Ready Player One, where principal visual effects studios ILM and Digital Domain combined to reveal a world inside and outside the realms of virtual reality. “Realizing Steven Spielberg’s vision for Ready Player One was a massive creative and technical challenge – literally on a world-building scale – resulting in over 90 minutes of fully digital work being created for the film,” states Visual Effects Supervisor Roger Guyett. “Perhaps the most important challenge was faithfully maintaining each actor’s performance as they intercut between their live-action character and their digital avatar.” Much of the world also didn’t exist in Mortal Engines, which required Weta Digital to build massive cities that could move, a major challenge of scale, says Visual Effects Supervisor Ken McGaugh. “The creative side to this challenge was finding a balance between believable scale and exciting action, which

TOP: In The Meg, menacing giant sharks were realized by several VFX studios utilizing an array of flesh and water simulation tools. (Image © 2018 Warner Bros. Pictures) BOTTOM: Motion capture informed the main creature in Rampage, while other character work and building destruction made this film heavy on effects. (Image © 2018 Warner Bros. Pictures)

WORLD BUILDING

Need a world for your film, with landscapes, buildings, vehicles and characters? Visual effects artists were behind so many of them in 2018. For Solo: A Star Wars Story, Visual Effects Supervisor Rob Bredow built a world that had to exist

10 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 11


FILM

“We knew Thanos would have to carry the film – after all, it was a film written to have the villain win. We reached out to Digital Domain and Weta Digital to work in parallel developing Thanos. The final shots conveyed the subtleties of Thanos’ rage in addition to his sorrow and regret.” —Dan DeLeeuw, Visual Effects Supervisor, Avengers: Infinity War

TOP: Jason Momoa plays the titular character in Aquaman, a film where decades of advancements in water simulation techniques were put to the test. (Image © 2018 Warner Bros. Pictures) BOTTOM: A convincing central character played by Tom Hardy with major input from the VFX teams is at the center of Venom. (Image © 2018 Sony Pictures)

nology – nuanced animated performances, beautiful lighting and lensing, finished off with incredibly detailed simulations of fur and contact with plants, environments and of course: ‘Hunny!’” Audiences got to witness a different kind of magic in Fantastic Beasts: The Crimes of Grindelwald, which made use of many CG creatures. Visual Effects Supervisor Christian Manz, who oversaw the VFX with Tim Burke, identified human-to-human and human-to-beast transformations as a major challenge. “We wanted both to feel physically plausible as much as possible, and to be viewed in close-up with no clever camera work to hide the joins.” Robots might not normally be described as ‘creatures,’ but VFX studios still have to approach them that way. On Pacific Rim Uprising, Visual Effects Supervisor Peter Chiang identifies his giant robot challenge as “conveying scale and mass, while at the same time ensuring that the choreography of the battles remains exciting.” Meanwhile, on Bumblebee, Visual Effects Supervisor Jason Smith says not only was believable CG character work important, it also had to involve an emotional connection between robot and girl. “Making the robot itself feel solidly rooted in the real world is a monumental task by itself. But the most important thing we tried to achieve in this film, by far our biggest hurdle, was making the audience really believe in that relationship so they could forget about VFX and just enjoy the story.” With Jurassic World: Fallen Kingdom – a more traditional creature feature – Industrial Light & Magic enjoyed an unprecedented level of collaboration with Neal Scanlan’s creature FX team. “We wanted to engage the actors, DP, camera operators and director with the digital creatures in the film,” says Visual Effects Supervisor David Vickery, “and I really believe that the tangible energy and authenticity you can feel in the dinosaurs final on-screen performance would have suffered were it not for that collaboration.” Fallen Kingdom also exemplifies the important role of the previs teams in visual effects production. Proof, for example, worked on the previs for that film, and several others this past year. “What’s interesting about where we now fit into the ecosystem is that we fit in so many different ways,” suggests Proof President Ron Frankel. “On Jurassic and A Wrinkle in Time – those are films we were involved in from the very early days, almost from design and story development all the way through to postvis. We were a design resource and a story resource for the directors. Ideas were experimented with – some made it into the movie and some were left on the editing room floor – which is kind of the point.”

as if it had been filmed in the years leading up to the original Star Wars. “We leveraged some of the oldest visual effects techniques, such as front- and rear-screen projection and drove those immersive set-ups with the very latest technology,” says Bredow. The latest technology was also on show in Ready Player One, where principal visual effects studios ILM and Digital Domain combined to reveal a world inside and outside the realms of virtual reality. “Realizing Steven Spielberg’s vision for Ready Player One was a massive creative and technical challenge – literally on a world-building scale – resulting in over 90 minutes of fully digital work being created for the film,” states Visual Effects Supervisor Roger Guyett. “Perhaps the most important challenge was faithfully maintaining each actor’s performance as they intercut between their live-action character and their digital avatar.” Much of the world also didn’t exist in Mortal Engines, which required Weta Digital to build massive cities that could move, a major challenge of scale, says Visual Effects Supervisor Ken McGaugh. “The creative side to this challenge was finding a balance between believable scale and exciting action, which

TOP: In The Meg, menacing giant sharks were realized by several VFX studios utilizing an array of flesh and water simulation tools. (Image © 2018 Warner Bros. Pictures) BOTTOM: Motion capture informed the main creature in Rampage, while other character work and building destruction made this film heavy on effects. (Image © 2018 Warner Bros. Pictures)

WORLD BUILDING

Need a world for your film, with landscapes, buildings, vehicles and characters? Visual effects artists were behind so many of them in 2018. For Solo: A Star Wars Story, Visual Effects Supervisor Rob Bredow built a world that had to exist

10 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 11


FILM

were often at odds with each other. The technical challenge comes from the fact that we could not use our normal optimizations when rendering large-scale environments that move like a vehicle. We had to engineer a whole new hierarchical layout puppet system to handle animating the cities and a dynamic level-of-detail generator to handle rendering them.” A Wrinkle in Time consisted of a multitude of fantastical locations, with Visual Effects Supervisor Rich McBride working closely with director Ava DuVernay to help realize them. “Ava really wanted to know all the different steps involved,” McBride said at a VES screening of the film. “So it was just a matter of collecting all of the different pieces that we go through, and it helped a lot to have a previs team in-house that she could work with, so she could just sit down at the computer and look at cameras, and see the virtual way of working.” It might not have been fantastical, but the 1960s world of First Man still required a combination of effects artisans to build it. This included on-set gags, miniature effects, digital effects and even restored archival footage projected on LED panels to recreate, with absolute plausibility, launches and space mission scenes. “With First Man being a period piece from the 1960s, anything that felt like heavy VFX would have completely taken you out of the story,” says Visual Effects Supervisor Paul Lambert. “The effects had to be subtle and shot in a particular way to make it feel like footage from the day. We used a diverse mixture of visual effects, special effects, archival footage and miniatures. We also shot full-scale models on 6-axis gimbals in front of a curved 60-foot diameter and 35-foot-tall LED screen. Using 90 minutes of content we created at DNEG, we were able to create a pseudo fully-three-dimensional world in-camera.” DIGITAL HUMAN-LIKE PERFORMANCES

In past years, VFX-Oscar winning films such as The Curious

12 • VFXVOICE.COM WINTER 2019

TOP: Pacific Rim Uprising’s animators followed a ‘heft and jank’ approach to the giant robots. (Image © 2018 Universal Pictures)

Case of Benjamin Button, Avatar and Blade Runner 2049 have showcased the latest in CG human or human-like performances. Infinity War’s Thanos and the avatars of Ready Player One certainly fit into that category, as do a few other films from 2018. In Alita: Battle Angel, Weta Digital made use of a motion-captured performance by Rosa Salazar to deliver its main character. “Through high-fidelity performance capture, and multi-camera facial acquisition, we were able to gather and recreate the subtle mannerisms that are recognizably Rosa,” explains Visual Effects Supervisor Eric Saindon. “By using Weta’s muscle and skinning systems, we were able to bring Alita to life as a CG character. Adding in lots of interaction between the Battle Angel and live-action characters is what really sets this movie apart from others.” It might not seem at first like a candidate for human-like performance, but Welcome to Marwen definitely fits the bill. Here, Atomic Fiction incorporated Steve Carrell’s live-action likeness into his digital doll representation. “Robert Zemeckis insisted that the dolls be completely relatable as living, breathing characters,” says Visual Effects Supervisor Kevin Baillie. “He was adamant that their facial performances had to be flawlessly faithful to the actors’ since, in his words, ‘the face is the actor’s instrument, just like a violin is the musician’s.’”

AD

THE OUTSIDERS

Every year a film that might not normally be considered a major VFX contender pushes through. In 2015, one of those, Ex Machina, went on to win the VFX Oscar. There are several possible contenders from this past year. With 2,000 VFX shots, Mission: Impossible - Fallout is light years from a small film maximizing VFX like Ex Machina. The VFX work is highly story-driven and integrated with

WINTER 2019 VFXVOICE.COM • 13



FILM

significant practical and stunt effects. “On many films, the tendency is to shoot on a bluescreen and then have VFX do all the rest of the work,” outlines Visual Effects Supervisor Jody Johnson. “But on a Mission film, it’s more traditional filmmaking. The best bit is getting those puzzles and working them out. How are we going to shoot up the pieces and put them all together?” That sentiment is backed up by Special Effects Supervisor Neil Corbould. “It was great working with Tom Cruise on a Mission: Impossible movie as he has a great appreciation for practical special effects and doing things for real. We really TOP LEFT: The Millennium Falcon travels through hyperspace in Solo: A Star Wars Story. VFX recreated well-known Star Wars vehicles and characters, sometimes using old-school techniques, as well as highly advanced means. (Image © 2018 Walt Disney Pictures) TOP RIGHT: For a moving city in Mortal Engines the VFX teams had to re-think how to create such vast assets for the film. (Image © 2018 Universal Pictures) BOTTOM: Miniatures, on-set practical effects, digital visual effects and archival restoration made up the VFX for First Man. (Image © 2018 Universal Pictures)

14 • VFXVOICE.COM WINTER 2019

pushed the boundaries on Fallout, from a rotating sinking prison van rig, to a helicopter crash impact effect that Tom wanted to be very much involved in. My role as a special effects supervisor was to produce amazing special effects for Tom to be in and to keep him safe at all times.” A Quiet Place saw Industrial Light & Magic incorporate a CG character sparingly, but effectively, into the horror film. “Director John Krasinski referred to Alien and Jaws as movie styles that he wanted to emulate,” discusses Visual Effects Supervisor Scott Farrar. “I totally agreed, it’s far more effective to show as little as possible for most of the movie, increasing the tension by slowly revealing the look of the creature near the end.” The effects are equally as eerie in Annihilation, and also had to evolve over production, as Visual Effects Supervisor Andrew Whitehurst explains. “Sometimes we threw away months of work when we realized that, as the film had evolved, our original ideas were no longer right for the film. Every frame of every shot had to be considered in this light for the film to weave its spell on the audience. The Annihilation VFX

AD

WINTER 2019 VFXVOICE.COM • 15



FILM

“With First Man being a period piece from the 1960s, anything that felt like heavy VFX would have completely taken you out of the story. The effects had to be subtle and shot in a particular way to make it feel like footage from the day. We used a diverse mixture of visual effects, special effects, archival footage and miniatures. We also shot full-scale models on 6-axis gimbals in front of a curved 60-foot diameter and 35-foot-tall LED screen. Using 90 minutes of content we created at DNEG, we were able to create a pseudo fully-three-dimensional world in-camera.” —Paul Lambert, Visual Effects Supervisor, First Man

16 • VFXVOICE.COM WINTER 2019

team understood this and maintained their enthusiasm, dedication, and creativity right until the very end of the show, and it’s their hard work that we see on screen and gives me such pleasure to look at.” Other films that might be considered outsiders: The Nutcracker and the Four Realms, for another round of substantial world building; Skyscraper, a film that not only makes a building its central character but also has a wealth of stunts and fiery scenes; and Mary Poppins Returns, which has the unique appeal of combining 2D animation into the final shots.

AD

TOP: In Mission: Impossible – Fallout, lead actor Tom Cruise performed as many stunts as possible, with VFX aiding and assisting to add in necessary elements. (Image © 2018 Paramount Pictures) BOTTOM: Annihilation’s effects ranged from the petro-chemical-looking Shimmer, to CG creatures and other strange occurrences. (Image © 2018 Paramount Pictures)

WINTER 2019 VFXVOICE.COM • 17



FILM

2019 ICEBREAKERS: HOT WINTER VFX

The first films of 2019 are rich in both subtle and spectacular VFX that continue to push the envelope of human-like CG characters, awesome animals, monsters and creatures, fantastic world building, and surreal environments, above water and below, as well as in deep space. The winter wave also includes reviving classics with the latest life-giving creative and technical VFX magic. Audiences have crossed 2018 into 2019 watching Mary Poppins Returns, Aquaman, Bumblebee and Welcome to Marwen. In the next few months they will welcome Ad Astra with Brad Pitt crossing the solar system in search of his long-lost father (played by Tommy Lee Jones), the James Cameron co-scripted manga adaption Alita: Battle Angel, M. Night Shyamalan’s terrifying trilogy finale Glass, and the much-touted Captain Marvel with Brie Larson. Towards Spring, Chicago struggles under an alien yoke in Captive State, and Tim Burton brings his vivid imagination to a live-action Dumbo. VFX Voice has compiled some of the top VFX-injected films on tap this winter along with their effects supervisors and VFX houses involved. This is not intended to be a complete listing. Release dates are subject to change.

By CHRIS McGOWAN Eli (Paramount Pictures) Release date: U.K., January 1; U.S., January 4 In this horror film from director Ciarán Foy shot in Sofia, Bulgaria, a boy is treated for a rare disease in a secluded clinic that turns into a haunted prison. Neishaw Ali is Visual Effects Executive Producer, Wesley Sewell is Visual Effects Supervisor, with Spin VFX crew handling visual effects. Chris Bailey is Special Effects Coordinator. Fractured FX is also on board.

TOP: Mary Poppins Returns (Image copyright © 2018 Walt Disney Pictures) BOTTOM: Aquaman (Image © 2018 Warner Bros. Pictures)

Ad Astra (20th Century Fox) Release date: U.K., January 11; U.S., January 11 Twenty years after Clifford McBride (Tommy Lee Jones) left for Neptune to find signs of extra-terrestrial life, his son Roy (Brad Pitt) goes in search of him in this James Gray film. Bradley Parker is Production VFX Supervisor, Frank Iudica is Special Effects Coordinator. Atomic Fiction, Weta Digital, Vitality VFX, Mr. X, Method Studios, MPC, Capital T, Shade VFX and Brainstorm VFX are handling the VFX load. Glass (Universal Pictures) Release date: U.K., January 18; U.S., January 18 This M. Night Shyamalan horror-thriller completes a trilogy initiated with Unbreakable (2000) and followed by Split (2016). James McAvoy, Bruce Willis and Samuel L. Jackson lead the cast. Patrick Edward White is Special Effects Coordinator. Contributing effects were provided by Powerhouse VFX. The Maze (Sony) Release date: U.K., February 1; U.S., February 1 Six young strangers must use their wits to survive circumstances beyond their control in this action thriller. Directed by Adam Robitel, who’s also in the cast. Max Poolman is Special Effects Supervisor. VFX Files has the effects role.

18 • VFXVOICE.COM WINTER 2019

Jacob’s Ladder (LD Entertainment) Release date: U.K., TBD; U.S., February 1 A remake of Adrian Lyne’s 1990 psychological horror film about a Vietnam veteran struggling to maintain his sanity. David Fletcher is Special Effects Supervisor. Chris LeDoux is Visual Effects Supervisor and Robin Scott Graham is Visual Effect Supervisor for Crafty Apes. The Lego Movie 2: The Second Part (Warner Bros. Pictures) Release date: U.K., February 8; U.S. February 8 The Second Part reunites the heroes of Bricksburg to save their city as citizens face a new threat – invaders from outer space, wrecking everything faster than they can rebuild. This computer-animated sequel will be released simultaneously in 2D, 3D and IMAX 3D. Animal Logic is the main VFX house. Alita: Battle Angel (20th Century Fox) Release date: U.K., February 6; U.S., February 19 Robert Rodriguez helms this adaptation of Yukito Kishiro’s cyberpunk manga Gunn, co-scripted by James Cameron, about a damaged and abandoned young cyborg girl (Rosa Salazar) who wakes up in a future she doesn’t recognize. Weta Digital’s motion-capture performance by Salazar brings Alita to life as a CG character. Christoph Waltz, Jennifer Connelly and Mahershala Ali are also cast. Weta Digital’s Senior VFX Supervisor Joe Letteri, VES, with VFX Supervisors Eric Saindon and Nick Epstein are tasked with creating the digital Alita. DNEG and Framestore are also on board. The Turning (Universal Pictures) Release date: U.K., TBD; U.S., February 22 This haunted-house film is a modern-day adaptation of Henry James’s classic The Turn of the Screw. Kevin Byrne is Special Effects Supervisor. Matt Philip Whelan is Visual Effects Supervisor and Brendan Taylor is Visual Effects Supervisor for main vendor Mavericks VFX. The Rhythm Section (Paramount Pictures) Release date: U.K., February 22; U.S., February 22 A woman takes revenge against the planners of a plane crash that killed her family. Starring Blake Lively, Sterling K. Brown, Jude Law. Chris Corbould is Special Effects Supervisor, with BlueBolt on effects, led by Visual Effects Supervisor Sandro Henriques.

TOP: Aquaman (Image © 2018 Warner Bros. Pictures) BOTTOM: Hailee Steinfeld and Bumblebee in Bumblebee. (Image copyright © 2018 Paramount Pictures)

How to Train Your Dragon: The Hidden World (Universal Pictures) Release date: U.K., February 1; U.S., February 22 In this third installment of the How to Train Your Dragon series, Hiccup and Toothless face a huge new threat to their village. Dean DeBlois directs this 3D computer-animated fantasy film, third installment of a trilogy. Dave Walvoord is Visual Effects Supervisor. Dreamworks Animation is the producer.

WINTER 2019 VFXVOICE.COM • 19


FILM

2019 ICEBREAKERS: HOT WINTER VFX

The first films of 2019 are rich in both subtle and spectacular VFX that continue to push the envelope of human-like CG characters, awesome animals, monsters and creatures, fantastic world building, and surreal environments, above water and below, as well as in deep space. The winter wave also includes reviving classics with the latest life-giving creative and technical VFX magic. Audiences have crossed 2018 into 2019 watching Mary Poppins Returns, Aquaman, Bumblebee and Welcome to Marwen. In the next few months they will welcome Ad Astra with Brad Pitt crossing the solar system in search of his long-lost father (played by Tommy Lee Jones), the James Cameron co-scripted manga adaption Alita: Battle Angel, M. Night Shyamalan’s terrifying trilogy finale Glass, and the much-touted Captain Marvel with Brie Larson. Towards Spring, Chicago struggles under an alien yoke in Captive State, and Tim Burton brings his vivid imagination to a live-action Dumbo. VFX Voice has compiled some of the top VFX-injected films on tap this winter along with their effects supervisors and VFX houses involved. This is not intended to be a complete listing. Release dates are subject to change.

By CHRIS McGOWAN Eli (Paramount Pictures) Release date: U.K., January 1; U.S., January 4 In this horror film from director Ciarán Foy shot in Sofia, Bulgaria, a boy is treated for a rare disease in a secluded clinic that turns into a haunted prison. Neishaw Ali is Visual Effects Executive Producer, Wesley Sewell is Visual Effects Supervisor, with Spin VFX crew handling visual effects. Chris Bailey is Special Effects Coordinator. Fractured FX is also on board.

TOP: Mary Poppins Returns (Image copyright © 2018 Walt Disney Pictures) BOTTOM: Aquaman (Image © 2018 Warner Bros. Pictures)

Ad Astra (20th Century Fox) Release date: U.K., January 11; U.S., January 11 Twenty years after Clifford McBride (Tommy Lee Jones) left for Neptune to find signs of extra-terrestrial life, his son Roy (Brad Pitt) goes in search of him in this James Gray film. Bradley Parker is Production VFX Supervisor, Frank Iudica is Special Effects Coordinator. Atomic Fiction, Weta Digital, Vitality VFX, Mr. X, Method Studios, MPC, Capital T, Shade VFX and Brainstorm VFX are handling the VFX load. Glass (Universal Pictures) Release date: U.K., January 18; U.S., January 18 This M. Night Shyamalan horror-thriller completes a trilogy initiated with Unbreakable (2000) and followed by Split (2016). James McAvoy, Bruce Willis and Samuel L. Jackson lead the cast. Patrick Edward White is Special Effects Coordinator. Contributing effects were provided by Powerhouse VFX. The Maze (Sony) Release date: U.K., February 1; U.S., February 1 Six young strangers must use their wits to survive circumstances beyond their control in this action thriller. Directed by Adam Robitel, who’s also in the cast. Max Poolman is Special Effects Supervisor. VFX Files has the effects role.

18 • VFXVOICE.COM WINTER 2019

Jacob’s Ladder (LD Entertainment) Release date: U.K., TBD; U.S., February 1 A remake of Adrian Lyne’s 1990 psychological horror film about a Vietnam veteran struggling to maintain his sanity. David Fletcher is Special Effects Supervisor. Chris LeDoux is Visual Effects Supervisor and Robin Scott Graham is Visual Effect Supervisor for Crafty Apes. The Lego Movie 2: The Second Part (Warner Bros. Pictures) Release date: U.K., February 8; U.S. February 8 The Second Part reunites the heroes of Bricksburg to save their city as citizens face a new threat – invaders from outer space, wrecking everything faster than they can rebuild. This computer-animated sequel will be released simultaneously in 2D, 3D and IMAX 3D. Animal Logic is the main VFX house. Alita: Battle Angel (20th Century Fox) Release date: U.K., February 6; U.S., February 19 Robert Rodriguez helms this adaptation of Yukito Kishiro’s cyberpunk manga Gunn, co-scripted by James Cameron, about a damaged and abandoned young cyborg girl (Rosa Salazar) who wakes up in a future she doesn’t recognize. Weta Digital’s motion-capture performance by Salazar brings Alita to life as a CG character. Christoph Waltz, Jennifer Connelly and Mahershala Ali are also cast. Weta Digital’s Senior VFX Supervisor Joe Letteri, VES, with VFX Supervisors Eric Saindon and Nick Epstein are tasked with creating the digital Alita. DNEG and Framestore are also on board. The Turning (Universal Pictures) Release date: U.K., TBD; U.S., February 22 This haunted-house film is a modern-day adaptation of Henry James’s classic The Turn of the Screw. Kevin Byrne is Special Effects Supervisor. Matt Philip Whelan is Visual Effects Supervisor and Brendan Taylor is Visual Effects Supervisor for main vendor Mavericks VFX. The Rhythm Section (Paramount Pictures) Release date: U.K., February 22; U.S., February 22 A woman takes revenge against the planners of a plane crash that killed her family. Starring Blake Lively, Sterling K. Brown, Jude Law. Chris Corbould is Special Effects Supervisor, with BlueBolt on effects, led by Visual Effects Supervisor Sandro Henriques.

TOP: Aquaman (Image © 2018 Warner Bros. Pictures) BOTTOM: Hailee Steinfeld and Bumblebee in Bumblebee. (Image copyright © 2018 Paramount Pictures)

How to Train Your Dragon: The Hidden World (Universal Pictures) Release date: U.K., February 1; U.S., February 22 In this third installment of the How to Train Your Dragon series, Hiccup and Toothless face a huge new threat to their village. Dean DeBlois directs this 3D computer-animated fantasy film, third installment of a trilogy. Dave Walvoord is Visual Effects Supervisor. Dreamworks Animation is the producer.

WINTER 2019 VFXVOICE.COM • 19


FILM

The Kid Who Would Be King (20th Century Fox) Release date: U.K., February 15; U.S., March 1 In this British fantasy adventure directed by Joe Cornish, young Alex discovers the mythical sword Excalibur and fights to thwart a medieval menace. Rebecca Ferguson, Tom Taylor and Patrick Stewart are top billed. Steven Warner is Special Effects Supervisor, Stephen Hutchinson is Special Effects Co-Supervisor and Frazier Churchill is Visual Effects Supervisor. DNEG, TPO VFX and Rodeo FX are contributing VFX houses.

TOP: Alita: Battle Angel (Image copyright © 2018 20th Century Fox) MIDDLE: Alita: Battle Angel (Image copyright © 2018 20th Century Fox) BOTTOM: Dumbo (Image copyright © 2018 Disney Enterprises Inc.)

Chaos Walking (Liongate) Release date: U.K., March 1; U.S., March 1 Doug Liman helms this visit to the planet New World where people can access thoughts, and two companions try to evade detection and uncover the truth in this sci-fi adventure. Headliners are Tom Holland, Daisy Ridley, Demián Bichir and Mads Mikkelsen. Neishaw Ali is Visual Effects Executive Producer. Louis Craig is Special Effects Supervisor with Nicholas Brooks as Visual Effects Supervisor. Effects provided by Method Studios, ILM, Spin VFX and Iloura. Captain Marvel (Walt Disney Studios) Release date: U.K., March 8; U.S., March 8 Brie Larson is the Marvel Comics character Carol Danvers (Captain Marvel), and is joined in the cast by Samuel L. Jackson, Djimon Hounsou and Jude Law. Christopher Townsend is Production VFX Supervisor and Daniel Sudick is Visual Effects Supervisor. Additional VFX supervision by ILM, Trixter, Framestore, Legacy Effects, Trixter, Animal Logic, Lola Visual Effects, Scanline VFX, Rising Sun Pictures, Luma Pictures, Digital Domain, Rise VFX and Cantina Creative.

Captive State (Focus Features) Release date: U.K., April 12; U.S., March 29 This Rupert Wyatt-directed science fiction thriller is set in a Chicago neighborhood, 10 years into an extraterrestrial occupation, when tensions rise on both sides of the conflict between collaborators and dissidents. Eric Pascarelli is Visual Effects Supervisor. KNB EFX Group, Jellyfish Pictures, Atomic Arts and FuseFX are the VFX houses. Dumbo (Walt Disney Studios) Release date: U.K., March 29; U.S., March 29 Tim Burton and a flying elephant with enormous ears lead this live-action remake of the 1941 animated classic. Colin Farrell, Michael Keaton, Danny DeVito, Eva Green and Alan Arkin lead the cast. Richard Stammers is Visual Effects Supervisor and Hayley J. Williams is Special Effects Supervisor. MPC, Framestore, Cheap Shot, Rodeo FX and Rise cover the effects. TOP: Dumbo entering Dreamland in Dumbo. (Image copyright © 2018 Disney Enterprises Inc.) MIDDLE LEFT: Brie Larson in Captain Marvel (Image copyright © 2019 Marvel Studios) BOTTOM LEFT: How To Train Your Dragon: The Hidden World (Image copyright © 2019 Universal Pictures)

MIDDLE RIGHT: How To Train Your Dragon: The Hidden World (Image copyright © 2019 Universal Pictures) BOTTOM RIGHT: Wonder Park (Image copyright © 2018 Paramount Pictures)

Us (Universal Pictures) Release date: U.K., March 15; U.S., March 15 Jordan Peele’s new horror thriller after his Oscar-winning Get Out. Elia P. Popov is Special Effects Coordinator. Grady Cofer is Visual Effects Supervisor for ILM. Greyhound (Sony Pictures) Release date: U.K., March 22; U.S., March 22 Nazi U-boats chase a U.S. Destroyer during WWII. Tom Hanks writes and stars with Elisabeth Shue. Nathan McGuinness is Visual Effects Supervisor, Special Effects Supervisor is Marc Banich. Hydraulx is the VFX house, supervised by Joel Sevilla. Wonder Park (Paramount Pictures) Release date: U.K., April 8; U.S., March 15 A wonder-filled amusement park stirs the imagination of a wildly creative young girl. Javier Romero Rodriguez is Visual Effects Supervisor. Voices by Brianna Denski, Matthew Broderick, Jennifer Garner, Mila Kunis and John Oliver. Digital animation by Ilion Animation Studios.

20 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 21


FILM

The Kid Who Would Be King (20th Century Fox) Release date: U.K., February 15; U.S., March 1 In this British fantasy adventure directed by Joe Cornish, young Alex discovers the mythical sword Excalibur and fights to thwart a medieval menace. Rebecca Ferguson, Tom Taylor and Patrick Stewart are top billed. Steven Warner is Special Effects Supervisor, Stephen Hutchinson is Special Effects Co-Supervisor and Frazier Churchill is Visual Effects Supervisor. DNEG, TPO VFX and Rodeo FX are contributing VFX houses.

TOP: Alita: Battle Angel (Image copyright © 2018 20th Century Fox) MIDDLE: Alita: Battle Angel (Image copyright © 2018 20th Century Fox) BOTTOM: Dumbo (Image copyright © 2018 Disney Enterprises Inc.)

Chaos Walking (Liongate) Release date: U.K., March 1; U.S., March 1 Doug Liman helms this visit to the planet New World where people can access thoughts, and two companions try to evade detection and uncover the truth in this sci-fi adventure. Headliners are Tom Holland, Daisy Ridley, Demián Bichir and Mads Mikkelsen. Neishaw Ali is Visual Effects Executive Producer. Louis Craig is Special Effects Supervisor with Nicholas Brooks as Visual Effects Supervisor. Effects provided by Method Studios, ILM, Spin VFX and Iloura. Captain Marvel (Walt Disney Studios) Release date: U.K., March 8; U.S., March 8 Brie Larson is the Marvel Comics character Carol Danvers (Captain Marvel), and is joined in the cast by Samuel L. Jackson, Djimon Hounsou and Jude Law. Christopher Townsend is Production VFX Supervisor and Daniel Sudick is Visual Effects Supervisor. Additional VFX supervision by ILM, Trixter, Framestore, Legacy Effects, Trixter, Animal Logic, Lola Visual Effects, Scanline VFX, Rising Sun Pictures, Luma Pictures, Digital Domain, Rise VFX and Cantina Creative.

Captive State (Focus Features) Release date: U.K., April 12; U.S., March 29 This Rupert Wyatt-directed science fiction thriller is set in a Chicago neighborhood, 10 years into an extraterrestrial occupation, when tensions rise on both sides of the conflict between collaborators and dissidents. Eric Pascarelli is Visual Effects Supervisor. KNB EFX Group, Jellyfish Pictures, Atomic Arts and FuseFX are the VFX houses. Dumbo (Walt Disney Studios) Release date: U.K., March 29; U.S., March 29 Tim Burton and a flying elephant with enormous ears lead this live-action remake of the 1941 animated classic. Colin Farrell, Michael Keaton, Danny DeVito, Eva Green and Alan Arkin lead the cast. Richard Stammers is Visual Effects Supervisor and Hayley J. Williams is Special Effects Supervisor. MPC, Framestore, Cheap Shot, Rodeo FX and Rise cover the effects. TOP: Dumbo entering Dreamland in Dumbo. (Image copyright © 2018 Disney Enterprises Inc.) MIDDLE LEFT: Brie Larson in Captain Marvel (Image copyright © 2019 Marvel Studios) BOTTOM LEFT: How To Train Your Dragon: The Hidden World (Image copyright © 2019 Universal Pictures)

MIDDLE RIGHT: How To Train Your Dragon: The Hidden World (Image copyright © 2019 Universal Pictures) BOTTOM RIGHT: Wonder Park (Image copyright © 2018 Paramount Pictures)

Us (Universal Pictures) Release date: U.K., March 15; U.S., March 15 Jordan Peele’s new horror thriller after his Oscar-winning Get Out. Elia P. Popov is Special Effects Coordinator. Grady Cofer is Visual Effects Supervisor for ILM. Greyhound (Sony Pictures) Release date: U.K., March 22; U.S., March 22 Nazi U-boats chase a U.S. Destroyer during WWII. Tom Hanks writes and stars with Elisabeth Shue. Nathan McGuinness is Visual Effects Supervisor, Special Effects Supervisor is Marc Banich. Hydraulx is the VFX house, supervised by Joel Sevilla. Wonder Park (Paramount Pictures) Release date: U.K., April 8; U.S., March 15 A wonder-filled amusement park stirs the imagination of a wildly creative young girl. Javier Romero Rodriguez is Visual Effects Supervisor. Voices by Brianna Denski, Matthew Broderick, Jennifer Garner, Mila Kunis and John Oliver. Digital animation by Ilion Animation Studios.

20 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 21


FILM

THE REALITY OF VIRTUAL FILMMAKING By IAN FAILES

TOP: Habib Zargarpour works with his virtual production tools to plan out a scene. (Image courtesy of Digital Monarch Media) BOTTOM: A still from Digital Media Monarch’s virtual production for the ‘Trash Mesa’ scene in Blade Runner 2049. (Image copyright © 2017 Alcon Entertainment, LLC)

22 • VFXVOICE.COM WINTER 2019

It’s easy to think about a film being made by a crew with actors, a set, some lights and a camera. Indeed, many films are still made this way. But as films continue to be imbued with more complex action, and invariably more complex visual effects, filmmakers are turning to new production techniques to imagine these scenes virtually even before they’ve been shot, and then to scout sets, interact live with CG assets and characters, and to shoot, revise and iterate virtual scenes on the fly. ‘Virtual production,’ as it has become known, can mean many things. On Ready Player One, for example, virtual production techniques were used by director Steven Spielberg to combine motion-captured characters, virtual cameras, ‘simul-cams,’ realtime rendering and virtual reality to help imagine the entirely synthetic world of the OASIS. The upcoming Lion King from Jon Favreau and James Cameron’s Avatar sequels are also examples where virtual production is a central part of the making of those films. There are also many different types of virtual production, whether it’s in the motion-capture equipment used, the style of virtual cinematography, the real-time rendering engine of choice, or simply how different pieces of a system are ‘bolted’ together. VFX Voice asked several players in the virtual production space about their approaches to this growing area of filmmaking. DIGITAL STORYTELLERS

Digital Monarch Media (DMM), run by Wes Potter and Habib Zargarpour, is one of a growing number of specialist virtual production outfits, with contributions made to films including Ready Player One, Blade Runner 2049 and The Jungle Book. A variety of hardware and software tools DMM provides enable directors to operate a virtual camera and make lighting and CG set changes. The virtual camera or V-cam is a customized tablet mapped to real camera lenses with game controllers for operation, running through the Unity game engine. Giving filmmakers a collection of tangible tools that they can interact with is a big part of DMM’s tool set. “There are many things you have to cater for,” notes Zargarpour, who previously worked at Industrial Light & Magic, EA and Microsoft. “There are different camera rigs simulating dollies and jibs and those kind of things. And then you have people who like key framing. And then you have all the things relating to recording takes and being able to view and note those, plus a user interface for accessing the V-cam, lighting and moving set pieces around. There’s a lot to think about.” For Ready Player One, DMM worked with Digital Domain to integrate real-time engines into the virtual production pipeline. Scenes with motion-captured actors were scouted and filmed virtually, and then could be re-worked if necessary with the virtual tools. Spielberg also utilized a range of VR headsets for motion capture, scouting the sets and shooting practical sets. On Blade Runner 2049, the arrival of a flying Spinner vehicle was proving difficult for director Denis Villeneuve to imagine. To help find the shot, the director used DMM’s Expozure V-cam tool to visualize the scene from the interior of the cockpit. “We attached

the Spinner to the virtual camera and gave it to Denis,” explains Zargarpour. “He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.” DMM’s tools have been taken to a new level for an upcoming film where they were used at the previs stage to generate master scenes that could then be ‘filmed’ quickly with the virtual camera. Zargarpour says that two operators used Expozure to generate 300 previs shots for the film, including with realistic real-time water simulations, while over the same period six previs artists generated only 12 shots using more traditional means. A DEDICATED PLATFORM

Having worked on a number of films where virtual production was a key part of the filmmaking process, visual effects studio MPC decided to build its own virtual production platform called

TOP: A Digital Media Monarch demo that incorporates real-time rendered ocean waves via an NVIDIA ocean plug-in. (Image courtesy of Digital Monarch Media) BOTTOM: A scene is set up for filming with a motion-captured actor and the virtual camera as part of MPC’s Genesis platform. (Image courtesy of MPC)

“We attached the Spinner [vehicle in Blade Runner 2049] to the virtual camera and gave it to [director] Denis [Villeneuve]. He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.” —Habib Zargarpour, CCO/Co-founder, Digital Monarch Media

WINTER 2019 VFXVOICE.COM • 23


FILM

THE REALITY OF VIRTUAL FILMMAKING By IAN FAILES

TOP: Habib Zargarpour works with his virtual production tools to plan out a scene. (Image courtesy of Digital Monarch Media) BOTTOM: A still from Digital Media Monarch’s virtual production for the ‘Trash Mesa’ scene in Blade Runner 2049. (Image copyright © 2017 Alcon Entertainment, LLC)

22 • VFXVOICE.COM WINTER 2019

It’s easy to think about a film being made by a crew with actors, a set, some lights and a camera. Indeed, many films are still made this way. But as films continue to be imbued with more complex action, and invariably more complex visual effects, filmmakers are turning to new production techniques to imagine these scenes virtually even before they’ve been shot, and then to scout sets, interact live with CG assets and characters, and to shoot, revise and iterate virtual scenes on the fly. ‘Virtual production,’ as it has become known, can mean many things. On Ready Player One, for example, virtual production techniques were used by director Steven Spielberg to combine motion-captured characters, virtual cameras, ‘simul-cams,’ realtime rendering and virtual reality to help imagine the entirely synthetic world of the OASIS. The upcoming Lion King from Jon Favreau and James Cameron’s Avatar sequels are also examples where virtual production is a central part of the making of those films. There are also many different types of virtual production, whether it’s in the motion-capture equipment used, the style of virtual cinematography, the real-time rendering engine of choice, or simply how different pieces of a system are ‘bolted’ together. VFX Voice asked several players in the virtual production space about their approaches to this growing area of filmmaking. DIGITAL STORYTELLERS

Digital Monarch Media (DMM), run by Wes Potter and Habib Zargarpour, is one of a growing number of specialist virtual production outfits, with contributions made to films including Ready Player One, Blade Runner 2049 and The Jungle Book. A variety of hardware and software tools DMM provides enable directors to operate a virtual camera and make lighting and CG set changes. The virtual camera or V-cam is a customized tablet mapped to real camera lenses with game controllers for operation, running through the Unity game engine. Giving filmmakers a collection of tangible tools that they can interact with is a big part of DMM’s tool set. “There are many things you have to cater for,” notes Zargarpour, who previously worked at Industrial Light & Magic, EA and Microsoft. “There are different camera rigs simulating dollies and jibs and those kind of things. And then you have people who like key framing. And then you have all the things relating to recording takes and being able to view and note those, plus a user interface for accessing the V-cam, lighting and moving set pieces around. There’s a lot to think about.” For Ready Player One, DMM worked with Digital Domain to integrate real-time engines into the virtual production pipeline. Scenes with motion-captured actors were scouted and filmed virtually, and then could be re-worked if necessary with the virtual tools. Spielberg also utilized a range of VR headsets for motion capture, scouting the sets and shooting practical sets. On Blade Runner 2049, the arrival of a flying Spinner vehicle was proving difficult for director Denis Villeneuve to imagine. To help find the shot, the director used DMM’s Expozure V-cam tool to visualize the scene from the interior of the cockpit. “We attached

the Spinner to the virtual camera and gave it to Denis,” explains Zargarpour. “He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.” DMM’s tools have been taken to a new level for an upcoming film where they were used at the previs stage to generate master scenes that could then be ‘filmed’ quickly with the virtual camera. Zargarpour says that two operators used Expozure to generate 300 previs shots for the film, including with realistic real-time water simulations, while over the same period six previs artists generated only 12 shots using more traditional means. A DEDICATED PLATFORM

Having worked on a number of films where virtual production was a key part of the filmmaking process, visual effects studio MPC decided to build its own virtual production platform called

TOP: A Digital Media Monarch demo that incorporates real-time rendered ocean waves via an NVIDIA ocean plug-in. (Image courtesy of Digital Monarch Media) BOTTOM: A scene is set up for filming with a motion-captured actor and the virtual camera as part of MPC’s Genesis platform. (Image courtesy of MPC)

“We attached the Spinner [vehicle in Blade Runner 2049] to the virtual camera and gave it to [director] Denis [Villeneuve]. He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.” —Habib Zargarpour, CCO/Co-founder, Digital Monarch Media

WINTER 2019 VFXVOICE.COM • 23


FILM

camera in virtual production is to motion capture a Steadi-cam and have a cameraman operate it, but we went a lot further,” says Giordana. “We encoded a variety of traditional camera equipment like cranes, fluid heads, dolly tracks and more, and made sure they would feel to the camera crew exactly the same as the real thing.” So far, MPC has employed Genesis on a couple of projects and used the full range of VR and more traditional camera devices. “We feel like the tool kit has now been properly battle-tested, and on one of these projects we shot over 2,500 takes in a little over five weeks,” states Giordana. “It’s amazing seeing how an entire crew can gradually familiarize with these new workflows, pick up speed and suddenly start producing over 100 takes a day!” VIRTUAL TOOLS AT YOUR DISPOSAL

TOP: MPC’s Genesis system is intended to be a multi-user platform, as demonstrated in this Unity Book of the Dead frame. (Image courtesy of MPC) BOTTOM: One of the ideas behind Genesis is to utilize existing filmmaker tools such as crane arms and dolly wheels into the virtual production process. (Image courtesy of MPC)

“We wanted to extend the [Unreal Engine] editor so that you can have multiple editor sessions connected together from different computers and from different users. Then we also wanted it to run as a VR application, where they might not even be in the same location.” —Kim Libreri, CTO, Epic Games

Just as some outfits have incorporated the game engine Unity into their virtual production workflows, several others have used Epic Games’ Unreal Engine for real-time rendering aspects. That engine has several components that fit into a virtual production platform. For example, Unreal’s virtual camera plug-in relies on Apple’s AR kit on an iPad or iPhone to feed camera positions to the engine, thus enabling a user to use that device as a camera portal into a virtual world. “Straightaway, you can take some CG content you’ve made in the engine and film it as if you’re a professional virtual cinematographer,” says Epic Games CTO Kim Libreri. ILMxLAB recently utilized an Unreal Engine workflow for their Star Wars-related ‘Reflections’ demo, which also took advantage of real-time ray tracing. Another Unreal Engine demo from Kite & Lightning showed how a head-mounted iPhone capturing a facial performance could drive the performance of a CG avatar – in this case, a baby – in real-time with convincing results. Unreal Engine

TOP: A still from ‘Reflections,’ a real-time ray-tracing demo in Unreal Engine 4, completed in conjunction with ILMxLAB and NVIDIA. (Image courtesy of Epic Games) BOTTOM: At SIGGRAPH, ILMxLAB and Epic Games demonstrated the ‘Reflections’ piece during the Real-Time Live! event. (Image courtesy of SIGGRAPH)

Genesis. “The turning point for MPC was The Jungle Book,” notes MPC Realtime Software Architect Francesco Giordana. “That’s when we realized how important virtual production is and we embarked on this new adventure. Tracking all the assets, the scenes, the cameras, the animations and all of the modifications is key. We couldn’t find any available third-party solution that could give us that out of the box. “It also became apparent that standard VFX tools were not the way to go. They didn’t offer the speed or the flexibility that on-set workflows require, so we turned to game engines, not as the definitive solution, but rather as a piece of a much more elaborate framework.” Genesis works with Unity, but MPC says that they rely on the game engine mainly for graphics and user interactions. The studio has built a separate layer for handling network configurations instead of relying on typical multiplayer game patterns. A key aspect of Genesis has been to incorporate elements of typical live-action filmmaking into the virtual tools. “The typical way of maintaining the look and feel of a physical

24 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 25


FILM

camera in virtual production is to motion capture a Steadi-cam and have a cameraman operate it, but we went a lot further,” says Giordana. “We encoded a variety of traditional camera equipment like cranes, fluid heads, dolly tracks and more, and made sure they would feel to the camera crew exactly the same as the real thing.” So far, MPC has employed Genesis on a couple of projects and used the full range of VR and more traditional camera devices. “We feel like the tool kit has now been properly battle-tested, and on one of these projects we shot over 2,500 takes in a little over five weeks,” states Giordana. “It’s amazing seeing how an entire crew can gradually familiarize with these new workflows, pick up speed and suddenly start producing over 100 takes a day!” VIRTUAL TOOLS AT YOUR DISPOSAL

TOP: MPC’s Genesis system is intended to be a multi-user platform, as demonstrated in this Unity Book of the Dead frame. (Image courtesy of MPC) BOTTOM: One of the ideas behind Genesis is to utilize existing filmmaker tools such as crane arms and dolly wheels into the virtual production process. (Image courtesy of MPC)

“We wanted to extend the [Unreal Engine] editor so that you can have multiple editor sessions connected together from different computers and from different users. Then we also wanted it to run as a VR application, where they might not even be in the same location.” —Kim Libreri, CTO, Epic Games

Just as some outfits have incorporated the game engine Unity into their virtual production workflows, several others have used Epic Games’ Unreal Engine for real-time rendering aspects. That engine has several components that fit into a virtual production platform. For example, Unreal’s virtual camera plug-in relies on Apple’s AR kit on an iPad or iPhone to feed camera positions to the engine, thus enabling a user to use that device as a camera portal into a virtual world. “Straightaway, you can take some CG content you’ve made in the engine and film it as if you’re a professional virtual cinematographer,” says Epic Games CTO Kim Libreri. ILMxLAB recently utilized an Unreal Engine workflow for their Star Wars-related ‘Reflections’ demo, which also took advantage of real-time ray tracing. Another Unreal Engine demo from Kite & Lightning showed how a head-mounted iPhone capturing a facial performance could drive the performance of a CG avatar – in this case, a baby – in real-time with convincing results. Unreal Engine

TOP: A still from ‘Reflections,’ a real-time ray-tracing demo in Unreal Engine 4, completed in conjunction with ILMxLAB and NVIDIA. (Image courtesy of Epic Games) BOTTOM: At SIGGRAPH, ILMxLAB and Epic Games demonstrated the ‘Reflections’ piece during the Real-Time Live! event. (Image courtesy of SIGGRAPH)

Genesis. “The turning point for MPC was The Jungle Book,” notes MPC Realtime Software Architect Francesco Giordana. “That’s when we realized how important virtual production is and we embarked on this new adventure. Tracking all the assets, the scenes, the cameras, the animations and all of the modifications is key. We couldn’t find any available third-party solution that could give us that out of the box. “It also became apparent that standard VFX tools were not the way to go. They didn’t offer the speed or the flexibility that on-set workflows require, so we turned to game engines, not as the definitive solution, but rather as a piece of a much more elaborate framework.” Genesis works with Unity, but MPC says that they rely on the game engine mainly for graphics and user interactions. The studio has built a separate layer for handling network configurations instead of relying on typical multiplayer game patterns. A key aspect of Genesis has been to incorporate elements of typical live-action filmmaking into the virtual tools. “The typical way of maintaining the look and feel of a physical

24 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 25


FILM

“The turning point for MPC was The Jungle Book. That’s when we realized how important virtual production is and we embarked on this new adventure. Tracking all the assets, the scenes, the cameras, the animations and all of the modifications is key. We couldn’t find any available thirdparty solution that could give us that out of the box… so we turned to game engines, not as the definitive solution, but rather as a piece of a much more elaborate framework.” —Francesco Giordana, Realtime Software Architect, MPC

is also used by several studios to do live-action real-time compositing, yet another aspect of virtual production filmmaking. Epic’s vision for the future of virtual production with Unreal Engine is the idea of virtual collaboration. “If you make a movie by yourself,” says Libreri, “you don’t really need a collaboration system, but that’s typically not the way it is. You usually have somebody working on lighting, somebody working on set layout, somebody doing the camera work, someone directing. We wanted to extend the editor so that you can have multiple editor sessions connected together from different computers and from different users. And then we also wanted it to run as a VR application, where they might not even be in the same location.” Outside of more mainstream projects, Epic has even enabled virtual production to take place within its popular multiplayer game, Fortnite. Here, players can find one of the game’s hidden ‘greenscreens,’ capture themselves dancing, and then use the replay system to output the footage to be combined via a traditional editing tool with other footage. “It’s not just about pros,” says Libreri. “We’re enabling the YouTube/Twitch generation to make little movies and express themselves.” ON-SET PRODUCTION

Virtual production systems tend to rely on camera-tracking technology that enables virtual graphics to be super-imposed in real-time onto live-action photography. Ncam has been offering that ability for several years, with experience on large-scale TOP LEFT: Kite & Lightning’s Cory Strassburger demonstrates his Bebylon project using an iPhone X, Xsens motion-capture suit, IKINEMA software and Unreal Engine. (Image courtesy of Kite & Lightning) TOP RIGHT: This Ncam Real Depth demo shows how the presenter against greenscreen is able to be captured and composited – in real-time – behind a separate element in the footage via depth data. BOTTOM RIGHT: Real Light is Ncam’s new approach to rendering virtual graphics in real-time and integrating them into a real-world environment, as well as adding in dynamic shadows and lighting changes.

26 • VFXVOICE.COM WINTER 2019

AD



FILM

films such as Hugo, White House Down and Solo: A Star Wars Story. Among newer projects are Deadpool 2, Aquaman and The Nutcracker and the Four Realms. It also has a major footprint in real-time broadcast graphics.

The classic use of Ncam on a feature film or television show has been to visualize CG environments or set extensions, typically by replacing large amounts of greenscreen with a previs asset. Says Ncam CEO Nic Hatch, “This allows the filmmakers to compose

A New Tool on the Virtual Block A group of Netherlands-based filmmakers and visual effects artists have embarked on their own virtual production platform called DeepSpace. The tool-set is a collaboration between Planet X Technologies and ScreenSpace Lab, and is aimed at helping filmmakers craft previs with virtual camera tools. Essentially, their system works by allowing someone to hold a physical camera in their hand rigged with common camera controls. The user can see the scene inside VR goggles, and can move around the scene to experiment with angles, lenses and camera movement. Later, this can be adjusted further and be used to produce a technical visualization of the location. “We noticed that there is a reluctance with directors, DPs and other filmmakers to engage in creating quality previs for their projects,” says ScreenSpace Lab’s Marijn Eken on the move to develop DeepSpace. “Communicating to the 3D and previs animators what kind of shots they want to see can be a frustrating

and slow process from their perspective. We wanted to give back control to the filmmakers by letting them create the shots themselves in an intuitive way without having to know how to operate 3D software.” DeepSpace is targeting the European market, focusing on productions that may have a complex scene that is difficult to grasp or visualize. “Our workflow is to invite production designers to review their own designs in VR and together with the director make decisions on what they actually need to build,” explains Eken. “A stunt coordinator can give us input on how he would approach a stunt. And, finally, the director of photography, together with the director, can effectively have a ‘shooting day’ in our system and download the recorded clips to edit into a previs movie.”

AD

“We wanted to give back control to the filmmakers by letting them create the shots themselves in an intuitive way without having to know how to operate 3D software.” —Marijn Eken, Founder/Developer, ScreenSpace Lab

CLOCKWISE: A recent Dutch feature film used DeepSpace to figure out a shoot that needed to take place on a highway with precision drives and near-miss accidents. (Image courtesy of DeepSpace) CG car assets and a virtual set enabled the filmmakers to imagine the roadway sequence by testing out lenses, framing, pacing and action. (Image courtesy of DeepSpace) A shot from the final sequence shows how the live action closely matches the virtual production-enabled previs, as created with DeepSpace. (Image courtesy of DeepSpace)

28 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 29



FILM

“[Ncam] allows the filmmakers to compose and frame the shot through the eyepiece as if the CG set were physical. Lens choices can be made, and the zoom, lens distortion and focus are all interactive as per a live video feed.” —Nic Hatch, CEO, Ncam and frame the shot through the eyepiece as if the CG set were physical. Lens choices can be made, and the zoom, lens distortion and focus are all interactive as per a live video feed. We’ve also seen animated assets being used, such as vehicles and aircraft. These types of shots can be really useful for trigger points and reaction and interaction with the actors.” Ncam’s toolset is made up of camera-tracking tech called Ncam Reality, which is both a hardware and software solution. The hardware consists of a sensor bar with optical and mechanical sensors that attaches to the main camera, and a server where the software sits. Ncam’s software includes 3D point-cloud generation, which makes the camera tracking marker-free. The company also has a depth-sensing option called Real Depth which can extract actors from greenscreens with interactive depth-based compositing. In recent times, Ncam has been moving forward on experimentations with LED screens rather than greenscreens, more augmented reality projects, as well as techniques to allow for automatic relighting of CG elements based on the real-world lighting in a scene (its product will be called Real Light). “The system figures out where the real-world lights are in terms of direction, brightness and color, and recreates those in real time,” explains Hatch. “It also computes reflections, image-based lighting and shadows. We’re really excited by the possibilities Real Light offers, not just for real-time applications, but also in terms of data collection on set and location, and then bringing that data back to set to recreate the exact lighting conditions, to take out the guesswork.”

AD

TOP LEFT: Ncam’s sensor bar attached to a camera enables elements of the environment to be tracked in terms of position and depth for the purposes of augmented reality and real-time VFX BOTTOM: A frame composed through the Unity game engine for Book ofthe Dead, which was made with MPC’s Genesis system. An important philosophy behind the Genesis approach is to replicate how live-action filming is achieved. (Image courtesy of MPC).

30 • VFXVOICE.COM WINTER 2019

TOP RIGHT: Previs for a ship at sea demo crafted using DMM’s Expozure virtual production tool. The virtual dolly is highlighted in the frame. (Image courtesy of Digital Monarch Media)

WINTER 2019 VFXVOICE.COM • 31



PROFILE

PHIL TIPPETT: FOLLOWING HIS IMAGINATION TO THE STARS AND BEYOND By CHRIS McGOWAN

32 • VFXVOICE.COM WINTER 2019

Throughout his career, Phil Tippett, VES has shown that imagination “will find a way.” He has brought all sorts of memorable creatures to life in movies like Star Wars, Jurassic Park, Starship Troopers and the Twilight Saga, while expanding the possibilities of special effects with his creative and technical skills. His dreams have become our dreams and left their mark on global culture. His firm, Tippett Studio, has provided VFX for many notable films and branched into the growing art form of fixed-location entertainment incorporating digital imagery. Along the way, Tippett has garnered two Academy Awards and an Emmy, the George Méliès Award for Artistic Excellence from the Visual Effects Society (VES), and numerous Oscar nominations with Tippett Studio. “Phil Tippett has functioned as one of the major engines pushing the visual effects industry further than it was ever thought possible,” said VES Executive Director Eric Roth, when Tippett received the George Méliès Award in 2009. “Phil has become one of the giants of our industry by pioneering new ways to make the fantastical a practical reality.” The seeds for Tippett’s career were planted early. Born in 1951 in Berkeley, California, he was seven years old when he saw The Seventh Voyage of Sinbad, which featured Ray Harryhausen’s stop-motion animation technique called Dynamation. The movie’s monsters and special effects were like “a lightning bolt” for Tippett. He started teaching himself how to make rudimentary stopmotion films with clay sculptures, articulated G.I. Joe dolls, and an 8mm movie camera that could film frame by frame. His parents worried that his pursuit was obsessive and a complete waste of time. They talked to a psychologist. They weren’t any happier when he started reading Forrest J. Ackerman’s magazine Famous Monsters of Filmland, which had articles about Harryhausen that made his work more palpable to the young Tippett. Later, he would meet Ackerman and attend gatherings at his Hollywood home, including when Harryhausen was in attendance. Tippett was often there with other aspiring filmmakers: Dennis Muren ASC, VES; Tom St. Amand; Jon Berg; and Rick Baker among them – all of whom went on to work on Star Wars. Tippett recalls that Harryhausen wouldn’t give away his trade secrets, but was always encouraging the young artists. After obtaining a fine arts degree from UC Irvine in 1974, Tippett found work at Cascade Pictures in Hollywood. “That was kind of our school where we were mentored. There wasn’t anybody doing any stop motion in the United States really, except at Cascade. I knew that Jim Danforth was working there, so I arranged a tour and met with Danforth, Dave Allen and Dennis Muren, and we worked for Phil Kellison, who was our mentor. I was brought in initially as a model maker and then eventually a sculptor-animator. That was our graduate work. Most of the stuff we’d previously just done at home.” Then, thanks to Tippett’s close connections to his animator peers, he became involved with the first Star Wars movie. A friend told Tippett that he knew a guy “who’s making a science fiction movie and he’s looking for people, so you should give him a call.” Tippett continues, “So I called Richard Edlund [ASC, VES] and he was looking for camera people. It was not my forte, but I gave him

“I asked [George Lucas], ‘If you were going to cast an actor to play Jabba [the Hutt in Return of the Jedi], who would that be?’ And he thought for a moment and said, ‘Sydney Greenstreet’ [a corpulent English actor]. At that moment I got a flash and came up with the design.” —Phil Tippett Dennis [Muren]’s number, and so Richard hired Dennis. “George [Lucas] wanted to do some insert shots for the cantina scene [the dingy Tatooine dive bar],” says Tippett, “and Dennis hooked George up with Rick Baker, and Rick Baker hired me and three other out-of-work stop-motion animators.” Tippett worked on the cantina creatures. “During that period George saw some stop-motion puppets that I had, and that gave him the idea for the [holo]chess set with stop-motion. This was right at the end of the schedule, two weeks to go before everything had to be done,” relates Tippett. “So George said, ‘Make me 10 alien monsters.’” Tippett, Baker and others spent about a week working on new creatures (as well as recycling some puppets Tippett had previously made) and shot them over a few days. Tippett was head of ILM’s “creature shop” for Star Wars: Episode V – The Empire Strikes Back (1980), and worked with Jon Berg and Muren to animate the AT-AT Imperial Walkers and Tauntaun animals with a pioneering stop-motion technique (“go motion”) with “motion blur” to make model movement more realistic and less jerky. For Star Wars: Episode VI – Return of the Jedi (1983), Tippett designed the inimitable Jabba the Hutt. “Ralph McQuarrie, Joe Johnston, Nilo Rodis-Jamero and I all contributed designs, and George eventually picked the design that I came up with.” Initially, Lucas wasn’t finding what he wanted. “So I asked him, ‘If you were going to cast an actor to play Jabba, who would that be?’ And he thought for a moment and said, ‘Sydney Greenstreet’ [a corpulent English actor]. At that moment I got a flash and came up with the design.” Tippett also developed the fearsome Rancor dungeon-dwelling creature. He, Muren, Richard Edlund and Ken Ralston received an Academy Award for Return of the Jedi – a Special Achievement Award for Visual Effects. Tippett subsequently set up Tippett Studio in Berkeley with his wife, Jules Roman, in 1984 (she is the company president) and directed the short dinosaur film Prehistoric Beast, utilizing his “go motion” animation technique. The film depicts a Tyrannosaurus’s pursuit of a Monoclonius. Tippett hoped to sell it to the educational market. “I knew a lot of stuff. I’d been working with paleontologists from UC Berkeley.” Ultimately, some television producers became interested, and it was turned into the full-length documentary Dinosaur!, with Tippett providing

OPPOSITE TOP: Phil Tippett, VES (Image courtesy of Tippett Studio) OPPOSITE BOTTOM: Tippett; Dennis Muren ASC, VES; Ken Ralston and Richard Edlund ASC, VES shared an Academy Award for Star Wars: Episode VI – Return of the Jedi – a Special Achievement Award for Visual Effects. (Image courtesy of Tippett Studio) TOP: Tippett and a Tauntaun maquette for Star Wars: Episode V – The Empire Strikes Back. (Image courtesy of Tippett Studio) BOTTOM: The Imperial Walkers. (Image courtesy of Tippett Studio and Walt Disney)

WINTER 2019 VFXVOICE.COM • 33


PROFILE

PHIL TIPPETT: FOLLOWING HIS IMAGINATION TO THE STARS AND BEYOND By CHRIS McGOWAN

32 • VFXVOICE.COM WINTER 2019

Throughout his career, Phil Tippett, VES has shown that imagination “will find a way.” He has brought all sorts of memorable creatures to life in movies like Star Wars, Jurassic Park, Starship Troopers and the Twilight Saga, while expanding the possibilities of special effects with his creative and technical skills. His dreams have become our dreams and left their mark on global culture. His firm, Tippett Studio, has provided VFX for many notable films and branched into the growing art form of fixed-location entertainment incorporating digital imagery. Along the way, Tippett has garnered two Academy Awards and an Emmy, the George Méliès Award for Artistic Excellence from the Visual Effects Society (VES), and numerous Oscar nominations with Tippett Studio. “Phil Tippett has functioned as one of the major engines pushing the visual effects industry further than it was ever thought possible,” said VES Executive Director Eric Roth, when Tippett received the George Méliès Award in 2009. “Phil has become one of the giants of our industry by pioneering new ways to make the fantastical a practical reality.” The seeds for Tippett’s career were planted early. Born in 1951 in Berkeley, California, he was seven years old when he saw The Seventh Voyage of Sinbad, which featured Ray Harryhausen’s stop-motion animation technique called Dynamation. The movie’s monsters and special effects were like “a lightning bolt” for Tippett. He started teaching himself how to make rudimentary stopmotion films with clay sculptures, articulated G.I. Joe dolls, and an 8mm movie camera that could film frame by frame. His parents worried that his pursuit was obsessive and a complete waste of time. They talked to a psychologist. They weren’t any happier when he started reading Forrest J. Ackerman’s magazine Famous Monsters of Filmland, which had articles about Harryhausen that made his work more palpable to the young Tippett. Later, he would meet Ackerman and attend gatherings at his Hollywood home, including when Harryhausen was in attendance. Tippett was often there with other aspiring filmmakers: Dennis Muren ASC, VES; Tom St. Amand; Jon Berg; and Rick Baker among them – all of whom went on to work on Star Wars. Tippett recalls that Harryhausen wouldn’t give away his trade secrets, but was always encouraging the young artists. After obtaining a fine arts degree from UC Irvine in 1974, Tippett found work at Cascade Pictures in Hollywood. “That was kind of our school where we were mentored. There wasn’t anybody doing any stop motion in the United States really, except at Cascade. I knew that Jim Danforth was working there, so I arranged a tour and met with Danforth, Dave Allen and Dennis Muren, and we worked for Phil Kellison, who was our mentor. I was brought in initially as a model maker and then eventually a sculptor-animator. That was our graduate work. Most of the stuff we’d previously just done at home.” Then, thanks to Tippett’s close connections to his animator peers, he became involved with the first Star Wars movie. A friend told Tippett that he knew a guy “who’s making a science fiction movie and he’s looking for people, so you should give him a call.” Tippett continues, “So I called Richard Edlund [ASC, VES] and he was looking for camera people. It was not my forte, but I gave him

“I asked [George Lucas], ‘If you were going to cast an actor to play Jabba [the Hutt in Return of the Jedi], who would that be?’ And he thought for a moment and said, ‘Sydney Greenstreet’ [a corpulent English actor]. At that moment I got a flash and came up with the design.” —Phil Tippett Dennis [Muren]’s number, and so Richard hired Dennis. “George [Lucas] wanted to do some insert shots for the cantina scene [the dingy Tatooine dive bar],” says Tippett, “and Dennis hooked George up with Rick Baker, and Rick Baker hired me and three other out-of-work stop-motion animators.” Tippett worked on the cantina creatures. “During that period George saw some stop-motion puppets that I had, and that gave him the idea for the [holo]chess set with stop-motion. This was right at the end of the schedule, two weeks to go before everything had to be done,” relates Tippett. “So George said, ‘Make me 10 alien monsters.’” Tippett, Baker and others spent about a week working on new creatures (as well as recycling some puppets Tippett had previously made) and shot them over a few days. Tippett was head of ILM’s “creature shop” for Star Wars: Episode V – The Empire Strikes Back (1980), and worked with Jon Berg and Muren to animate the AT-AT Imperial Walkers and Tauntaun animals with a pioneering stop-motion technique (“go motion”) with “motion blur” to make model movement more realistic and less jerky. For Star Wars: Episode VI – Return of the Jedi (1983), Tippett designed the inimitable Jabba the Hutt. “Ralph McQuarrie, Joe Johnston, Nilo Rodis-Jamero and I all contributed designs, and George eventually picked the design that I came up with.” Initially, Lucas wasn’t finding what he wanted. “So I asked him, ‘If you were going to cast an actor to play Jabba, who would that be?’ And he thought for a moment and said, ‘Sydney Greenstreet’ [a corpulent English actor]. At that moment I got a flash and came up with the design.” Tippett also developed the fearsome Rancor dungeon-dwelling creature. He, Muren, Richard Edlund and Ken Ralston received an Academy Award for Return of the Jedi – a Special Achievement Award for Visual Effects. Tippett subsequently set up Tippett Studio in Berkeley with his wife, Jules Roman, in 1984 (she is the company president) and directed the short dinosaur film Prehistoric Beast, utilizing his “go motion” animation technique. The film depicts a Tyrannosaurus’s pursuit of a Monoclonius. Tippett hoped to sell it to the educational market. “I knew a lot of stuff. I’d been working with paleontologists from UC Berkeley.” Ultimately, some television producers became interested, and it was turned into the full-length documentary Dinosaur!, with Tippett providing

OPPOSITE TOP: Phil Tippett, VES (Image courtesy of Tippett Studio) OPPOSITE BOTTOM: Tippett; Dennis Muren ASC, VES; Ken Ralston and Richard Edlund ASC, VES shared an Academy Award for Star Wars: Episode VI – Return of the Jedi – a Special Achievement Award for Visual Effects. (Image courtesy of Tippett Studio) TOP: Tippett and a Tauntaun maquette for Star Wars: Episode V – The Empire Strikes Back. (Image courtesy of Tippett Studio) BOTTOM: The Imperial Walkers. (Image courtesy of Tippett Studio and Walt Disney)

WINTER 2019 VFXVOICE.COM • 33


PROFILE

“I’d been working with Dennis [Muren], and we’d been prepping Jurassic Park for quite a while. I was very aware of the work they had been doing with computer graphics. So when they got to the stage of Spielberg committing to go with computer graphics, yeah, everything changed for me.” —Phil Tippett

TOP LEFT: Tippett and a “go motion” set-up for The Empire Strikes Back. (Image courtesy of Tippett Studio) MIDDLE LEFT: Luke Skywalker (Mark Hamill) on a Tauntaun. (Image courtesy of Tippett Studio and Walt Disney) BOTTOM LEFT: Tippett and an Imperial Walker in The Empire Strikes Back. (Image courtesy of Tippett Studio) TOP RIGHT: The Rancor beast of The Return of the Jedi and its creator, Phil Tippett. (Image courtesy of Tippett Studio) MIDDLE RIGHT: Tippett works on RoboCop. (Image courtesy of Tippett Studio) BOTTOM RIGHT: RoboCop was another example of Tippett’s “go motion.” (Image courtesy of Tippett Studio and Orion Pictures)

34 • VFXVOICE.COM WINTER 2019

the dinosaur sequences. It won a Primetime Emmy Award for Outstanding Visual Effects in 1986. Tippett and/or his studio worked on such projects as Dragonslayer (1981), RoboCop (1987) and Willow (1988), all noteworthy for their special effects. And then came Jurassic Park (1993), a life-changing event for Tippett. There is an oft-told story that Tippett was at the ILM test screening where Dennis Muren showed off the possibilities of photorealistic animation to Steven Spielberg and others, and when Phil saw the CGI dinosaurs he said, “I think I’m extinct.” “That’s true,” says Tippett. “But it came as no surprise. I’d been working with Dennis, and we’d been prepping Jurassic Park for

quite a while. I was very aware of the work they had been doing with computer graphics. So when they got to the stage of Spielberg committing to go with computer graphics, yeah, everything changed for me.” Tippett saw it as a “shot to the head” of stop-motion animation, but Spielberg called him and urged him to participate in Jurassic Park, for both his filmmaking skills and his in-depth knowledge of dinosaurs. “Steven didn’t want them portrayed as monsters. He wanted them portrayed as the animals they were, behaviorally. I was tasked with keeping everything on track. We’d go through the script. Michael Crichton, because he was writing, didn’t worry about things like scale. [In his writing] the T-Rex picks up a car and shakes it in his jaws, like in a Godzilla movie, but it wouldn’t do that. The car was too heavy for a Tyrannosaurus to lift. I was just keeping things on track.” Tippett adds, “In the end, I got kicked upstairs, so I ended up having more of a supervisory capacity.” He was, in fact, credited as “Dinosaur Supervisor” for the film and oversaw both ILM and Tippett Studio animators, and made sure the animals seemed realistic and alive. It all turned out rather well for him: Tippett, Dennis Muren, Stan Wilson and Michael Lantieri shared an Oscar for Best Visual Effects. Tippett Studio animators began their transition to computer-generated animation on Jurassic Park by developing the DID (Digital-Input-Device) system that linked a computer animation program to sensors in the joints of stop-motion armature. This later earned Craig Hayes (Tippett Studio co-founder/ VFX Supervisor) and others an Academy Award for Scientific and Technical Achievement. Tippett’s studio has supplied VFX for more than 70 films, including Dragonheart, Ghostbusters II, Honey, I Shrunk the Kids, Starship Troopers, Armageddon, The Haunting, Mission to Mars, The Matrix Revolutions, Cloverfield, The Spiderwick Chronicles, the Twilight series, Ted, Jurassic World and Star Wars: The Force Awakens. Tippett also worked on the holochess scene for Solo: A Star Wars Story. Tippett Studio reconstructed pieces that had eroded over time and utilized stop motion in-house for the new version. Tippett Studio is now also involved in producing fixed-location immersive entertainment. Its first effort was the Dream of Anhui “flying theater” ride with CGI and sensory effects for a Wanda theme park in Anhui, China. “Now we’re doing a bunch of other ones, mostly Chinese.”

TOP: Tippett’s work on his own “Prehistoric Beast” short film led to the documentary Dinosaur! His dinosaur knowledge came in handy several years later on Jurassic Park. (Image courtesy of Tippett Studio) MIDDLE: Velociraptors on the loose in Jurassic Park. (Image copyright © Universal Pictures) BOTTOM: Conceptualizing the T-Rex of Jurassic Park. (Image courtesy of Tippett Studio)

WINTER 2019 VFXVOICE.COM • 35


PROFILE

“I’d been working with Dennis [Muren], and we’d been prepping Jurassic Park for quite a while. I was very aware of the work they had been doing with computer graphics. So when they got to the stage of Spielberg committing to go with computer graphics, yeah, everything changed for me.” —Phil Tippett

TOP LEFT: Tippett and a “go motion” set-up for The Empire Strikes Back. (Image courtesy of Tippett Studio) MIDDLE LEFT: Luke Skywalker (Mark Hamill) on a Tauntaun. (Image courtesy of Tippett Studio and Walt Disney) BOTTOM LEFT: Tippett and an Imperial Walker in The Empire Strikes Back. (Image courtesy of Tippett Studio) TOP RIGHT: The Rancor beast of The Return of the Jedi and its creator, Phil Tippett. (Image courtesy of Tippett Studio) MIDDLE RIGHT: Tippett works on RoboCop. (Image courtesy of Tippett Studio) BOTTOM RIGHT: RoboCop was another example of Tippett’s “go motion.” (Image courtesy of Tippett Studio and Orion Pictures)

34 • VFXVOICE.COM WINTER 2019

the dinosaur sequences. It won a Primetime Emmy Award for Outstanding Visual Effects in 1986. Tippett and/or his studio worked on such projects as Dragonslayer (1981), RoboCop (1987) and Willow (1988), all noteworthy for their special effects. And then came Jurassic Park (1993), a life-changing event for Tippett. There is an oft-told story that Tippett was at the ILM test screening where Dennis Muren showed off the possibilities of photorealistic animation to Steven Spielberg and others, and when Phil saw the CGI dinosaurs he said, “I think I’m extinct.” “That’s true,” says Tippett. “But it came as no surprise. I’d been working with Dennis, and we’d been prepping Jurassic Park for

quite a while. I was very aware of the work they had been doing with computer graphics. So when they got to the stage of Spielberg committing to go with computer graphics, yeah, everything changed for me.” Tippett saw it as a “shot to the head” of stop-motion animation, but Spielberg called him and urged him to participate in Jurassic Park, for both his filmmaking skills and his in-depth knowledge of dinosaurs. “Steven didn’t want them portrayed as monsters. He wanted them portrayed as the animals they were, behaviorally. I was tasked with keeping everything on track. We’d go through the script. Michael Crichton, because he was writing, didn’t worry about things like scale. [In his writing] the T-Rex picks up a car and shakes it in his jaws, like in a Godzilla movie, but it wouldn’t do that. The car was too heavy for a Tyrannosaurus to lift. I was just keeping things on track.” Tippett adds, “In the end, I got kicked upstairs, so I ended up having more of a supervisory capacity.” He was, in fact, credited as “Dinosaur Supervisor” for the film and oversaw both ILM and Tippett Studio animators, and made sure the animals seemed realistic and alive. It all turned out rather well for him: Tippett, Dennis Muren, Stan Wilson and Michael Lantieri shared an Oscar for Best Visual Effects. Tippett Studio animators began their transition to computer-generated animation on Jurassic Park by developing the DID (Digital-Input-Device) system that linked a computer animation program to sensors in the joints of stop-motion armature. This later earned Craig Hayes (Tippett Studio co-founder/ VFX Supervisor) and others an Academy Award for Scientific and Technical Achievement. Tippett’s studio has supplied VFX for more than 70 films, including Dragonheart, Ghostbusters II, Honey, I Shrunk the Kids, Starship Troopers, Armageddon, The Haunting, Mission to Mars, The Matrix Revolutions, Cloverfield, The Spiderwick Chronicles, the Twilight series, Ted, Jurassic World and Star Wars: The Force Awakens. Tippett also worked on the holochess scene for Solo: A Star Wars Story. Tippett Studio reconstructed pieces that had eroded over time and utilized stop motion in-house for the new version. Tippett Studio is now also involved in producing fixed-location immersive entertainment. Its first effort was the Dream of Anhui “flying theater” ride with CGI and sensory effects for a Wanda theme park in Anhui, China. “Now we’re doing a bunch of other ones, mostly Chinese.”

TOP: Tippett’s work on his own “Prehistoric Beast” short film led to the documentary Dinosaur! His dinosaur knowledge came in handy several years later on Jurassic Park. (Image courtesy of Tippett Studio) MIDDLE: Velociraptors on the loose in Jurassic Park. (Image copyright © Universal Pictures) BOTTOM: Conceptualizing the T-Rex of Jurassic Park. (Image courtesy of Tippett Studio)

WINTER 2019 VFXVOICE.COM • 35


PROFILE

Tippett has also been finishing his own feature-length stopmotion epic called Mad God, which he says is impossible to describe. “It’s like everything but the kitchen sink that I’ve been thinking about for the last 60 years kind of rolled into one.” He screened the film-in-progress last year at New York’s MOMA (Museum of Modern Art). In terms of advice for aspiring VFX artists, Tippett says, “One thing is to assess your own level of skill and talent. And you just need to be somewhat obsessed with wanting to do this stuff.”

AD

TOP LEFT: Chewbacca enjoys holographic chess with Woody Harrelson as Beckett in Solo: A Star Wars Story. (Image © Walt Disney) MIDDLE LEFT: Holochess in Solo: A Star Wars Story. (Image © Walt Disney) BOTTOM LEFT: Scenery from Dream of Anhui. (Image courtesy of Tippett Studio and Wanda) TOP RIGHT: Landscapes from Dream of Anhui. (Image courtesy of Tippett Studio and Wanda) MIDDLE RIGHT: Mad God is Tippett’s epic stop-motion work in progress. (Image courtesy of Phil Tippett) BOTTOM RIGHT: A scene from Phil Tippett’s Mad God. (Image courtesy of Phil Tippett)

36 • VFXVOICE.COM WINTER 2019



FILM

FROM SCRIPT TO PREVIS AND, SOMETIMES, VICE VERSA By IAN FAILES

THE PLANT GATE BLOCKED BY THE ARMADA OF VEHICLES, Logan drives straight at the chainlink fence – and we prepare for a classic action movie fence smash. LOGAN Hold on! BUT THE LIMO HITS THE FENCE, BENDS THE SUPPORTS AND GRINDS TO A PATHETIC HALT, TANGLED IN MESH. As it would in real life. That’s an excerpt from the screenplay for Logan (story by James Mangold, screenplay by Scott Frank, James Mangold, and Michael Green). It’s a moment in the story that was fine-tuned with the help of previs. Getting the director’s vision on screen is always the goal for the actors and crew on a feature film, but it’s often the previsualization, or previs, team who gets a crack at it first. Previs artists tend to take a script or an outline and turn it into story beats, preview imagery and breakdowns for how shots can be filmed. Occasionally, their work also informs scripted moments or generates further ideas for the director during filming (and often extends to postvis after principal photography). Either way, the previs department has become crucial in the age of complicated visual effects and action filmmaking. VFX Voice asked three previs studios – HALON Entertainment, The Third Floor and Proof – how their artists worked at the earliest stages of production to flesh out story ideas and key moments for the films Logan, Ant-Man and the Wasp and A Wrinkle in Time.

much-applauded grittiness. Helping him on that character journey from early on was HALON Entertainment. One sequence in particular summed up the previs studio’s work – Logan, driving a limousine, escapes a group of adversaries with fellow mutants Charles Xavier (Patrick Stewart) and Laura (Dafne Keen). A key moment in the final film has the limo get stuck on a piece of fencing but continues to drive, something that actually came about as the previs was explored. “In the beginning of the previs work, when interpreting the script pages, Jim asked me to showcase how un-heroic the life of Logan was,” recalls HALON Previsualization Supervisor Clint Reagan. “The script I started with had the limo driving around in circles avoiding capture, but upon seeing it play out it was not believable for Jim. He wanted something more realistic

OPPOSITE TOP: A previs’d moment in Logan’s limousine chase scene by HALON. TOP: The limo reverses after attempting to burst through a fence, only to have part of the fence stick to the vehicle. This was a moment ‘found’ through the previs process. BOTTOM: HALON also delivered postvis for the sequence where the limo is forced to slow down, and gets wrapped up in the metal and barbed wire.

A FENCE STORY

In Logan, director James Mangold depicted the X-Men character Wolverine (played by Hugh Jackman) with

38 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 39


FILM

FROM SCRIPT TO PREVIS AND, SOMETIMES, VICE VERSA By IAN FAILES

THE PLANT GATE BLOCKED BY THE ARMADA OF VEHICLES, Logan drives straight at the chainlink fence – and we prepare for a classic action movie fence smash. LOGAN Hold on! BUT THE LIMO HITS THE FENCE, BENDS THE SUPPORTS AND GRINDS TO A PATHETIC HALT, TANGLED IN MESH. As it would in real life. That’s an excerpt from the screenplay for Logan (story by James Mangold, screenplay by Scott Frank, James Mangold, and Michael Green). It’s a moment in the story that was fine-tuned with the help of previs. Getting the director’s vision on screen is always the goal for the actors and crew on a feature film, but it’s often the previsualization, or previs, team who gets a crack at it first. Previs artists tend to take a script or an outline and turn it into story beats, preview imagery and breakdowns for how shots can be filmed. Occasionally, their work also informs scripted moments or generates further ideas for the director during filming (and often extends to postvis after principal photography). Either way, the previs department has become crucial in the age of complicated visual effects and action filmmaking. VFX Voice asked three previs studios – HALON Entertainment, The Third Floor and Proof – how their artists worked at the earliest stages of production to flesh out story ideas and key moments for the films Logan, Ant-Man and the Wasp and A Wrinkle in Time.

much-applauded grittiness. Helping him on that character journey from early on was HALON Entertainment. One sequence in particular summed up the previs studio’s work – Logan, driving a limousine, escapes a group of adversaries with fellow mutants Charles Xavier (Patrick Stewart) and Laura (Dafne Keen). A key moment in the final film has the limo get stuck on a piece of fencing but continues to drive, something that actually came about as the previs was explored. “In the beginning of the previs work, when interpreting the script pages, Jim asked me to showcase how un-heroic the life of Logan was,” recalls HALON Previsualization Supervisor Clint Reagan. “The script I started with had the limo driving around in circles avoiding capture, but upon seeing it play out it was not believable for Jim. He wanted something more realistic

OPPOSITE TOP: A previs’d moment in Logan’s limousine chase scene by HALON. TOP: The limo reverses after attempting to burst through a fence, only to have part of the fence stick to the vehicle. This was a moment ‘found’ through the previs process. BOTTOM: HALON also delivered postvis for the sequence where the limo is forced to slow down, and gets wrapped up in the metal and barbed wire.

A FENCE STORY

In Logan, director James Mangold depicted the X-Men character Wolverine (played by Hugh Jackman) with

38 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 39


FILM

TOP: A still from Proof’s previs for the Mrs. Whatsit flight sequence in A Wrinkle in Time. BOTTOM: Proof used a toon shader effect on the final previs to give a sense of ‘work in progress.’

40 • VFXVOICE.COM WINTER 2019

and difficult for Logan to manage. Nothing goes right for him, everything is hard and not fair. Right about this time we had lost the original location for the shoot that we had previs’d the entire sequence to. Now a new location was being sought.” So Reagan oversaw previs that could explore more options for the director, such as arranging the car so that Logan could just back out of the compound he was trapped in. This was deemed too ‘slick’ and convenient by Mangold. “In giving notes,” says Reagan, “Jim ranted about the Hollywood trope that so many conflict resolutions aren’t real enough. He said, ‘Like, when cars burst out of fences on cue as if they were not there. If you or I drove out of a fence right now outside you’d probably get caught up in it.’ Everyone listening laughed as he described the list of absurdities we all see in action movies. I liked how plainly this idea rebuffed

AD

WINTER 2019 VFXVOICE.COM • 41



FILM

Tackling Techvis For Ant-Man and the Wasp, The Third Floor delivered techvis for the school sequence in which Ant-Man (Paul Rudd) performs scenes in a ‘miniaturized’ form against a normal-sized Wasp (Evangeline Lilly). It was used, as The Third Floor Techvis Lead Ariel Feblowitz describes, “to help figure out how to shoot character size differences, and what impact this would have on camera speed, distance to subject and parallaxing.” “For instance,” adds Feblowitz, “if a character was meant to be half of its regular size and they were to be standing next to a normal-sized person/environment, the camera filming them needed to be twice as far away and moving twice as fast. If walking, the smaller character needed to travel double speed to keep up with the regular-sized character. All of this needed to be mathematically precise, the camera angles had to match exactly and camera heights and distances needed to be exact within a fraction of an inch. “The props the characters interacted with had to be scaled correctly. It was all thoroughly planned. For shots that required a moving camera, a motion-control rig was brought in so the movements would match exactly. We would track the camera filming the normal-sized element, scale the camera motion to accommodate larger or smaller characters, and feed that information into the motion-control rig to get all of the exact movements.”

TOP: Techvis helped establish the dimensions of a ‘small Scott’ for a sequence at a school in Ant-Man and the Wasp. BOTTOM: Wasp and Ant-Man had to appear together – but as different sizes – in the school scenes.

“[Techvis was used] to help figure out how to shoot character size differences, and what impact this would have on camera speed, distance to subject and parallaxing.” —Ariel Feblowitz, Techvis Lead, The Third Floor convention, and I decided to previs it exactly as he had mocked. Jim was intrigued by the first pass and a bit surprised we took him so literally, but in the end it did exactly what we hoped it would.” Meanwhile, the production searched for a suitable location, with the fence-crash shot now in the mind of the director. Even the script was updated to include it. “The stunt evolved in previs as we designed outward from the fence beat to account for how the rest of the scene would be affected,” describes Reagan. “At this point we didn’t have a location so we adjusted the old location by moving buildings to allow for the creative needs I had in previs.” Eventually a new location was found, and HALON adjusted its previs shots to a matching digital version of the location. Says Reagan: “My team of animators and I went through the entire

42 • VFXVOICE.COM WINTER 2019

AD



FILM

sequence again. We solved camera blocking problems, adjusted animation and added new beats to connect the scenes together. I’m told that irony reared its ugly head on the day of the shoot, and the fence fell right over. So they had to cable the car to stop it from blasting right through!” WHEN PREVIS BECOMES TECHVIS

TOP: Ant-Man’s ‘Giant Man’ interacts with a ferry in this previs shot. The Third Floor would also deliver postvis for the scene once the liveaction ferry scene was filmed. BOTTOM: Wasp looks on from hiding in her miniature form in this previs shot from the film.

44 • VFXVOICE.COM WINTER 2019

Sometimes previs is used just for working out story beats, but very often it is taken one step further, to ‘techvis.’ It’s at this stage that set diagrams, camera positions and rigging locations, for example, can become part of the previs delivery to see if the planned shots are actually achievable. Proof delivered a combination of previs and postvis in this way for Ava DuVernay’s A Wrinkle in Time, including for a scene of the green flying creature, Mrs. Whatsit (Reese Witherspoon), taking a group of children into the skies. “The flying sequence was the first thing we worked on,” says Proof Previs Supervisor Christopher Batty. “It started with some initial discussions with Ava along with a few keyframes from storyboard artist John Fox. We did a short concept animation that was presented to the studio early on. It explored how the movement of the kids would look over the top of ‘Creature’ Whatsit. It was very important for Ava to have the kids do something different than the typical riding or sitting on top of the creature. She wanted to see the kids truly flying and for the characters to experience that freedom and exhilaration. “Once we had the basic concept, we moved onto exploring how the kids moved across the creature’s back. We explored how they would lift off and how far they moved along the back and from each other. We looked at footage of sky divers for reference and, specifically, indoor skydiving. Once up in the air we developed this action of her fully spreading out her wings so that the kids had more room to fly around on her back. It solved a few logistical as well as aesthetic issues.” The major step forward on this sequence for Proof was a further collaboration with motion-control camera-system makers Robomoco, which would be operating their robotic arms on set and providing the illusion of the children being mid-air. Proof came in to ensure that the operation of the robotic arms could achieve the planned shots. “We received accurate models of Robomoco’s robotic arms and worked on scripts together in order for our systems to communicate with each other,” outlines Batty. “Once we had a previs of the flying actions that Ava had approved, we got to work on translating those animations into movements that the robotic arms could achieve. We had special scripts that basically translated our animated robotic arms into a file that would drive the machines. So after a few tests it was relatively easy to go back and forth from our Maya scenes into the Robomoco arms.” Proof provided previs with a unique toon shader look, something requested by the director and that signaled previs was part of the continual evolution of shot and story design. “She liked that it had a ‘work in progress’ visual connotation.” says Batty. “Since the beginning of the process was a creative exploration, we didn’t

AD

WINTER 2019 VFXVOICE.COM • 45



FILM

want anything to look too ‘finished’ or locked in. The toon shader helped to communicate that we were still open to changes up until the shoot. It can also be quicker to set up characters and other previs assets. With the toon shader’s more abstract, stylized look, we can worry less about the materials and lighting and concentrate more on action and composition.” COLLABORATION IS KEY

TOP: Techvis for the Mrs. Whatsit shots involved coordinating how the flying could be achieved with Robomoco’s robot arms. BOTTOM: A scene from The Third Floor’s previs for the car chase sequence in Ant-Man and the Wasp.

46 • VFXVOICE.COM WINTER 2019

Marvel films are, of course, big users of previs, often for brainstorming, and with the strong notion that things will change along the way – or even be abandoned as the story and edit continue to develop. On Peyton Reed’s Ant-Man and the Wasp, for example, The Third Floor contributed previs while the script was still being worked on and as different crew members for the film came on board. In terms of day-to-day collaboration, previs studios tend to work with many department heads. For Ant-Man and the Wasp, The Third Floor Visualization Supervisor Jim Baker says they regularly talked to the director [Reed], producer Stephen Broussard, Visual Effects Supervisor Stephane Ceretti, Stunt Coordinator George Cottle and second-unit director Jeff Habberstad. The film’s San Francisco miniature car chase involved contact with just about all of these crew members as the sequence itself evolved. “The scene was initially a much smaller sequence and earlier in the script than appears in the final film,” notes Baker. “Early on, Peyton and Stephen had us trying out different scenarios with the chase. They would review these and pick the ones they liked, which ended up informing the script. We would meet with Peyton two or three times a week. In a typical week, we would have a kickoff on Monday using, say, a beat sheet. We would then get to work immediately blocking in previs. As part of the visualization, we also cut together shots from other movies to give a flavor of what the sequence could be like. The car chase grew so much that it moved to the third act and became a central structure for the climax.” “The nature of previs,” adds Baker, “is such that we may visualize even more than is used in the film and, while some of the ideas may end up on the cutting room floor, other aspects that have been previs’d can end up informing the feel of the film or may inspire something for a completely different part of it than originally intended. On some projects, there are storyboards throughout, but on this one we went directly from the script and from pitches for our action sequences. Peyton was extremely enthusiastic about exploration and was always welcoming to ideas.” Baker identifies the film’s kitchen fight as a good example of this exploration. “Peyton kicked it off to our team, the visuals effects team and the stunt team. As we all brainstormed ideas, it led to concepts like Wasp running along the knife and the goons smashing all of the vegetables while trying to get her to come out. These types of creative outcomes happen when the whole team is open to collaboration, and it starts at the top with the leadership of the director and producer.”

AD

WINTER 2019 VFXVOICE.COM • 47



FOCUS

VFX IN THE U.K.: FROM COTTAGE INDUSTRY TO GLOBAL POWERHOUSE – WITH PAUSE FOR BREXIT By ADRIAN PENNINGTON

The U.K. visual effects scene has never been more exciting, nor more uncertain. The bonanza of feature and TV production on the island shows no sign of abating with facilities and studios in the capital and across the U.K. close to bursting, even while the country’s pending departure from the EU threatens to derail the economic progress of the industry. Figures released from the British Film Institute show that 2017 was the largest year on record for international spending in the U.K., with film leaping by 23% from 2016 to $2.4 billion, propelled by shoots for blockbusters like Ready Player One, Tomb Raider, Jurassic World: Fallen Kingdom, Solo: A Star Wars Story and Mission: Impossible 6. Meanwhile, TV rocketed 27% to $967 million. But it’s not just about volume. The scale and complexity of visual effects is at an all-time high, driving multiple VFX hubs across the world, with London the indispensable axis. Six of the last eight VFX Oscars have gone to British shops. “The most demanding work is being placed in the U.K., not only because it is very competitive with the rest of the world, but because we have a very mature infrastructure and world-class talent pool,” says Cinesite’s Director of Animation, Eamonn Butler. “As much as we’ve grown elsewhere, the U.K. remains the largest part of our global operation for film and VFX,” says Matt Fox, Joint Global MD at Framestore, 2018 Oscar winner with DNEG for Blade Runner 2049. To cope with the amount of work, Framestore opened bigger offices earlier this year, a short walk from Soho, adding another 100 artists to bring its total U.K. staff above 1,000. “London’s massive pull is testament to the richness of artistic talent available to us,” says Philip Greenlow, Executive Producer, MPC Film. “The city has always drawn a large workforce from the U.K. and Europe, and the depth and breadth of skills they possess in computer graphics cannot be overlooked.” TRACK RECORD

TOP: Mary Poppins Returns (Photo: Jay Maidment. Image copyright © 2018 Walt Disney Enterprises)

48 • VFXVOICE.COM WINTER 2019

The U.K.’s digital prowess emerged on the back of a strong physical production presence stemming from a period in the late 1970s when films like Star Wars, Superman and Alien shot at Pinewood. “The directors making those movies were among the first to seed the idea of visual effects as part of the filmmaking process as opposed to a post service,” says Fox. “It also positioned the U.K. culturally as a place where production and post disciplines merge.” During the decade between the first and last Harry Potter films (2001-2011), the U.K.’s VFX grew from cottage industry to global powerhouse, as more and more work traditionally destined for the west coast of America was channelled to London. “In L.A., Mumbai or Beijing there’s a great respect for the level of polish and finishing that our VFX community is able to put on a production,” says Will Cohen, CEO at Milk. The English language plays a part, of course, as does London as a vibrant city where senior creatives want to spend time. Soho’s geography remains unique with all the facilities, screening rooms and networking available in a buzzing square mile. The cluster of studios on the outskirts of the capital is unrivalled, with even more stages – including a major 20-acre site in East London run by

The Brexit Threat The U.K.’s departure from the European Union is timed for this March, with uncertainty about its impact looming large in the sector. That’s chiefly because Brexit is designed to curb the movement of labor, and when the U.K.’s VFX community is built on attracting the best overseas talent, the cost and volume of red tape for visas could impact competitiveness. “Foreign talent has a critical role to play in U.K. skills development,” says DNEG’s Alex Hope. “My concern is that any changes to visa costs and its scope will impact this.” A third of the 6,000 artists employed at U.K. VFX houses are from the European Union or European Economic Area (excluding the U.K. and Ireland) and 13% from the rest of the world. Union VFX is typical. A quarter of its workforce is from the EU; many are specialists who need to be hired quickly to work on new projects. “If they were to become subject to [visa-restrictive] rules, our business would be severely impacted,” says co-founder Adam Gascoyne. There’s a risk that projects could transfer overseas with larger facilities. Those [companies] with offices in Montreal or L.A. are threatening to uproot HQs from Soho. Cinesite’s acquisition of Berlin-based studio Trixter last August could see projects shift from the U.K. to Germany. Smaller houses don’t have the ability to withstand such as a challenge, but some are preparing contingency measures to compete. Jellyfish has built a virtualized production hub with which it currently connects its two London offices, but could be extended anywhere with a high-speed connection. “This will enable us to bring in artists to work remotely on projects from anywhere in the world,” says Dobree. Efforts are also being made to boost the skills and the number of locally-trained graduates to fill the post-Brexit gap. With typical British resolve, most executives remain upbeat about long-term prospects, given the economic value which post-production and VFX in particular generate for the U.K. government. “The quality of work the U.K. produces is a function of the multi-cultural influence of our talent,” says Mark Benson, CEO of MPC. “Any change in our ability to engage with talent post Brexit could represent an issue. As an industry, it is important we make that crystal clear.” Warner Bros., for one, has no qualms. It has a major new 25,000-square-foot post-production facility opening in Soho by 2021 which will be “on a scale to rival those in Hollywood,” according to the studio.

TOP AND BOTTOM: Mission: Impossible – Fallout (Image copyright © 2018 Paramount Pictures)

WINTER 2019 VFXVOICE.COM • 49


FOCUS

VFX IN THE U.K.: FROM COTTAGE INDUSTRY TO GLOBAL POWERHOUSE – WITH PAUSE FOR BREXIT By ADRIAN PENNINGTON

The U.K. visual effects scene has never been more exciting, nor more uncertain. The bonanza of feature and TV production on the island shows no sign of abating with facilities and studios in the capital and across the U.K. close to bursting, even while the country’s pending departure from the EU threatens to derail the economic progress of the industry. Figures released from the British Film Institute show that 2017 was the largest year on record for international spending in the U.K., with film leaping by 23% from 2016 to $2.4 billion, propelled by shoots for blockbusters like Ready Player One, Tomb Raider, Jurassic World: Fallen Kingdom, Solo: A Star Wars Story and Mission: Impossible 6. Meanwhile, TV rocketed 27% to $967 million. But it’s not just about volume. The scale and complexity of visual effects is at an all-time high, driving multiple VFX hubs across the world, with London the indispensable axis. Six of the last eight VFX Oscars have gone to British shops. “The most demanding work is being placed in the U.K., not only because it is very competitive with the rest of the world, but because we have a very mature infrastructure and world-class talent pool,” says Cinesite’s Director of Animation, Eamonn Butler. “As much as we’ve grown elsewhere, the U.K. remains the largest part of our global operation for film and VFX,” says Matt Fox, Joint Global MD at Framestore, 2018 Oscar winner with DNEG for Blade Runner 2049. To cope with the amount of work, Framestore opened bigger offices earlier this year, a short walk from Soho, adding another 100 artists to bring its total U.K. staff above 1,000. “London’s massive pull is testament to the richness of artistic talent available to us,” says Philip Greenlow, Executive Producer, MPC Film. “The city has always drawn a large workforce from the U.K. and Europe, and the depth and breadth of skills they possess in computer graphics cannot be overlooked.” TRACK RECORD

TOP: Mary Poppins Returns (Photo: Jay Maidment. Image copyright © 2018 Walt Disney Enterprises)

48 • VFXVOICE.COM WINTER 2019

The U.K.’s digital prowess emerged on the back of a strong physical production presence stemming from a period in the late 1970s when films like Star Wars, Superman and Alien shot at Pinewood. “The directors making those movies were among the first to seed the idea of visual effects as part of the filmmaking process as opposed to a post service,” says Fox. “It also positioned the U.K. culturally as a place where production and post disciplines merge.” During the decade between the first and last Harry Potter films (2001-2011), the U.K.’s VFX grew from cottage industry to global powerhouse, as more and more work traditionally destined for the west coast of America was channelled to London. “In L.A., Mumbai or Beijing there’s a great respect for the level of polish and finishing that our VFX community is able to put on a production,” says Will Cohen, CEO at Milk. The English language plays a part, of course, as does London as a vibrant city where senior creatives want to spend time. Soho’s geography remains unique with all the facilities, screening rooms and networking available in a buzzing square mile. The cluster of studios on the outskirts of the capital is unrivalled, with even more stages – including a major 20-acre site in East London run by

The Brexit Threat The U.K.’s departure from the European Union is timed for this March, with uncertainty about its impact looming large in the sector. That’s chiefly because Brexit is designed to curb the movement of labor, and when the U.K.’s VFX community is built on attracting the best overseas talent, the cost and volume of red tape for visas could impact competitiveness. “Foreign talent has a critical role to play in U.K. skills development,” says DNEG’s Alex Hope. “My concern is that any changes to visa costs and its scope will impact this.” A third of the 6,000 artists employed at U.K. VFX houses are from the European Union or European Economic Area (excluding the U.K. and Ireland) and 13% from the rest of the world. Union VFX is typical. A quarter of its workforce is from the EU; many are specialists who need to be hired quickly to work on new projects. “If they were to become subject to [visa-restrictive] rules, our business would be severely impacted,” says co-founder Adam Gascoyne. There’s a risk that projects could transfer overseas with larger facilities. Those [companies] with offices in Montreal or L.A. are threatening to uproot HQs from Soho. Cinesite’s acquisition of Berlin-based studio Trixter last August could see projects shift from the U.K. to Germany. Smaller houses don’t have the ability to withstand such as a challenge, but some are preparing contingency measures to compete. Jellyfish has built a virtualized production hub with which it currently connects its two London offices, but could be extended anywhere with a high-speed connection. “This will enable us to bring in artists to work remotely on projects from anywhere in the world,” says Dobree. Efforts are also being made to boost the skills and the number of locally-trained graduates to fill the post-Brexit gap. With typical British resolve, most executives remain upbeat about long-term prospects, given the economic value which post-production and VFX in particular generate for the U.K. government. “The quality of work the U.K. produces is a function of the multi-cultural influence of our talent,” says Mark Benson, CEO of MPC. “Any change in our ability to engage with talent post Brexit could represent an issue. As an industry, it is important we make that crystal clear.” Warner Bros., for one, has no qualms. It has a major new 25,000-square-foot post-production facility opening in Soho by 2021 which will be “on a scale to rival those in Hollywood,” according to the studio.

TOP AND BOTTOM: Mission: Impossible – Fallout (Image copyright © 2018 Paramount Pictures)

WINTER 2019 VFXVOICE.COM • 49


FOCUS

TOP: Avengers: Infinity War (Image copyright © 2018 Walt Disney Studios and Marvel Studios) BOTTOM: Christopher Robin (Image copyright © 2018 Walt Disney Enterprises, Inc.)

Taking the Lead in Hybrid Filmmaking

U.K. VFX A-Z

Motion capture is increasingly used with CG animation, especially to create bipedal characters, but the bigger advance is closer integration with real-time render technology, to better visualize and drive performance capture. U.K. facilities and native world-renowned VFX tool developers like Foundry are leveraging game-engine technology to visualize the action in real-time. In doing so they are opening up to ‘hybrid filmmaking,’ where directing entirely or largely digital scenes can be approached using traditional techniques. “Motion capture can be input almost instantly to the VFX pipeline,” says Cinesite’s Butler. “There’s no down time waiting on data to be cleaned up, so a director can be more interactive in finding the best angles.” MPC’s immersive filmmaking platform is called Genesis, inside of which environments can be scouted, designed and dressed, and characters can be directed. “This facilitates a much tighter feedback loop on set, informing all aspects of the final shot,” says Greenlow. Actor/director Andy Serkis has taken hybrid filmmaking to new levels for Mowgli. Facial, vocal and physical performances were captured as CG assets at Warner Bros.’ Leavesden Studio then reworked, animated and rendered at Framestore. The majority (851) of the 1,164 total shots were created in London. Framestore and Imaginarium, Serkis’s west London performance-capture studio, developed a new muscle system to allow animators to more directly replicate facial shapes and expressions. “The balance between the essence of an actor’s performance and the animator’s translation of that to create the final CG character is finely crafted here,” explains Matthew Brown, Imaginarium’s CEO. Adds Framestore’s Matt Fox, “There is something incredibly recognizable in the character’s facial performance that is down to that mixture of translating performance capture into a muscle system in order to build the anatomical features of an animal’s face, which is where the technology has advanced.”

Automatik. Five-year-old boutique based in London and Berlin provides a full palette of digital image creation. They provided extensive previs for The Imitation Game, with Look Away, starring Jason Isaacs, in production. www.automatik-vfx.com

Lola. Respected boutique delivering conceptual art and previs to final delivery. Completed 400 4K shots for BBC/Netflix show Troy: Fall of a City and regular set extensions for Ripper Street. Feature work includes John Carter and Life. www.lola-post.com

Atomic Arts. Produced virals, trailers, clean-up, roto and environments work across its London and Mumbai offices for Red Sparrow, It, Alien Covenant and others, with full-blown VFX for The Martian (via MPC) and 400 shots on Netflix series Altered Carbon. www.atomicarts.net

Milk. Launched five years ago by former Mill Film artists to concentrate on high-end TV, the facility grew rapidly on successive series of Doctor Who, and landed the 2016 Oscar for Ex Machina (with DNEG). Straddling both disciplines over Cardiff and London hubs with 250 full-time staff, its current projects include Four Kids and It. www.milk-vfx.com

California’s Pacifica Ventures due in 2020. However, it is the supportive fiscal environment, in place since 2007 and available at 25% of qualifying film production spending in tax credits, which underpins the dramatic surge. “Other tax rebates may pop up, but what is incredibly important is the stability of the U.K. tax regime,” says Fox. “Financiers and filmmakers don’t want to invest and then not receive the support they were promised.” An amendment in 2014 encouraged producers with a VFX budget of 10%-25% of the total project budget to come to the U.K.

50 • VFXVOICE.COM WINTER 2019

BlueBolt. Smaller house that mostly provides winning high-end TV work, but is regularly called on for additional shots in larger movies, such as Mission: Impossible – Fallout and Mamma Mia! Here We Go Again. Compositing, CG and digital matte painting are core skills. www.blue-bolt.com Cinesite. One of the big four U.K.-headquartered businesses with a legacy of Bond and Potter franchises, this former Kodak facility has been owned and run by U.K. management since 2012. Acquisition of Canada’s Image Engine Design and Nitrogen Studios, plus Germany’s Trixter, fueled overseas expansion and work on studio titles like Ant-Man and the Wasp. Its Montreal branch majors in feature animation (The Addams Family). www.cinesite.com

The Mill. After trailblazing the U.K.’s first VFX Oscar for Gladiator, The Mill famously shuttered its film outfit to concentrate on the more profitable commercials sector – only to re-open it in Australia. Part of the Technicolor stable since 2015, the house has been consistently recognized by peers and clients as a premier visual effects provider – mainly for agencies – since launch in 1990, and also operates in New York City, L.A. and Chicago. www.themill.com Molinare. Grand old dame of the Soho post scene, the VFX skill set of this 40-year-old full-service facility ranges from clean-up and compositing to 2D and 3D motion graphics, and effects on titles including Dad’s Army, Netflix’s The Crown, and the DI/grading for Isle of Dogs. www.molinare.co.uk

DNEG. Already one of the world’s largest VFX companies before its 2014 merger with Prime Focus World, this powerhouse operates multiple studios in North America and India, employs 5,000, and is likely to have a hand in many blockbuster shows. Favored by Christopher Nolan, DNEG has Alita: Battle Angel and Joe Cornish’s The Kid Who Would be King. Its TV wing recently won series work on Doctor Who. www.dneg.com

MPC. To its long-standing commercials VFX and grading specialties, the Moving Picture Company grew film work on the back of successive Harry Potter films, landing an Oscar for Life of Pi, another for The Jungle Book, and continues with work on Tim Burton’s Dumbo, The Lion King, Artemis Fowl and Shazam! to name a few. MPC is Technicolor-owned with studios in Bangalore, Montreal, Vancouver and L.A. www.moving-picture.com

Framestore. Respected worldwide for its innovation as much as for its final-shot polish, past credits include The Golden Compass and Gravity. In-production shows include Captain Marvel and The Crimes of Grindelwald. With multiple offices worldwide, including a VR studio in New York City (and majority-owned by Chinese investors), it has a year-old TV division, with work for BBC series His Dark Materials in the works. www.framestore.com

One of Us. Sole vendor of 500 shots on Season 1 of Netflix’s The Crown, the house is proficient at handling sequences for features, including Widows, 22 July and Star Wars: The Last Jedi. www.weacceptyou.com

Jellyfish Pictures. Principally an episodic TV animation house, this stalwart facility has flourished by investing in film and TV. It played a key postvis role on Rogue One: A Star Wars Story, eventually picking up half the shots. www.jellyfishpictures.co.uk LipSync. Full-service facility offering previs, concept, matte painting and 3D, its credits include The Nice Guys, Mr Turner and BBC period drama Wolf Hall. www.lipsyncpost.co.uk

Outpost. Home to 80 artists located on the south coast, handy for Bournemouth’s VFX graduates, Outpost offers wide-ranging TV and film credits, including Jurassic Park: Fallen World and Jack Ryan for Amazon. www.outpostvfx.co.uk Union VFX. Celebrating a decade in business, this mid-sized vendor’s core specialities include CG environment work and invisible effects. Favored by Danny Boyle (127 Hours, Trance, T2 Trainspotting), it also contributed to Three Billboards Outside Ebbing, Missouri and Annihilation. www.unionvfx.com This is a brief guide to some of the most active effects studios in the U.K.

WINTER 2019 VFXVOICE.COM • 51


FOCUS

TOP: Avengers: Infinity War (Image copyright © 2018 Walt Disney Studios and Marvel Studios) BOTTOM: Christopher Robin (Image copyright © 2018 Walt Disney Enterprises, Inc.)

Taking the Lead in Hybrid Filmmaking

U.K. VFX A-Z

Motion capture is increasingly used with CG animation, especially to create bipedal characters, but the bigger advance is closer integration with real-time render technology, to better visualize and drive performance capture. U.K. facilities and native world-renowned VFX tool developers like Foundry are leveraging game-engine technology to visualize the action in real-time. In doing so they are opening up to ‘hybrid filmmaking,’ where directing entirely or largely digital scenes can be approached using traditional techniques. “Motion capture can be input almost instantly to the VFX pipeline,” says Cinesite’s Butler. “There’s no down time waiting on data to be cleaned up, so a director can be more interactive in finding the best angles.” MPC’s immersive filmmaking platform is called Genesis, inside of which environments can be scouted, designed and dressed, and characters can be directed. “This facilitates a much tighter feedback loop on set, informing all aspects of the final shot,” says Greenlow. Actor/director Andy Serkis has taken hybrid filmmaking to new levels for Mowgli. Facial, vocal and physical performances were captured as CG assets at Warner Bros.’ Leavesden Studio then reworked, animated and rendered at Framestore. The majority (851) of the 1,164 total shots were created in London. Framestore and Imaginarium, Serkis’s west London performance-capture studio, developed a new muscle system to allow animators to more directly replicate facial shapes and expressions. “The balance between the essence of an actor’s performance and the animator’s translation of that to create the final CG character is finely crafted here,” explains Matthew Brown, Imaginarium’s CEO. Adds Framestore’s Matt Fox, “There is something incredibly recognizable in the character’s facial performance that is down to that mixture of translating performance capture into a muscle system in order to build the anatomical features of an animal’s face, which is where the technology has advanced.”

Automatik. Five-year-old boutique based in London and Berlin provides a full palette of digital image creation. They provided extensive previs for The Imitation Game, with Look Away, starring Jason Isaacs, in production. www.automatik-vfx.com

Lola. Respected boutique delivering conceptual art and previs to final delivery. Completed 400 4K shots for BBC/Netflix show Troy: Fall of a City and regular set extensions for Ripper Street. Feature work includes John Carter and Life. www.lola-post.com

Atomic Arts. Produced virals, trailers, clean-up, roto and environments work across its London and Mumbai offices for Red Sparrow, It, Alien Covenant and others, with full-blown VFX for The Martian (via MPC) and 400 shots on Netflix series Altered Carbon. www.atomicarts.net

Milk. Launched five years ago by former Mill Film artists to concentrate on high-end TV, the facility grew rapidly on successive series of Doctor Who, and landed the 2016 Oscar for Ex Machina (with DNEG). Straddling both disciplines over Cardiff and London hubs with 250 full-time staff, its current projects include Four Kids and It. www.milk-vfx.com

California’s Pacifica Ventures due in 2020. However, it is the supportive fiscal environment, in place since 2007 and available at 25% of qualifying film production spending in tax credits, which underpins the dramatic surge. “Other tax rebates may pop up, but what is incredibly important is the stability of the U.K. tax regime,” says Fox. “Financiers and filmmakers don’t want to invest and then not receive the support they were promised.” An amendment in 2014 encouraged producers with a VFX budget of 10%-25% of the total project budget to come to the U.K.

50 • VFXVOICE.COM WINTER 2019

BlueBolt. Smaller house that mostly provides winning high-end TV work, but is regularly called on for additional shots in larger movies, such as Mission: Impossible – Fallout and Mamma Mia! Here We Go Again. Compositing, CG and digital matte painting are core skills. www.blue-bolt.com Cinesite. One of the big four U.K.-headquartered businesses with a legacy of Bond and Potter franchises, this former Kodak facility has been owned and run by U.K. management since 2012. Acquisition of Canada’s Image Engine Design and Nitrogen Studios, plus Germany’s Trixter, fueled overseas expansion and work on studio titles like Ant-Man and the Wasp. Its Montreal branch majors in feature animation (The Addams Family). www.cinesite.com

The Mill. After trailblazing the U.K.’s first VFX Oscar for Gladiator, The Mill famously shuttered its film outfit to concentrate on the more profitable commercials sector – only to re-open it in Australia. Part of the Technicolor stable since 2015, the house has been consistently recognized by peers and clients as a premier visual effects provider – mainly for agencies – since launch in 1990, and also operates in New York City, L.A. and Chicago. www.themill.com Molinare. Grand old dame of the Soho post scene, the VFX skill set of this 40-year-old full-service facility ranges from clean-up and compositing to 2D and 3D motion graphics, and effects on titles including Dad’s Army, Netflix’s The Crown, and the DI/grading for Isle of Dogs. www.molinare.co.uk

DNEG. Already one of the world’s largest VFX companies before its 2014 merger with Prime Focus World, this powerhouse operates multiple studios in North America and India, employs 5,000, and is likely to have a hand in many blockbuster shows. Favored by Christopher Nolan, DNEG has Alita: Battle Angel and Joe Cornish’s The Kid Who Would be King. Its TV wing recently won series work on Doctor Who. www.dneg.com

MPC. To its long-standing commercials VFX and grading specialties, the Moving Picture Company grew film work on the back of successive Harry Potter films, landing an Oscar for Life of Pi, another for The Jungle Book, and continues with work on Tim Burton’s Dumbo, The Lion King, Artemis Fowl and Shazam! to name a few. MPC is Technicolor-owned with studios in Bangalore, Montreal, Vancouver and L.A. www.moving-picture.com

Framestore. Respected worldwide for its innovation as much as for its final-shot polish, past credits include The Golden Compass and Gravity. In-production shows include Captain Marvel and The Crimes of Grindelwald. With multiple offices worldwide, including a VR studio in New York City (and majority-owned by Chinese investors), it has a year-old TV division, with work for BBC series His Dark Materials in the works. www.framestore.com

One of Us. Sole vendor of 500 shots on Season 1 of Netflix’s The Crown, the house is proficient at handling sequences for features, including Widows, 22 July and Star Wars: The Last Jedi. www.weacceptyou.com

Jellyfish Pictures. Principally an episodic TV animation house, this stalwart facility has flourished by investing in film and TV. It played a key postvis role on Rogue One: A Star Wars Story, eventually picking up half the shots. www.jellyfishpictures.co.uk LipSync. Full-service facility offering previs, concept, matte painting and 3D, its credits include The Nice Guys, Mr Turner and BBC period drama Wolf Hall. www.lipsyncpost.co.uk

Outpost. Home to 80 artists located on the south coast, handy for Bournemouth’s VFX graduates, Outpost offers wide-ranging TV and film credits, including Jurassic Park: Fallen World and Jack Ryan for Amazon. www.outpostvfx.co.uk Union VFX. Celebrating a decade in business, this mid-sized vendor’s core specialities include CG environment work and invisible effects. Favored by Danny Boyle (127 Hours, Trance, T2 Trainspotting), it also contributed to Three Billboards Outside Ebbing, Missouri and Annihilation. www.unionvfx.com This is a brief guide to some of the most active effects studios in the U.K.

WINTER 2019 VFXVOICE.COM • 51


FOCUS

for their visual effects. “Films not shot in U.K. are able to pass the qualification criteria for U.K. tax credit and therefore gain access to the wealth of talent and technology resources here,” says Alex Hope, Managing Director and co-founder of DNEG. Pacific Rim Uprising and Sony’s Marvel adventure Venom, are two recent examples of films shot outside the U.K., with editorial and sound post in L.A., but VFX supervised and completed in the U.K. “It’s difficult to convince artists to stay in a location if they don’t perceive there will be long-term work there for them to do,” adds Fox. TOP: Christopher Robin (Image copyright © 2018 Walt Disney Enterprises, Inc.) BOTTOM LEFT: Wonder Woman (Image copyright © 2017 Warner Bros. Entertainment) BOTTOM RIGHT: Blade Runner 2049 (Image copyright © 2017 Alcon Entertainment, LLC., Warner Bros. Entertainment Inc. and Columbia Pictures Industries Inc.)

52 • VFXVOICE.COM WINTER 2019

EXCHANGE RATE WINNERS

Demand has heightened even further thanks to Brexit. A 25% drop in the pound against the U.S. dollar since the June 2016 EU Referendum has made production much cheaper for U.S. majors. Meanwhile, the unprecedented volume of high-end drama being commissioned by Amazon, Netflix, HBO and Starz – in turn prompting U.K. networks like Sky to up their game – has put demand for homegrown VFX talent at an all-time high. The global success of Game of Thrones, for example, has put

AD



FOCUS

Northern Ireland’s studio, location and crewing facilities on the map. HBO has so much confidence it wasted no time booking the spin-off into Belfast’s studios. U.S. vendor Stargate has also tried to tap into the U.K., although its Ealing studio is serviced from an outpost on Malta. Another characteristic of U.K. VFX is that supervisors can call on a raft of specialities from an array of smaller facilities to supplement the shots handled by the mega-shops. Jellyfish Pictures, for example, has been engaged by ILM London for the last three Star Wars outings, including 150 shots on Episode VIII: The Last Jedi. That showcase helped it become principal vendor for Focus Features’ sci-fi Captive State, shot in Chicago by director Rupert Wyatt. “They wanted a company that

TOP LEFT AND RIGHT: Guardians of the Galaxy Vol. 2 (Image copyright © Walt Disney Studios/Marvel Studios) BOTTOM LEFT: Paddington 2 (Image copyright © 2017 StudioCanal) BOTTOM RIGHT: Deadpool 2 (Image copyright © 2018 Twentieth Century Fox)

54 • VFXVOICE.COM WINTER 2019

AD

“Films not shot in the U.K. are able to pass the qualification criteria for U.K. tax credit, and therefore gain access to the wealth of talent and technology resources here.” —Alex Hope, MD/Co-founder, DNEG

WINTER 2019 VFXVOICE.COM • 55



FOCUS

was able to work flexibly without the runaway costs of the really big guys,” says Jellyfish CEO and co-founder Phil Dobree. Boutique Atomic Arts was tasked with set extensions and crowd replication for Disney’s Dumbo. Lola was one of a number of facilities working on space thriller Life, for which DNEG held the lion’s share. “We’re often brought in near deadline to work on a particular scene as a safe pair of hands,” explains Creative Director Rob Harvey. Working on scenes for Jason Bourne and Nocturnal Animals (where it performed digital de-aging on Amy Adams and Jake Gyllenhaal) helped Outpost gain attention, believes founder Duncan McWilliam. It recently landed shots on Jurassic Park: Fallen Kingdom.

AD

ANIMAL MAGIC TOP LEFT AND RIGHT: Ant-Man and the Wasp (Image copyright © 2018 Marvel) MIDDLE AND BOTTOM LEFT: Pacific Rim Uprising (Image copyright © 2018 Legendary and Universal Studios) BOTTOM RIGHT: The Meg (Image copyright © 2018 Warner Bros. Entertainment, Inc)

56 • VFXVOICE.COM WINTER 2019

Exploring exotic or alien worlds, or talking with animals, has always been coveted in storytelling, and now the technology has caught up to be able to realize those tropes more convincingly. “A maturation of CG software – a lot of which is the product of in-house proprietary development – means that films comprised mostly of CG creatures are more scalable to schedules and budgets,” says Greenlow.

WINTER 2019 VFXVOICE.COM • 57



FOCUS

MPC recently launched a new character design and build department in London. The ‘Character Lab’ features modelling, texture, rigging and groom specialists working on Dumbo, The Voyage of Doctor Dolittle and The Lion King. The 54 species of animals and 224 unique animals, which MPC put on screen for Jon Favreau’s VFX-Oscar winner The Jungle Book, is being advanced further for Disney’s follow-up. “All of those complex photoreal CG characters require an exceptional eye for animation, and we’ve benefitted from a wealth of talent across the U.K. and Europe’s flourishing TV and feature animation industry,” says Greenlow. Framestore completed the bulk of the 1,110 shots for Paddington 2 and was responsible for animating another bear for Marc Forster’s Christopher Robin. Along with Pooh, the facility provided initial concept design for Tigger, Piglet and Eeyore, and animated these as full-CG characters interacting with real-world environments and actors. More hero creature work, plus environment extensions and magic wand effects, comprise the 330-shot order made at Framestore London for Fantastic Beasts: The Crimes of Grindelwald. Digi-doubles and environment work of a London park are part of its 599 shots for The Return of Mary Poppins. “Filmmakers are not just using VFX to blow an audience away, they are using it to communicate sophisticated emotions and complex ideas,” says Butler. He describes a new generation of filmmakers that have grown up watching movies with “detailed understanding” of the craft for shaping stories using visual effects in ways that were not present before. “Studio’s aren’t afraid to back them with budgets either,” he adds.

AD

CULTURAL DIVERSITY TOP: Harry Potter and the Deathly Hallows – Part 2 (Image copyright © 2011 Warner Bros. Pictures) MIDDLE: Ghost in the Shell (Image copyright © 2017 Paramount Pictures and Storyteller Distribution Co., LLC) BOTTOM: Pirates of the Caribbean: Dead Men Tell No Tales (Image copyright © 2017 Disney Enterprises, Inc.)

58 • VFXVOICE.COM WINTER 2019

The sheer amount and variety of work coming into the country means there’s no difficulty finding crew or scaling up with worldclass craft skills. “We can cherry pick the best talent from around the world to do specialist shots rather than overloading one facility,” says Christian Manz, VFX Supervisor on Warner Bros.’ Fantastic Beasts series. “By plugging into different cultures, you get different ways of working. You get a European sensibility.” There’s also the peculiar British character to consider. “Brits have historically been good at science, invention and art, but we’re naturally very reasonable when it comes to settling and solving problems,” says McWilliam. “When budgets go wrong and shots aren’t going according to plan, we offer a polite calmness rather than hot-headedness, which does not go unappreciated.”

WINTER 2019 VFXVOICE.COM • 59



FILM

ATOMIC FICTION ANSWERS THE CALL FOR INNOVATION IN WELCOME TO MARWEN By KEVIN H. MARTIN

TOP: Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

60 • VFXVOICE.COM WINTER 2019

The 2010 documentary Marwencol depicts how, after suffering an unprovoked and brutal attack, Mark Hogancamp embarked upon a most unusual kind of self-therapy. He proceeded to build a scaled-down version of a WWII-era European town in his backyard, then populated it with dolls representing himself as well as his friends – and even his assailants, dressed as Nazi soldiers. Director Robert Zemeckis was fascinated by Hogancamp’s story, and after a significant gestation period, this winter’s Welcome to Marwen represents his take on the tale. The filmmaker is well-known for his penchant for visual innovation, going back to Who Framed Roger Rabbit?, Death Becomes Her and Forrest Gump, and this outing is no exception, building on and going well beyond his past experiments with motion capture that began in earnest with The Polar Express, in order to bring life to the denizens of Marwen, as visualized by their creator. To realize the challenging visual effects, Zemeckis turned to Atomic Fiction, which had worked on the director’s live-action projects since Flight. Co-founder/VFX Supervisor Kevin Baillie’s collaborations with Zemeckis extend back to 2009’s A Christmas Carol. Atomic’s work on Welcome to Marwen was augmented by Framestore and Method Studios. (Last September, Deluxe Entertainment acquired Atomic Fiction, which is now part of Method Studios. Credit for the company’s work on Welcome to Marwen remained under the name Atomic Fiction. Baillie is now Method’s Creative Director & Sr. Visual Effects Supervisor.) When first contacted about this project by Zemeckis in 2013, Baillie was in his backyard barbequing. “Bob wanted to know if there was any way I knew to do the visuals indicated in this script,” Baillie recalls. “So there were another four years until filming actually began, which gave us time to really think things out and consider options before settling on our methodologies.” During that time, the trial-and-error phase resulted in a few dead-ends. “Owing to some critical feedback about character performances in earlier mocap movies, Bob had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.” Atomic’s work began with a production test shoot featuring Steve Carrell as Hogancamp’s alter ego Hogie, and Zemeckis’ wife Leslie as Suzette, recorded in full wardrobe on a liveaction set. “We took that test and digitally reshaped their bodies to create a heroic stylized form like what you’d associate with doll figures,” notes Baillie, “using techniques similar to what was done on the first Captain America film. Then we’d try tracking in 3D joints to create an appropriate doll-limb look. We produced a 20-second test with this digital-augmentation routine – and it looked horrible! Humans have stretchy stuff all throughout their bodies, and when you put rigid ball joints in between fleshy human tissue, it just looks all wrong.”

“We started animating... but Bob stopped us, saying he didn’t want to go that route. … So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’ This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction) Fearing they might be headed up an uncanny creek, Zemeckis and Atomic re-examined the test, this time 3D tracking character movement and treating it like motion-capture data. “We started animating, but when we got to showing him our animated faces, Bob stopped us, saying he didn’t want to go that route,” admits Baillie. “I told him we just needed more time to work things out, but he was very concerned about how long it would take with this approach to create imagery for 40 minutes of movie. So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’” Utilizing the lit and well-exposed footage of Carrell’s eyes and face, Atomic applied this ‘ski mask’ onto their digital doll, finessing the look further with a proprietary comp process that made the character’s countenance take on a plastic-y sheen. “This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.” The next step was the physical refinement of the doll character. “While we had a 1:1 3D scan of the actor, the stylized digital doll head differs somewhat, with a smaller nose and different jawline,” observes Baillie. “To address this, after projecting the footage onto the 1:1 head, we rendered that in UV space, then re-rendered that back through the doll version of the head so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.” This involved mocapping not just the actors on stage, but also the two Alexa 65s shooting them. “We created real-time environments that allowed us to see exactly what the camera was seeing so we could compose our shot. And it was essential that we be able to light the actors just as they would be lit when we got to our finals. To do this, we wound up building a CG model of the characters and town in Unreal.” Then, in the weeks leading up to the motion-capture shoot,

TOP: In Welcome to Marwen, Steve Carrell plays Mark Hogancamp, who deals with the trauma of a brutal attack by constructing a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. Mark documents his Marwen creations photographically. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Doll technician D. Martin Myatt, props assistant Shannon Stensrud and director Robert Zemeckis confer over the practical doll figures. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

WINTER 2019 VFXVOICE.COM • 61


FILM

ATOMIC FICTION ANSWERS THE CALL FOR INNOVATION IN WELCOME TO MARWEN By KEVIN H. MARTIN

TOP: Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

60 • VFXVOICE.COM WINTER 2019

The 2010 documentary Marwencol depicts how, after suffering an unprovoked and brutal attack, Mark Hogancamp embarked upon a most unusual kind of self-therapy. He proceeded to build a scaled-down version of a WWII-era European town in his backyard, then populated it with dolls representing himself as well as his friends – and even his assailants, dressed as Nazi soldiers. Director Robert Zemeckis was fascinated by Hogancamp’s story, and after a significant gestation period, this winter’s Welcome to Marwen represents his take on the tale. The filmmaker is well-known for his penchant for visual innovation, going back to Who Framed Roger Rabbit?, Death Becomes Her and Forrest Gump, and this outing is no exception, building on and going well beyond his past experiments with motion capture that began in earnest with The Polar Express, in order to bring life to the denizens of Marwen, as visualized by their creator. To realize the challenging visual effects, Zemeckis turned to Atomic Fiction, which had worked on the director’s live-action projects since Flight. Co-founder/VFX Supervisor Kevin Baillie’s collaborations with Zemeckis extend back to 2009’s A Christmas Carol. Atomic’s work on Welcome to Marwen was augmented by Framestore and Method Studios. (Last September, Deluxe Entertainment acquired Atomic Fiction, which is now part of Method Studios. Credit for the company’s work on Welcome to Marwen remained under the name Atomic Fiction. Baillie is now Method’s Creative Director & Sr. Visual Effects Supervisor.) When first contacted about this project by Zemeckis in 2013, Baillie was in his backyard barbequing. “Bob wanted to know if there was any way I knew to do the visuals indicated in this script,” Baillie recalls. “So there were another four years until filming actually began, which gave us time to really think things out and consider options before settling on our methodologies.” During that time, the trial-and-error phase resulted in a few dead-ends. “Owing to some critical feedback about character performances in earlier mocap movies, Bob had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.” Atomic’s work began with a production test shoot featuring Steve Carrell as Hogancamp’s alter ego Hogie, and Zemeckis’ wife Leslie as Suzette, recorded in full wardrobe on a liveaction set. “We took that test and digitally reshaped their bodies to create a heroic stylized form like what you’d associate with doll figures,” notes Baillie, “using techniques similar to what was done on the first Captain America film. Then we’d try tracking in 3D joints to create an appropriate doll-limb look. We produced a 20-second test with this digital-augmentation routine – and it looked horrible! Humans have stretchy stuff all throughout their bodies, and when you put rigid ball joints in between fleshy human tissue, it just looks all wrong.”

“We started animating... but Bob stopped us, saying he didn’t want to go that route. … So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’ This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction) Fearing they might be headed up an uncanny creek, Zemeckis and Atomic re-examined the test, this time 3D tracking character movement and treating it like motion-capture data. “We started animating, but when we got to showing him our animated faces, Bob stopped us, saying he didn’t want to go that route,” admits Baillie. “I told him we just needed more time to work things out, but he was very concerned about how long it would take with this approach to create imagery for 40 minutes of movie. So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’” Utilizing the lit and well-exposed footage of Carrell’s eyes and face, Atomic applied this ‘ski mask’ onto their digital doll, finessing the look further with a proprietary comp process that made the character’s countenance take on a plastic-y sheen. “This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.” The next step was the physical refinement of the doll character. “While we had a 1:1 3D scan of the actor, the stylized digital doll head differs somewhat, with a smaller nose and different jawline,” observes Baillie. “To address this, after projecting the footage onto the 1:1 head, we rendered that in UV space, then re-rendered that back through the doll version of the head so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.” This involved mocapping not just the actors on stage, but also the two Alexa 65s shooting them. “We created real-time environments that allowed us to see exactly what the camera was seeing so we could compose our shot. And it was essential that we be able to light the actors just as they would be lit when we got to our finals. To do this, we wound up building a CG model of the characters and town in Unreal.” Then, in the weeks leading up to the motion-capture shoot,

TOP: In Welcome to Marwen, Steve Carrell plays Mark Hogancamp, who deals with the trauma of a brutal attack by constructing a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. Mark documents his Marwen creations photographically. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Doll technician D. Martin Myatt, props assistant Shannon Stensrud and director Robert Zemeckis confer over the practical doll figures. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

WINTER 2019 VFXVOICE.COM • 61


FILM

TOP LEFT: Director of Photography C. Miles Kim and Zemeckis with a pair of Arriflex Alexa 65 cameras. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) TOP RIGHT: Carrell and Zemeckis on the capture stage. Successfully marrying liveaction with CG required a new approach, augmenting digital dolls with live-action human imagery. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM LEFT AND RIGHT: Depth of field appropriate to the scale of the Marwen villagers had to be determined by a compositor prior to rendering. (Photo: Ed Araquel. Images copyright © 2018 Universal Pictures and DreamWorks)

the VFX team developed tools that permitted director of photography C. Kim Miles [Arrow, The Flash, Lost in Space] to pre-light every setup taking place in Marwen. “Via iPad, Kim was able to set positions and intensities for various kinds of illumination,” says Baillie. “He worked with our real-time team throughout, and then he was able to deal with the mocap stage in exactly the same fashion, so when the actors showed up, everything was ready to go.” A pair of monitors was utilized during the motion-capture sessions, with one showing the Alexa view of actors in suits in front of bluescreen. “The other screen displayed the Unreal feed, which was a nicely-lit view of our final,” notes Baillie. “After comparing them, Bob and I could walk away from that day knowing that not only the performances were in the can,

but that the camera moves worked as designed and that our lighting was a perfect match for what the final shot would look like at the end of post. So production had material available to go to editorial the very next day after we shot.” A necessary next step involved the director’s review of character animation. “For Bob to buy off on the animation, he needed to be able to see lit faces,” Baillie acknowledges. “But viewing those on a traditional gray-shaded playblast looks messed up. So our lighting team at Atomic Fiction – and Framestore also contributed to this work – ended up having to do a first pass on lighting for every shot very early on. Animators could then kick out a render instead of a playblast, and compositors would have already done a first pass on the facial aspect. With all those elements in place, only then could we ask Bob to make an evaluation. It was in many ways the antithesis of a traditional production process; instead of a progression of events, everything had to happen at the same time, which can get kind of scary. We began iterating from 1K renders initially, and now with only a few months left in post, there were only a handful of full-res renders done. Thank God we have a cloud-based rendering system, because effectively there’s going to be 20 million core-hours’ worth of imagery to render during our last three weeks to get it all done.” The cloud approach comes courtesy of Conductor Technologies, a company which Baillie also founded. “A company the size of Atomic could never build a render farm large enough to handle all this,” he observes. “I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.” To properly convey scale in the Marwen mini-village, depth of field had to be reduced proportionally. “We’d have a compositor set the depth of field before feeding that data back into the 3D render,” states Baillie. “When the animator renders the image with first-pass lighting along with the focus adjustment, Bob’s eye would be led to the right place.

TOP: Depth of field issues were negotiated with custom toolset modifications, including a digital tilt-shift lens and a custom diopter capable of near-infinite variations. (Image copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Hogancamp’s heroes head into battle, armed for bear (on a tiny scale). (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

“Owing to some critical feedback about character performances in earlier mocap movies, Bob [director Robert Zemeckis] had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

62 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 63


FILM

TOP LEFT: Director of Photography C. Miles Kim and Zemeckis with a pair of Arriflex Alexa 65 cameras. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) TOP RIGHT: Carrell and Zemeckis on the capture stage. Successfully marrying liveaction with CG required a new approach, augmenting digital dolls with live-action human imagery. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM LEFT AND RIGHT: Depth of field appropriate to the scale of the Marwen villagers had to be determined by a compositor prior to rendering. (Photo: Ed Araquel. Images copyright © 2018 Universal Pictures and DreamWorks)

the VFX team developed tools that permitted director of photography C. Kim Miles [Arrow, The Flash, Lost in Space] to pre-light every setup taking place in Marwen. “Via iPad, Kim was able to set positions and intensities for various kinds of illumination,” says Baillie. “He worked with our real-time team throughout, and then he was able to deal with the mocap stage in exactly the same fashion, so when the actors showed up, everything was ready to go.” A pair of monitors was utilized during the motion-capture sessions, with one showing the Alexa view of actors in suits in front of bluescreen. “The other screen displayed the Unreal feed, which was a nicely-lit view of our final,” notes Baillie. “After comparing them, Bob and I could walk away from that day knowing that not only the performances were in the can,

but that the camera moves worked as designed and that our lighting was a perfect match for what the final shot would look like at the end of post. So production had material available to go to editorial the very next day after we shot.” A necessary next step involved the director’s review of character animation. “For Bob to buy off on the animation, he needed to be able to see lit faces,” Baillie acknowledges. “But viewing those on a traditional gray-shaded playblast looks messed up. So our lighting team at Atomic Fiction – and Framestore also contributed to this work – ended up having to do a first pass on lighting for every shot very early on. Animators could then kick out a render instead of a playblast, and compositors would have already done a first pass on the facial aspect. With all those elements in place, only then could we ask Bob to make an evaluation. It was in many ways the antithesis of a traditional production process; instead of a progression of events, everything had to happen at the same time, which can get kind of scary. We began iterating from 1K renders initially, and now with only a few months left in post, there were only a handful of full-res renders done. Thank God we have a cloud-based rendering system, because effectively there’s going to be 20 million core-hours’ worth of imagery to render during our last three weeks to get it all done.” The cloud approach comes courtesy of Conductor Technologies, a company which Baillie also founded. “A company the size of Atomic could never build a render farm large enough to handle all this,” he observes. “I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.” To properly convey scale in the Marwen mini-village, depth of field had to be reduced proportionally. “We’d have a compositor set the depth of field before feeding that data back into the 3D render,” states Baillie. “When the animator renders the image with first-pass lighting along with the focus adjustment, Bob’s eye would be led to the right place.

TOP: Depth of field issues were negotiated with custom toolset modifications, including a digital tilt-shift lens and a custom diopter capable of near-infinite variations. (Image copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Hogancamp’s heroes head into battle, armed for bear (on a tiny scale). (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

“Owing to some critical feedback about character performances in earlier mocap movies, Bob [director Robert Zemeckis] had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

62 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 63


FILM

“A company the size of Atomic could never build a render farm large enough to handle all this. I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

TOP: While Method created virtual village environments to surround the Marwen characters, here the dolls are seen in the real world, keeping pace with Hogancamp in the small car. (Image copyright © 2018 Universal Pictures and DreamWorks) MIDDLE AND BOTTOM: Before and after: Actors Steve Carrell and Janelle Monae during performance capture, and the dramatic transformation of the actors into Marwen denizens Cap’n Hogie and GI Julie. (Photo Middle: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) (Bottom Image copyright © 2018 Universal Pictures and DreamWorks)

64 • VFXVOICE.COM WINTER 2019

To achieve a look that was authentic to Mark Hogancamp’s original photography, Atomic rendered all of their de-focus in physically-accurate 3D with Renderman. In reality, when an object is out of focus, you start seeing around behind it, but that’s not something possible when doing depth of field as a post-process in comp. Before rendering our de-focus in 3D, we had to devise workflows to ensure that rack focuses rendered properly, rather than iterating with expensive renders.” There were occasions when depth of field needed to be altered for dramatic or cinematic effect, which drove more innovation from the Atomic Fiction team. “We might be shooting two actors on the mocap stage, with one slightly offset from the other,” Baillie relates. “Bob would tell us that both of them needed to be in focus. We could address that by closing the aperture of the CG camera, which extended image sharpness over a greater distance, but since it was desirable to maintain a macro look while in Marwen, there were a couple of custom alternatives created in our tool set. One of these permitted animators and lighters to create a digital tilt-shift lens. Tilting the focal plane is an old-school analog technique, and here it afforded us the opportunity to keep both actors sharp while letting the rest of the scene display a natural appropriately-scaled falloff.” When the focus of scenes required a more complex geometry, Baillie sought out aid from Pixar. “They helped us with a toolset addition, the results of which you can see in the trailer,” says Baillie, “when you see the girls in Marwen arrayed in a kind of U-shape around the village bar. No tilt-shift lens would be able to accommodate that configuration, but we wound up with a kind of curtain that goes through all of the girls. And wherever we positioned this curtain, the focus would remain sharp, even though we were shooting with a wide aperture. It’s like building a custom-crafted per-shot diopter. Our DP was just over the moon about that one!” In divvying up the remaining VFX shotload, Baillie assigned “real-world” environment extensions and comp-centric work to Method Studios and earmarked select character scenes for Framestore. “Right off the bat, we set things up so they would have to perform character development on just a few

“[A]fter projecting the footage [of the doll character] onto the 1:1 head, we rendered that in UV space, then re-rendered that so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction) characters,” he explains. “And to ease continuity concerns, those scenes are also all temporally congruent, so there wasn’t going to be intercutting between their work and ours.” With the heavy volume of data acquired by shooting on the Alexa 65, Atomic’s pipeline was rejigged to accommodate a heavier workflow. “While Technicolor handled digital dailies and file storage, we did all of our own frame pulls at the back end of the shoot, through their cloud-based Pulse system. Most of the time when dealing with a 6K source, we’d work at 3K – which is what we finished the show at, but occasionally there’d be a need to request the full 6k for shots with special needs.” During post-production, Zemeckis screened the movie for Atomic Fiction, soliciting opinions not just about the visuals, but also how the story was playing for them. “For Bob, it’s always all about trying to make a better movie,” Baillie declares. “He’s truly a terrific collaborator in the best sense of the word, because while the final decision will always remain his, he is never afraid to ask for input. Bob loves the old François Truffaut quote about how a really good movie is a perfect blend of truth and spectacle. Bob always manages to balance concern for human honesty with rich and innovative visual aspects, but even after all these years, I still am not sure how he manages to deliver these aspects so seamlessly.”

TOP: Attending her ex’s art installation, Nicol (Leslie Mann) finds herself moved by Mark’s evocative imagery of the Marwen dolls in their village. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Steve Carrell as Mark Hogancamp constructs a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

WINTER 2019 VFXVOICE.COM • 65


FILM

“A company the size of Atomic could never build a render farm large enough to handle all this. I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction)

TOP: While Method created virtual village environments to surround the Marwen characters, here the dolls are seen in the real world, keeping pace with Hogancamp in the small car. (Image copyright © 2018 Universal Pictures and DreamWorks) MIDDLE AND BOTTOM: Before and after: Actors Steve Carrell and Janelle Monae during performance capture, and the dramatic transformation of the actors into Marwen denizens Cap’n Hogie and GI Julie. (Photo Middle: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) (Bottom Image copyright © 2018 Universal Pictures and DreamWorks)

64 • VFXVOICE.COM WINTER 2019

To achieve a look that was authentic to Mark Hogancamp’s original photography, Atomic rendered all of their de-focus in physically-accurate 3D with Renderman. In reality, when an object is out of focus, you start seeing around behind it, but that’s not something possible when doing depth of field as a post-process in comp. Before rendering our de-focus in 3D, we had to devise workflows to ensure that rack focuses rendered properly, rather than iterating with expensive renders.” There were occasions when depth of field needed to be altered for dramatic or cinematic effect, which drove more innovation from the Atomic Fiction team. “We might be shooting two actors on the mocap stage, with one slightly offset from the other,” Baillie relates. “Bob would tell us that both of them needed to be in focus. We could address that by closing the aperture of the CG camera, which extended image sharpness over a greater distance, but since it was desirable to maintain a macro look while in Marwen, there were a couple of custom alternatives created in our tool set. One of these permitted animators and lighters to create a digital tilt-shift lens. Tilting the focal plane is an old-school analog technique, and here it afforded us the opportunity to keep both actors sharp while letting the rest of the scene display a natural appropriately-scaled falloff.” When the focus of scenes required a more complex geometry, Baillie sought out aid from Pixar. “They helped us with a toolset addition, the results of which you can see in the trailer,” says Baillie, “when you see the girls in Marwen arrayed in a kind of U-shape around the village bar. No tilt-shift lens would be able to accommodate that configuration, but we wound up with a kind of curtain that goes through all of the girls. And wherever we positioned this curtain, the focus would remain sharp, even though we were shooting with a wide aperture. It’s like building a custom-crafted per-shot diopter. Our DP was just over the moon about that one!” In divvying up the remaining VFX shotload, Baillie assigned “real-world” environment extensions and comp-centric work to Method Studios and earmarked select character scenes for Framestore. “Right off the bat, we set things up so they would have to perform character development on just a few

“[A]fter projecting the footage [of the doll character] onto the 1:1 head, we rendered that in UV space, then re-rendered that so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios, (Co-founder of Atomic Fiction) characters,” he explains. “And to ease continuity concerns, those scenes are also all temporally congruent, so there wasn’t going to be intercutting between their work and ours.” With the heavy volume of data acquired by shooting on the Alexa 65, Atomic’s pipeline was rejigged to accommodate a heavier workflow. “While Technicolor handled digital dailies and file storage, we did all of our own frame pulls at the back end of the shoot, through their cloud-based Pulse system. Most of the time when dealing with a 6K source, we’d work at 3K – which is what we finished the show at, but occasionally there’d be a need to request the full 6k for shots with special needs.” During post-production, Zemeckis screened the movie for Atomic Fiction, soliciting opinions not just about the visuals, but also how the story was playing for them. “For Bob, it’s always all about trying to make a better movie,” Baillie declares. “He’s truly a terrific collaborator in the best sense of the word, because while the final decision will always remain his, he is never afraid to ask for input. Bob loves the old François Truffaut quote about how a really good movie is a perfect blend of truth and spectacle. Bob always manages to balance concern for human honesty with rich and innovative visual aspects, but even after all these years, I still am not sure how he manages to deliver these aspects so seamlessly.”

TOP: Attending her ex’s art installation, Nicol (Leslie Mann) finds herself moved by Mark’s evocative imagery of the Marwen dolls in their village. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) BOTTOM: Steve Carrell as Mark Hogancamp constructs a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

WINTER 2019 VFXVOICE.COM • 65


ANIMATION

that worked for the entire pipeline – animation, lighting, effects,” says Josh Beveridge, Animation Director. “A lot of us were high five-ing when we got a system, a plan that worked.” To create that plan, the artists had to veer into unknown territory. The film sends Spider-Man into an alternate universe. The artists making the film landed in alternate visual effects and animation universes as well. “When we brought artists onto the show, we asked them to reconsider how they do everything,” Dimian says. “We told them, ‘If it ain’t broke, break it.’ Early on we wanted to see what mistakes we could find that looked interesting.” ACTING LINES

IMAGEWORKS ARTISTS ‘BREAK THE MOLD’ TO CREATE AN ALTERNATE SPIDER-VERSE By BARBARA ROBERTSON

All images Sony Pictures Animation copyright © 2017 CTMG, Inc. TOP: Miles Morales (Shameik Moore) falls through an alternateuniverse New York City in Sony Picture Animation’s Spider-Man: Into the Spider-Verse.

66 • VFXVOICE.COM WINTER 2019

The surprisingly quick success of Marvel’s first Spider-Man comic in 1962 has led to more than 50 years of comics, cartoons, books, video games, TV series and films. IMDb alone lists more than 1,000 Spider-Man titles. None of these are like the latest, Spider-Man: Into the Spider-Verse, an animated feature from Columbia Pictures, Sony Pictures Animation and Marvel Entertainment – and that was the intention. Developed by writers/directors Phil Lord and Christopher Miller (The LEGO Movie, Cloudy with a Chance of Meatballs) and directed by Bob Persichetti, Peter Ramsey and Rodney Rothman, the film isn’t set in the same world as any of Sony’s other Spider-Man films. Instead, it takes place in a multiverse. The Spidey star is a Puerto Rican, African-American high school student from Brooklyn named Miles Morales, but he’s only one of many Spider-People, each with his or her own moves. Animators and artists at Sony Pictures Imageworks, many of whom had worked on Lord and Miller’s Cloudy with a Chance of Meatballs, gave that alternate world a unique look and feel, creating a stylish, action-packed animated film that explodes with color. The characters created by blending CG and handdrawn techniques into a comic-book aesthetic look fresh and new. Sometimes we see semi-realistic environments, sometimes abstract patterns. “Chris and Phil’s mandate from the beginning was to make this film look like something we’ve never seen before,” says Visual Effects Supervisor Danny Dimian. “Something that would make people wonder how it was made.” Dimian began working on the film in pre-production, nearly three years before the scheduled release date. Small teams started with tests to determine how they could realize the filmmakers’ vision. “We took the visual language of comic books and made an immersive movie variant, and tried to do that efficiently in a way

With an eye toward comic book illustration, the team considered color, line and form rather than animation styles to develop their plan. They wanted to simplify the look by removing the complexity in modern animation, yet keep the characters engaging and emotional. They decided to focus on simple models and handdrawn details. Line work would be the answer. “We developed a line-drawing system in Houdini and tied it into Maya,” Dimian says. “Artists can draw on the [3D] object, changing the weight and taper of a line. But that is just the starting point. We created two types of lines, lines for acting and emotion, and lines for form. The main point is that these lines are independent from the form, whether hand-drawn, rigged or drawn through machine learning.” The acting lines used for facial performances start as drawings with the type of tapering typical in illustrations. The line-drawing system converts these hand-drawn lines into geometry curves that can be animated like any other rigged geometry. But they are not tied to muscles or shapes. “An illustrator can draw for emotion and completely contradict what the underlying geometry does, so we completely de-coupled the acting lines from the form, too,” Dimian says. “These acting lines can follow the form when we want them to, or break free when we want, and make expressions to just the camera view. So the lines work well in stereo, too.” The result gave animators new methods for creating facial performances for the 3D models. Rather than sculpting wrinkles, animators use these 2D acting lines to create expressions. “We have little lines, little splines, that we can move and shape as we like, all built into the rig,” Beveridge says. “And we can add more. The characters don’t look ‘on model’ until we have these expression lines.”

TOP AND MIDDLE: Miles Morales BOTTOM: Hailee Steinfeld voices the role of Spider-Gwen.

FORM LINES

In addition to the acting lines, the team developed a second type of 2D line to separate overlapping forms. “These lines round out a form and provide detail, but they are not tied to the acting,” Dimian says. “We didn’t want the form lines to look like traditional cartoon inking or to have contour lines that appear everywhere. We used a formula based on illustration principals; we don’t use edge detection. We wanted to hand design the lines to separate forms.”

WINTER 2019 VFXVOICE.COM • 67


ANIMATION

that worked for the entire pipeline – animation, lighting, effects,” says Josh Beveridge, Animation Director. “A lot of us were high five-ing when we got a system, a plan that worked.” To create that plan, the artists had to veer into unknown territory. The film sends Spider-Man into an alternate universe. The artists making the film landed in alternate visual effects and animation universes as well. “When we brought artists onto the show, we asked them to reconsider how they do everything,” Dimian says. “We told them, ‘If it ain’t broke, break it.’ Early on we wanted to see what mistakes we could find that looked interesting.” ACTING LINES

IMAGEWORKS ARTISTS ‘BREAK THE MOLD’ TO CREATE AN ALTERNATE SPIDER-VERSE By BARBARA ROBERTSON

All images Sony Pictures Animation copyright © 2017 CTMG, Inc. TOP: Miles Morales (Shameik Moore) falls through an alternateuniverse New York City in Sony Picture Animation’s Spider-Man: Into the Spider-Verse.

66 • VFXVOICE.COM WINTER 2019

The surprisingly quick success of Marvel’s first Spider-Man comic in 1962 has led to more than 50 years of comics, cartoons, books, video games, TV series and films. IMDb alone lists more than 1,000 Spider-Man titles. None of these are like the latest, Spider-Man: Into the Spider-Verse, an animated feature from Columbia Pictures, Sony Pictures Animation and Marvel Entertainment – and that was the intention. Developed by writers/directors Phil Lord and Christopher Miller (The LEGO Movie, Cloudy with a Chance of Meatballs) and directed by Bob Persichetti, Peter Ramsey and Rodney Rothman, the film isn’t set in the same world as any of Sony’s other Spider-Man films. Instead, it takes place in a multiverse. The Spidey star is a Puerto Rican, African-American high school student from Brooklyn named Miles Morales, but he’s only one of many Spider-People, each with his or her own moves. Animators and artists at Sony Pictures Imageworks, many of whom had worked on Lord and Miller’s Cloudy with a Chance of Meatballs, gave that alternate world a unique look and feel, creating a stylish, action-packed animated film that explodes with color. The characters created by blending CG and handdrawn techniques into a comic-book aesthetic look fresh and new. Sometimes we see semi-realistic environments, sometimes abstract patterns. “Chris and Phil’s mandate from the beginning was to make this film look like something we’ve never seen before,” says Visual Effects Supervisor Danny Dimian. “Something that would make people wonder how it was made.” Dimian began working on the film in pre-production, nearly three years before the scheduled release date. Small teams started with tests to determine how they could realize the filmmakers’ vision. “We took the visual language of comic books and made an immersive movie variant, and tried to do that efficiently in a way

With an eye toward comic book illustration, the team considered color, line and form rather than animation styles to develop their plan. They wanted to simplify the look by removing the complexity in modern animation, yet keep the characters engaging and emotional. They decided to focus on simple models and handdrawn details. Line work would be the answer. “We developed a line-drawing system in Houdini and tied it into Maya,” Dimian says. “Artists can draw on the [3D] object, changing the weight and taper of a line. But that is just the starting point. We created two types of lines, lines for acting and emotion, and lines for form. The main point is that these lines are independent from the form, whether hand-drawn, rigged or drawn through machine learning.” The acting lines used for facial performances start as drawings with the type of tapering typical in illustrations. The line-drawing system converts these hand-drawn lines into geometry curves that can be animated like any other rigged geometry. But they are not tied to muscles or shapes. “An illustrator can draw for emotion and completely contradict what the underlying geometry does, so we completely de-coupled the acting lines from the form, too,” Dimian says. “These acting lines can follow the form when we want them to, or break free when we want, and make expressions to just the camera view. So the lines work well in stereo, too.” The result gave animators new methods for creating facial performances for the 3D models. Rather than sculpting wrinkles, animators use these 2D acting lines to create expressions. “We have little lines, little splines, that we can move and shape as we like, all built into the rig,” Beveridge says. “And we can add more. The characters don’t look ‘on model’ until we have these expression lines.”

TOP AND MIDDLE: Miles Morales BOTTOM: Hailee Steinfeld voices the role of Spider-Gwen.

FORM LINES

In addition to the acting lines, the team developed a second type of 2D line to separate overlapping forms. “These lines round out a form and provide detail, but they are not tied to the acting,” Dimian says. “We didn’t want the form lines to look like traditional cartoon inking or to have contour lines that appear everywhere. We used a formula based on illustration principals; we don’t use edge detection. We wanted to hand design the lines to separate forms.”

WINTER 2019 VFXVOICE.COM • 67


ANIMATION

“We took the visual language of comic books and made an immersive movie variant, and tried to do that efficiently in a way that worked for the entire pipeline – animation, lighting, effects. A lot of us were high five-ing when we got a system that worked.” —Josh Beveridge, Animation Director

A form line might be drawn around a nose to separate it from the cheek, for example, but only when necessary. “We don’t want that line when the nose is against the background,” Dimian says. “Only when we want the separation from the cheek. The reason we have these form lines is to decide, like an illustrator, when we need to separate overlapping forms, to separate fingers, or to outline hands.” Because the appropriate use of the form lines is predictable, the effects team devised a system that used machine learning to automate the line drawing, again using a Houdini solver tied into Maya. “We start in the same way as we start with the acting lines,” Dimian says. “The difference is that we do turntables to look at the lines from different views. Then we keyframe what the lines should do and feed that into machine learning. The first passes come out automatically now. We just do small hand tweaks over time.” For consistency, the animation team built libraries of poses into the rig. “We had new powers,” Beveridge says. “This is something a lot of animators have wanted to do for a long time. The trick was finding what was in good taste, finding rules that made sense.” TOP LEFT: Jake Johnson voices the role of Peter Parker. TOP RIGHT: Peter Parker, Gwen Stacy and Miles Morales. BOTTOM LEFT: Miles Morales BOTTOM RIGHT: Peter Parker serves as Miles Morales’ reluctant mentor.

68 • VFXVOICE.COM WINTER 2019

THE MOVES

All told, more than 150 animators worked on Spider-Verse, the biggest team Imageworks has ever had on a film, according to Beveridge. He organized them with sequence leads, nine in all. Although the team didn’t have official character leads, animators gravitated toward those with a knack for performing certain characters.

“We have characters that are very different, and each has its own rules,” Beveridge says. “Every Spider-Person has a different background and moves differently. We find the characters through exploration. Even before we had the characters move, we explored the characters’ aesthetic through camera tests.” The first characters explored were Peter Parker and, in particular, Miles Morales, who needed to be obviously unique. When they were happy with his aesthetic, he became a benchmark, and they began doing motion tests that allowed other departments to test their techniques. To give the Spider-People a snappy style the comic book aesthetic demands, the teams decided to forgo motion blur. Instead, animators drew streak lines to give the appearance of motion, and lines off the characters that connected with the form. They also decided to use stepped animation, which is loosely categorized as animating on 2s. “Animating on 2s doesn’t mean mathematically every other frame,” Beveridge says. “More often than not, it’s 12 frames per second, but it’s a fluid scale. Sometimes we might have a 60-frame shot with the first 16 on 2s, then a 3-frame hold, then one’s. It’s dynamic. Something musical. Every animator finds the flavor for a scene. Our boundaries are to stay away from strobe and mush.” The lack of motion blur accents that style. “If you stop on any frame in the movie, it’s solid, sharp,” Dimian says. “In the ‘70s we had an effect where every frame was held, so we got a trail of previous images with traced-out lines. We have that kind of style. A non-blurred image that’s stepped behind another in the animation curve.” They pushed other techniques as well, many borrowed from 2D animation and illustration. With a “pose stamp” tool, for example, the animators could work with a duplicate of any character in a pose. “It’s a brick,” Beveridge says of the “pose stamp.” “A dummieddown duplicate of a rig without a rig. We can use it to smear an object across space, and use portions of it to fill in gaps with multiple limbs.” Adds Dimian, “We’ve always done smearing of geometry – when you pause on a frame, the character might be a blobby smear through the animation, like with a trailing smeared arm and elastic stretching – but we pushed that further. We’ve also broken the characters into pieces. You’ll see extra arms and other shapes that fill in gaps. You’ll see pieces of the model floating, edges that don’t connect, a wrinkle floating off the surface. The cap of a fire hydrant might not be on it. A cushion might float off a

TOP: Spider-Gwen BOTTOM: Jefferson (Brian Tyree Henry), Miles Morales and Rio (Luna Lauren Velez).

WINTER 2019 VFXVOICE.COM • 69


ANIMATION

“We took the visual language of comic books and made an immersive movie variant, and tried to do that efficiently in a way that worked for the entire pipeline – animation, lighting, effects. A lot of us were high five-ing when we got a system that worked.” —Josh Beveridge, Animation Director

A form line might be drawn around a nose to separate it from the cheek, for example, but only when necessary. “We don’t want that line when the nose is against the background,” Dimian says. “Only when we want the separation from the cheek. The reason we have these form lines is to decide, like an illustrator, when we need to separate overlapping forms, to separate fingers, or to outline hands.” Because the appropriate use of the form lines is predictable, the effects team devised a system that used machine learning to automate the line drawing, again using a Houdini solver tied into Maya. “We start in the same way as we start with the acting lines,” Dimian says. “The difference is that we do turntables to look at the lines from different views. Then we keyframe what the lines should do and feed that into machine learning. The first passes come out automatically now. We just do small hand tweaks over time.” For consistency, the animation team built libraries of poses into the rig. “We had new powers,” Beveridge says. “This is something a lot of animators have wanted to do for a long time. The trick was finding what was in good taste, finding rules that made sense.” TOP LEFT: Jake Johnson voices the role of Peter Parker. TOP RIGHT: Peter Parker, Gwen Stacy and Miles Morales. BOTTOM LEFT: Miles Morales BOTTOM RIGHT: Peter Parker serves as Miles Morales’ reluctant mentor.

68 • VFXVOICE.COM WINTER 2019

THE MOVES

All told, more than 150 animators worked on Spider-Verse, the biggest team Imageworks has ever had on a film, according to Beveridge. He organized them with sequence leads, nine in all. Although the team didn’t have official character leads, animators gravitated toward those with a knack for performing certain characters.

“We have characters that are very different, and each has its own rules,” Beveridge says. “Every Spider-Person has a different background and moves differently. We find the characters through exploration. Even before we had the characters move, we explored the characters’ aesthetic through camera tests.” The first characters explored were Peter Parker and, in particular, Miles Morales, who needed to be obviously unique. When they were happy with his aesthetic, he became a benchmark, and they began doing motion tests that allowed other departments to test their techniques. To give the Spider-People a snappy style the comic book aesthetic demands, the teams decided to forgo motion blur. Instead, animators drew streak lines to give the appearance of motion, and lines off the characters that connected with the form. They also decided to use stepped animation, which is loosely categorized as animating on 2s. “Animating on 2s doesn’t mean mathematically every other frame,” Beveridge says. “More often than not, it’s 12 frames per second, but it’s a fluid scale. Sometimes we might have a 60-frame shot with the first 16 on 2s, then a 3-frame hold, then one’s. It’s dynamic. Something musical. Every animator finds the flavor for a scene. Our boundaries are to stay away from strobe and mush.” The lack of motion blur accents that style. “If you stop on any frame in the movie, it’s solid, sharp,” Dimian says. “In the ‘70s we had an effect where every frame was held, so we got a trail of previous images with traced-out lines. We have that kind of style. A non-blurred image that’s stepped behind another in the animation curve.” They pushed other techniques as well, many borrowed from 2D animation and illustration. With a “pose stamp” tool, for example, the animators could work with a duplicate of any character in a pose. “It’s a brick,” Beveridge says of the “pose stamp.” “A dummieddown duplicate of a rig without a rig. We can use it to smear an object across space, and use portions of it to fill in gaps with multiple limbs.” Adds Dimian, “We’ve always done smearing of geometry – when you pause on a frame, the character might be a blobby smear through the animation, like with a trailing smeared arm and elastic stretching – but we pushed that further. We’ve also broken the characters into pieces. You’ll see extra arms and other shapes that fill in gaps. You’ll see pieces of the model floating, edges that don’t connect, a wrinkle floating off the surface. The cap of a fire hydrant might not be on it. A cushion might float off a

TOP: Spider-Gwen BOTTOM: Jefferson (Brian Tyree Henry), Miles Morales and Rio (Luna Lauren Velez).

WINTER 2019 VFXVOICE.COM • 69


ANIMATION

“We developed a line-drawing system in Houdini and tied it into Maya. Artists can draw on the [3D] object, changing the weight and taper of a line. But that is just the starting point. We created two types of lines, lines for acting and emotion, and lines for form. The main point is that these lines are independent from the form, whether hand-drawn, rigged or drawn through machine learning.” —Danny Dimian, Visual Effects Supervisor chair. But it looks right.” The result could have looked visually confusing, but the team learned how to maintain a consistent visual language. “It works because it’s not arbitrary,” Dimian says. “There are rules to it all; it plays right. We found that these imperfections had to be accents, secondary actions, or embellishments. They couldn’t be the main structure or form. It ends up being a case-by-case-based thing.” CAMERA PANS

TOP: Miles Morales MIDDLE: Peter Parker BOTTOM: A Prowler

Given the unique aesthetic and animation style, the team needed to find new ways to handle fast camera pans without using motion blur, and to create new types of lensing effects consistent with illustration, not realism. “We needed to pan a camera quickly and have continuity,” Dimian says. “For a simple object like a moving train or other predictable geometry, the effects team added geometry – motion lines on top – or smears, building the geometry into the model. The renderer [Arnold] responded to the highlights or color patterns represented by additional geometry. We tried to do as much as possible automatically, then have the artists fine tune.” For a lensing effect, the team referenced old comic books, especially those with misprinted pages where the registration didn’t line up right. “We have the same image offset,” Dimian says. “It isn’t a blur. It isn’t out of focus. It’s an image with different colors offset based on z-depth, so it looks out of focus. But it feels natural because we use it consistently.” MATERIALS

Because color, line and form are more important for this film than realistic materials, the team also needed to find a way to represent objects as if they had been printed, not rendered with a ray tracer. “We looked at old screen-printed comics,” Dimian says. “They used half-toning and line-hatching, so we came up with hatching and half-toning techniques in the render and in compositing, too.” Half-toning represents colors with dots. Moving from larger to smaller dots creates grads, and that’s what the team mimicked. “The comic book illustrators didn’t draw anything that was really soft, so we tried not to have anything blurry,” Dimian says. “We tried

70 • VFXVOICE.COM WINTER 2019

to avoid grades by turning those into half-toning, and avoided traditional material looks by having them quantized using brush strokes and other ways to break up the form. We would have stepped values to avoid smooth grads, and then broke those up with half-toning. A lot of our shadows were hatches. We’d combine line work, thick to thin, to get an overall value.” The half-toning and hatching were procedurally-based techniques, some done in shading and some with shaders giving passes to compositors who would half-tone and hatch. “The majority of half-toning and hatching is compositing based using render passes,” Dimian says. “We still have bounce light and all that, and we still had to make things hard, soft, shiny and dull, but we abstracted it away from a ray-traced look.” EFFECTS

Similarly, the effects artists realized early on that fully simulation-based effects wouldn’t do. They needed to create effects in a comic book illustration style. “We used lots of animation cycles that are completely 2D, much like those you’d create for a 2D movie or an anime,” Dimian says. “We’d tie them to sprites and mix with simulation.” The artists would start with 2D cycles, feed them into a system, and then add volume with 3D simulation to create forms that would read well in stereo. “Clouds and fire start flat with 2D animation that we build with little pieces,” Dimian says. “Then we give them form with 3D simulation or split them into layers to make them not flat. When we use 3D simulation, it’s stylized to be simple forms, not realistic. The important thing is that the way the simulation changes form, how it evolves over time, is an artistic choice.” MAKING IT WORK

The nature of the new aesthetic required more collaboration among people working on different parts of the pipeline. “Everyone is always doing an incomplete portion, and something slightly out of their comfort zone,” Beveridge says. “We try to keep it a linear process, but we had a lot of forward and backward communication. It was very exciting.” As might be imagined, the new techniques required much trial and error, and many failed experiments fell by the wayside before the team discovered a good path. “Our artists had to be reminded that it’s safe to explore and that we wanted them to take risks,” Dimian says. “We all work hard and fast and it’s a rare opportunity to be purposely looking for things that are broken or to experiment without knowing exactly what you want. We had to make sure our artists felt supported in this. My personal concern was to find something that turns into enough of a language that there’s a consistency so that you’re swept away by the story – you’re just in it.” Clearly, the team met Lord and Miller’s mandate to make something never seen before. Early reactions suggest they have also accomplished something even more difficult – to create a new, seamless language that supports both the characters and the story.

TOP: Miles Morales MIDDLE: Hailee Steinfeld voices the role of Spider-Gwen. BOTTOM: Miles Morales, Aunt May (voiced by Lily Tomlin), Peter Parker and Spider-Gwen.

WINTER 2019 VFXVOICE.COM • 71


ANIMATION

“We developed a line-drawing system in Houdini and tied it into Maya. Artists can draw on the [3D] object, changing the weight and taper of a line. But that is just the starting point. We created two types of lines, lines for acting and emotion, and lines for form. The main point is that these lines are independent from the form, whether hand-drawn, rigged or drawn through machine learning.” —Danny Dimian, Visual Effects Supervisor chair. But it looks right.” The result could have looked visually confusing, but the team learned how to maintain a consistent visual language. “It works because it’s not arbitrary,” Dimian says. “There are rules to it all; it plays right. We found that these imperfections had to be accents, secondary actions, or embellishments. They couldn’t be the main structure or form. It ends up being a case-by-case-based thing.” CAMERA PANS

TOP: Miles Morales MIDDLE: Peter Parker BOTTOM: A Prowler

Given the unique aesthetic and animation style, the team needed to find new ways to handle fast camera pans without using motion blur, and to create new types of lensing effects consistent with illustration, not realism. “We needed to pan a camera quickly and have continuity,” Dimian says. “For a simple object like a moving train or other predictable geometry, the effects team added geometry – motion lines on top – or smears, building the geometry into the model. The renderer [Arnold] responded to the highlights or color patterns represented by additional geometry. We tried to do as much as possible automatically, then have the artists fine tune.” For a lensing effect, the team referenced old comic books, especially those with misprinted pages where the registration didn’t line up right. “We have the same image offset,” Dimian says. “It isn’t a blur. It isn’t out of focus. It’s an image with different colors offset based on z-depth, so it looks out of focus. But it feels natural because we use it consistently.” MATERIALS

Because color, line and form are more important for this film than realistic materials, the team also needed to find a way to represent objects as if they had been printed, not rendered with a ray tracer. “We looked at old screen-printed comics,” Dimian says. “They used half-toning and line-hatching, so we came up with hatching and half-toning techniques in the render and in compositing, too.” Half-toning represents colors with dots. Moving from larger to smaller dots creates grads, and that’s what the team mimicked. “The comic book illustrators didn’t draw anything that was really soft, so we tried not to have anything blurry,” Dimian says. “We tried

70 • VFXVOICE.COM WINTER 2019

to avoid grades by turning those into half-toning, and avoided traditional material looks by having them quantized using brush strokes and other ways to break up the form. We would have stepped values to avoid smooth grads, and then broke those up with half-toning. A lot of our shadows were hatches. We’d combine line work, thick to thin, to get an overall value.” The half-toning and hatching were procedurally-based techniques, some done in shading and some with shaders giving passes to compositors who would half-tone and hatch. “The majority of half-toning and hatching is compositing based using render passes,” Dimian says. “We still have bounce light and all that, and we still had to make things hard, soft, shiny and dull, but we abstracted it away from a ray-traced look.” EFFECTS

Similarly, the effects artists realized early on that fully simulation-based effects wouldn’t do. They needed to create effects in a comic book illustration style. “We used lots of animation cycles that are completely 2D, much like those you’d create for a 2D movie or an anime,” Dimian says. “We’d tie them to sprites and mix with simulation.” The artists would start with 2D cycles, feed them into a system, and then add volume with 3D simulation to create forms that would read well in stereo. “Clouds and fire start flat with 2D animation that we build with little pieces,” Dimian says. “Then we give them form with 3D simulation or split them into layers to make them not flat. When we use 3D simulation, it’s stylized to be simple forms, not realistic. The important thing is that the way the simulation changes form, how it evolves over time, is an artistic choice.” MAKING IT WORK

The nature of the new aesthetic required more collaboration among people working on different parts of the pipeline. “Everyone is always doing an incomplete portion, and something slightly out of their comfort zone,” Beveridge says. “We try to keep it a linear process, but we had a lot of forward and backward communication. It was very exciting.” As might be imagined, the new techniques required much trial and error, and many failed experiments fell by the wayside before the team discovered a good path. “Our artists had to be reminded that it’s safe to explore and that we wanted them to take risks,” Dimian says. “We all work hard and fast and it’s a rare opportunity to be purposely looking for things that are broken or to experiment without knowing exactly what you want. We had to make sure our artists felt supported in this. My personal concern was to find something that turns into enough of a language that there’s a consistency so that you’re swept away by the story – you’re just in it.” Clearly, the team met Lord and Miller’s mandate to make something never seen before. Early reactions suggest they have also accomplished something even more difficult – to create a new, seamless language that supports both the characters and the story.

TOP: Miles Morales MIDDLE: Hailee Steinfeld voices the role of Spider-Gwen. BOTTOM: Miles Morales, Aunt May (voiced by Lily Tomlin), Peter Parker and Spider-Gwen.

WINTER 2019 VFXVOICE.COM • 71


TV

VFX RULES THE CYBERPUNK UNIVERSE OF ALTERED CARBON By KEVIN H. MARTIN

All images courtesy of Netflix.

72 • VFXVOICE.COM WINTER 2019

The science fiction subgenre of Cyberpunk really came into its own during the 1980s, but its origins can be found in the science fiction ‘new wave’ movement of the ‘60s, when edgy authors such as Philip K. Dick, J.G. Ballard and Harlan Ellison experimented with their narrative voices to explore social mores as well as how technology would impact likely future dystopias. More than 30 years on from author William Gibson’s trailblazing novel Neuromancer and the Ridley Scott-directed feature film Blade Runner (itself an adaptation of Dick’s novel Do Androids Dream of Electric Sheep?, the Netflix series Altered Carbon delves deep into the Cyberscape, focusing on differences between haves and have-nots in a realm wealthy with both stylistic excesses and urban grit. Based upon a novel by Richard K. Morgan, Altered Carbon takes place centuries hence in a radically-transformed San Francisco known as Bay City. A human’s consciousness in its entirety can be stored in a disk and implanted into different bodies known as ‘sleeves.’ Long-dead soldier-turned-mercenary Takeshi Kovacs (played in the present by Joel Kinnaman, and by Will Yun Lee in flashbacks to a past sleeve-life), is restored to life at the behest of wealthy Laurens Bancroft (James Purefoy), who challenges Kovacs to solve the rich man’s own murder. For Senior Visual Effects Supervisor Everett Burrell, who would oversee principal VFX vendor DNEG TV as well as Milk VFX, Atomic Arts and Pixel Light Effects, working on Altered Carbon harkened back to early in his career, when he focused on prosthetic effects while fabricating alien creatures for the Babylon 5 series. “It’s really kind of funny how much Altered Carbon reminded me in some ways of B5,” he acknowledges. “Both shows offered up a lot of new ideas, each proving to be very exciting in terms of how they explored a unique landscape. My background in practical effects afforded me a deep affection for making in-camera solutions work when that is an option, which is something I still believe in, though a marriage between techniques is often the best solution.” Burrell joined the pre-production effort in May 2016. “Production Designer Carey Meyer (Firefly) had come on in April, so they already had a good handle on aspects of the city, like Bancroft’s tower,” Burrell states. “He and I hit it off right away, and I very much enjoyed how we could riff back and forth about visual concepts. Showrunner Laeta Kalogridis was a very active participant as well while we spent a month exploring various ideas. One focus of our design effort was visualizing the differences between various classes of citizens. The Grounders existed in the lower levels, and the Aerium, high above the clouds, is where the wealthiest lived. Between them was Twilight, who represented the upper-middle section of society.” Although black and white films like Touch of Evil and The Third Man were touch-points for the Altered Carbon design effort, the (replicant) elephant in the room remained Scott’s Blade Runner. “One of the first things we did as a group was go to a really nice theater at DeLuxe to watch it again,” says Burrell. “We wanted to see what they did to make that film work so well. In turn, that made us think about what we could do to achieve a similar level of credible spectacle, but without falling into straight emulation,

which wouldn’t have done credit to the great source material Altered Carbon is based on. While people remember Blade Runner for those iconic cityscapes, the film actually has tons of close-ups of Deckard and Rachel talking with everything out of focus behind them, owing to long lenses. And that worked well for us, because when you put a long lens on the Alexa 65, it mimics that lack of depth of field, and the out of focus background registers in an impressionistic way that, while still looking cool, isn’t totally about the detail and texture way back there in frame.” More recent efforts in the subgenre were less influential. “We watched the live-action version of Ghost in the Shell,” Burrell reveals, “but Laeta didn’t like that palette, so we steered well clear of the pinkish hues and made a point of muting our greens, especially during the DI. And when I saw Blade Runner 2049, we realized they had put a dam in, just like our dam around San Francisco Bay. I was like, ‘Oh shit, they got here first!’ But as these films all feature extrapolation from present-day concerns, it kind of makes sense we both drew similar conclusions about rising sea levels. The thing we had that 2049 didn’t was a beautiful, sunny, pristine world that wasn’t mired in fog and haze, revealed once we broke through those clouds.” Burrell acknowledges that there was very much a blue-sky aspect to this early development work, and that when reality set in, the possibilities were scaled back somewhat. “But I have to say, we aimed very high with this, always intending a Game of Thrones level of production with extremely high-quality visuals, making a particular effort to get our CG to a photoreal level. Ironically, some stuff built practically, like the limousine, had such a nice finish that when we created an exact duplicate in CG, our first turntable looked so pristine and perfect that it generated all these ‘It looks totally fake!’ responses. We wound up having to put extra layers of dirt and scratches on our digital model in order for people to believe this candy-apple/new-car/showroom finish.” The Netflix mandate for 4K delivery was a higher specification than Burrell has dealt with for feature work, but advance planning facilitated the workflow. Encore Vancouver processed digital dailies as ProRes 4444 XQ files, capable of supporting 12-bit imagery, while a Technicolor innovation enabled efficient linkage with overseas facilities. “We talked at length with DNEG TV about the huge amount of

OPPOSITE TOP AND BOTTOM: Before and after. Revived mercenary turned sleuth Takeshi Kovacs (Joel Kinnaman) surveys Bay City in the Netflix presentation of Altered Carbon. Showrunner Leta Kalogridis and Visual Effects Supervisor Everett Burrell selected DNEG TV as principal VFX provider for the series, which has been renewed for a second season. TOP AND BOTTOM: Before and after. A key visual component in Altered Carbon relates to how the rich quite literally occupy the high ground, known as the Aerium, above the clouds, while the have-not ‘Grounders’ scrounge out a living near the ground. Middle-class life exists on the levels in between.

“We wanted to see what they did to make [Blade Runner] work so well. And in turn, that made us think about what we could do to achieve a similar level of credible spectacle, but without falling into straight emulation, which wouldn’t have done credit to the great source material Altered Carbon is based on.” —Everett Burrell, Senior Visual Effects Supervisor

WINTER 2019 VFXVOICE.COM • 73


TV

VFX RULES THE CYBERPUNK UNIVERSE OF ALTERED CARBON By KEVIN H. MARTIN

All images courtesy of Netflix.

72 • VFXVOICE.COM WINTER 2019

The science fiction subgenre of Cyberpunk really came into its own during the 1980s, but its origins can be found in the science fiction ‘new wave’ movement of the ‘60s, when edgy authors such as Philip K. Dick, J.G. Ballard and Harlan Ellison experimented with their narrative voices to explore social mores as well as how technology would impact likely future dystopias. More than 30 years on from author William Gibson’s trailblazing novel Neuromancer and the Ridley Scott-directed feature film Blade Runner (itself an adaptation of Dick’s novel Do Androids Dream of Electric Sheep?, the Netflix series Altered Carbon delves deep into the Cyberscape, focusing on differences between haves and have-nots in a realm wealthy with both stylistic excesses and urban grit. Based upon a novel by Richard K. Morgan, Altered Carbon takes place centuries hence in a radically-transformed San Francisco known as Bay City. A human’s consciousness in its entirety can be stored in a disk and implanted into different bodies known as ‘sleeves.’ Long-dead soldier-turned-mercenary Takeshi Kovacs (played in the present by Joel Kinnaman, and by Will Yun Lee in flashbacks to a past sleeve-life), is restored to life at the behest of wealthy Laurens Bancroft (James Purefoy), who challenges Kovacs to solve the rich man’s own murder. For Senior Visual Effects Supervisor Everett Burrell, who would oversee principal VFX vendor DNEG TV as well as Milk VFX, Atomic Arts and Pixel Light Effects, working on Altered Carbon harkened back to early in his career, when he focused on prosthetic effects while fabricating alien creatures for the Babylon 5 series. “It’s really kind of funny how much Altered Carbon reminded me in some ways of B5,” he acknowledges. “Both shows offered up a lot of new ideas, each proving to be very exciting in terms of how they explored a unique landscape. My background in practical effects afforded me a deep affection for making in-camera solutions work when that is an option, which is something I still believe in, though a marriage between techniques is often the best solution.” Burrell joined the pre-production effort in May 2016. “Production Designer Carey Meyer (Firefly) had come on in April, so they already had a good handle on aspects of the city, like Bancroft’s tower,” Burrell states. “He and I hit it off right away, and I very much enjoyed how we could riff back and forth about visual concepts. Showrunner Laeta Kalogridis was a very active participant as well while we spent a month exploring various ideas. One focus of our design effort was visualizing the differences between various classes of citizens. The Grounders existed in the lower levels, and the Aerium, high above the clouds, is where the wealthiest lived. Between them was Twilight, who represented the upper-middle section of society.” Although black and white films like Touch of Evil and The Third Man were touch-points for the Altered Carbon design effort, the (replicant) elephant in the room remained Scott’s Blade Runner. “One of the first things we did as a group was go to a really nice theater at DeLuxe to watch it again,” says Burrell. “We wanted to see what they did to make that film work so well. In turn, that made us think about what we could do to achieve a similar level of credible spectacle, but without falling into straight emulation,

which wouldn’t have done credit to the great source material Altered Carbon is based on. While people remember Blade Runner for those iconic cityscapes, the film actually has tons of close-ups of Deckard and Rachel talking with everything out of focus behind them, owing to long lenses. And that worked well for us, because when you put a long lens on the Alexa 65, it mimics that lack of depth of field, and the out of focus background registers in an impressionistic way that, while still looking cool, isn’t totally about the detail and texture way back there in frame.” More recent efforts in the subgenre were less influential. “We watched the live-action version of Ghost in the Shell,” Burrell reveals, “but Laeta didn’t like that palette, so we steered well clear of the pinkish hues and made a point of muting our greens, especially during the DI. And when I saw Blade Runner 2049, we realized they had put a dam in, just like our dam around San Francisco Bay. I was like, ‘Oh shit, they got here first!’ But as these films all feature extrapolation from present-day concerns, it kind of makes sense we both drew similar conclusions about rising sea levels. The thing we had that 2049 didn’t was a beautiful, sunny, pristine world that wasn’t mired in fog and haze, revealed once we broke through those clouds.” Burrell acknowledges that there was very much a blue-sky aspect to this early development work, and that when reality set in, the possibilities were scaled back somewhat. “But I have to say, we aimed very high with this, always intending a Game of Thrones level of production with extremely high-quality visuals, making a particular effort to get our CG to a photoreal level. Ironically, some stuff built practically, like the limousine, had such a nice finish that when we created an exact duplicate in CG, our first turntable looked so pristine and perfect that it generated all these ‘It looks totally fake!’ responses. We wound up having to put extra layers of dirt and scratches on our digital model in order for people to believe this candy-apple/new-car/showroom finish.” The Netflix mandate for 4K delivery was a higher specification than Burrell has dealt with for feature work, but advance planning facilitated the workflow. Encore Vancouver processed digital dailies as ProRes 4444 XQ files, capable of supporting 12-bit imagery, while a Technicolor innovation enabled efficient linkage with overseas facilities. “We talked at length with DNEG TV about the huge amount of

OPPOSITE TOP AND BOTTOM: Before and after. Revived mercenary turned sleuth Takeshi Kovacs (Joel Kinnaman) surveys Bay City in the Netflix presentation of Altered Carbon. Showrunner Leta Kalogridis and Visual Effects Supervisor Everett Burrell selected DNEG TV as principal VFX provider for the series, which has been renewed for a second season. TOP AND BOTTOM: Before and after. A key visual component in Altered Carbon relates to how the rich quite literally occupy the high ground, known as the Aerium, above the clouds, while the have-not ‘Grounders’ scrounge out a living near the ground. Middle-class life exists on the levels in between.

“We wanted to see what they did to make [Blade Runner] work so well. And in turn, that made us think about what we could do to achieve a similar level of credible spectacle, but without falling into straight emulation, which wouldn’t have done credit to the great source material Altered Carbon is based on.” —Everett Burrell, Senior Visual Effects Supervisor

WINTER 2019 VFXVOICE.COM • 73


TV

TOP LEFT AND RIGHT: Before and after. Wary of how some digital water solutions look less liquid and more like CG Jell-O, Burrell found DNEG’s Houdinibased sim, augmented by Ludiq’s Chronos, to be both robust and visually credible. BOTTOM LEFT AND RIGHT: Before and after. DNEG used lidar scan data from the full-size set, then employed City Engine to model all the numerous structures that brought Bay City to cinematic life. Burrell strove to incorporate the practical imperfections inherent in reality for the CG cityscape.

74 • VFXVOICE.COM WINTER 2019

data generated by these Alexa 65 cameras,” Burrell reports, “and were able to take advantage of a new feature offered by DeLuxe called the [Synapse] Portal. That offered iCloud storage at native resolution, which could be 5K or 6K. We would send an EDL to the portal, which, via Espera, was a seamless way to push data across the ocean to our UK vendors.” Final delivery was managed by Encore Hollywood, including finishing in Dolby Vision HDR. While DNEG took on the lion’s share of VFX shots, Atomic Arts handled a variety of muzzle flashes and bullet hits, and Lola – which had just opened a U.K. facility – took on cosmetic fix-it work, which helped distinguish the skins of various ‘sleeves.’ “Laeta really liked the work Milk VFX did on another U.K. show, and wanted us to find a place for them,” notes Burrell. “Since episode 7 was a kind of standalone episode [it takes place in an earlier time, showing Kovacs in his previous incarnation] and had its own unique look, we figured that Milk could take most of that on and give DNEG a break from the huge bulk of work needed for the rest of the series.” Milk VFX Supervisor Nicolas Hernandez spearheaded Milk’s 70-shot contribution, and Pixel Light Effects completed the list of providers, handling scanning duties on set. Meyer built the enormous five-block street set inside of a Vancouver, B.C. warehouse. “That Bay City street featured massive Calatrava-like structures along with storefronts,” notes Burrell. “We could achieve daylight versions of the location as well as a night look, owing to these amazing [UV translight] backdrops at each end of the street. They could be used frontlit or backlit, and made a world of difference

“Ironically, some stuff built practically, like the limousine, had such a nice finish that when we created an exact duplicate in CG, our first turntable looked so pristine and perfect that it generated all these ‘It looks totally fake!’ responses. We wound up having to put extra layers of dirt and scratches on our digital model in order for people to believe this candy-apple/ new-car/showroom finish.” —Everett Burrell, Senior Visual Effects Supervisor for us. If we’d had to shoot bluescreen with all the smoke and rain being used, VFX would have been dead right out of the gate.” This in-camera solution worked for a large majority of street views, until the camera tilted up, at which point DNEG’s involvement came to the fore. The vendor utilized Esri’s City Engine software to model dozens of structures that made up thousands of Bay City buildings. “DNEG took a lidar scan of the set and used it as part of their foundation for digital buildings extending upward past what Carey had constructed,” Burrell continues. “The imperfections in the full-size build that were acquired via this process helped enormously when achieving convincing detail in the CG, which might otherwise have featured too-perfect lines to the buildings on this street. If you look at Peter Ellenshaw’s matte paintings for Disney up close and in person, you see actual paint strokes amid these blotchy patches of color, and yet on film, that all translates to beautifully-realized environments. Detail is needed, but if the form and function aren’t right, the detail won’t help.” Burrell believes that training in the arts is still beneficial in the digital era, perhaps even essential. “Without an education in the traditional arts, you don’t know how to apply the rules or when to break them,” he emphasizes. “I think a lot of digital artists who don’t have that background and/or training – and this isn’t intended as disparagement of ability, just observation – don’t know some of the practical applications for extending things in the digital world. You must know how to mimic the chaos and weirdness that comes from mixing colors in reality, or shooting with a lens that distorts the image lines. I try to bring my education from high school and college to what I need to achieve on set.” As an example, Burrell cites a most basic tool. “In my bag, I keep a color wheel with me at all times. I use this to show people what kind of result to expect when mixing certain colors. You can get these at any art store, and they are the basis for any painter, but you don’t always see them being utilized.” Pilot director Miguel Sapochnik, responsible for setting much of the tone for the series, also had a lot of input on the visuals. “Miguel indicated he didn’t want to see just that same old Princess Leia hologram,” Burrell relates, “so at first we experimented with

TOP AND BOTTOM: Before and after. Production designer Carey Meyer built a huge stretch of street within a warehouse, which in addition to VFX set extensions, also exploited UV translight backdrops that could be front- or back-lit to create day and night vistas.

WINTER 2019 VFXVOICE.COM • 75


TV

TOP LEFT AND RIGHT: Before and after. Wary of how some digital water solutions look less liquid and more like CG Jell-O, Burrell found DNEG’s Houdinibased sim, augmented by Ludiq’s Chronos, to be both robust and visually credible. BOTTOM LEFT AND RIGHT: Before and after. DNEG used lidar scan data from the full-size set, then employed City Engine to model all the numerous structures that brought Bay City to cinematic life. Burrell strove to incorporate the practical imperfections inherent in reality for the CG cityscape.

74 • VFXVOICE.COM WINTER 2019

data generated by these Alexa 65 cameras,” Burrell reports, “and were able to take advantage of a new feature offered by DeLuxe called the [Synapse] Portal. That offered iCloud storage at native resolution, which could be 5K or 6K. We would send an EDL to the portal, which, via Espera, was a seamless way to push data across the ocean to our UK vendors.” Final delivery was managed by Encore Hollywood, including finishing in Dolby Vision HDR. While DNEG took on the lion’s share of VFX shots, Atomic Arts handled a variety of muzzle flashes and bullet hits, and Lola – which had just opened a U.K. facility – took on cosmetic fix-it work, which helped distinguish the skins of various ‘sleeves.’ “Laeta really liked the work Milk VFX did on another U.K. show, and wanted us to find a place for them,” notes Burrell. “Since episode 7 was a kind of standalone episode [it takes place in an earlier time, showing Kovacs in his previous incarnation] and had its own unique look, we figured that Milk could take most of that on and give DNEG a break from the huge bulk of work needed for the rest of the series.” Milk VFX Supervisor Nicolas Hernandez spearheaded Milk’s 70-shot contribution, and Pixel Light Effects completed the list of providers, handling scanning duties on set. Meyer built the enormous five-block street set inside of a Vancouver, B.C. warehouse. “That Bay City street featured massive Calatrava-like structures along with storefronts,” notes Burrell. “We could achieve daylight versions of the location as well as a night look, owing to these amazing [UV translight] backdrops at each end of the street. They could be used frontlit or backlit, and made a world of difference

“Ironically, some stuff built practically, like the limousine, had such a nice finish that when we created an exact duplicate in CG, our first turntable looked so pristine and perfect that it generated all these ‘It looks totally fake!’ responses. We wound up having to put extra layers of dirt and scratches on our digital model in order for people to believe this candy-apple/ new-car/showroom finish.” —Everett Burrell, Senior Visual Effects Supervisor for us. If we’d had to shoot bluescreen with all the smoke and rain being used, VFX would have been dead right out of the gate.” This in-camera solution worked for a large majority of street views, until the camera tilted up, at which point DNEG’s involvement came to the fore. The vendor utilized Esri’s City Engine software to model dozens of structures that made up thousands of Bay City buildings. “DNEG took a lidar scan of the set and used it as part of their foundation for digital buildings extending upward past what Carey had constructed,” Burrell continues. “The imperfections in the full-size build that were acquired via this process helped enormously when achieving convincing detail in the CG, which might otherwise have featured too-perfect lines to the buildings on this street. If you look at Peter Ellenshaw’s matte paintings for Disney up close and in person, you see actual paint strokes amid these blotchy patches of color, and yet on film, that all translates to beautifully-realized environments. Detail is needed, but if the form and function aren’t right, the detail won’t help.” Burrell believes that training in the arts is still beneficial in the digital era, perhaps even essential. “Without an education in the traditional arts, you don’t know how to apply the rules or when to break them,” he emphasizes. “I think a lot of digital artists who don’t have that background and/or training – and this isn’t intended as disparagement of ability, just observation – don’t know some of the practical applications for extending things in the digital world. You must know how to mimic the chaos and weirdness that comes from mixing colors in reality, or shooting with a lens that distorts the image lines. I try to bring my education from high school and college to what I need to achieve on set.” As an example, Burrell cites a most basic tool. “In my bag, I keep a color wheel with me at all times. I use this to show people what kind of result to expect when mixing certain colors. You can get these at any art store, and they are the basis for any painter, but you don’t always see them being utilized.” Pilot director Miguel Sapochnik, responsible for setting much of the tone for the series, also had a lot of input on the visuals. “Miguel indicated he didn’t want to see just that same old Princess Leia hologram,” Burrell relates, “so at first we experimented with

TOP AND BOTTOM: Before and after. Production designer Carey Meyer built a huge stretch of street within a warehouse, which in addition to VFX set extensions, also exploited UV translight backdrops that could be front- or back-lit to create day and night vistas.

WINTER 2019 VFXVOICE.COM • 75


TV

“Most software seems to max out at about 50 feet, and when you go beyond that, CG waters tends to look more like Jell-O. ILM has written a better water sim that seems to go beyond everybody else’s, that deals with fine detail and dirty water, a big step up from how CG water used to look...” —Everett Burrell, Senior Visual Effects Supervisor “Visually, we’ll be able to distinguish each show from what has gone before, because of the varied input of so many different creatives, from showrunners and cinematographers to VFX artists.” —Everett Burrell, Senior Visual Effects Supervisor TOP AND BOTTOM: Scenes from Altered Carbon, and a full view of Bay City – San Francisco centuries in the future.

subtle, ghost-like images then went extreme, introducing a lot of static and weird distortions. The subtler we went, the worse it seemed to look, though for one episode, there was a special kind of high-end hologram, when they walk through a very glossy-looking area. There are three holograms featured in that environment that are very large with a real presence, and it was an interesting experiment, to make these stand out as a kind of Rolex of holograms. In the end, it turned out the Princess Leia version often looked best, because it is both cool and funky, which, when you’re down on street level, worked much better artistically while serving the story.” With the series set in a futuristic version of the city by the bay, it only figures that water simulations – including one featuring

a large craft crashing into the bay – were a significant part of the VFX equation. “With lots of CG water, I see the scale doesn’t look right,” reports Burrell. “Most software seems to max out at about 50 feet, and when you go beyond that, CG waters tends to look more like Jell-O. ILM has written a better water sim that seems to go beyond everybody else’s, that deals with fine detail and dirty water, a big step up from how CG water used to look like snot. DNEG’s sim, owing to a Houdini engine, was pretty robust, using [Ludig’s] Chronos, which lets you sculpt atop the simulation once the mesh is baked out. Even so, it was three or four takes before they got things right. On one take, the water went so high, it went right though the Golden Gate Bridge and would have killed everybody!” Another deadly situation for Bay City dwellers takes place in a vertical arena in which weightless combat is waged. Production found a theater in the round at a local university to stand in for the venue. “All of that zero-g battle needed to look as real as possible,” maintains Burrell, “so that ruled out early discussions about greenscreen and digital doubles. While there were a lot of cheats – Joel is sometimes just sitting on a bouncy-ball – along with some set extensions and wire removals, getting 99% of that in-camera helped us maintain a visceral quality to the sequence.” Even prior to being nominated for a VFX Emmy, Altered Carbon already won a renewal for a second season from Netflix, which intends to continue the adventures of Kovacs, with Anthony Mackie taking over as the character’s newest ‘skin’ in a second season slated to air later in 2019. “I think that Cyberpunk will remain a very popular subgenre,” Burrell concludes. “Visually, we’ll be able to distinguish each show from what has gone before, because of the varied input of so many different creatives, from showrunners and cinematographers to VFX artists.”

TOP: A map of the San Francisco Bay Area centuries from now, known as Bay City in Altered Carbon. MIDDLE: A “sleeve sack” in Altered Carbon. A human’s consciousness can be stored in a disk and implanted into different bodies known as “sleeves.” BOTTOM: Concept art of Bay City, the Golden Gate Bridge and Bancroft’s Tower in Altered Carbon.

76 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 77


TV

“Most software seems to max out at about 50 feet, and when you go beyond that, CG waters tends to look more like Jell-O. ILM has written a better water sim that seems to go beyond everybody else’s, that deals with fine detail and dirty water, a big step up from how CG water used to look...” —Everett Burrell, Senior Visual Effects Supervisor “Visually, we’ll be able to distinguish each show from what has gone before, because of the varied input of so many different creatives, from showrunners and cinematographers to VFX artists.” —Everett Burrell, Senior Visual Effects Supervisor TOP AND BOTTOM: Scenes from Altered Carbon, and a full view of Bay City – San Francisco centuries in the future.

subtle, ghost-like images then went extreme, introducing a lot of static and weird distortions. The subtler we went, the worse it seemed to look, though for one episode, there was a special kind of high-end hologram, when they walk through a very glossy-looking area. There are three holograms featured in that environment that are very large with a real presence, and it was an interesting experiment, to make these stand out as a kind of Rolex of holograms. In the end, it turned out the Princess Leia version often looked best, because it is both cool and funky, which, when you’re down on street level, worked much better artistically while serving the story.” With the series set in a futuristic version of the city by the bay, it only figures that water simulations – including one featuring

a large craft crashing into the bay – were a significant part of the VFX equation. “With lots of CG water, I see the scale doesn’t look right,” reports Burrell. “Most software seems to max out at about 50 feet, and when you go beyond that, CG waters tends to look more like Jell-O. ILM has written a better water sim that seems to go beyond everybody else’s, that deals with fine detail and dirty water, a big step up from how CG water used to look like snot. DNEG’s sim, owing to a Houdini engine, was pretty robust, using [Ludig’s] Chronos, which lets you sculpt atop the simulation once the mesh is baked out. Even so, it was three or four takes before they got things right. On one take, the water went so high, it went right though the Golden Gate Bridge and would have killed everybody!” Another deadly situation for Bay City dwellers takes place in a vertical arena in which weightless combat is waged. Production found a theater in the round at a local university to stand in for the venue. “All of that zero-g battle needed to look as real as possible,” maintains Burrell, “so that ruled out early discussions about greenscreen and digital doubles. While there were a lot of cheats – Joel is sometimes just sitting on a bouncy-ball – along with some set extensions and wire removals, getting 99% of that in-camera helped us maintain a visceral quality to the sequence.” Even prior to being nominated for a VFX Emmy, Altered Carbon already won a renewal for a second season from Netflix, which intends to continue the adventures of Kovacs, with Anthony Mackie taking over as the character’s newest ‘skin’ in a second season slated to air later in 2019. “I think that Cyberpunk will remain a very popular subgenre,” Burrell concludes. “Visually, we’ll be able to distinguish each show from what has gone before, because of the varied input of so many different creatives, from showrunners and cinematographers to VFX artists.”

TOP: A map of the San Francisco Bay Area centuries from now, known as Bay City in Altered Carbon. MIDDLE: A “sleeve sack” in Altered Carbon. A human’s consciousness can be stored in a disk and implanted into different bodies known as “sleeves.” BOTTOM: Concept art of Bay City, the Golden Gate Bridge and Bancroft’s Tower in Altered Carbon.

76 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 77


COVER

The Netflix Phenomenon: Industry Reaction – in Quotes

NETFLIX: A ROBUST PROVIDER OF ‘UPPER-TIER’ VFX By CHRIS McGOWAN

TOP LEFT: Andrew Fowler, Netflix Head of Global VFX TOP RIGHT: Stranger Things BOTTOM LEFT AND RIGHT: Altered Carbon OPPOSITE TOP: Lost in Space

One only has to look at the 2018 Emmy Awards to recognize Netflix’s rapid ascension and its importance to the film and TV industries, including those people working in the VFX domain. HBO previously garnered 23 Emmys for 16 years straight, but this past year Netflix achieved a significant milestone by earning 23 Emmys – the same as HBO. Netflix only got into streaming media in 2007 and debuted its first exclusive series, Lilyhammer, in 2012. It launched the original series House of Cards and Orange Is the New Black the following year. Since then, the company’s expansion of its original and exclusive programming has positively exploded. Netflix claims to have more than 130 million subscribers globally. The Economist projected that in 2018, Netflix would spend $12-13 billion on original programing (earlier estimates by others were in the $8 billion range), which would include an astounding 82 feature films. CFO David Wells estimated last February that Netflix would have some 700 new or existing original series worldwide in 2018, including 80 non-English-language productions. It debuted 52 original shows and movies alone just in September. The Emmys also reflected Netflix’s growing excellence in VFX. In the Outstanding Special Video Effects category, Netflix took three of the nominations, with Altered Carbon (“Out of the Past”), Lost in Space (“Danger, Will Robinson”) and Stranger Things (“Chapter Nine: The Gate”). (HBO’s Westworld and Game of Thrones were also nominated, with the latter triumphing). The

three shows illustrate how Netflix has been applying upper-tier VFX to its series, creating what Milk VFX CEO Will Cohen refers to as “feature television” and others term “cinematic TV.” For Lost in Space, for example, more than 20 vendors from four continents contributed to the high-end VFX, according to Jabbar Raisani, Visual Effects Supervisor for the show’s 2018 first season. Altered Carbon and Stranger Things also had extensive teams of VFX houses working on their impressive effects. Black Mirror is another Netflix original with impressive VFX. Netflix has established itself as “among the biggest providers of visual effects work today,” comments Aymeric Perceval, Mill Film VFX Supervisor. While the visual effects quality is generally high on its shows, Netflix applies it judiciously. “The plan of course is to keep ‘raising the bar,’ but this objective needs to be executed with sensitivity so that each show doesn’t simply become bigger and more complex than the last for no rhyme or reason,” says Andrew Fowler, Netflix Head of Global VFX. “However,” he adds, “with many filmmakers turning to series, they expect a certain level of finish, and Netflix does all it can to partner with all involved to maintain top-quality work responsibly. Stranger Things is a great example of finding the right level of execution, a ‘film brush’ [business approach] employed to a [television] series budget and schedule.” Fowler continues, “At Netflix we have an amazing VFX team managing film and series that touch on many levels of scope. From documentaries, international content, indie film, and of course original film and series productions, these content verticals all have the day-to-day support of the Netflix VFX studio group to help improve workflow and add efficiency within the VFX space at Netflix. The intent is to provide a collaborative approach to the

“The plan of course is to keep ‘raising the bar,’ but this objective needs to be executed with sensitivity so that each show doesn’t simply become bigger and more complex than the last for no rhyme or reason.” —Andrew Fowler, Netflix Head of Global VFX

78 • VFXVOICE.COM WINTER 2019

“Netflix is reinventing the way we produce, tell and consume stories, and I believe it will be a great place to push the envelope in terms of visual effects. The streamer provides writers and directors with a vision check to explore bold storytelling and visual ideas that could be a bit too risky for some traditional studios. This was the case for Duncan Jones’ film Mute, which I supervised for Cinesite earlier this year. Jones spent 15 years trying to get the project off the ground, no small feat in an industry that has all but abandoned mid-budget genre pictures. Without Netflix, the film and our VFX work may not have reached an audience.” —Salvador Zalvidea, VFX Supervisor, Cinesite

“Working on Lost in Space has shown us the invaluable respect and knowledge that Netflix has for the post-production process. From planning to execution, we had a close collaboration with their team and felt fully supported, which enabled us to push the quality of VFX production in an episodic show.” —Niklas Jacobson, VFX Supervisor/Cofounder, ILP

“From the bidding process to the final delivery, Cinesite Montreal has worked hand-in-hand with the creatives on Lost in Space, Game Over Man!, and now Murder Mystery to find the ultimate way to enhance each of the projects. We’re fans of the shows and projects, so this pushes us to strive for something new, something epic. Knowing that we are part of a new era in production, distribution and visual effects is an added bonus.” —Marc A. Rousseau, Director, VFX Studio, Cinesite

WINTER 2019 VFXVOICE.COM • 79


COVER

The Netflix Phenomenon: Industry Reaction – in Quotes

NETFLIX: A ROBUST PROVIDER OF ‘UPPER-TIER’ VFX By CHRIS McGOWAN

TOP LEFT: Andrew Fowler, Netflix Head of Global VFX TOP RIGHT: Stranger Things BOTTOM LEFT AND RIGHT: Altered Carbon OPPOSITE TOP: Lost in Space

One only has to look at the 2018 Emmy Awards to recognize Netflix’s rapid ascension and its importance to the film and TV industries, including those people working in the VFX domain. HBO previously garnered 23 Emmys for 16 years straight, but this past year Netflix achieved a significant milestone by earning 23 Emmys – the same as HBO. Netflix only got into streaming media in 2007 and debuted its first exclusive series, Lilyhammer, in 2012. It launched the original series House of Cards and Orange Is the New Black the following year. Since then, the company’s expansion of its original and exclusive programming has positively exploded. Netflix claims to have more than 130 million subscribers globally. The Economist projected that in 2018, Netflix would spend $12-13 billion on original programing (earlier estimates by others were in the $8 billion range), which would include an astounding 82 feature films. CFO David Wells estimated last February that Netflix would have some 700 new or existing original series worldwide in 2018, including 80 non-English-language productions. It debuted 52 original shows and movies alone just in September. The Emmys also reflected Netflix’s growing excellence in VFX. In the Outstanding Special Video Effects category, Netflix took three of the nominations, with Altered Carbon (“Out of the Past”), Lost in Space (“Danger, Will Robinson”) and Stranger Things (“Chapter Nine: The Gate”). (HBO’s Westworld and Game of Thrones were also nominated, with the latter triumphing). The

three shows illustrate how Netflix has been applying upper-tier VFX to its series, creating what Milk VFX CEO Will Cohen refers to as “feature television” and others term “cinematic TV.” For Lost in Space, for example, more than 20 vendors from four continents contributed to the high-end VFX, according to Jabbar Raisani, Visual Effects Supervisor for the show’s 2018 first season. Altered Carbon and Stranger Things also had extensive teams of VFX houses working on their impressive effects. Black Mirror is another Netflix original with impressive VFX. Netflix has established itself as “among the biggest providers of visual effects work today,” comments Aymeric Perceval, Mill Film VFX Supervisor. While the visual effects quality is generally high on its shows, Netflix applies it judiciously. “The plan of course is to keep ‘raising the bar,’ but this objective needs to be executed with sensitivity so that each show doesn’t simply become bigger and more complex than the last for no rhyme or reason,” says Andrew Fowler, Netflix Head of Global VFX. “However,” he adds, “with many filmmakers turning to series, they expect a certain level of finish, and Netflix does all it can to partner with all involved to maintain top-quality work responsibly. Stranger Things is a great example of finding the right level of execution, a ‘film brush’ [business approach] employed to a [television] series budget and schedule.” Fowler continues, “At Netflix we have an amazing VFX team managing film and series that touch on many levels of scope. From documentaries, international content, indie film, and of course original film and series productions, these content verticals all have the day-to-day support of the Netflix VFX studio group to help improve workflow and add efficiency within the VFX space at Netflix. The intent is to provide a collaborative approach to the

“The plan of course is to keep ‘raising the bar,’ but this objective needs to be executed with sensitivity so that each show doesn’t simply become bigger and more complex than the last for no rhyme or reason.” —Andrew Fowler, Netflix Head of Global VFX

78 • VFXVOICE.COM WINTER 2019

“Netflix is reinventing the way we produce, tell and consume stories, and I believe it will be a great place to push the envelope in terms of visual effects. The streamer provides writers and directors with a vision check to explore bold storytelling and visual ideas that could be a bit too risky for some traditional studios. This was the case for Duncan Jones’ film Mute, which I supervised for Cinesite earlier this year. Jones spent 15 years trying to get the project off the ground, no small feat in an industry that has all but abandoned mid-budget genre pictures. Without Netflix, the film and our VFX work may not have reached an audience.” —Salvador Zalvidea, VFX Supervisor, Cinesite

“Working on Lost in Space has shown us the invaluable respect and knowledge that Netflix has for the post-production process. From planning to execution, we had a close collaboration with their team and felt fully supported, which enabled us to push the quality of VFX production in an episodic show.” —Niklas Jacobson, VFX Supervisor/Cofounder, ILP

“From the bidding process to the final delivery, Cinesite Montreal has worked hand-in-hand with the creatives on Lost in Space, Game Over Man!, and now Murder Mystery to find the ultimate way to enhance each of the projects. We’re fans of the shows and projects, so this pushes us to strive for something new, something epic. Knowing that we are part of a new era in production, distribution and visual effects is an added bonus.” —Marc A. Rousseau, Director, VFX Studio, Cinesite

WINTER 2019 VFXVOICE.COM • 79


COVER

TOP LEFT: Lost in Space BOTTOM LEFT AND RIGHT: Black Mirror episode “USS Callister” OPPOSITE TOP TWO: Filming A Series of Unfortunate Events OPPOSITE BOTTOM TWO: Maniac and filming Maniac

creation of VFX content not just for our production partners, but also with our partnered vendors and freelance talent. By creating a smoother experience, the work on the screen has a chance to become the focus leading to greater creative thinking and a higher quality output.” Two of the things that Netflix offers to consumers are a vast supply of programming and on-demand viewing. House of Cards was one of the first TV series to have a new season released all at once. This helped increase the streaming service’s popularity, as it satisfied many viewers’ desire to watch episodes as much as they wanted, when they wanted, instead of having to wait a week between episodes or having to record them. Russell Dodgson, Creative Director Television of Framestore, describes Netflix as being a “positively disruptive influence on how the world experiences long-form entertainment.” Netflix is astute in using analytics of all kinds “to optimize what

“Content providers like Netflix have, without doubt, ushered in a new era of televisual entertainment. It’s part technology-driven: the ability to watch on any screen, anywhere. But Netflix has also committed to a high standard of storytelling and production values that helps push the creative boundaries of the TV series once more. Demand will only grow, now that consumers have had a taste – which is fantastic, as high-caliber televisual VFX is something Framestore prides itself in achieving.” —Michelle Martin, Head of Television, Framestore

“I am personally excited every time I hear of a potential creative collaboration with them, especially since my time VFX Supervising [Black Mirror episode] ‘USS Callister.’ It was a great pleasure to work on a show with such a strong and supported visual ambition.” —Russell Dodgson, Creative Director Television, Framestore

“From a visual effects perspective, they have always been supportive, and with the breadth of content we get to stretch the outer limits of our imaginations and deliver some very exciting shots. They value quality imagery as much as they value storytelling, which means we are constantly challenged to deliver our best. This is a challenge I accept wholeheartedly.” —Chris MacLean, VFX Supervisor, Godless and The Highwaymen

“With many filmmakers turning to series, they expect a certain level of finish, and Netflix does all it can to partner with all involved to maintain top-quality work responsibly. Stranger Things is a great example of finding the right level of execution, a ‘film brush’ [business approach] employed to a [TV] series budget and schedule.” —Andrew Fowler, Netflix Head of Global VFX

80 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 81


COVER

TOP LEFT: Lost in Space BOTTOM LEFT AND RIGHT: Black Mirror episode “USS Callister” OPPOSITE TOP TWO: Filming A Series of Unfortunate Events OPPOSITE BOTTOM TWO: Maniac and filming Maniac

creation of VFX content not just for our production partners, but also with our partnered vendors and freelance talent. By creating a smoother experience, the work on the screen has a chance to become the focus leading to greater creative thinking and a higher quality output.” Two of the things that Netflix offers to consumers are a vast supply of programming and on-demand viewing. House of Cards was one of the first TV series to have a new season released all at once. This helped increase the streaming service’s popularity, as it satisfied many viewers’ desire to watch episodes as much as they wanted, when they wanted, instead of having to wait a week between episodes or having to record them. Russell Dodgson, Creative Director Television of Framestore, describes Netflix as being a “positively disruptive influence on how the world experiences long-form entertainment.” Netflix is astute in using analytics of all kinds “to optimize what

“Content providers like Netflix have, without doubt, ushered in a new era of televisual entertainment. It’s part technology-driven: the ability to watch on any screen, anywhere. But Netflix has also committed to a high standard of storytelling and production values that helps push the creative boundaries of the TV series once more. Demand will only grow, now that consumers have had a taste – which is fantastic, as high-caliber televisual VFX is something Framestore prides itself in achieving.” —Michelle Martin, Head of Television, Framestore

“I am personally excited every time I hear of a potential creative collaboration with them, especially since my time VFX Supervising [Black Mirror episode] ‘USS Callister.’ It was a great pleasure to work on a show with such a strong and supported visual ambition.” —Russell Dodgson, Creative Director Television, Framestore

“From a visual effects perspective, they have always been supportive, and with the breadth of content we get to stretch the outer limits of our imaginations and deliver some very exciting shots. They value quality imagery as much as they value storytelling, which means we are constantly challenged to deliver our best. This is a challenge I accept wholeheartedly.” —Chris MacLean, VFX Supervisor, Godless and The Highwaymen

“With many filmmakers turning to series, they expect a certain level of finish, and Netflix does all it can to partner with all involved to maintain top-quality work responsibly. Stranger Things is a great example of finding the right level of execution, a ‘film brush’ [business approach] employed to a [TV] series budget and schedule.” —Andrew Fowler, Netflix Head of Global VFX

80 • VFXVOICE.COM WINTER 2019

WINTER 2019 VFXVOICE.COM • 81


COVER

TOP LEFT AND RIGHT: Next Gen BOTTOM LEFT: Okja BOTTOM RIGHT: Filming Jessica Jones

82 • VFXVOICE.COM WINTER 2019

kind of content they produce,” comments Method Studios Senior VFX Supervisor Kevin Baillie, who worked on Stranger Things. This helps the company reach viewers with existing content they’d like, and also produce films and series that will appeal to them. As a result, Netflix is a major backer of unique indie productions and of mid-budget genre pictures. The latter had been somewhat abandoned by much of the movie industry, according to Cinesite VFX Supervisor Salvador Zalvidea. He credits Netflix’s openness to “bold storytelling” as having made it possible for the Duncan Jones sci-fi film Mute (which Cinesite worked on) to have been produced. To attract global customers, Netflix is currently making programs in 21 countries, including Brazil, Germany, India and South Korea. With so many original movies and series in production around the world, the coordination of different VFX vendors across the globe is a challenge. “Simply put, it’s evolving, especially for international where being able to add support at the right level at the right time is dynamic,” says Fowler. “The run book for what Netflix is doing in VFX does not exist in some spaces, although we are of course creating solid guidelines as we go, taking our learnings and applying them moving forward. Already in Asia, Latin America, Europe and even in the USA, our

“Netflix is a key driver behind the emerging genre of what is being labeled as ‘feature television.’ The production value bar is being raised like never before and directors are looking for more ambitious film-style VFX work and creative agility. This, coupled with the volume and variety of film and TV content being produced by Netflix, presents an exciting opportunity for VFX studios, such as Milk, to push the boundaries creatively and technically. We’re excited to be a part of this zeitgeist.” —Will Cohen, CEO, Milk VFX

“Netflix releases a diverse range of quality content each year, and it’s clear it has established itself among the biggest providers of visual effects work today. It feels like there are hundreds of projects brewing at any given time, each with a great deal of cool, creative and challenging projects to work on. Netflix aims to impress, knowing that today’s audience has a very discerning eye when it comes to VFX, and doesn’t pull any punches.” —Aymeric Perceval, VFX Supervisor, Mill Film

“What’s so great about working with them is that they look to visual effects studios as creative partners in developing this content. Their approach is more about collaborating with us around both our capacity and our ideas. As a visual effects company, it’s like, ‘Hallelujah!’ – it’s amazing to be able to share information and work together like this up front.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios

“The run book for what Netflix is doing in VFX does not exist in some spaces, although we are of course creating solid guidelines as we go, taking our learnings and applying them moving forward. Already in Asia, Latin America, Europe and even in the USA, our VFX workflow is iterating and improving all the time thanks to the talented local teams we engage with and the exemplary VFX team here at Netflix working across series, film, international and our initiatives division.” —Andrew Fowler, Netflix Head of Global VFX

“From a workload standpoint, taking on a full series requires having the bandwidth to deliver top-quality VFX work come the final episodes. On Stranger Things season 2... we knew delivering the Upside Down would be a huge creative challenge, and our waterfall delivery schedule required us to deliver a large number of shots for each episode while maintaining the quality of the later episodes’ unique content. Our team was extremely crafty with procedural techniques and pipeline builds that allowed us time for creativity, while delivering against our timeline.” —Seth Hill, VFX Supervisor, Method Studios

“What’s been great in our partnership with Netflix over the past year has been the creative trust that they’ve placed in our supervisors across our three studio locations. We have L.A., Atlanta and NYC Netflix projects all in various stages of production and post, ranging from character-driven indie features where we provide environment enhancements and invisible FX work, all the way up to our more stylized, design-driven work on the new season of Stranger Things. Netflix has been on the forefront of future-proofing the visual quality level available to streaming viewers.” —Matt Akey, Director of Production/ Business Development, Crafty Apes

“The content, along with their visual effects needs, have not only been varied in terms of scope but consistently of the highest quality. We have worked on projects from 13 Reasons Why to Marvel’s Iron Fist, and the commitment to create the best work is extraordinarily consistent. In addition, Netflix has shown a great deal of savviness when it comes to visual effects and consistently supports the options for VFX that are best for the show visually, not just those that are best for the bottom line. The end results are projects that are visually compelling and uniquely engaging to the viewer.” —Greg Anderson, Senior VFX Supervisor/ Head of Production-N.Y., FuseFX

WINTER 2019 VFXVOICE.COM • 83


COVER

TOP LEFT AND RIGHT: Next Gen BOTTOM LEFT: Okja BOTTOM RIGHT: Filming Jessica Jones

82 • VFXVOICE.COM WINTER 2019

kind of content they produce,” comments Method Studios Senior VFX Supervisor Kevin Baillie, who worked on Stranger Things. This helps the company reach viewers with existing content they’d like, and also produce films and series that will appeal to them. As a result, Netflix is a major backer of unique indie productions and of mid-budget genre pictures. The latter had been somewhat abandoned by much of the movie industry, according to Cinesite VFX Supervisor Salvador Zalvidea. He credits Netflix’s openness to “bold storytelling” as having made it possible for the Duncan Jones sci-fi film Mute (which Cinesite worked on) to have been produced. To attract global customers, Netflix is currently making programs in 21 countries, including Brazil, Germany, India and South Korea. With so many original movies and series in production around the world, the coordination of different VFX vendors across the globe is a challenge. “Simply put, it’s evolving, especially for international where being able to add support at the right level at the right time is dynamic,” says Fowler. “The run book for what Netflix is doing in VFX does not exist in some spaces, although we are of course creating solid guidelines as we go, taking our learnings and applying them moving forward. Already in Asia, Latin America, Europe and even in the USA, our

“Netflix is a key driver behind the emerging genre of what is being labeled as ‘feature television.’ The production value bar is being raised like never before and directors are looking for more ambitious film-style VFX work and creative agility. This, coupled with the volume and variety of film and TV content being produced by Netflix, presents an exciting opportunity for VFX studios, such as Milk, to push the boundaries creatively and technically. We’re excited to be a part of this zeitgeist.” —Will Cohen, CEO, Milk VFX

“Netflix releases a diverse range of quality content each year, and it’s clear it has established itself among the biggest providers of visual effects work today. It feels like there are hundreds of projects brewing at any given time, each with a great deal of cool, creative and challenging projects to work on. Netflix aims to impress, knowing that today’s audience has a very discerning eye when it comes to VFX, and doesn’t pull any punches.” —Aymeric Perceval, VFX Supervisor, Mill Film

“What’s so great about working with them is that they look to visual effects studios as creative partners in developing this content. Their approach is more about collaborating with us around both our capacity and our ideas. As a visual effects company, it’s like, ‘Hallelujah!’ – it’s amazing to be able to share information and work together like this up front.” —Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios

“The run book for what Netflix is doing in VFX does not exist in some spaces, although we are of course creating solid guidelines as we go, taking our learnings and applying them moving forward. Already in Asia, Latin America, Europe and even in the USA, our VFX workflow is iterating and improving all the time thanks to the talented local teams we engage with and the exemplary VFX team here at Netflix working across series, film, international and our initiatives division.” —Andrew Fowler, Netflix Head of Global VFX

“From a workload standpoint, taking on a full series requires having the bandwidth to deliver top-quality VFX work come the final episodes. On Stranger Things season 2... we knew delivering the Upside Down would be a huge creative challenge, and our waterfall delivery schedule required us to deliver a large number of shots for each episode while maintaining the quality of the later episodes’ unique content. Our team was extremely crafty with procedural techniques and pipeline builds that allowed us time for creativity, while delivering against our timeline.” —Seth Hill, VFX Supervisor, Method Studios

“What’s been great in our partnership with Netflix over the past year has been the creative trust that they’ve placed in our supervisors across our three studio locations. We have L.A., Atlanta and NYC Netflix projects all in various stages of production and post, ranging from character-driven indie features where we provide environment enhancements and invisible FX work, all the way up to our more stylized, design-driven work on the new season of Stranger Things. Netflix has been on the forefront of future-proofing the visual quality level available to streaming viewers.” —Matt Akey, Director of Production/ Business Development, Crafty Apes

“The content, along with their visual effects needs, have not only been varied in terms of scope but consistently of the highest quality. We have worked on projects from 13 Reasons Why to Marvel’s Iron Fist, and the commitment to create the best work is extraordinarily consistent. In addition, Netflix has shown a great deal of savviness when it comes to visual effects and consistently supports the options for VFX that are best for the show visually, not just those that are best for the bottom line. The end results are projects that are visually compelling and uniquely engaging to the viewer.” —Greg Anderson, Senior VFX Supervisor/ Head of Production-N.Y., FuseFX

WINTER 2019 VFXVOICE.COM • 83


COVER

VFX workflow is iterating and improving all the time thanks to the talented local teams we engage with and the exemplary VFX team here at Netflix working across series, film, international and our initiatives division.” Netflix is often praised for being supportive of and working in close collaboration with VFX vendors, and for its willingness to take chances on content. It is described as “artist-forward” by Mr. X’s VFX Supervisor Chris MacLean, who worked on the Netflix series Godless. “Netflix strives to create firm partnerships with talent based upon freedom and responsibility, which are core values of our company,” comments Fowler. “This strategy leads to better professional understanding with more impactful communication, which in turn leads to greater trust and respect up and down the line. “Working at tremendous scale with high levels of complexity

84 • VFXVOICE.COM WINTER 2019

globally,” Fowler states, “it is imperative to create an environment where partners have their minds set in the right direction, creating their best work, and taking out the friction. Mutual trust and respect lead to better content plain and simple, plus it makes the experience much more enjoyable!” Netflix is present in more than 190 countries. According to Fortune, Goldman Sachs projects that Netflix could be spending $22.5 billion per year on content by 2022. A proportionate part of that will go to VFX.

AD

TOP LEFT: Daredevil TOP RIGHT: Filming Ozark with Jason Bateman, middle. BOTTOM: Kiss Me First

WINTER 2019 VFXVOICE.COM • 85



VFX VAULT

40 YEARS AGO, A MAN DID FLY... SUPERMAN By IAN FAILES

TOP LEFT: Christopher Reeve as Superman during production on Superman. Creative Supervisor and Director of Special Effects Colin Chilvers is at the far right. (Image courtesy of Michael McClellan.) OPPOSITE TOP: The film’s tagline was ‘You’ll believe a man can fly.’ (Image copyright © 1978 Warner Bros.) OPPOSITE BOTTOM: Superman and Lois Lane (Margot Kidder) take flight. Flying gags were achieved several ways, including via wire work, a special hydraulic arm, and a combination of front and rear projection. (Image copyright © 1978 Warner Bros.)

86 • VFXVOICE.COM WINTER 2019

In the past four decades, there have been several Superman film and television projects, but perhaps none is as fondly remembered as Richard Donner’s 1978 movie starring Christopher Reeve as the caped superhero. Superman made use of the latest in practical, miniature and optical effects of the time to convince audiences that, as the film’s tagline heralded, ‘You’ll believe a man can fly.’ The film was ultimately awarded a Special Achievement Academy Award for Visual Effects (presented to Les Bowie, Colin Chilvers, Denys Coop, Roy Field, Derek Meddings and Zoran Perisic) in recognition of its contribution to VFX. Creative Supervisor and Director of Special Effects Colin Chilvers was a key member of the team involved in enabling Superman to fly, and in accomplishing several on-set gags, such as the helicopter crash, earthquake scenes and even an unused tornado sequence. He shared some of his most memorable moments from the set with VFX Voice. TAKING FLIGHT

Finding a convincing way to show Reeve taking to the skies was one of Chilvers’ first challenges. A number of options had already been explored by other filmmakers, including having the actor skydive from a plane with a parachute under his cape. Chilvers even tried a Superman mannequin launched into the air by an air canon and a miniature remote-controlled Superman doll. Ultimately, several techniques would be employed, including filming the actor on wires and a mechanical arm, and using

bluescreen compositing and front projection. “We had this huge problem initially,” states Chilvers, “because, well, Superman’s outfit is blue, and how were we going to shoot against a bluescreen? But, very cleverly, [Creative Supervisor of Optical Visual Effects] Roy Field had worked out that we could actually use a different color blue for Chris’s outfit, and that after he pulled the matte, he could then change that blue input in the final color grading – he could change the blue back to the right blue.” Another artisan Chilvers credits in making Superman fly is Zoran Perisic, who devised the bespoke ‘Zoptic’ optical process for shooting a front-screen projection with two zoom lenses, one on the projector and one on the camera, that were locked together. This enabled zooming in on Reeve to give the feeling that he was flying towards or away from camera. With the bluescreen and optical technologies at their disposal, the special effects team could then devise ways to secure Reeve, and others who he sometimes carries such as Lois Lane (Margot Kidder), while acting out the flying scenes. “We built a rig that had a hydraulic gimbal on the end that we put Christopher on,” explains Chilvers. “He had this fiberglass tray that was molded to his body. For the camera, we slung it on a counter-weighted wire so that the camera and the front-screen projector could move from side to side and up and down, which effectively made it look like Chris was moving from side to side and up and down.” The flying scenes were not all high tech, however. Some on-set

“I’m pretty sure that when we put Christopher on wires and took him up in the air, he thought he was really flying. There were some great stuntmen, but Chris was always the best one to do it. He was a pilot himself, so he had a good idea of what an aircraft would do. He was just amazing like that.” —Colin Chilvers, Creative Supervisor/Director of Special Effects

WINTER 2019 VFXVOICE.COM • 87


VFX VAULT

40 YEARS AGO, A MAN DID FLY... SUPERMAN By IAN FAILES

TOP LEFT: Christopher Reeve as Superman during production on Superman. Creative Supervisor and Director of Special Effects Colin Chilvers is at the far right. (Image courtesy of Michael McClellan.) OPPOSITE TOP: The film’s tagline was ‘You’ll believe a man can fly.’ (Image copyright © 1978 Warner Bros.) OPPOSITE BOTTOM: Superman and Lois Lane (Margot Kidder) take flight. Flying gags were achieved several ways, including via wire work, a special hydraulic arm, and a combination of front and rear projection. (Image copyright © 1978 Warner Bros.)

86 • VFXVOICE.COM WINTER 2019

In the past four decades, there have been several Superman film and television projects, but perhaps none is as fondly remembered as Richard Donner’s 1978 movie starring Christopher Reeve as the caped superhero. Superman made use of the latest in practical, miniature and optical effects of the time to convince audiences that, as the film’s tagline heralded, ‘You’ll believe a man can fly.’ The film was ultimately awarded a Special Achievement Academy Award for Visual Effects (presented to Les Bowie, Colin Chilvers, Denys Coop, Roy Field, Derek Meddings and Zoran Perisic) in recognition of its contribution to VFX. Creative Supervisor and Director of Special Effects Colin Chilvers was a key member of the team involved in enabling Superman to fly, and in accomplishing several on-set gags, such as the helicopter crash, earthquake scenes and even an unused tornado sequence. He shared some of his most memorable moments from the set with VFX Voice. TAKING FLIGHT

Finding a convincing way to show Reeve taking to the skies was one of Chilvers’ first challenges. A number of options had already been explored by other filmmakers, including having the actor skydive from a plane with a parachute under his cape. Chilvers even tried a Superman mannequin launched into the air by an air canon and a miniature remote-controlled Superman doll. Ultimately, several techniques would be employed, including filming the actor on wires and a mechanical arm, and using

bluescreen compositing and front projection. “We had this huge problem initially,” states Chilvers, “because, well, Superman’s outfit is blue, and how were we going to shoot against a bluescreen? But, very cleverly, [Creative Supervisor of Optical Visual Effects] Roy Field had worked out that we could actually use a different color blue for Chris’s outfit, and that after he pulled the matte, he could then change that blue input in the final color grading – he could change the blue back to the right blue.” Another artisan Chilvers credits in making Superman fly is Zoran Perisic, who devised the bespoke ‘Zoptic’ optical process for shooting a front-screen projection with two zoom lenses, one on the projector and one on the camera, that were locked together. This enabled zooming in on Reeve to give the feeling that he was flying towards or away from camera. With the bluescreen and optical technologies at their disposal, the special effects team could then devise ways to secure Reeve, and others who he sometimes carries such as Lois Lane (Margot Kidder), while acting out the flying scenes. “We built a rig that had a hydraulic gimbal on the end that we put Christopher on,” explains Chilvers. “He had this fiberglass tray that was molded to his body. For the camera, we slung it on a counter-weighted wire so that the camera and the front-screen projector could move from side to side and up and down, which effectively made it look like Chris was moving from side to side and up and down.” The flying scenes were not all high tech, however. Some on-set

“I’m pretty sure that when we put Christopher on wires and took him up in the air, he thought he was really flying. There were some great stuntmen, but Chris was always the best one to do it. He was a pilot himself, so he had a good idea of what an aircraft would do. He was just amazing like that.” —Colin Chilvers, Creative Supervisor/Director of Special Effects

WINTER 2019 VFXVOICE.COM • 87


VFX VAULT

“We were all working so hard, but separately we didn’t necessarily see how it was all being joined together. And suddenly we saw it and we knew, wow, this is gonna work.” —Colin Chilvers, Creative Supervisor/Director of Special Effects

88 • VFXVOICE.COM WINTER 2019

shots involved traditional wire work, while others even involved a seesaw-like contraption with counterweights. “Christopher would stand on that,” says Chilvers, “and there’d be a couple of burly guys that, as Christopher bent down to take off, would push down on the other end of the seesaw, and he would go up in the air, and above him there’d be a bar for him to pull himself up on.” The filmmakers had found ways to depict Reeve’s body in flight, but there was still one challenge remaining: how to get Superman’s cape to billow as if flapping in the wind. Nowadays, of course, that kind of thing can be achieved with CG cloth simulation, but during the making of Superman it had to be a practical effect. Having tried a multitude of wind machines, Chilver’s team noticed that the cape simply wrapped around the actor. A different solution was needed, and that came from Creative Supervisor of Mattes and Composites Les Bowie. “Les came up with this idea of making a remote-controlled rig out of an electric motor, with fishing pole-like rods that would fit on Christopher’s back,” outlines Chilvers. “The rods, which were attached to the end of the cape, would flick up and down. Then, with a bit of wind as well, it really did sell the fact that the cape was flapping in the wind from the speed Superman was flying. Incidentally, Les also came up with the vibrator on the wires holding Christopher. If you could vibrate them enough, you

wouldn’t see the wires because of the way we were running the film at 24 frames per second. “I’m pretty sure,” Chilvers add, “that when we put Christopher on wires and took him up in the air, he thought he was really flying. There were some great stuntmen, but Chris was always the best one to do it. He was a pilot himself, so he had a good idea of what an aircraft would do. He was just amazing like that.”

tested some blades that were the right width and thickness, but they were only 10 feet wide, 20 feet in total.” This meant the ‘fake’ blades could be spun at the desired speed – matching what might normally have looked like a helicopter taking off – without worrying about the extending blade hitting a piece of the set or any cast or crew. “That was the illusion,” says Chilvers. “I don’t know if anyone would ever notice that they were actually a third of the size that they should be.”

FAKE BLADES

One of Superman’s first public displays of his superpowers takes place in the movie when he rescues Lois Lane from a helicopter that has crashed atop a building in Metropolis. A number of set pieces and miniatures were utilized to film both the crash and the subsequent rescue. For the crash, which included rotor blades still spinning dangerously, Chilvers’ team rebuilt a real helicopter airframe, and then rigged it to smash into a glass enclosure and fall on its side towards the edge of the building roof. The spinning blades portion of the effect was achieved with an interesting sleight of hand. “We noticed that with helicopters, the blades are going so fast that you don’t really see two-thirds of the blade,” observes Chilvers. “If your blades were, say, 30 feet long, you really only saw 10 feet of the blade that was coming out from the center. So we

TOP: During the ‘earthquake’ sequence, Superman uses his body to fill out a broken train track. Depending on the angle of the shot, a combination of rear projection and matte painting made the final shots possible. (Image copyright © 1978 Warner Bros.) OPPOSITE TOP: The destruction of Krypton was realized via hydraulics rigged underneath the set in order to make it move, coupled with pneumatic tip tanks to topple pieces of polystyrene and polyurethane that had been covered with front-projection materials to give them a ‘Kryptonian’ glow. (Image copyright © 1978 Warner Bros.) OPPOSITE BOTTOM: Lois Lane dangling from a falling helicopter. A full-sized helicopter mock-up was suspended over the side of the building by cranes for this part of the shot, while earlier scenes on the top of the building involved partial-length spinning rotor blades. (Image copyright © 1978 Warner Bros.)

WINTER 2019 VFXVOICE.COM • 89


VFX VAULT

“We were all working so hard, but separately we didn’t necessarily see how it was all being joined together. And suddenly we saw it and we knew, wow, this is gonna work.” —Colin Chilvers, Creative Supervisor/Director of Special Effects

88 • VFXVOICE.COM WINTER 2019

shots involved traditional wire work, while others even involved a seesaw-like contraption with counterweights. “Christopher would stand on that,” says Chilvers, “and there’d be a couple of burly guys that, as Christopher bent down to take off, would push down on the other end of the seesaw, and he would go up in the air, and above him there’d be a bar for him to pull himself up on.” The filmmakers had found ways to depict Reeve’s body in flight, but there was still one challenge remaining: how to get Superman’s cape to billow as if flapping in the wind. Nowadays, of course, that kind of thing can be achieved with CG cloth simulation, but during the making of Superman it had to be a practical effect. Having tried a multitude of wind machines, Chilver’s team noticed that the cape simply wrapped around the actor. A different solution was needed, and that came from Creative Supervisor of Mattes and Composites Les Bowie. “Les came up with this idea of making a remote-controlled rig out of an electric motor, with fishing pole-like rods that would fit on Christopher’s back,” outlines Chilvers. “The rods, which were attached to the end of the cape, would flick up and down. Then, with a bit of wind as well, it really did sell the fact that the cape was flapping in the wind from the speed Superman was flying. Incidentally, Les also came up with the vibrator on the wires holding Christopher. If you could vibrate them enough, you

wouldn’t see the wires because of the way we were running the film at 24 frames per second. “I’m pretty sure,” Chilvers add, “that when we put Christopher on wires and took him up in the air, he thought he was really flying. There were some great stuntmen, but Chris was always the best one to do it. He was a pilot himself, so he had a good idea of what an aircraft would do. He was just amazing like that.”

tested some blades that were the right width and thickness, but they were only 10 feet wide, 20 feet in total.” This meant the ‘fake’ blades could be spun at the desired speed – matching what might normally have looked like a helicopter taking off – without worrying about the extending blade hitting a piece of the set or any cast or crew. “That was the illusion,” says Chilvers. “I don’t know if anyone would ever notice that they were actually a third of the size that they should be.”

FAKE BLADES

One of Superman’s first public displays of his superpowers takes place in the movie when he rescues Lois Lane from a helicopter that has crashed atop a building in Metropolis. A number of set pieces and miniatures were utilized to film both the crash and the subsequent rescue. For the crash, which included rotor blades still spinning dangerously, Chilvers’ team rebuilt a real helicopter airframe, and then rigged it to smash into a glass enclosure and fall on its side towards the edge of the building roof. The spinning blades portion of the effect was achieved with an interesting sleight of hand. “We noticed that with helicopters, the blades are going so fast that you don’t really see two-thirds of the blade,” observes Chilvers. “If your blades were, say, 30 feet long, you really only saw 10 feet of the blade that was coming out from the center. So we

TOP: During the ‘earthquake’ sequence, Superman uses his body to fill out a broken train track. Depending on the angle of the shot, a combination of rear projection and matte painting made the final shots possible. (Image copyright © 1978 Warner Bros.) OPPOSITE TOP: The destruction of Krypton was realized via hydraulics rigged underneath the set in order to make it move, coupled with pneumatic tip tanks to topple pieces of polystyrene and polyurethane that had been covered with front-projection materials to give them a ‘Kryptonian’ glow. (Image copyright © 1978 Warner Bros.) OPPOSITE BOTTOM: Lois Lane dangling from a falling helicopter. A full-sized helicopter mock-up was suspended over the side of the building by cranes for this part of the shot, while earlier scenes on the top of the building involved partial-length spinning rotor blades. (Image copyright © 1978 Warner Bros.)

WINTER 2019 VFXVOICE.COM • 89


VFX VAULT

SOMETIMES THE SIMPLEST EFFECT IS BEST

“I think [the Superman footage] was put together to show Warner Bros. where their money was going, but they [also] showed it to us. [During] one of the iconic shots of Christopher taking off in the ice palace, we all cheered and clapped when he flew by like that. It was like a shot in the arm. It was a rejuvenation for the whole crew that, yes, it was working and we had a great movie coming up.” —Colin Chilvers, Creative Supervisor/Director of Special Effects TOP AND BOTTOM: The remote-controlled cape-waving rig devised to allow Superman’s cape to billow as he flies. (Image credit: http://supermania78.com)

90 • VFXVOICE.COM WINTER 2019

Chilvers oversaw many gags that made complex effects seem simple. At one point, Superman seeks out villain Lex Luthor’s (Gene Hackman) lair, entering the subterranean space by drilling himself through the street. The special effects requirements here were that it had to appear as if the character was spinning at an astounding rate. “The sequence was shot on a stage that had been lifted up about 10 or 12 feet,” details Chilvers. “We had an elevator contraption with Chris on it and a rig hidden by the cape, so that when we spun him around, centrifugal force wouldn’t throw him off. There was a floor that would disintegrate, and we’d be waiting below.” Another simple approach to a complex problem was taken for a shot of Superman pushing a boulder that starts an avalanche after the Hoover Dam has burst, in an effort to stop the flow of water. The entire sequence made use of practical live action, and of miniature effects (supervised by Model Effects Director and Creator Derek Meddings), with the boulder push being a moment actually directed by Chilvers. “That was an interesting practical effect because, in fact, it was a foreground miniature, and Christopher was nowhere near that rock when it goes over and tumbles. We just lined up the camera so that it would look like he was pushing it and then tumbled it down. That was a big part of how effects were done in those days.” A HEAVY EFFECTS WORKLOAD

There were countless other special effects that Chilvers and his department worked on for Superman (and for Superman II, which was largely filmed at the same time). Some of the supervisor’s favorite effects shots ended up being cut from the final product, including a tornado scene. “The tornado we made was about three inches in diameter,” says Chilvers. “We used a round tank with dry ice and some fans blowing from above and fans pushing air around, and then all of a sudden a tornado would suddenly ‘zip’ up. We actually shot a lot of footage of it at 120 frames a second. It looked like a tornado. It was amazing. That was, to me that was a huge achievement.” This kind of ingenuity was prevalent throughout the shoot, although Chilvers also reflects on how grueling filming two movies at once came to be. Luckily, the English crew – which had been shooting at Pinewood Studios throughout 1977 – was emboldened when director Richard Donner somehow secured and screened a print of George Lucas’ Star Wars before it had been released in the U.K. Donner also showed the crew a highly motivational edit of Superman footage that had been shot so far. “I think [the Superman footage] was also put together to show Warner Bros. where their money was going,” says Chilvers, “but they [also] showed it to us. [During ] one of the iconic shots of Christopher taking off in the ice palace, we all cheered and clapped when he flew by like that. It was like a shot in the arm. It was a rejuvenation for the whole crew that, yes, it was working and we had a great movie coming up. We were all working so hard, but separately we didn’t necessarily see how it was all being joined together. And suddenly we saw it and we knew, wow, this is gonna work.”


[ VES SECTION SPOTLIGHT: LONDON ]

London Calling: Thriving in a VFX Metropolis By NAOMI GOLDMAN

TOP: Members and guests of the London Section enjoy exclusive film screenings and lively panel discussions. BOTTOM: London Section members and guests spotlight VFX artistry at their gala celebration.

Much of the Visual Effects Society’s growing international presence is due to its network of Sections whose members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region while advancing the Society and industry worldwide. Founded in 2007, the London Section is thriving with nearly 250 members. The Section is in an exciting period of expansion, mirroring the explosion of its local visual effects industry. In recent years, an impressive string of Academy Award-winning VFX work on Gravity, Interstellar, Ex Machina, The Jungle Book and Blade Runner 2049 has emanated from U.K. VFX houses. VFX companies, including MPC, Cinesite, DNEG, Framestore, Milk and Union FX are at the forefront in delivering best-in-class VFX work, and are finding increasing success in acclaimed film and television projects. “VFX across London is more vibrant than ever, with a large community based in Soho, elevating the local VFX scene.” says Mark Spevick, Co-Chair of the VES London Section and Head of 3D at Escape Studios and Course Leader at Pearson College. “There’s a dynamic mix of collaboration and competition between companies, which makes the kinship so rich,” adds Spevick. “We’re galvanized by these companies and our VES members, who bring diverse expertise and energy to our community of artists and innovators.” Much has been said about Wardour Street in Soho as the longtime center of the British film industry, with the cinemas of Leicester Square, BAFTA and the BFI in close proximity. “The pub scene is a vibrant center point of the VFX community, and every week you can find our members and colleagues alongside directors, editors and producers,” says Spevick. “Our Section regularly hosts pub nights as a gathering spot to build camaraderie among our members, and introduce partners and prospective members to our organization. They are among our most popular events.” The VES Section hosts year-round film screenings and panel discussions, due in large part to its strong partnership with

MPC, which provides their screening room. London is also a part of the VES’s tight-knit global network, and has begun hosting Europe-wide events with its sister Sections, including a summer pub night with the Germany Section in London, and a large-scale educational event with the Germany and France Sections that took place in September, the second “Megabrain MasterClass.” Simulcast from London, DNEG presented on “A Managed Approach to AOV Manipulation” and “A JIT Expression Language for Fast Manipulation of VDB Points and Volumes,” Pixomodo in Germany presented “Extend Nuke’s Interface and Functionality Using PySide,” and Scanbox in France presented “Photogrammetry and PBR Surface Scanning.” “‘Megabrain’ was an exciting opportunity to join forces with the VES Germany and France Sections to create a knowledge-sharing event designed to teach actual techniques to help VFX artists improve their skill-sets,” says Craig Dibble, Co-Chair of the VES London Section and Lead Render Systems Engineer at MPC. “This was a great collaboration that epitomizes the value of the VES as a resource for our members. “In terms of our membership, I think the quality of the work the U.K. produces benefits from the multi-cultural influence of our local talent,” adds Dibble. “A lot of our members are non-U.K. nationals from around the world, further bolstering the VES Section as a VFX home base for networking and professional development.” “With the resurgence of virtual and augmented reality,” continues Dibble, “we are aiming to expand our membership in these areas, as well as in gaming. We are also looking at diversifying our future programming, which we hope will include more technical and educational events around the craft and business of VFX.” Concludes Dibble, “The future looks bright, and we are excited to help build the VES’s global community and provide a valuable experience for our peers throughout London.”

WINTER 2019

VFXVOICE.COM • 91


[ VES NEWS ]

Society Celebrates Distinguished 2018 VES Fellows By NAOMI GOLDMAN Every Fall, the VES carries out its tradition of honoring distinguished visual effects practitioners with special honors. In 2018, the VES Board of Directors was privileged to bestow the VES Fellows mark of distinction upon five exemplary professionals. This year’s venerated VES Fellows conferred with the post-nominal letters “VES” are: Craig Barron, VES; Joyce Cox, VES; Dan Curry, VES; Paul Debevec, VES; and Mike Fink, VES. They were recognized in October at a special reception in Beverly Hills. The Fellows distinction signifies that the individual has maintained an outstanding reputation and has made exceptional achievements and sustained contributions to the art, science or business of visual effects, as well as enabling members’ careers, promoting community worldwide and providing sustained service to the VES, which has significantly advanced the Society, its membership and its mission statement for not less than 10 years within the last 20 years. “Our VES Fellows represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects,” says Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”

Craig Barron, VES. Barron has been an innovator in the cinematic illusion of creating visuals for the past two decades and has contributed to the visual effects on more than 100 films. He began his career at Industrial Light + Magic, where he worked on classic films including The Empire Strikes Back, Raiders of the Lost Ark and E.T. The Extra-Terrestrial. Barron’s visual effects company, Matte World Digital, conjured environments for such films as Titanic, Zodiac, Alice in Wonderland and Hugo. Barron received an Oscar nomination for his work on Batman Returns and an Oscar for best visual effects for The Curious Case of Benjamin Button. He is the Co-chair of the Science and Technology Council for the Academy of Motion Picture Arts and Sciences, and an Academy Governor representing the Visual Effects branch. He is the Creative Director at Magnopus, a Los Angeles-based new media company.

Joyce Cox, VES. In the mid ‘90s, Cox transitioned from producing commercials into the role of VFX producer for feature films. Since then, she has produced thousands of visual effects shots. Her credits include Harry Potter and the Sorcerer’s Stone, X-Men 2, The Dark Knight, Avatar, Men in Black III, and most recently Jon Favreau’s The Jungle Book. Cox has been the recipient of VES awards for her work on Avatar and The Jungle Book. In addition, she teaches “Producing VFX” at the USC School of Cinematic Arts and is working with UST Global Media Services to create Curó, a web-based application for organizing and simplifying the complex process of managing the finances of VFX.

“Our VES Fellows represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects. We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” —Mike Chambers, VES Board Chair

92 • VFXVOICE.COM WINTER 2019

Dan Curry, VES. A veteran of over 100 feature films and television productions, Curry’s career spans over three decades working with some of the industry’s most influential and respected filmmakers. Among other awards, his work has been recognized with seven Emmys (15 nominations) and a VES Award for Best Broadcast Visual Effects. Curry is a past VES Board member and a past Visual Effects Peer Group Governor of the Academy of Television Arts & Sciences. He is also a member of the DGA, ASC and PGA. Curry is a former Peace Corps Volunteer in Community Development, where he designed and supervised the construction of small dams and bridges in rural Thailand. He subsequently directed a Thai language television series, taught architectural drafting at Khon Kaen University, and did freelance art, architecture and production design for clients ranging from the United States Information Service to the late King Bhumibol Adulyadej of Thailand. Returning to the U.S., he taught Fine Arts, Film and Theatre at the university level prior to entering the entertainment industry.

Paul Debevec, VES. Debevec is a Senior Scientist at Google VR and Adjunct Research Professor of Computer Science in the Viterbi School of Engineering at USC, working with the Vision and Graphics Laboratory at the USC Institute for Creative Technologies. His Ph.D. thesis presented Façade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. He led the creation of virtual cinematography for his 1997 film, The Campanile Movie, whose techniques were used to create virtual backgrounds in The Matrix. Subsequently, he pioneered high-dynamic-range image-based lighting techniques, and led the design of HDR Shop, the first high-dynamic-range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices used to create photoreal digital actors in a number of films, including Spider-Man 2, Superman Returns, The Curious Case of Benjamin Button and Avatar. Debevec received ACM SIGGRAPH’s first Significant New Researcher Award. In addition to serving as a longtime VES member, he is Vice President of ACM SIGGRAPH, and a member of the Academy of Motion Picture Arts and Sciences and the Academy’s Science and Technology Council.

Michael Fink, VES. Fink began working in film with China Syndrome. He was hooked, and worked on films such as Star Trek – The Motion Picture and Blade Runner before becoming a visual effects supervisor on War Games (BAFTA nomination). He has since worked on Buckaroo Banzai, Batman Returns (Academy Award nomination, BAFTA nomination), Braveheart, Mars Attacks!, X-Men, X-Men 2, The Golden Compass (VES nomination, Academy Award, BAFTA Award), Avatar, Tron: Legacy, Tree of Life and Life of Pi. Fink directed the first Coca-Cola Polar Bear spot in 1993, which was one of the earliest widely-seen examples of 3D fur on a CG creature. Fink is a founding member of the Visual Effects Society and a former VES Board Member. He is a member of the Executive Committee of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences. He is currently a Professor at the School of Cinematic Arts at the University of Southern California, and Chair of the Division of Film and Television Production. He holds the George Méliès Endowed Chair in Visual Effects at the USC School of Cinematic Arts.

The reception also honored VFX archivist and longtime Board member Gene Kozicki, who was named recipient of the 2018 VES Founders Award. The Society designated venerated visual effects innovator Jonathan Erland, VES with a Lifetime VES Membership, and CEO of Tippett Studio, Jules Roman, with an Honorary VES Membership. The 2018 VES Hall of Fame honorees included: L.B. Abbott, Richard “Doc” Baily, Saul Bass, Ray Harryhausen, Derek Meddings, Eileen Moran and Gene Roddenberry.

WINTER 2019

VFXVOICE.COM • 93


[ VES NEWS ]

Society Celebrates Distinguished 2018 VES Fellows By NAOMI GOLDMAN Every Fall, the VES carries out its tradition of honoring distinguished visual effects practitioners with special honors. In 2018, the VES Board of Directors was privileged to bestow the VES Fellows mark of distinction upon five exemplary professionals. This year’s venerated VES Fellows conferred with the post-nominal letters “VES” are: Craig Barron, VES; Joyce Cox, VES; Dan Curry, VES; Paul Debevec, VES; and Mike Fink, VES. They were recognized in October at a special reception in Beverly Hills. The Fellows distinction signifies that the individual has maintained an outstanding reputation and has made exceptional achievements and sustained contributions to the art, science or business of visual effects, as well as enabling members’ careers, promoting community worldwide and providing sustained service to the VES, which has significantly advanced the Society, its membership and its mission statement for not less than 10 years within the last 20 years. “Our VES Fellows represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects,” says Mike Chambers, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”

Craig Barron, VES. Barron has been an innovator in the cinematic illusion of creating visuals for the past two decades and has contributed to the visual effects on more than 100 films. He began his career at Industrial Light + Magic, where he worked on classic films including The Empire Strikes Back, Raiders of the Lost Ark and E.T. The Extra-Terrestrial. Barron’s visual effects company, Matte World Digital, conjured environments for such films as Titanic, Zodiac, Alice in Wonderland and Hugo. Barron received an Oscar nomination for his work on Batman Returns and an Oscar for best visual effects for The Curious Case of Benjamin Button. He is the Co-chair of the Science and Technology Council for the Academy of Motion Picture Arts and Sciences, and an Academy Governor representing the Visual Effects branch. He is the Creative Director at Magnopus, a Los Angeles-based new media company.

Joyce Cox, VES. In the mid ‘90s, Cox transitioned from producing commercials into the role of VFX producer for feature films. Since then, she has produced thousands of visual effects shots. Her credits include Harry Potter and the Sorcerer’s Stone, X-Men 2, The Dark Knight, Avatar, Men in Black III, and most recently Jon Favreau’s The Jungle Book. Cox has been the recipient of VES awards for her work on Avatar and The Jungle Book. In addition, she teaches “Producing VFX” at the USC School of Cinematic Arts and is working with UST Global Media Services to create Curó, a web-based application for organizing and simplifying the complex process of managing the finances of VFX.

“Our VES Fellows represent a group of exceptional artists, innovators and professionals who have made significant contributions to the field of visual effects. We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” —Mike Chambers, VES Board Chair

92 • VFXVOICE.COM WINTER 2019

Dan Curry, VES. A veteran of over 100 feature films and television productions, Curry’s career spans over three decades working with some of the industry’s most influential and respected filmmakers. Among other awards, his work has been recognized with seven Emmys (15 nominations) and a VES Award for Best Broadcast Visual Effects. Curry is a past VES Board member and a past Visual Effects Peer Group Governor of the Academy of Television Arts & Sciences. He is also a member of the DGA, ASC and PGA. Curry is a former Peace Corps Volunteer in Community Development, where he designed and supervised the construction of small dams and bridges in rural Thailand. He subsequently directed a Thai language television series, taught architectural drafting at Khon Kaen University, and did freelance art, architecture and production design for clients ranging from the United States Information Service to the late King Bhumibol Adulyadej of Thailand. Returning to the U.S., he taught Fine Arts, Film and Theatre at the university level prior to entering the entertainment industry.

Paul Debevec, VES. Debevec is a Senior Scientist at Google VR and Adjunct Research Professor of Computer Science in the Viterbi School of Engineering at USC, working with the Vision and Graphics Laboratory at the USC Institute for Creative Technologies. His Ph.D. thesis presented Façade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. He led the creation of virtual cinematography for his 1997 film, The Campanile Movie, whose techniques were used to create virtual backgrounds in The Matrix. Subsequently, he pioneered high-dynamic-range image-based lighting techniques, and led the design of HDR Shop, the first high-dynamic-range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices used to create photoreal digital actors in a number of films, including Spider-Man 2, Superman Returns, The Curious Case of Benjamin Button and Avatar. Debevec received ACM SIGGRAPH’s first Significant New Researcher Award. In addition to serving as a longtime VES member, he is Vice President of ACM SIGGRAPH, and a member of the Academy of Motion Picture Arts and Sciences and the Academy’s Science and Technology Council.

Michael Fink, VES. Fink began working in film with China Syndrome. He was hooked, and worked on films such as Star Trek – The Motion Picture and Blade Runner before becoming a visual effects supervisor on War Games (BAFTA nomination). He has since worked on Buckaroo Banzai, Batman Returns (Academy Award nomination, BAFTA nomination), Braveheart, Mars Attacks!, X-Men, X-Men 2, The Golden Compass (VES nomination, Academy Award, BAFTA Award), Avatar, Tron: Legacy, Tree of Life and Life of Pi. Fink directed the first Coca-Cola Polar Bear spot in 1993, which was one of the earliest widely-seen examples of 3D fur on a CG creature. Fink is a founding member of the Visual Effects Society and a former VES Board Member. He is a member of the Executive Committee of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences. He is currently a Professor at the School of Cinematic Arts at the University of Southern California, and Chair of the Division of Film and Television Production. He holds the George Méliès Endowed Chair in Visual Effects at the USC School of Cinematic Arts.

The reception also honored VFX archivist and longtime Board member Gene Kozicki, who was named recipient of the 2018 VES Founders Award. The Society designated venerated visual effects innovator Jonathan Erland, VES with a Lifetime VES Membership, and CEO of Tippett Studio, Jules Roman, with an Honorary VES Membership. The 2018 VES Hall of Fame honorees included: L.B. Abbott, Richard “Doc” Baily, Saul Bass, Ray Harryhausen, Derek Meddings, Eileen Moran and Gene Roddenberry.

WINTER 2019

VFXVOICE.COM • 93


[ VFX CAREERS ]

A Globe-Trotter’s Guide to the Wide World of VFX In this edition of VFX Careers, Australian-born, Singapore-based Dayne Cowan, VP of Film at VHQ Media – responsible for managing their feature film divisions in Singapore, Kuala Lumpur and Beijing – recounts his global career journey in visual effects for VFX Voice.

Cowan: I learned fairly quickly that I wanted to go client-side and become an artist, rather than a support guy. I went to work for some other post-production companies in Australia like Acme Digital and Foxtel, and learned about being an artist and how to work with clients along the way. Many of the people I worked with then now work for great companies like Weta and ILM, and they remain good friends. I also learned that if I really wanted to do film VFX work, I would have to leave Australia. Most of my colleagues either left to work in L.A., San Francisco or London. I chose London. VFX Voice: How did you stay current for your journey through VFX?

VFX Voice: What turned you to VFX? Cowan: Right as I was on the brink of changing my degree [from Bachelor of Computer Science at the University of Technology in Sydney, Australia] to something else, I went to a guest lecture by a commercial director for the computer graphics elective option. He had just created a fully CG music video, and I was hooked. The software he had used was available to use in the computer graphics elective, so I signed up for it immediately! I spoke to the Dean of the Computer Science School and begged her to let me do a sub-major in Design, knowing I would need to combine both visual and technical skills. She was wonderfully supportive and agreed to arrange it. I was the first graduate with a sub-major in Design from my school. VFX Voice: What led to your first job?

“I see a huge growth in the Asian market. That is why I chose to come here. It is expanding and developing at an unprecedented rate. Although most of the films in this region are still aimed at a local audience, there is an increasing push towards finding films and TV series that can be made locally and sold globally.” —Dayne Cowan, VP of Film, VHQ Media

Cowan: Once I had decided to focus on computer graphics, I needed to figure out a way to break into the very small Australian industry. I was fortunate enough to hear a presentation by a former student, Harry Magiros, who had started his own company selling computer graphics hardware and software (Prisms/Silicon Graphics). He said during the lecture that he was looking to employ an assistant, and I leapt at the opportunity, practically sprinting up to the stage to speak to him after it was over. VFX Voice: What type of VFX had the biggest impact on you? Cowan: I have a huge love for what we would term “invisible VFX,” which seems like a complete paradox. Almost all films have some VFX, and most of it is not meant to be seen at all. To me, that is the ultimate illusion – being able to trick an audience so thoroughly that they don’t even realize the image has been completely altered. Some of my best work has been in this area. VFX Voice: Did you have a mentor? Cowan: Harry was an excellent mentor, especially on the technical side of things. He was an artist before he set up his own company, so he knew what he was talking about. VFX Voice: What did you learn the first few years on the job?

94 • VFXVOICE.COM WINTER 2019

Cowan: We used to visit SIGGRAPH every year in the ‘90s. It was (and still is) the place to go for the latest and greatest developments in our field. Every year seemed to herald some amazing new development in the industry. I knew from the beginning that this constantly evolving VFX industry was only going to accelerate, and I’d have to work very hard to stay abreast of all new developments. Information and reference are so plentiful now, so I think that VFX artists have no excuse these days! VFX Voice: How did you navigate from one job to the next? Cowan: I arrived in the U.K. in the late ‘90s with nothing but a “working holiday” visa, no concrete job. I spent four months knocking on the door of every post-production company in town, and tried meeting people in the local pubs of Soho, to see if anyone could put in a word from me. I was freelancing, and it took a lot of time and patience to get established. One small job done well usually led to another, then another. It was all word of mouth. You had to deliver a good result, or that hard-won status could be lost. I used to suffer “delivery anxiety” for the first few weeks at every new company! Eventually I got established enough that I was offered some full-time employment. Initially, I did a lot of work for companies like Framestore (CFC), the BBC, Jim Henson’s Creature Workshop, ITV, and eventually Double Negative, which was in its earliest days. I started full-time with DNEG when we had about 50 artists, and stayed with them most of my time in the U.K. I loved the team there (still do), and gradually worked my way up from a junior FX artist, to lighting, to team supervisor, to CG Supervisor, eventually becoming one of the Heads of the CG department. VFX Voice: What was your biggest career challenge? Cowan: Some of my biggest challenges came from having to adapt all of my years of working on Hollywood films to the Asian market. The technology and approach to the work are similar, but you really need to adapt to a very different culture and mindset. It’s not better or worse, just different. It began with a short period when I worked in India, before coming to Singapore. Bollywood is a pretty chaotic industry. China is a much more structured and organized environment. There is a level of fluidity that took some time to adapt to, but I love the challenge. It keeps me inspired.

VFX Voice: Where do you see VFX headed? Cowan: I see a huge growth in the Asian market. That is why I chose to come here. It is expanding and developing at an unprecedented rate. Although most of the films in this region are still aimed at a local audience, there is an increasing push towards finding films and TV series that can be made locally and sold globally, particularly with online streaming services such as Netflix, iFlix, HOOQ, and so on. The way that media is being consumed by the audiences is changing. VFX is changing with it, and we are always under pressure to deliver higher quality, faster and cheaper. On top of this, we are usually expected to be able to change the creative direction at the drop of a hat, so I think the process of generating high-volume VFX needs to become more flexible and adaptable. VFX Voice: What advice would you give to career professionals who might be looking to advance their VFX careers? Cowan: If you are looking to break into the industry, be persistent. Not annoyingly persistent, but stick at it. If you want it badly enough, you will get there. Do a short show reel of your absolute BEST work only, and put that right up at the start of your reel. I’ve reviewed thousands of reels over the years. The best always have short, superb, quality work right at the beginning. VFX profession als do not have time to sit through 4-5 minute-long show reels, and you have to grab their attention very, very quickly. For those that are already working in VFX and want to advance, my suggestion is to tell someone above you where you want to go. Don’t sit back and wait for a promotion, or wait for someone to make you a supervisor. We are not mind readers, and although we may be considering you for a higher or better role, if you come forward and show your enthusiasm for that position, this makes a huge impression. A great drive and attitude will set you apart. Show that you are keen, forward thinking, able to solve problems and take on responsibility. Think laterally: “If this were my own company, what would I be doing to take it forward?” Don’t just be a worker, be a pro-active leader. And then come and see me! Dayne Cowan has over 24 years of experience in the film and television post-production industry, and has worked for many of the largest companies in the U.K., Australia and Asia. He has worked closely with most of the major Hollywood, Chinese and Indian film production studios. His extensive film credits include Batman Begins, Harry Potter and the Order of the Phoenix, Harry Potter and the Half-Blood Prince, and the Academy Award-winning film, The Reader. Since moving to Asia, his film credits include Forever Young, The Legend of the Naga Pearls, Once Upon a Time, and a creative collaboration on Wolf Totem with French Director Jean Jacques Annaud. He is a current Board Member of the Visual Effects Society, former Chair of the VES London Section, and has been involved for many years with various VES committees.

WINTER 2019

VFXVOICE.COM • 95


[ VFX CAREERS ]

A Globe-Trotter’s Guide to the Wide World of VFX In this edition of VFX Careers, Australian-born, Singapore-based Dayne Cowan, VP of Film at VHQ Media – responsible for managing their feature film divisions in Singapore, Kuala Lumpur and Beijing – recounts his global career journey in visual effects for VFX Voice.

Cowan: I learned fairly quickly that I wanted to go client-side and become an artist, rather than a support guy. I went to work for some other post-production companies in Australia like Acme Digital and Foxtel, and learned about being an artist and how to work with clients along the way. Many of the people I worked with then now work for great companies like Weta and ILM, and they remain good friends. I also learned that if I really wanted to do film VFX work, I would have to leave Australia. Most of my colleagues either left to work in L.A., San Francisco or London. I chose London. VFX Voice: How did you stay current for your journey through VFX?

VFX Voice: What turned you to VFX? Cowan: Right as I was on the brink of changing my degree [from Bachelor of Computer Science at the University of Technology in Sydney, Australia] to something else, I went to a guest lecture by a commercial director for the computer graphics elective option. He had just created a fully CG music video, and I was hooked. The software he had used was available to use in the computer graphics elective, so I signed up for it immediately! I spoke to the Dean of the Computer Science School and begged her to let me do a sub-major in Design, knowing I would need to combine both visual and technical skills. She was wonderfully supportive and agreed to arrange it. I was the first graduate with a sub-major in Design from my school. VFX Voice: What led to your first job?

“I see a huge growth in the Asian market. That is why I chose to come here. It is expanding and developing at an unprecedented rate. Although most of the films in this region are still aimed at a local audience, there is an increasing push towards finding films and TV series that can be made locally and sold globally.” —Dayne Cowan, VP of Film, VHQ Media

Cowan: Once I had decided to focus on computer graphics, I needed to figure out a way to break into the very small Australian industry. I was fortunate enough to hear a presentation by a former student, Harry Magiros, who had started his own company selling computer graphics hardware and software (Prisms/Silicon Graphics). He said during the lecture that he was looking to employ an assistant, and I leapt at the opportunity, practically sprinting up to the stage to speak to him after it was over. VFX Voice: What type of VFX had the biggest impact on you? Cowan: I have a huge love for what we would term “invisible VFX,” which seems like a complete paradox. Almost all films have some VFX, and most of it is not meant to be seen at all. To me, that is the ultimate illusion – being able to trick an audience so thoroughly that they don’t even realize the image has been completely altered. Some of my best work has been in this area. VFX Voice: Did you have a mentor? Cowan: Harry was an excellent mentor, especially on the technical side of things. He was an artist before he set up his own company, so he knew what he was talking about. VFX Voice: What did you learn the first few years on the job?

94 • VFXVOICE.COM WINTER 2019

Cowan: We used to visit SIGGRAPH every year in the ‘90s. It was (and still is) the place to go for the latest and greatest developments in our field. Every year seemed to herald some amazing new development in the industry. I knew from the beginning that this constantly evolving VFX industry was only going to accelerate, and I’d have to work very hard to stay abreast of all new developments. Information and reference are so plentiful now, so I think that VFX artists have no excuse these days! VFX Voice: How did you navigate from one job to the next? Cowan: I arrived in the U.K. in the late ‘90s with nothing but a “working holiday” visa, no concrete job. I spent four months knocking on the door of every post-production company in town, and tried meeting people in the local pubs of Soho, to see if anyone could put in a word from me. I was freelancing, and it took a lot of time and patience to get established. One small job done well usually led to another, then another. It was all word of mouth. You had to deliver a good result, or that hard-won status could be lost. I used to suffer “delivery anxiety” for the first few weeks at every new company! Eventually I got established enough that I was offered some full-time employment. Initially, I did a lot of work for companies like Framestore (CFC), the BBC, Jim Henson’s Creature Workshop, ITV, and eventually Double Negative, which was in its earliest days. I started full-time with DNEG when we had about 50 artists, and stayed with them most of my time in the U.K. I loved the team there (still do), and gradually worked my way up from a junior FX artist, to lighting, to team supervisor, to CG Supervisor, eventually becoming one of the Heads of the CG department. VFX Voice: What was your biggest career challenge? Cowan: Some of my biggest challenges came from having to adapt all of my years of working on Hollywood films to the Asian market. The technology and approach to the work are similar, but you really need to adapt to a very different culture and mindset. It’s not better or worse, just different. It began with a short period when I worked in India, before coming to Singapore. Bollywood is a pretty chaotic industry. China is a much more structured and organized environment. There is a level of fluidity that took some time to adapt to, but I love the challenge. It keeps me inspired.

VFX Voice: Where do you see VFX headed? Cowan: I see a huge growth in the Asian market. That is why I chose to come here. It is expanding and developing at an unprecedented rate. Although most of the films in this region are still aimed at a local audience, there is an increasing push towards finding films and TV series that can be made locally and sold globally, particularly with online streaming services such as Netflix, iFlix, HOOQ, and so on. The way that media is being consumed by the audiences is changing. VFX is changing with it, and we are always under pressure to deliver higher quality, faster and cheaper. On top of this, we are usually expected to be able to change the creative direction at the drop of a hat, so I think the process of generating high-volume VFX needs to become more flexible and adaptable. VFX Voice: What advice would you give to career professionals who might be looking to advance their VFX careers? Cowan: If you are looking to break into the industry, be persistent. Not annoyingly persistent, but stick at it. If you want it badly enough, you will get there. Do a short show reel of your absolute BEST work only, and put that right up at the start of your reel. I’ve reviewed thousands of reels over the years. The best always have short, superb, quality work right at the beginning. VFX profession als do not have time to sit through 4-5 minute-long show reels, and you have to grab their attention very, very quickly. For those that are already working in VFX and want to advance, my suggestion is to tell someone above you where you want to go. Don’t sit back and wait for a promotion, or wait for someone to make you a supervisor. We are not mind readers, and although we may be considering you for a higher or better role, if you come forward and show your enthusiasm for that position, this makes a huge impression. A great drive and attitude will set you apart. Show that you are keen, forward thinking, able to solve problems and take on responsibility. Think laterally: “If this were my own company, what would I be doing to take it forward?” Don’t just be a worker, be a pro-active leader. And then come and see me! Dayne Cowan has over 24 years of experience in the film and television post-production industry, and has worked for many of the largest companies in the U.K., Australia and Asia. He has worked closely with most of the major Hollywood, Chinese and Indian film production studios. His extensive film credits include Batman Begins, Harry Potter and the Order of the Phoenix, Harry Potter and the Half-Blood Prince, and the Academy Award-winning film, The Reader. Since moving to Asia, his film credits include Forever Young, The Legend of the Naga Pearls, Once Upon a Time, and a creative collaboration on Wolf Totem with French Director Jean Jacques Annaud. He is a current Board Member of the Visual Effects Society, former Chair of the VES London Section, and has been involved for many years with various VES committees.

WINTER 2019

VFXVOICE.COM • 95


[ FINAL FRAME ]

Mary Poppins’ Oscar-Winning Effects – 55 Years Ago

The visual effects for Robert Stevenson’s 1964 Mary Poppins came together via a unique collaboration of on-set physical effects, matte painting, process photography, in-camera effects, traveling mattes and optical printing. This enabled the lead character to fly, and also interact at various times with 2D-animated characters. Several scenes in the film made use of the sodium vapor process, in which actors were filmed in front of a white screen lit with sodium vapor lights. The technique was invented by Petro Vlahos who, with Ub Iwerks and Wadsworth E. Pohl, was awarded an Oscar statuette in 1965 for the conception and perfection of techniques for Color Traveling Matte Composite Cinematography. Mary Poppins would go on to be recognized with the 1964 Academy Award for Special Visual Effects, awarded to Peter Ellenshaw, Eustace Lycett and Hamilton Luske. Photo courtesy of Walt Disney

96 • VFXVOICE.COM WINTER 2019




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.