VFX Voice - Summer 2019 Issue

Page 1

MOCAP SUITS • ALADDIN ALIVE • THE UMBRELLA ACADEMY VFX • BLACK MIRROR: DARK REFLECTIONS • THE DC TV UNIVERSE THE ORVILLE: SPACE WARS • PROFILES: CHRIS MELEDANDRI, JONATHAN NOLAN, D.B. WEISS & DAVID BENIOFF AND ROGER CORMAN

GAME OF THRONES RULES

VFXVOICE.COM SUMMER 2019




[ EXECUTIVE NOTE ]

Welcome to the Summer issue of VFX Voice! In this issue, we take a closer look at a few major luminaries and our esteemed honorees recognized earlier this year at the 17th Annual VES Awards – Lifetime Achievement Award recipient Chris Meledandri, Visionary Award recipient Jonathan Nolan and Creative Excellence Award recipients David Benioff and D.B. Weiss. We also sit down with legendary producer and director Roger Corman, ‘The Pope of Pop Culture.’ In the exploding field of television visual effects, we go inside Black Mirror, The Umbrella Academy, The Orville, and juggernaut Game of Thrones as the transformational series comes to an end. We take a look at Aladdin and the world of Immersive Cinema, and preview Cool Summer VFX Films and TV. We herald the much-anticipated opening of Star Wars: Galaxy’s Edge lands, and shine a light on cutting-edge trends, including mocap suits and advances in VR. And continuing our review of VES worldwide, we spotlight the VES Washington Section, one of our beacons of VFX (along with Vancouver) in the Pacific Northwest. We’re continuing to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Thank you for your continued support of VFX Voice, the definitive authority on all things VFX.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2019



[ CONTENTS ] FEATURES 8 VFX TRENDS: MOCAP SUITS Exploring the many options in suits and head-cams.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 94 VES NEWS

14 FILM: IMMERSIVE CINEMA Cutting-edge projection, sound boost big-screen theaters. 18 TV: ONE-TAKES How visual effects help make TV long shots possible. 24 TV: THE UMBRELLA ACADEMY Netflix series marries unique superheroes and super VFX. 30 FILM: ALADDIN Disney’s 2D animated classic gets the live-action treatment. 34 FILM AND TV: COOL SUMMER VFX A breakdown of Summer’s top VFX films and TV. 38 PROFILE: CHRIS MELEDANDRI Recipient of the 2019 VES Lifetime Achievement Award. 42 PROFILE: JONATHAN NOLAN Recipient of the 2019 VES Visionary Award. 48 COVER: GAME OF THRONES Behind the milestone HBO series’ stunning effects. 56 PROFILE: D.B. WEISS AND DAVID BENIOFF Recipients of the 2019 VES Creative Excellence Award. 58 LEGEND: ROGER CORMAN Saluting the storied career of the trailblazing producer/director. 64 TV: DC TV Inside the VFX factory powering the DC Television Universe. 70 TV: BLACK MIRROR Visual effects drive the dark tales in the Netflix sci-fi series. 76 MUSIC VIDEOS: STATE OF PLAY Studios and artists answer the demand for captivating VFX. 82 THEME PARKS: STAR WARS: GALAXY’S EDGE Epic effects plunge park-goers into deep Star Wars experiences. 86 TV: THE ORVILLE Effects fuel marathon space battle for the Fox sci-fi comedy drama. 90 VR/AR/MR TRENDS: VIRTUAL ADVANCES Evolving products and experiences broaden mainstream usage.

4 • VFXVOICE.COM SUMMER 2019

96 FINAL FRAME: RAY HARRYHAUSEN

ON THE COVER: The Night King from Game of Thrones. (Image courtesy of HBO)



SUMMER 2019 • VOL. 3, NO. 3

MULTIPLE WINNER OF THE 2018 FOLIO: EDDIE & OZZIE AWARDS

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Lisa Cooke, 2nd Vice Chair Brooke Lyndon-Stanford, Treasurer Rita Cahill, Secretary DIRECTORS Brooke Breton, Kathryn Brillhart, Colin Campbell Bob Coleman, Dayne Cowan, Kim Davidson Rose Duignan, Richard Edlund, VES, Bryan Grill Dennis Hoffman, Pam Hogarth, Jeff Kleiser Suresh Kondareddy, Kim Lavery, VES Tim McGovern, Emma Clifton Perry Scott Ross, Jim Rygiel, Tim Sassoon Lisa Sepp-Wilson, Katie Stetson David Tanaka, Richard Winn Taylor II, VES Cat Thelia, Joe Weidenbach ALTERNATES Andrew Bly, Gavin Graham, Charlie Iturriaga Andres Martinez, Dan Schrecker Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2019 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2019



VFX TRENDS

WHAT MOCAP SUIT SUITS YOU? By IAN FAILES

If you’re looking to bring to life a CG human, character or creature with the aid of any kind of performance capture, there’s now a bevy of options at your disposal. Among those are many different motion-capture suits – ranging from optical to inertial systems, as well as ‘faux mocap’ tracking suits – and facial-capture set ups. VFX Voice asked several mocap vendors and visual effects studios about their various suit and head-cam offerings.

antimicrobial and more breathable than before, offering a better fit and more flexibility, which provides performers with exceptional freedom of movement and comfort over long recording sessions. Our popular X-base markers adhere much better to the new suits, making them nearly impossible to knock off during performance capture, and it allows the performers and the mocap technicians to focus on the performance rather than the tech.” Also providing optical systems is Fox VFX Lab, which was formerly Technoprops. They’ve made significant innovations in virtual cameras and simul-cams, something Technoprops founder Glenn Derry (now team leader, Fox VFX Lab and VP, Visual Effects, at Fox Feature Films) helped pioneer on Avatar. The Fox VFX Lab suits, gloves and marker patches are made by 3 x 3, with the markers themselves from MoCap Solutions. The shoes are Nike with mocap fabric sewed on. “The mocap suits with the gray fabric and colors make it easy to pick out which actor you are looking at when using reference video to make performance selects,” says Derry. “If everyone is wearing solid black the editor’s job is more difficult. Plus, who doesn’t like festive colors?” The head rigs from Fox VFX Lab are designed and built in-house. “The original version of our head rigs was designed and manufactured during the production of Avatar, though at the time the camera was singular and standard definition,” describes Derry. “The head rigs have been continuously improved upon featurewise for the last 13 years. We use them every day on productions ourselves under the most demanding conditions in the business.”

OPTICAL SYSTEMS

In optical motion capture, infrared cameras pick up the light reflected back from retro reflective markers on the suits (in a ‘passive’ system). ‘Active’ systems allow cameras to sync up to strobing markers on the suits. Each camera sees the marker from a 2D perspective and when all of the 2D files are reconstructed together, a 3D marker in space can be calculated. Vicon motion-capture suits include a hat, gloves, overshoes and markers that can be glued or taped onto the suits – there are facial markers, too. “The suit has been designed to be comfy to wear, capable of dealing with stunt work and covered in Velcro so we can attach markers to it,” outlines Vicon VFX product manager Tim Doubleday. “The reflective markers are essentially a molded base covered in reflective scotch tape. They can vary in sizes and are small enough to go on fingers.” Vicon’s facial-capture solution is called Cara, a helmet that consists of four cameras synchronized together to enable highresolution capture using tracking points on the face. “The Cara system can be synced to the Vicon body system to enable seamless integration of face and body capture across multiple actors,” notes Doubleday. “Recently, we also released Cara Lite, a modular version of the Cara system, which can work in either single or stereo mode and supports third-party helmets and software from the likes of Faceware or Dynamixyz.” Meanwhile, Optitrack also offers a full range of motioncapture and tracking solutions, including a new suit. “The new suits were designed to do what the older suits do – only better,” says OptiTrack chief strategy officer Brian Nilles. “They are now

8 • VFXVOICE.COM SUMMER 2019

INERTIAL SYSTEMS

Inertial, or magnetic, motion-capture systems use magnets, accelerometers and gyroscopes all within a contained cable system that tends to zip into some kind of lycra suit. No calibrated volume is required. Xsens offers suit, sensor/tracker and software solutions. MVN Link and MVN Awinda are Xsens’ main offerings, with the suits using 17 sensors embedded or wirelessly strapped to the body of the performer. “The MVN Link is a full-body, camera-less mocap suit that is connected to a wireless data link and used for highdynamic movements, like fighting scenes and fast maneuvers,” explains Xsens product manager Hein Beute. “MVN Awinda is the fully wireless version with wireless sensors built into it. It has greater data collection capabilities and can also be used to capture multiple subjects at once. “The defining characteristic that sets our solution apart from others is the magnetic immunity,” says Beute. “It is the only system that provides magnetic immunity to this level, because of the sophisticated sensor fusion algorithms which took us many years to develop. We can also do multi-level motion capture, which is hard for most other inertial motion-capture systems.” Perception Neuron also offers a full-body wireless motioncapture system using inertial measurement unit technology. Their Neuron suit comes with a network of straps that house the inertial sensors, known as ‘neurons.’ Full hand and finger tracking is part of the set-up for the 2.0 option. “The system includes the Axis Neuron software, which allows for up to five actors at a time to be recorded

OPPOSITE TOP LEFT: Performers wearing Vicon’s motion-capture suits act out a scene in a capture volume. OPPOSITE TOP RIGHT: Motion-capture performers stage a fight scene in Vicon’s motion-capture suits and Cara helmets. OPPOSITE BOTTOM: A performer stands in a T-pose to calibrate the Vicon gear. TOP: OptiTrack’s newly released motion-capture suit. BOTTOM: Close-up on Optitrack’s suit fabric and optical markers.

SUMMER 2019 VFXVOICE.COM • 9


VFX TRENDS

WHAT MOCAP SUIT SUITS YOU? By IAN FAILES

If you’re looking to bring to life a CG human, character or creature with the aid of any kind of performance capture, there’s now a bevy of options at your disposal. Among those are many different motion-capture suits – ranging from optical to inertial systems, as well as ‘faux mocap’ tracking suits – and facial-capture set ups. VFX Voice asked several mocap vendors and visual effects studios about their various suit and head-cam offerings.

antimicrobial and more breathable than before, offering a better fit and more flexibility, which provides performers with exceptional freedom of movement and comfort over long recording sessions. Our popular X-base markers adhere much better to the new suits, making them nearly impossible to knock off during performance capture, and it allows the performers and the mocap technicians to focus on the performance rather than the tech.” Also providing optical systems is Fox VFX Lab, which was formerly Technoprops. They’ve made significant innovations in virtual cameras and simul-cams, something Technoprops founder Glenn Derry (now team leader, Fox VFX Lab and VP, Visual Effects, at Fox Feature Films) helped pioneer on Avatar. The Fox VFX Lab suits, gloves and marker patches are made by 3 x 3, with the markers themselves from MoCap Solutions. The shoes are Nike with mocap fabric sewed on. “The mocap suits with the gray fabric and colors make it easy to pick out which actor you are looking at when using reference video to make performance selects,” says Derry. “If everyone is wearing solid black the editor’s job is more difficult. Plus, who doesn’t like festive colors?” The head rigs from Fox VFX Lab are designed and built in-house. “The original version of our head rigs was designed and manufactured during the production of Avatar, though at the time the camera was singular and standard definition,” describes Derry. “The head rigs have been continuously improved upon featurewise for the last 13 years. We use them every day on productions ourselves under the most demanding conditions in the business.”

OPTICAL SYSTEMS

In optical motion capture, infrared cameras pick up the light reflected back from retro reflective markers on the suits (in a ‘passive’ system). ‘Active’ systems allow cameras to sync up to strobing markers on the suits. Each camera sees the marker from a 2D perspective and when all of the 2D files are reconstructed together, a 3D marker in space can be calculated. Vicon motion-capture suits include a hat, gloves, overshoes and markers that can be glued or taped onto the suits – there are facial markers, too. “The suit has been designed to be comfy to wear, capable of dealing with stunt work and covered in Velcro so we can attach markers to it,” outlines Vicon VFX product manager Tim Doubleday. “The reflective markers are essentially a molded base covered in reflective scotch tape. They can vary in sizes and are small enough to go on fingers.” Vicon’s facial-capture solution is called Cara, a helmet that consists of four cameras synchronized together to enable highresolution capture using tracking points on the face. “The Cara system can be synced to the Vicon body system to enable seamless integration of face and body capture across multiple actors,” notes Doubleday. “Recently, we also released Cara Lite, a modular version of the Cara system, which can work in either single or stereo mode and supports third-party helmets and software from the likes of Faceware or Dynamixyz.” Meanwhile, Optitrack also offers a full range of motioncapture and tracking solutions, including a new suit. “The new suits were designed to do what the older suits do – only better,” says OptiTrack chief strategy officer Brian Nilles. “They are now

8 • VFXVOICE.COM SUMMER 2019

INERTIAL SYSTEMS

Inertial, or magnetic, motion-capture systems use magnets, accelerometers and gyroscopes all within a contained cable system that tends to zip into some kind of lycra suit. No calibrated volume is required. Xsens offers suit, sensor/tracker and software solutions. MVN Link and MVN Awinda are Xsens’ main offerings, with the suits using 17 sensors embedded or wirelessly strapped to the body of the performer. “The MVN Link is a full-body, camera-less mocap suit that is connected to a wireless data link and used for highdynamic movements, like fighting scenes and fast maneuvers,” explains Xsens product manager Hein Beute. “MVN Awinda is the fully wireless version with wireless sensors built into it. It has greater data collection capabilities and can also be used to capture multiple subjects at once. “The defining characteristic that sets our solution apart from others is the magnetic immunity,” says Beute. “It is the only system that provides magnetic immunity to this level, because of the sophisticated sensor fusion algorithms which took us many years to develop. We can also do multi-level motion capture, which is hard for most other inertial motion-capture systems.” Perception Neuron also offers a full-body wireless motioncapture system using inertial measurement unit technology. Their Neuron suit comes with a network of straps that house the inertial sensors, known as ‘neurons.’ Full hand and finger tracking is part of the set-up for the 2.0 option. “The system includes the Axis Neuron software, which allows for up to five actors at a time to be recorded

OPPOSITE TOP LEFT: Performers wearing Vicon’s motion-capture suits act out a scene in a capture volume. OPPOSITE TOP RIGHT: Motion-capture performers stage a fight scene in Vicon’s motion-capture suits and Cara helmets. OPPOSITE BOTTOM: A performer stands in a T-pose to calibrate the Vicon gear. TOP: OptiTrack’s newly released motion-capture suit. BOTTOM: Close-up on Optitrack’s suit fabric and optical markers.

SUMMER 2019 VFXVOICE.COM • 9


VFX TRENDS

Mocap Websites

TOP LEFT: Two performers captured together in Fox VFX Lab’s capture volume. TOP RIGHT: Fox VFX Lab’s motion-capture suit and head rig. BOTTOM: Xsens’ MVN Link (right) and MVN Awinda (left) suits.

Weta Digital and ILM Suit Themselves

Dynamixyz www.dynamixyz.com

Perception Neuron www.neuronmocap.com

Faceware www.facewaretech.com

Rokoko www.rokoko.com

Fox VFX Lab www.foxvfxlab.com

Vicon www.vicon.com

Optitrack www.optitrack.com

Xsens www.xsens.com

or streamed live, and all of the hardware including sensors and straps,” says Perception Neuron’s chief motion-capture technologist, Daniel Cuadra. “Perception Neuron offers accessibility to all levels of users from professionals to beginners, and is also used by many educational institutions due to its versatility and simple set up,” states Cuadra. “Perception Neuron is also the only motion-capture solution to include full-finger tracking at no extra cost.” One of the newer entrants in the inertial space is Rokoko, which makes a wireless motion-capture body suit that is right now pitched at indie developers and smaller studios who might not have had access to mocap previously. The Smartsuit Pro contains 19 sensors that are part of a sports textile and mesh suit, with straps positioned wherever there are sensors. Via the gyro, accelerometer and compass tech in the sensors, the system balances the capture data output on a body model on a suit-based hub, which is transmitted through Wi-Fi to a computer or smart device. “We wanted it to be this personal mocap system,” notes Rokoko founder and CEO Jakob Balslev, “one that you could almost write your own name on, and have there to use anytime. Above all, we wanted something that was intuitive to everybody, something that one person would be able to set up, one person could operate and get started.”

Weta Digital, which has a long history of taking motioncaptured actors through to final CG characters – from Gollum in the Lord of the Rings trilogy, to the primates of the Planet of the Apes films, to the characters of Alita: Battle Angel – has now crafted its own custom motion-capture suit. “On Alita, in particular,” outlines Weta Digital Motion Capture Supervisor Dejan Momcilovic, “we built a custom suit with embedded marker strands that did not allow any marker placement changes, but proved to be very practical, as it was much faster for the performer to get suited up. Head, hands and feet were detachable, so actress Rosa Salazar, as Alita, could have breaks with minimal fuss, which was great given the summer conditions in Texas. The head-cam used was the first stereo cam rig for us, and has since been adopted as our standard rig.” Momcilovic adds that the suits also fit in with an established workflow at the studio for capturing data. Having their own solution gives Weta Digital more consistency and speed of setup, he says. “Basically, we prepare the suit, hand it off to costumes, and the performers then arrive dressed and ready to go. We only need to attach the body pack and the battery. With our custom-built helmets, we are basically removing the need for additional padding, minimizing the weight and providing better performer comfort.”

ILM, too, has its own custom suit, one that is aimed at providing tracking data of performer movement with low impact. Dubbed IMocap, the suit was first used for the pirate characters in Pirates of the Caribbean: Dead Man’s Chest. “The goal was to get close to optical motion-capture quality, but with a small footprint on set, and able to handle difficult location shooting conditions,” remarks ILM Chief Creative Officer John Knoll. The suits once had banded tracking dots on them. Using two high definition video cameras, the position of particular joints could be triangulated with software and provide a fast matchmove of the actor. Since then, the suit has evolved and found use in many productions, including Solo: A Star Wars Story, A Quiet Place, Black Panther, Avengers: Infinity War and Aquaman. IMocap’s markers and dots on the suit have evolved over time, as has the use of custom ILM tools for match-moving and crafting digital characters. “Our most recent version of the IMocap suit, developed for Rogue One: A Star Wars Story, replaces what were colored dots with unique symbols,” describes Senior ILM R&D Engineer Kevin Wooley. “The aim of the symbols is to once again provide more visual distinction and to make it easier to track patterns through blur by allowing the tracker to see different colors in different channels.”

FACIAL CAPTURE

Two of the main players focusing on facial-capture hardware (and which also have software solutions) are Faceware and Dynamixyz. Head-mounted camera systems from Faceware are designed to be used on set, in voice-over or ADR booths and on motion-capture stages, without the need for markers. Faceware also provides a supported software pipeline for getting the data into and out of content creation tools. Faceware’s main offering is the ProHD Headcam, a fiberglass helmet available in three sizes and fitted with different anodized aluminum boom arms so the user can cover any number of capture situations. The camera is a micro Full HD camera – which also includes an onboard mini light box – and is powered by a separate capture belt for wireless transmission. “The real-time full

10 • VFXVOICE.COM SUMMER 2019

TOP LEFT: Motion-capture suits with active markers as used on War for the Planet of the Apes. They were intended to be filmed in the harshest of conditions. BOTTOM LEFT: Bill Nighy’s Davy Jones character from Pirates of the Caribbean: Dead Man’s Chest in an early version of ILM’s IMocap suit. (Image © 2017 Walt Disney Pictures)

TOP RIGHT: Rosa Salazar in Weta Digital’s new motion-capture suit during the filming of Alita: Battle Angel. BOTTOM RIGHT: Actor Alan Tudyk performs the role of robot K-2SO in an IMocap suit for Rogue One: A Star Wars Story. (Image © 2015 Lucasfilm Ltd.)

SUMMER 2019 VFXVOICE.COM • 11


VFX TRENDS

Mocap Websites

TOP LEFT: Two performers captured together in Fox VFX Lab’s capture volume. TOP RIGHT: Fox VFX Lab’s motion-capture suit and head rig. BOTTOM: Xsens’ MVN Link (right) and MVN Awinda (left) suits.

Weta Digital and ILM Suit Themselves

Dynamixyz www.dynamixyz.com

Perception Neuron www.neuronmocap.com

Faceware www.facewaretech.com

Rokoko www.rokoko.com

Fox VFX Lab www.foxvfxlab.com

Vicon www.vicon.com

Optitrack www.optitrack.com

Xsens www.xsens.com

or streamed live, and all of the hardware including sensors and straps,” says Perception Neuron’s chief motion-capture technologist, Daniel Cuadra. “Perception Neuron offers accessibility to all levels of users from professionals to beginners, and is also used by many educational institutions due to its versatility and simple set up,” states Cuadra. “Perception Neuron is also the only motion-capture solution to include full-finger tracking at no extra cost.” One of the newer entrants in the inertial space is Rokoko, which makes a wireless motion-capture body suit that is right now pitched at indie developers and smaller studios who might not have had access to mocap previously. The Smartsuit Pro contains 19 sensors that are part of a sports textile and mesh suit, with straps positioned wherever there are sensors. Via the gyro, accelerometer and compass tech in the sensors, the system balances the capture data output on a body model on a suit-based hub, which is transmitted through Wi-Fi to a computer or smart device. “We wanted it to be this personal mocap system,” notes Rokoko founder and CEO Jakob Balslev, “one that you could almost write your own name on, and have there to use anytime. Above all, we wanted something that was intuitive to everybody, something that one person would be able to set up, one person could operate and get started.”

Weta Digital, which has a long history of taking motioncaptured actors through to final CG characters – from Gollum in the Lord of the Rings trilogy, to the primates of the Planet of the Apes films, to the characters of Alita: Battle Angel – has now crafted its own custom motion-capture suit. “On Alita, in particular,” outlines Weta Digital Motion Capture Supervisor Dejan Momcilovic, “we built a custom suit with embedded marker strands that did not allow any marker placement changes, but proved to be very practical, as it was much faster for the performer to get suited up. Head, hands and feet were detachable, so actress Rosa Salazar, as Alita, could have breaks with minimal fuss, which was great given the summer conditions in Texas. The head-cam used was the first stereo cam rig for us, and has since been adopted as our standard rig.” Momcilovic adds that the suits also fit in with an established workflow at the studio for capturing data. Having their own solution gives Weta Digital more consistency and speed of setup, he says. “Basically, we prepare the suit, hand it off to costumes, and the performers then arrive dressed and ready to go. We only need to attach the body pack and the battery. With our custom-built helmets, we are basically removing the need for additional padding, minimizing the weight and providing better performer comfort.”

ILM, too, has its own custom suit, one that is aimed at providing tracking data of performer movement with low impact. Dubbed IMocap, the suit was first used for the pirate characters in Pirates of the Caribbean: Dead Man’s Chest. “The goal was to get close to optical motion-capture quality, but with a small footprint on set, and able to handle difficult location shooting conditions,” remarks ILM Chief Creative Officer John Knoll. The suits once had banded tracking dots on them. Using two high definition video cameras, the position of particular joints could be triangulated with software and provide a fast matchmove of the actor. Since then, the suit has evolved and found use in many productions, including Solo: A Star Wars Story, A Quiet Place, Black Panther, Avengers: Infinity War and Aquaman. IMocap’s markers and dots on the suit have evolved over time, as has the use of custom ILM tools for match-moving and crafting digital characters. “Our most recent version of the IMocap suit, developed for Rogue One: A Star Wars Story, replaces what were colored dots with unique symbols,” describes Senior ILM R&D Engineer Kevin Wooley. “The aim of the symbols is to once again provide more visual distinction and to make it easier to track patterns through blur by allowing the tracker to see different colors in different channels.”

FACIAL CAPTURE

Two of the main players focusing on facial-capture hardware (and which also have software solutions) are Faceware and Dynamixyz. Head-mounted camera systems from Faceware are designed to be used on set, in voice-over or ADR booths and on motion-capture stages, without the need for markers. Faceware also provides a supported software pipeline for getting the data into and out of content creation tools. Faceware’s main offering is the ProHD Headcam, a fiberglass helmet available in three sizes and fitted with different anodized aluminum boom arms so the user can cover any number of capture situations. The camera is a micro Full HD camera – which also includes an onboard mini light box – and is powered by a separate capture belt for wireless transmission. “The real-time full

10 • VFXVOICE.COM SUMMER 2019

TOP LEFT: Motion-capture suits with active markers as used on War for the Planet of the Apes. They were intended to be filmed in the harshest of conditions. BOTTOM LEFT: Bill Nighy’s Davy Jones character from Pirates of the Caribbean: Dead Man’s Chest in an early version of ILM’s IMocap suit. (Image © 2017 Walt Disney Pictures)

TOP RIGHT: Rosa Salazar in Weta Digital’s new motion-capture suit during the filming of Alita: Battle Angel. BOTTOM RIGHT: Actor Alan Tudyk performs the role of robot K-2SO in an IMocap suit for Rogue One: A Star Wars Story. (Image © 2015 Lucasfilm Ltd.)

SUMMER 2019 VFXVOICE.COM • 11


VFX TRENDS

resolution and full frame-rate transmission allow for the highest quality real-time tracking for live character facial animation, from real-time events to previs,” says Faceware Vice President of Business Development Peter Busch. “The signal can also be recorded for further enhanced tracking and animation.” Dynamixyz’s facial-capture pipeline involves markerless capture, with video recording and head-mounted camera (HMC) software part of the system. The company provides a custom HMC designed to capture faces (“mostly human,” says Dynamixyz CEO Gaspard Breton, “although we would really like to test it on a pet face one day for fun”). The HMC can be either single view, with one camera facing the actor, or multi-view, with two cameras, one for each side that leaves the field of view unobstructed. “Our main software Performer enables you to analyze data and re-target them on any 3D model and any rig,” states Breton. “Once the system is trained, it can produce massive amounts of frames throughout batch processing, with minimal rework for amazing results. Our system is also compatible with any body motioncapture system.” TOP LEFT: Rokoko’s suit contains 19 sensors that are ‘zipped’ into linings inside. TOP MIDDLE: Dynamixyz’s head-mounted camera. TOP RIGHT: Kite & Lightning’s Cory Strassburger wears an Xsens suit as part of his Bebylon real-time project. MIDDLE LEFT: Faceware’s PRO Headcam. BOTTOM LEFT: The Perception Neuron suit is made up of a network of straps with inertial measurement unit sensors.

12 • VFXVOICE.COM SUMMER 2019



FILM

NEXT-GENERATION CINEMA BREAKS OUT By CHRIS McGOWAN

TOP LEFT: The IMAX 15 perf/ 65mm film camera is the highestresolution camera in the world, according to IMAX. The camera is capable of capturing footage at 18K resolution, and has been used by IMAX documentary and Hollywood filmmakers, including Christopher Nolan and more recently by Damien Chazelle for the moon sequence of First Man. (Photo © IMAX) TOP RIGHT: The IMAX with Laser system offers greater contrast, brightness, sharpness and an expanded color gamut. It also includes an upgraded 12-channel sound system with new side and ceiling speakers for next-level sound immersion. (Photo © IMAX) BOTTOM: IMAX recently launched its next-generation IMAX with Laser projection system, the largest R&D project in the company’s history. The new optimal engine for the system features an open frame design that eliminates the prism standard in projectors for years. (Photo © IMAX)

14 • VFXVOICE.COM SUMMER 2019

Nowadays, affordable big-screen 4K TVs can display a multitude of films and series in high resolution and with superb sound via streaming, cable and satellite TV (with Blu-ray discs filling in catalog gaps), yet the 8K writing is on the wall (Samsung released an 85-inch 8K television at the end of 2018). With a high-resolution smart TV at home and a seemingly endless supply of programming, why fight traffic? The answer may lie in “Immersive Cinema,” which is on the rise. Laser projection, sound technology improvements, large-format screens and cameras, HFR (High Frame Rates) and 4D moving seats and multi-sensory stimuli are making movies more immersive. Among those leading the charge are IMAX, Dolby, THX, RealD, Red 8K, Ang Lee, the 4DX format and Douglas Trumbull, VES. 3D in theaters is an immersive factor that has grown tremendously worldwide (99,639 digital 3D screens at the end of 2017, according to IHS Markit), but lost some steam in the U.S. since its 2010 peak. Meanwhile, PLF (Premium Large Format) theaters – which have luxury, big-screen venues with cutting-edge sound and projection – rose by 21.3% in 2017 to surpass 3,100 screens globally, according to IHS Markit. The AMC chain has invested heavily in IMAX and Dolby Cinema locations (Dolby’s proprietary technologies for Dolby Vision and Dolby Atmos improve image and sound quality), and has its own AMC Prime PLF locations. Cinemark has the XD line of PLF theaters, over 200 of which have been approved by THX’s Certified Cinema program. RealD, which sells the RealD 3D system, has launched its own LUXE line of PLF theaters. And CJ 4DPLEX offers ScreenX, with 270-degree visuals. IMAX is the leader in PLF, though the company does not include itself in that category. “We’ve definitely seen a shift towards more premium cinema offerings over the last few years and that has helped to fuel our network expansion, particularly on the international front. IMAX now has more than 1,443 theaters open across 79 countries with another 635 in backlog,” comments IMAX’s Chief Technology Officer, Brian Bonnick. “At IMAX, our mission for 50 years has been to change the way

people experience movies. We look at every aspect of the cinema experience, from how the film content is shot and remastered in post-production using our cameras and DMR process, to the design of the theater and our proprietary projection and sound technology powering the experience. By taking this holistic approach, we can optimize the audience experience so that when the lights go down, you feel like you are part of the movie.” Earlier this year, IMAX launched its next-generation “IMAX with Laser” projection system, which represents the largest R&D project in the company’s history. “The system incorporates a radical new optical engine featuring an open frame design, eliminating the prism that has been used in projectors for the last 25 years,” says Bonnick. “IMAX with Laser delivers substantially greater contrast, brightness, sharpness and an expanded color gamut. The system also comes with an upgraded 12-channel sound system, adding new side and ceiling speakers to take sound immersion to the next level.” IMAX’s profile has also shifted. “While IMAX started out playing documentary films, over the last 15 years our business has shifted to predominantly focus on blockbuster event films. We now play approximately 40 blockbuster films a year – including Hollywood films as well as local-language films in certain parts of the world such as China and India,” says Bonnick. “We partner with all of the major studios and look to program the biggest and mostanticipated films that will excite our fans and take advantage of our scale and scope.” In addition, IMAX is changing the way many studio movies are shot. A major shift occurred with The Dark Knight (2008), which featured 28 minutes filmed using IMAX, the first time that a major feature film was partially shot using IMAX cameras. Bonnick adds, “In 2018 we had several films that featured IMAX differentiation, or what we like to call IMAX DNA, meaning it was filmed with our cameras.” This included Avengers: Infinity War (filmed entirely with IMAX cameras), First Man (lunar sequences filmed with IMAX cameras) and Aquaman (approximately 90 minutes of footage specially formatted for IMAX). “For 2019, we expect to continue this trend,” says Bonnick. “IMAX DNA” titles this year include Captain Marvel (select scenes specially formatted), Avengers 4 (entirely shot with IMAX cameras) and The Lion King (specially formatted), with additional titles to be announced. Bonnick adds, “The filmmakers we work with, such as Christopher Nolan, the Russo Brothers and J.J. Abrams, are designing their films for the big-screen experience,

TOP: There are around 600 4DX theaters worldwide featuring motion seats and a variety of multi-sensory effects (scents, wind, mist, fog) that are integrated with each film. (Image © CJ 4DPLEX) BOTTOM: The 4DX experience includes the use of 3D glasses for 3D films. The not-too-distant future may also include 4DX VR. (Image © CJ 4DPLEX)

“The filmmakers we work with, such as Christopher Nolan, the Russo Brothers and J.J. Abrams, are designing their films for the big-screen experience, and we are working hard to help deliver them the tools to achieve their creative visions.” —Brian Bonnick, Chief Technology Officer, IMAX

SUMMER 2019 VFXVOICE.COM • 15


FILM

NEXT-GENERATION CINEMA BREAKS OUT By CHRIS McGOWAN

TOP LEFT: The IMAX 15 perf/ 65mm film camera is the highestresolution camera in the world, according to IMAX. The camera is capable of capturing footage at 18K resolution, and has been used by IMAX documentary and Hollywood filmmakers, including Christopher Nolan and more recently by Damien Chazelle for the moon sequence of First Man. (Photo © IMAX) TOP RIGHT: The IMAX with Laser system offers greater contrast, brightness, sharpness and an expanded color gamut. It also includes an upgraded 12-channel sound system with new side and ceiling speakers for next-level sound immersion. (Photo © IMAX) BOTTOM: IMAX recently launched its next-generation IMAX with Laser projection system, the largest R&D project in the company’s history. The new optimal engine for the system features an open frame design that eliminates the prism standard in projectors for years. (Photo © IMAX)

14 • VFXVOICE.COM SUMMER 2019

Nowadays, affordable big-screen 4K TVs can display a multitude of films and series in high resolution and with superb sound via streaming, cable and satellite TV (with Blu-ray discs filling in catalog gaps), yet the 8K writing is on the wall (Samsung released an 85-inch 8K television at the end of 2018). With a high-resolution smart TV at home and a seemingly endless supply of programming, why fight traffic? The answer may lie in “Immersive Cinema,” which is on the rise. Laser projection, sound technology improvements, large-format screens and cameras, HFR (High Frame Rates) and 4D moving seats and multi-sensory stimuli are making movies more immersive. Among those leading the charge are IMAX, Dolby, THX, RealD, Red 8K, Ang Lee, the 4DX format and Douglas Trumbull, VES. 3D in theaters is an immersive factor that has grown tremendously worldwide (99,639 digital 3D screens at the end of 2017, according to IHS Markit), but lost some steam in the U.S. since its 2010 peak. Meanwhile, PLF (Premium Large Format) theaters – which have luxury, big-screen venues with cutting-edge sound and projection – rose by 21.3% in 2017 to surpass 3,100 screens globally, according to IHS Markit. The AMC chain has invested heavily in IMAX and Dolby Cinema locations (Dolby’s proprietary technologies for Dolby Vision and Dolby Atmos improve image and sound quality), and has its own AMC Prime PLF locations. Cinemark has the XD line of PLF theaters, over 200 of which have been approved by THX’s Certified Cinema program. RealD, which sells the RealD 3D system, has launched its own LUXE line of PLF theaters. And CJ 4DPLEX offers ScreenX, with 270-degree visuals. IMAX is the leader in PLF, though the company does not include itself in that category. “We’ve definitely seen a shift towards more premium cinema offerings over the last few years and that has helped to fuel our network expansion, particularly on the international front. IMAX now has more than 1,443 theaters open across 79 countries with another 635 in backlog,” comments IMAX’s Chief Technology Officer, Brian Bonnick. “At IMAX, our mission for 50 years has been to change the way

people experience movies. We look at every aspect of the cinema experience, from how the film content is shot and remastered in post-production using our cameras and DMR process, to the design of the theater and our proprietary projection and sound technology powering the experience. By taking this holistic approach, we can optimize the audience experience so that when the lights go down, you feel like you are part of the movie.” Earlier this year, IMAX launched its next-generation “IMAX with Laser” projection system, which represents the largest R&D project in the company’s history. “The system incorporates a radical new optical engine featuring an open frame design, eliminating the prism that has been used in projectors for the last 25 years,” says Bonnick. “IMAX with Laser delivers substantially greater contrast, brightness, sharpness and an expanded color gamut. The system also comes with an upgraded 12-channel sound system, adding new side and ceiling speakers to take sound immersion to the next level.” IMAX’s profile has also shifted. “While IMAX started out playing documentary films, over the last 15 years our business has shifted to predominantly focus on blockbuster event films. We now play approximately 40 blockbuster films a year – including Hollywood films as well as local-language films in certain parts of the world such as China and India,” says Bonnick. “We partner with all of the major studios and look to program the biggest and mostanticipated films that will excite our fans and take advantage of our scale and scope.” In addition, IMAX is changing the way many studio movies are shot. A major shift occurred with The Dark Knight (2008), which featured 28 minutes filmed using IMAX, the first time that a major feature film was partially shot using IMAX cameras. Bonnick adds, “In 2018 we had several films that featured IMAX differentiation, or what we like to call IMAX DNA, meaning it was filmed with our cameras.” This included Avengers: Infinity War (filmed entirely with IMAX cameras), First Man (lunar sequences filmed with IMAX cameras) and Aquaman (approximately 90 minutes of footage specially formatted for IMAX). “For 2019, we expect to continue this trend,” says Bonnick. “IMAX DNA” titles this year include Captain Marvel (select scenes specially formatted), Avengers 4 (entirely shot with IMAX cameras) and The Lion King (specially formatted), with additional titles to be announced. Bonnick adds, “The filmmakers we work with, such as Christopher Nolan, the Russo Brothers and J.J. Abrams, are designing their films for the big-screen experience,

TOP: There are around 600 4DX theaters worldwide featuring motion seats and a variety of multi-sensory effects (scents, wind, mist, fog) that are integrated with each film. (Image © CJ 4DPLEX) BOTTOM: The 4DX experience includes the use of 3D glasses for 3D films. The not-too-distant future may also include 4DX VR. (Image © CJ 4DPLEX)

“The filmmakers we work with, such as Christopher Nolan, the Russo Brothers and J.J. Abrams, are designing their films for the big-screen experience, and we are working hard to help deliver them the tools to achieve their creative visions.” —Brian Bonnick, Chief Technology Officer, IMAX

SUMMER 2019 VFXVOICE.COM • 15


FILM

TOP: 4DX seats, clustered in sets of four, can heave (up/down), pitch (forward and backward tilt) and roll (left/right), as well as shake, vibrate and tickle, to create various possibilities of movement. (Photo courtesy of CJ 4DPLEX) BOTTOM: South Korea’s CJ 4DPLEX seeks to increase immersion with 4DX theaters. (Photo courtesy of CJ 4DPLEX)

16 • VFXVOICE.COM SUMMER 2019

and we are working hard to help deliver them the tools to achieve their creative visions.” Universal’s First Man includes a dramatic use of IMAX in its finale. “Director Damien Chazelle shot the moon sequences of the film using our 15 perf/65mm film cameras. During those sequences, the audience in an IMAX theater saw a dramatic increase in clarity as well as the image expand to fill the entire screen, which helped to immerse the audience in that pivotal sequence of the film,” says Bonnick. The lunar scenes of First Man illustrate how IMAX can impact VFX artists. “There is extra work to do because you get such incredible detail in an IMAX frame,” says Paul Lambert, VFX Supervisor, DNEG. “In First Man, we actually transition from 16mm, when we’re inside the capsule, to [IMAX] when Neil opens the portal door. You’ve been accustomed for the past two hours to be in this shaky, gritty documentary-style world and suddenly you’re in this clean, pristine silent environment. “You get to see the incredible detail that the IMAX frame gives you, and with that you also have to inspect the frame very well because basically you’re seeing everything. Once you get the scanned frames back, which are so incredibly huge, you start to see lines in there. We had to meticulously go through each frame, take out additional footsteps, stray lines, anything that seems to be man-made.” Lambert adds, “I think watching a bigger format is far more immersive than traditional cinema, and there are more and more productions choosing to use [larger formats].” He notes that working with IMAX and larger formats can be a challenge in terms of computation power and storage. “Frames are getting bigger and bigger. Which means it is more firepower, more storage to be able to deal with this imagery. You need faster computers as well to be able to do the effects on this. Just when you think things are starting to get to a good speed, suddenly you’ll get a bigger format to deal with, and then you’re chasing again to try to get things to [run] quicker. It’s a constant cycle.” HFR also may upgrade the film-going experience. Peter Jackson and Ang Lee boldly employed HFRs in The Hobbit series (48 frames per second) and Billy Lynn’s Long Halftime Walk (120fps), respectively. James Cameron will reportedly be filming his Avatar sequels with high frame rates. And Douglas Trumbull, VES has developed the Magi system, which can shoot and project films in 4K 3D at 120fps (or other frame rates). South Korea’s CJ 4DPLEX seeks to increase immersion with 4DX theaters, which add seat motion, wind, snow, mist and smells to blockbuster movies. “In Jurassic Park, when little Tim notices the ripples in the glass of water, the chairs subtly emit vibrations, alerting the audience of impending danger,” notes Yohan Song, CJ 4DPLEX Senior Manager of Strategic Initiatives. And, he adds, “In Doctor Strange, when the Ancient One takes Stephen Strange on a psychedelic foray into the astral plane, the 4DX motion and wind effects synchronize to the scene in play, creating a visceral experience.” “4DX was launched with the desire to bring moviegoers back to the cinema in the age of streaming by providing an immersive

“4DX was launched with the desire to bring moviegoers back to the cinema in the age of streaming by providing an immersive experience that can’t be replicated elsewhere.” —Yohan Song, Senior Manager of Strategic Initiatives, CJ 4DPLEX experience that can’t be replicated elsewhere,” Song adds. CJ 4DPLEX is a subsidiary of CJ CGV, the fifth-largest cinema exhibition firm in the world. “The idea derived from 4D theme park rides. 4DX harnessed the technology and developed it further to provide a more immersive experience” that was complementary to films. The first 4DX theater bowed in 2009, the 100th debuted in 2014 and the 600th auditorium was slated to launch last December, according to Song. 4DX movies have received strong support from both the “studio and exhibition” sides, as “4DX provides a new revenue stream that benefits the industry as a whole,” says Song. At major flagship sites such as Regal L.A. Live, “the 4DX upcharge is $8 (USD) from the regular ticket price. However, regardless of the price difference, 4DX is capable of generating far more revenue compared to a regular 2D auditorium.” Song adds that the 4DX version of Avengers: Infinity War was “the highest-grossing film from all Disney/Marvel films we’ve screened during our nine years of operation. Jurassic World: Fallen Kingdom was the highest-grossing film in 4DX history, which earned more than $32 million at the box office.” Are filmmakers adding visual elements specifically designed for 4DX screenings? “Not yet,” says Song. “This is our ultimate goal, to collaborate with studios and producers starting from the creative phase to production and post-production to better reflect the endless possibilities that the technology can offer.” CJ 4DPLEX’s ScreenX is a “270-degree panoramic experience that extends the center image of the film to the left and right, all the way up to the end of the auditorium, providing a sense of being wrapped or being inside a movie,” says Song. “We started showcasing Hollywood films last year, and in 2018 alone we released eight Hollywood films: Black Panther, Rampage, The Meg, Ant-Man and the Wasp, The Nun, Bohemian Rhapsody, Fantastic Beasts: The Crimes of Grindelwald and Aquaman. CJ 4DPLEX brought its two immersive formats together last year in “4DX with ScreenX” theaters. IMAX’s Bonnick feels that the push towards immersive cinema is good for everyone. “Consumers want a reason to get off their couch, and it is important that we, as an industry, are consistently innovating and improving the cinema experience to deliver that reason. We believe people are social by nature, and there is something magical about that communal movie-going experience and being immersed in the film that you just can’t recreate at home. We’ve seen cinema chains adding new technology and improved seating and amenities, and I think that is all in an effort to raise the bar, which is a good thing for everyone involved.”

TOP: The Regal Cinemas L.A. LIVE 14 theater complex in Los Angeles is a 4DX flagship location. BOTTOM: 4DX environmental effects.

SUMMER 2019 VFXVOICE.COM • 17


FILM

TOP: 4DX seats, clustered in sets of four, can heave (up/down), pitch (forward and backward tilt) and roll (left/right), as well as shake, vibrate and tickle, to create various possibilities of movement. (Photo courtesy of CJ 4DPLEX) BOTTOM: South Korea’s CJ 4DPLEX seeks to increase immersion with 4DX theaters. (Photo courtesy of CJ 4DPLEX)

16 • VFXVOICE.COM SUMMER 2019

and we are working hard to help deliver them the tools to achieve their creative visions.” Universal’s First Man includes a dramatic use of IMAX in its finale. “Director Damien Chazelle shot the moon sequences of the film using our 15 perf/65mm film cameras. During those sequences, the audience in an IMAX theater saw a dramatic increase in clarity as well as the image expand to fill the entire screen, which helped to immerse the audience in that pivotal sequence of the film,” says Bonnick. The lunar scenes of First Man illustrate how IMAX can impact VFX artists. “There is extra work to do because you get such incredible detail in an IMAX frame,” says Paul Lambert, VFX Supervisor, DNEG. “In First Man, we actually transition from 16mm, when we’re inside the capsule, to [IMAX] when Neil opens the portal door. You’ve been accustomed for the past two hours to be in this shaky, gritty documentary-style world and suddenly you’re in this clean, pristine silent environment. “You get to see the incredible detail that the IMAX frame gives you, and with that you also have to inspect the frame very well because basically you’re seeing everything. Once you get the scanned frames back, which are so incredibly huge, you start to see lines in there. We had to meticulously go through each frame, take out additional footsteps, stray lines, anything that seems to be man-made.” Lambert adds, “I think watching a bigger format is far more immersive than traditional cinema, and there are more and more productions choosing to use [larger formats].” He notes that working with IMAX and larger formats can be a challenge in terms of computation power and storage. “Frames are getting bigger and bigger. Which means it is more firepower, more storage to be able to deal with this imagery. You need faster computers as well to be able to do the effects on this. Just when you think things are starting to get to a good speed, suddenly you’ll get a bigger format to deal with, and then you’re chasing again to try to get things to [run] quicker. It’s a constant cycle.” HFR also may upgrade the film-going experience. Peter Jackson and Ang Lee boldly employed HFRs in The Hobbit series (48 frames per second) and Billy Lynn’s Long Halftime Walk (120fps), respectively. James Cameron will reportedly be filming his Avatar sequels with high frame rates. And Douglas Trumbull, VES has developed the Magi system, which can shoot and project films in 4K 3D at 120fps (or other frame rates). South Korea’s CJ 4DPLEX seeks to increase immersion with 4DX theaters, which add seat motion, wind, snow, mist and smells to blockbuster movies. “In Jurassic Park, when little Tim notices the ripples in the glass of water, the chairs subtly emit vibrations, alerting the audience of impending danger,” notes Yohan Song, CJ 4DPLEX Senior Manager of Strategic Initiatives. And, he adds, “In Doctor Strange, when the Ancient One takes Stephen Strange on a psychedelic foray into the astral plane, the 4DX motion and wind effects synchronize to the scene in play, creating a visceral experience.” “4DX was launched with the desire to bring moviegoers back to the cinema in the age of streaming by providing an immersive

“4DX was launched with the desire to bring moviegoers back to the cinema in the age of streaming by providing an immersive experience that can’t be replicated elsewhere.” —Yohan Song, Senior Manager of Strategic Initiatives, CJ 4DPLEX experience that can’t be replicated elsewhere,” Song adds. CJ 4DPLEX is a subsidiary of CJ CGV, the fifth-largest cinema exhibition firm in the world. “The idea derived from 4D theme park rides. 4DX harnessed the technology and developed it further to provide a more immersive experience” that was complementary to films. The first 4DX theater bowed in 2009, the 100th debuted in 2014 and the 600th auditorium was slated to launch last December, according to Song. 4DX movies have received strong support from both the “studio and exhibition” sides, as “4DX provides a new revenue stream that benefits the industry as a whole,” says Song. At major flagship sites such as Regal L.A. Live, “the 4DX upcharge is $8 (USD) from the regular ticket price. However, regardless of the price difference, 4DX is capable of generating far more revenue compared to a regular 2D auditorium.” Song adds that the 4DX version of Avengers: Infinity War was “the highest-grossing film from all Disney/Marvel films we’ve screened during our nine years of operation. Jurassic World: Fallen Kingdom was the highest-grossing film in 4DX history, which earned more than $32 million at the box office.” Are filmmakers adding visual elements specifically designed for 4DX screenings? “Not yet,” says Song. “This is our ultimate goal, to collaborate with studios and producers starting from the creative phase to production and post-production to better reflect the endless possibilities that the technology can offer.” CJ 4DPLEX’s ScreenX is a “270-degree panoramic experience that extends the center image of the film to the left and right, all the way up to the end of the auditorium, providing a sense of being wrapped or being inside a movie,” says Song. “We started showcasing Hollywood films last year, and in 2018 alone we released eight Hollywood films: Black Panther, Rampage, The Meg, Ant-Man and the Wasp, The Nun, Bohemian Rhapsody, Fantastic Beasts: The Crimes of Grindelwald and Aquaman. CJ 4DPLEX brought its two immersive formats together last year in “4DX with ScreenX” theaters. IMAX’s Bonnick feels that the push towards immersive cinema is good for everyone. “Consumers want a reason to get off their couch, and it is important that we, as an industry, are consistently innovating and improving the cinema experience to deliver that reason. We believe people are social by nature, and there is something magical about that communal movie-going experience and being immersed in the film that you just can’t recreate at home. We’ve seen cinema chains adding new technology and improved seating and amenities, and I think that is all in an effort to raise the bar, which is a good thing for everyone involved.”

TOP: The Regal Cinemas L.A. LIVE 14 theater complex in Los Angeles is a 4DX flagship location. BOTTOM: 4DX environmental effects.

SUMMER 2019 VFXVOICE.COM • 17


TV

if they were knocked down. Other times, replicas of Vela-Bailey needed to be visibly present in the scene, which is why separate passes using the Bolt were necessary. “We did something like 75 takes that were hooked up into one move broken out into four different sections, all playing as one,” states Kolpack. “We shot everything at 120 frames per second so we could build in the speed ramps.” Pixomondo composited the takes – which were mostly acquired on set but also included one or two separate passes on bluescreen – together for the final oner to appear as if the camera is moving around the fight. “It was a very cool shot because we were able to repeat Alicia in all these additional passes and get her in the frame, all choreographed perfectly,” says Kolpack. “It was all about selling the idea of, ‘here’s a girl who can split five times.’”

TV’S ONE-TAKES: A CHANCE TO SHOWCASE SEAMLESS EFFECTS By IAN FAILES

TOP LEFT: The Bolt Cinebot rig used to film the oner in Marvel’s Agents of S.H.I.E.L.D.’s “S.O.S. Part 2” episode. TOP RIGHT: A frame from the one-take in “S.O.S. Part 2,” which sees the character Daisy take on an Inhuman that can replicate herself. BOTTOM LEFT: Multiple versions of the same actress had to appear in the final frame.

18 • VFXVOICE.COM SUMMER 2019

HALLWAY HEROICS

Some say we are in the golden age of television – it may also be true we are in the golden age of the ‘one-take,’ those long, uninterrupted shots that are now so common on the small screen, but which still seem magical to viewers. Also known as ‘oners’ or ‘long takes,’ these kinds of shots tend to follow a character on the move or give a lengthy view of the action. Shows such as Daredevil, True Detective, Mr. Robot, The Marvelous Mrs. Maisel and The Haunting of Hill House, for example, have all used oners to illustrate specific story points. Oners take a great deal of skill to pull off in terms of planning, on-set choreography, stunts, cinematography, editing and, increasingly, visual effects to stitch takes together or add in necessary elements into the frames. VFX Voice looks at several episodes from recent series that have used the oner, and how visual effects helped make the shots possible. THE ART OF REPLICATION

Soon to begin its sixth season, Marvel’s Agents of S.H.I.E.L.D. on ABC regularly features long takes. One such shot occurred in Season 2’s “S.O.S. Part 2” episode, when the character Daisy (Chloe Bennet) takes on the Inhuman Alisha (Alicia Vela-Bailey), who replicates herself into multiple ‘twinned’ versions. “The traditional approach to this kind of thing would have been to have the stunt guys choreograph their fight sequence, and then you would have, say, photo-doubles, and you would do it in cuts,” suggests Agents of S.H.I.E.L.D. Visual Effects Supervisor Mark Kolpack. “Billy Gierhart, who was the director of this episode, thought, ‘Well, we could do a oner here.’ Then Garry Brown was the second unit director who I shot this with to make it a one-take.” The fight scene was not actually filmed as a one-take, but instead programmed as a motion-control move with separate passes. This employed Camera Control’s Bolt Cinebot, a high-speed rig that would allow the team to factor in speed ramps into the shots and repeat the moves since they would involve multiple takes of the same actress. During the shoot, several actors were present: Bennet, VelaBailey and four photo-doubles. There were moments during the fight that the photo-doubles could be utilized directly, such as

The Netflix miniseries Maniac, created by Patrick Somerville and directed by Cary Joji Fukunaga, incorporates a dynamic oner in Episode 9 “Utangatta,” in which the characters Annie Landsberg (Emma Stone) and Owen Milgrim (Jonah Hill) – connected during a pharmaceutical trial – are involved in a hallway shootout with several agents. “The brief was to get something feeling as exciting, dynamic and slick as possible, with limited time and resources,” outlines Maniac Visual Effects Supervisor Ilia Mokhtareizadeh. “VFX was fully involved at every stage, working closely with Cary, the actors, stunt team and special effects. Due to the nature of the shot, we were required to travel through the same environment over and over again, so we had to ensure that we maintained the damage for continuity.” Interestingly, the oner did not make use of any stitching of takes. “Cary wanted the scene to feel natural, and the free-flow camera work on the Steadicam did not really provide us with any ideal stitch points,” notes Mokhtareizadeh. “We spent extra time in rehearsal knowing that we wanted to get everything in a true one-take shot. In the edit we discussed a potential stitch halfway through the shot and decided against it as we preferred the flow of the unadulterated original take.”

TOP: In Maniac’s “Utangatta” episode, a long take told the story of Annie Landsberg (Emma Stone) and Owen Milgrim (Jonah Hill) in a shootout against several hostile agents. MIDDLE: The shot follows Annie and Owen around a hallway. It was achieved with no stitches. BOTTOM LEFT: The oner was captured via a Steadicam following the action, with beats heavily planned and choreographed. BOTTOM RIGHT: Visual effects added in muzzle flashes, bullet hits and blood spurts.

SUMMER 2019 VFXVOICE.COM • 19


TV

if they were knocked down. Other times, replicas of Vela-Bailey needed to be visibly present in the scene, which is why separate passes using the Bolt were necessary. “We did something like 75 takes that were hooked up into one move broken out into four different sections, all playing as one,” states Kolpack. “We shot everything at 120 frames per second so we could build in the speed ramps.” Pixomondo composited the takes – which were mostly acquired on set but also included one or two separate passes on bluescreen – together for the final oner to appear as if the camera is moving around the fight. “It was a very cool shot because we were able to repeat Alicia in all these additional passes and get her in the frame, all choreographed perfectly,” says Kolpack. “It was all about selling the idea of, ‘here’s a girl who can split five times.’”

TV’S ONE-TAKES: A CHANCE TO SHOWCASE SEAMLESS EFFECTS By IAN FAILES

TOP LEFT: The Bolt Cinebot rig used to film the oner in Marvel’s Agents of S.H.I.E.L.D.’s “S.O.S. Part 2” episode. TOP RIGHT: A frame from the one-take in “S.O.S. Part 2,” which sees the character Daisy take on an Inhuman that can replicate herself. BOTTOM LEFT: Multiple versions of the same actress had to appear in the final frame.

18 • VFXVOICE.COM SUMMER 2019

HALLWAY HEROICS

Some say we are in the golden age of television – it may also be true we are in the golden age of the ‘one-take,’ those long, uninterrupted shots that are now so common on the small screen, but which still seem magical to viewers. Also known as ‘oners’ or ‘long takes,’ these kinds of shots tend to follow a character on the move or give a lengthy view of the action. Shows such as Daredevil, True Detective, Mr. Robot, The Marvelous Mrs. Maisel and The Haunting of Hill House, for example, have all used oners to illustrate specific story points. Oners take a great deal of skill to pull off in terms of planning, on-set choreography, stunts, cinematography, editing and, increasingly, visual effects to stitch takes together or add in necessary elements into the frames. VFX Voice looks at several episodes from recent series that have used the oner, and how visual effects helped make the shots possible. THE ART OF REPLICATION

Soon to begin its sixth season, Marvel’s Agents of S.H.I.E.L.D. on ABC regularly features long takes. One such shot occurred in Season 2’s “S.O.S. Part 2” episode, when the character Daisy (Chloe Bennet) takes on the Inhuman Alisha (Alicia Vela-Bailey), who replicates herself into multiple ‘twinned’ versions. “The traditional approach to this kind of thing would have been to have the stunt guys choreograph their fight sequence, and then you would have, say, photo-doubles, and you would do it in cuts,” suggests Agents of S.H.I.E.L.D. Visual Effects Supervisor Mark Kolpack. “Billy Gierhart, who was the director of this episode, thought, ‘Well, we could do a oner here.’ Then Garry Brown was the second unit director who I shot this with to make it a one-take.” The fight scene was not actually filmed as a one-take, but instead programmed as a motion-control move with separate passes. This employed Camera Control’s Bolt Cinebot, a high-speed rig that would allow the team to factor in speed ramps into the shots and repeat the moves since they would involve multiple takes of the same actress. During the shoot, several actors were present: Bennet, VelaBailey and four photo-doubles. There were moments during the fight that the photo-doubles could be utilized directly, such as

The Netflix miniseries Maniac, created by Patrick Somerville and directed by Cary Joji Fukunaga, incorporates a dynamic oner in Episode 9 “Utangatta,” in which the characters Annie Landsberg (Emma Stone) and Owen Milgrim (Jonah Hill) – connected during a pharmaceutical trial – are involved in a hallway shootout with several agents. “The brief was to get something feeling as exciting, dynamic and slick as possible, with limited time and resources,” outlines Maniac Visual Effects Supervisor Ilia Mokhtareizadeh. “VFX was fully involved at every stage, working closely with Cary, the actors, stunt team and special effects. Due to the nature of the shot, we were required to travel through the same environment over and over again, so we had to ensure that we maintained the damage for continuity.” Interestingly, the oner did not make use of any stitching of takes. “Cary wanted the scene to feel natural, and the free-flow camera work on the Steadicam did not really provide us with any ideal stitch points,” notes Mokhtareizadeh. “We spent extra time in rehearsal knowing that we wanted to get everything in a true one-take shot. In the edit we discussed a potential stitch halfway through the shot and decided against it as we preferred the flow of the unadulterated original take.”

TOP: In Maniac’s “Utangatta” episode, a long take told the story of Annie Landsberg (Emma Stone) and Owen Milgrim (Jonah Hill) in a shootout against several hostile agents. MIDDLE: The shot follows Annie and Owen around a hallway. It was achieved with no stitches. BOTTOM LEFT: The oner was captured via a Steadicam following the action, with beats heavily planned and choreographed. BOTTOM RIGHT: Visual effects added in muzzle flashes, bullet hits and blood spurts.

SUMMER 2019 VFXVOICE.COM • 19


TV

TOP: Escape at Dannemora director Ben Stiller.

Visual effects were used, however, to add in muzzle flashes (including for a mini-gun), bullet and blood hits, and some other effects. For that reason, a workflow had to be established to un-distort and re-distort the anamorphic lens photography. “Photogrammetry and set diagrams were used to recreate the environment in 3D, allowing us to run a single match move across the entire shot,” explains John Kilshaw, a Creative Director at Zoic Studios, the studio behind the work. Blood hits were handled via Zoic’s library of elements, while larger sprays were generated in Houdini. “Cary provided great reference for the stylized impact he wanted the mini-gun to cause, and we were able to match that exactly,” adds Kilshaw. “We referenced period footage to ensure that all other muzzle flashes and impacts were accurate to all the other weapons employed.” Amid the chaos, Mokhtareizadeh was cast as one of the agents during the shoot-out for the one-take. “I’d been semi-jokingly badgering Cary about letting me be in the show for a while,” he says. “We were on set out in Roosevelt Island the evening before the oner when my wish was finally granted. I Uber-ed back to Silvercup Studios while everyone was setting up the next scene, got fitted for a suit and rushed back to continue supervising. The next day was pretty fun. I needed the haircut and shave badly! John from Zoic covered the supervision for me as I spent the day being shot by Jonah Hill.”

Seamless Smuggling A much-lauded oner from recent years appears in Better Call Saul’s Season 2 “Fifi” episode, which follows the action at a US/Mexico checkpoint. The one-take lasts for 4 minutes 22 seconds, totaling 6,300 frames. It required 1,308 compositing layers made up of matte paintings, 3D and 2D elements, animation, matte paintings and significant rotoscoping. To shoot the scene – which goes through a line of trucks, over the border fence, into an inspection bay, and finishes on the back of a hero truck – a Steadicam operator began on foot before being transferred to a Titan crane, going back on foot, and then joining a golf cart as the move continued before ending once again on foot. “The enormity of the shot wasn’t revealed until we got into post-production,” says Visual Effects Supervisor William

Powloski, “where they wanted to marry different takes that weren’t designed to be put together. And then having to deal with light changing outside over the course of the day. That’s probably the most invisible part of the shot – replacing lighting or keeping lighting from an earlier take and applying it to a second take. “The shot,” adds Powloski, “was one of those where the producers of the show wanted the audience to say at the end, ‘Wow, where was the cut? I didn’t see any cut in this!’ And then they’d go back and re-watch it. That was very important to the producers - that the visual effects could hold up with somebody trying to analyze and figure out how it’s being done. I think that planning would have made it feel too choreographed, and I think that they were trying to avoid that.”

BREAKOUT WITH NO BREAKS

“[Director] Cary [Joji Fukunaga] wanted the scene to feel natural, and the free-flow camera work on the Steadicam did not really provide us with any ideal stitch points. We spent extra time in rehearsal knowing that we wanted to get everything in a true one-take shot. In the edit we discussed a potential stitch halfway through the shot and decided against it as we preferred the flow of the unadulterated original take.” —Ilia Mokhtareizadeh, Visual Effects Supervisor, Maniac

20 • VFXVOICE.COM SUMMER 2019

Showtime’s Escape at Dannemora, created and written by Brett Johnson and Michael Tolkin, and directed by Ben Stiller, tells the true story of a daring jail escape at the Clinton Correctional Facility in 2015. ‘Part 5’ showcases – in one nine-minute one-take – prisoner David Sweat’s (Paul Dano) trial run at the breakout, with the camera following him through walls, along tunnels and even crawling through pipes, before he emerges out of a man-hole outside the facility. “Doing it as a oner for Ben was completely about storytelling,” says Visual Effects Consultant Jake Braver, who had some previous experience with oners on the film Birdman. “It wasn’t about showing off. It was about taking this journey with the character the whole way, and realizing how hard it was and how tough it was, and how much ingenuity was required to get out. The idea of not using cuts was Ben’s way of saying, ‘Sweat didn’t cheat, so we’re not going to cheat.’ But it became very clear that we were going to have to cheat because it was physically impossible to shoot it all in one place.” Ultimately, four different places – a mix of real locations and stages – would be used for filming of the one-take, which became known as ‘Sweat’s run.’ Previs, led by Proof’s George Antzoulides, helped plan out how it would be shot and where edits and stitches could work. Shooting the takes would be complicated enough; they involved careful collaboration between departments, and specific rigs and actions allowing for cameras to go through walls or for camera hand-offs. The visual effects team from Phosphene would then take the original photography and incorporate seamless stitches between multiple takes, as well as add in CG pieces of the

TOP LEFT: The oner for Better Call Saul’s “Fifi” episode was filmed at an airport standing in for the US/ Mexico border checkpoint. BOTTOM LEFT: As the camera roamed around the scene, it would pick up elements that needed to be removed from the plate, such as the airport control tower.

TOP RIGHT: The line of trucks was extended and extraneous buildings removed. BOTTOM RIGHT: The final shot. Many elements had to be tracked, rotoscoped and digitally replaced throughout the entire scene.

SUMMER 2019 VFXVOICE.COM • 21


TV

TOP: Escape at Dannemora director Ben Stiller.

Visual effects were used, however, to add in muzzle flashes (including for a mini-gun), bullet and blood hits, and some other effects. For that reason, a workflow had to be established to un-distort and re-distort the anamorphic lens photography. “Photogrammetry and set diagrams were used to recreate the environment in 3D, allowing us to run a single match move across the entire shot,” explains John Kilshaw, a Creative Director at Zoic Studios, the studio behind the work. Blood hits were handled via Zoic’s library of elements, while larger sprays were generated in Houdini. “Cary provided great reference for the stylized impact he wanted the mini-gun to cause, and we were able to match that exactly,” adds Kilshaw. “We referenced period footage to ensure that all other muzzle flashes and impacts were accurate to all the other weapons employed.” Amid the chaos, Mokhtareizadeh was cast as one of the agents during the shoot-out for the one-take. “I’d been semi-jokingly badgering Cary about letting me be in the show for a while,” he says. “We were on set out in Roosevelt Island the evening before the oner when my wish was finally granted. I Uber-ed back to Silvercup Studios while everyone was setting up the next scene, got fitted for a suit and rushed back to continue supervising. The next day was pretty fun. I needed the haircut and shave badly! John from Zoic covered the supervision for me as I spent the day being shot by Jonah Hill.”

Seamless Smuggling A much-lauded oner from recent years appears in Better Call Saul’s Season 2 “Fifi” episode, which follows the action at a US/Mexico checkpoint. The one-take lasts for 4 minutes 22 seconds, totaling 6,300 frames. It required 1,308 compositing layers made up of matte paintings, 3D and 2D elements, animation, matte paintings and significant rotoscoping. To shoot the scene – which goes through a line of trucks, over the border fence, into an inspection bay, and finishes on the back of a hero truck – a Steadicam operator began on foot before being transferred to a Titan crane, going back on foot, and then joining a golf cart as the move continued before ending once again on foot. “The enormity of the shot wasn’t revealed until we got into post-production,” says Visual Effects Supervisor William

Powloski, “where they wanted to marry different takes that weren’t designed to be put together. And then having to deal with light changing outside over the course of the day. That’s probably the most invisible part of the shot – replacing lighting or keeping lighting from an earlier take and applying it to a second take. “The shot,” adds Powloski, “was one of those where the producers of the show wanted the audience to say at the end, ‘Wow, where was the cut? I didn’t see any cut in this!’ And then they’d go back and re-watch it. That was very important to the producers - that the visual effects could hold up with somebody trying to analyze and figure out how it’s being done. I think that planning would have made it feel too choreographed, and I think that they were trying to avoid that.”

BREAKOUT WITH NO BREAKS

“[Director] Cary [Joji Fukunaga] wanted the scene to feel natural, and the free-flow camera work on the Steadicam did not really provide us with any ideal stitch points. We spent extra time in rehearsal knowing that we wanted to get everything in a true one-take shot. In the edit we discussed a potential stitch halfway through the shot and decided against it as we preferred the flow of the unadulterated original take.” —Ilia Mokhtareizadeh, Visual Effects Supervisor, Maniac

20 • VFXVOICE.COM SUMMER 2019

Showtime’s Escape at Dannemora, created and written by Brett Johnson and Michael Tolkin, and directed by Ben Stiller, tells the true story of a daring jail escape at the Clinton Correctional Facility in 2015. ‘Part 5’ showcases – in one nine-minute one-take – prisoner David Sweat’s (Paul Dano) trial run at the breakout, with the camera following him through walls, along tunnels and even crawling through pipes, before he emerges out of a man-hole outside the facility. “Doing it as a oner for Ben was completely about storytelling,” says Visual Effects Consultant Jake Braver, who had some previous experience with oners on the film Birdman. “It wasn’t about showing off. It was about taking this journey with the character the whole way, and realizing how hard it was and how tough it was, and how much ingenuity was required to get out. The idea of not using cuts was Ben’s way of saying, ‘Sweat didn’t cheat, so we’re not going to cheat.’ But it became very clear that we were going to have to cheat because it was physically impossible to shoot it all in one place.” Ultimately, four different places – a mix of real locations and stages – would be used for filming of the one-take, which became known as ‘Sweat’s run.’ Previs, led by Proof’s George Antzoulides, helped plan out how it would be shot and where edits and stitches could work. Shooting the takes would be complicated enough; they involved careful collaboration between departments, and specific rigs and actions allowing for cameras to go through walls or for camera hand-offs. The visual effects team from Phosphene would then take the original photography and incorporate seamless stitches between multiple takes, as well as add in CG pieces of the

TOP LEFT: The oner for Better Call Saul’s “Fifi” episode was filmed at an airport standing in for the US/ Mexico border checkpoint. BOTTOM LEFT: As the camera roamed around the scene, it would pick up elements that needed to be removed from the plate, such as the airport control tower.

TOP RIGHT: The line of trucks was extended and extraneous buildings removed. BOTTOM RIGHT: The final shot. Many elements had to be tracked, rotoscoped and digitally replaced throughout the entire scene.

SUMMER 2019 VFXVOICE.COM • 21


TV

“Getting underway on the most challenging sequence [‘Sweat’s Run’] this early, while still shooting, gave us a great opportunity to ease into the process with the director. [Director] Ben [Stiller] saw a few stitches work seamlessly and that led to some pretty fun, creative conversations.” —Djuna Wahlrab, Visual Effects Supervisor, Escape at Dannemora Image of Marvel’s Agents of S.H.I.E.L.D. © copyright 2015 ABC. Maniac images © copyright 2018 Netflix. Escape at Dannemora images © copyright 2018 Showtime. Better Call Saul images © copyright 2016 AMC/Sony Pictures Television.

22 • VFXVOICE.COM SUMMER 2019

underground environment, pipes and wall coverings, while sometimes removing safety gear and other elements from the plates. Among the most complex stitches was one involving Sweat climbing down a wall after leaving his cell and making his way along a series of catwalks. “There were limitations of the stage where we shot this,” describes Visual Effects Supervisor Djuna Wahlrab, “in that we couldn’t actually build any higher than three floors of the catwalks and the cells, and we needed a basement. That meant that we actually had to duplicate that section of the catwalk elsewhere in the stage and stitch it so that we could have the full length of Sweat going from the third floor all the way into the basement. Finding a place where we could stitch on Sweat and have him not leave the frame was a huge challenge. “The problem was,” continues Wahlrab, “that we were on a fairly loose handheld camera. The camera was on a cable coming down the side alongside Sweat, starting on his face and then coming down to his feet. It’s during that shift downwards, getting underneath Sweat, that the stitch takes place. Being able to keep the camera moving and Sweat moving, and also contend with the shifting perspective of a handheld, positionally moving camera, was the real challenge.” The planned stitches were all intended to be as invisible as this one, with no obvious seams. “It wasn’t a case of trying to hide anything in the shadows,” notes Visual Effects Producer Matthew Griffin. “Or trying to have moments where it feels too dark, that would make the audience say, ‘Oh, there’s the stitch.’ We didn’t want any of those moments. So it was so important to keep everything lit, to keep everything in frame, to keep everything moving, so you did have that sense of fluidity.” The climb-down stitch was only the first of many to come, with Wahlrab noting that Phosphene Lead Artist Tim Van Horn was able to get started early on with these initial stitch shots and provide some confidence to the filmmakers that they would convincingly sell ‘Sweat’s Run.’ “Getting underway on the most challenging sequence this early, while still shooting, gave us a great opportunity to ease into the process with the director. Ben saw a few stitches work seamlessly and that led to some pretty fun, creative conversations.” Braver recalls, too, that Stiller “was really fond of saying, in relation to the stitches, ‘I don’t want the audience to know what they’ve gone through until they’ve gone through it and Sweat takes a breath of fresh air at the end. Then people can go, ‘Wait, did they just do that?!’ You were just with this obsessive guy on his obsessive journey to free himself.”

TOP: Original plate for one section of the Escape at Dannemora one-take shot. MIDDLE: Phosphene’s CG extensions. BOTTOM: The final shot, which made it appear as if the section filmed was an extra-long tunnel.



FILM

SUPER MISFITS: THE VFX OF THE UMBRELLA ACADEMY By IAN FAILES

All images copyright © Netflix 2019. TOP LEFT: Ken Hall in a capture suit films a scene with Ellen Page as Vanya. TOP RIGHT: Weta Digital tracked the performance and replaced it with their CG Pogo. BOTTOM: The final rendered shot.

24 • VFXVOICE.COM SUMMER 2019

By now, we’re all used to superheroes on the big and small screens, but The Umbrella Academy – the Netflix series based on the Dark Horse comics and developed by Steve Blackman and Jeremy Slater – follows seven children all born simultaneously, who each seem to have mysterious superpowers, thrust together into a dysfunctional family. Still, with superpowers come super visual effects, and The Umbrella Academy has many, including the aftermath of a major future apocalypse, spatial and time-travel jumps, the destruction of the moon, and a fully CG character named Pogo, a chimpanzee who can speak. VFX Voice asked the show’s Production Visual Effects Supervisor, Everett Burrell, and Visual Effects Supervisor Chris White from Weta Digital – which created Pogo – about these significant sequences and characters. THE PATH TO POGO

In the series, Pogo is an assistant to Sir Reginald Hargreeves (played by Colm Feore), a billionaire who has adopted the seven children into his ‘Umbrella Academy.’ To create an advanced chimpanzee, Burrell knew he and his team would need to craft a creature that could talk, wear a suit and interact in scenes with live-action actors. That initially led him to consider a ‘man-in-suit’ option for shooting scenes, perhaps to be augmented with a CG head replacement. However, Burrell then consulted Joseph Conmy, Senior Vice President Visual Effects at 20th Century Fox, about his work with Weta Digital on the Planet of the Apes films, where Weta had very successfully realized the primates entirely in CG. It was a daunting task for such a crucial character to be created on what are usually shorter schedules for TV, but Burrell pushed the production on going fully digital with Pogo. “Netflix liked the idea of pushing the envelope on doing an all-CG character for television, especially for a show like this where it had to be very subtle and very supporting as a character. He wasn’t a dragon, flying around shooting fire. He had to simply be in the room and have conversations. It was a few months of terror, but once we saw the first test from Weta Digital, I think everybody calmed down.”

On set, actor Ken Hall stood in for Pogo wearing a gray tracking marker suit. Two 4K cameras off to the side of the main Alexa 65 camera acted as witness-cams to provide environment and body reference for Weta Digital. “Then we would pull Ken out and do clean plates, and then I would do HDRI capture with a gray ball,” says Burrell. “We also had a really great stand-in Pogo head with hair punched and painted so the director of photography could light it specifically for the shots.” The voice of Pogo was provided by Adam Godley, who was also filmed performing the lines so that his expressions and emotions could be inputted into the CG character. Weta Digital keyframed the final animation, but also used its own motion-capture stage to re-create Pogo’s actions based on the performances of Hall and Godley. Pogo unfortunately meets an untimely end at the hands of Vanya (Ellen Page). Weta Digital had a significant hand in how the death scene occurred. “We were able to send the production some previs that we had done showing how Pogo gets picked up and thrown against these antlers,” says White. “We also did some tests at Weta where we took some fabric from different shirts and practical blood and showed how the blood would come out and how it would form. We even had some antlers that were from another show that we pushed through.” For Pogo’s suit itself, Weta Digital relied on a physically-based cloth model. This model replicated the technique of spinning and weaving to make textiles the same way they are made in the real world. “It’s a new technique developed where we could go down to the individual thread, but it didn’t have to be hand modeled,” notes White. “You could give it different fabric and weave models and it does all the correct lighting, but it does it procedurally within the system itself. “The other thing with the clothing,” adds White, “was that originally we had thought of fitting this clothing to Pogo, but then we talked to Everett about it and he said, ‘Well, the public doesn’t know about him, so it’s not like he’s going to go out and get fitted or anything like that, so it’s okay for his jacket to not fit exactly right.’ You’ll see in certain scenes that it doesn’t hang like a well-fitted jacket and his pants are a little bit off.”

TOP RIGHT: Plate photography for a scene showing the apocalypse environment made use of partial sets and greenscreens. MIDDLE RIGHT: The final apocalypse shot extended the carnage with extra buildings, fire and smoke. BOTTOM LEFT: Number Five investigates the destruction in this original plate. BOTTOM RIGHT: Final shot with visual effects augmentation.

SUMMER 2019 VFXVOICE.COM • 25


FILM

SUPER MISFITS: THE VFX OF THE UMBRELLA ACADEMY By IAN FAILES

All images copyright © Netflix 2019. TOP LEFT: Ken Hall in a capture suit films a scene with Ellen Page as Vanya. TOP RIGHT: Weta Digital tracked the performance and replaced it with their CG Pogo. BOTTOM: The final rendered shot.

24 • VFXVOICE.COM SUMMER 2019

By now, we’re all used to superheroes on the big and small screens, but The Umbrella Academy – the Netflix series based on the Dark Horse comics and developed by Steve Blackman and Jeremy Slater – follows seven children all born simultaneously, who each seem to have mysterious superpowers, thrust together into a dysfunctional family. Still, with superpowers come super visual effects, and The Umbrella Academy has many, including the aftermath of a major future apocalypse, spatial and time-travel jumps, the destruction of the moon, and a fully CG character named Pogo, a chimpanzee who can speak. VFX Voice asked the show’s Production Visual Effects Supervisor, Everett Burrell, and Visual Effects Supervisor Chris White from Weta Digital – which created Pogo – about these significant sequences and characters. THE PATH TO POGO

In the series, Pogo is an assistant to Sir Reginald Hargreeves (played by Colm Feore), a billionaire who has adopted the seven children into his ‘Umbrella Academy.’ To create an advanced chimpanzee, Burrell knew he and his team would need to craft a creature that could talk, wear a suit and interact in scenes with live-action actors. That initially led him to consider a ‘man-in-suit’ option for shooting scenes, perhaps to be augmented with a CG head replacement. However, Burrell then consulted Joseph Conmy, Senior Vice President Visual Effects at 20th Century Fox, about his work with Weta Digital on the Planet of the Apes films, where Weta had very successfully realized the primates entirely in CG. It was a daunting task for such a crucial character to be created on what are usually shorter schedules for TV, but Burrell pushed the production on going fully digital with Pogo. “Netflix liked the idea of pushing the envelope on doing an all-CG character for television, especially for a show like this where it had to be very subtle and very supporting as a character. He wasn’t a dragon, flying around shooting fire. He had to simply be in the room and have conversations. It was a few months of terror, but once we saw the first test from Weta Digital, I think everybody calmed down.”

On set, actor Ken Hall stood in for Pogo wearing a gray tracking marker suit. Two 4K cameras off to the side of the main Alexa 65 camera acted as witness-cams to provide environment and body reference for Weta Digital. “Then we would pull Ken out and do clean plates, and then I would do HDRI capture with a gray ball,” says Burrell. “We also had a really great stand-in Pogo head with hair punched and painted so the director of photography could light it specifically for the shots.” The voice of Pogo was provided by Adam Godley, who was also filmed performing the lines so that his expressions and emotions could be inputted into the CG character. Weta Digital keyframed the final animation, but also used its own motion-capture stage to re-create Pogo’s actions based on the performances of Hall and Godley. Pogo unfortunately meets an untimely end at the hands of Vanya (Ellen Page). Weta Digital had a significant hand in how the death scene occurred. “We were able to send the production some previs that we had done showing how Pogo gets picked up and thrown against these antlers,” says White. “We also did some tests at Weta where we took some fabric from different shirts and practical blood and showed how the blood would come out and how it would form. We even had some antlers that were from another show that we pushed through.” For Pogo’s suit itself, Weta Digital relied on a physically-based cloth model. This model replicated the technique of spinning and weaving to make textiles the same way they are made in the real world. “It’s a new technique developed where we could go down to the individual thread, but it didn’t have to be hand modeled,” notes White. “You could give it different fabric and weave models and it does all the correct lighting, but it does it procedurally within the system itself. “The other thing with the clothing,” adds White, “was that originally we had thought of fitting this clothing to Pogo, but then we talked to Everett about it and he said, ‘Well, the public doesn’t know about him, so it’s not like he’s going to go out and get fitted or anything like that, so it’s okay for his jacket to not fit exactly right.’ You’ll see in certain scenes that it doesn’t hang like a well-fitted jacket and his pants are a little bit off.”

TOP RIGHT: Plate photography for a scene showing the apocalypse environment made use of partial sets and greenscreens. MIDDLE RIGHT: The final apocalypse shot extended the carnage with extra buildings, fire and smoke. BOTTOM LEFT: Number Five investigates the destruction in this original plate. BOTTOM RIGHT: Final shot with visual effects augmentation.

SUMMER 2019 VFXVOICE.COM • 25


FILM

Pogo Progression How Weta Digital made the stunning CG chimpanzee for The Umbrella Academy.

“Netflix liked the idea of pushing the envelope on doing an all-CG character for television, especially for a show like this where it had to be very subtle and very supporting as a character. He wasn’t a dragon, flying around shooting fire. He had to simply be in the room and have conversations. It was a few months of terror, but once we saw the first test from Weta Digital, I think everybody calmed down.” —Everett Burrell, Production Visual Effects Supervisor

BRINGING THE APOCALYPSE

Before we actually find out just what causes the apocalypse, the viewer sees glimpses of it from the future, courtesy of Number Five (Aidan Gallagher), who is able to travel through space and time. Only a small portion of the destroyed apocalyptic environment was built, says Burrell. “There was a pile of rubble, the façade of the Umbrella Academy home and maybe a couple other buildings. The way it was designed was like an amphitheater of rubble, so when you were at the bottom of the amphitheater and you looked around, the rubble covered the skyline. But then occasionally we’d want to go high and wide to reveal the apocalypse.” That extension of the destruction was handled by Spin VFX, which took Lidar and photographic reference of the set and built extra derelict structures and general mayhem. “We also had great concept art from production designer Mark Worthington,” adds Burrell. “It really set the tone. Every time we had an incoming director, we gave them this packet which we called the ‘director’s welcome pack,’ and it had all the concept art, and all the R&D and tests that we had done that had been approved.”

1. Capturing the actors Ken Hall performed Pogo scenes in a gray tracking suit on set, while actor Adam Godley (inset) delivered the character’s voice in a separate capture session.

2. CG chimpanzee Taking advantage of its previous work on the Planet of the Apes films, Weta Digital crafted a digital Pogo, also tailoring a CG suit with new software.

JUMPING WITH NUMBER FIVE

TOP LEFT: The Umbrella Academy used invisible effects to add scope to the series. Here is the original plate for a city scene. TOP RIGHT: The final shot replaced the existing background with a fleshed-out city.

26 • VFXVOICE.COM SUMMER 2019

In order to show us the apocalypse, Number Five jumps forward in time. He can also do much smaller spatial jumps. For these, the visual effects team started with some early R&D. “In one of our camera tests,” recalls Burrell, “Aidan was in a hallway, and he was just kind of goofing off in between takes. I asked him, ‘Can you do a little hop in the air? Can you do a little jump?’ And he did a little jump, and then I said, ‘Well, that’s actually kind of neat, and it will help motivate you when you teleport, so why don’t you run towards camera, jump in the air, and then just walk out of frame?’” Spin took that footage and iterated on several kinds of spatial jump effects, all the way from heavy distortion to subtler images. The final effect – dubbed ‘jelly vision’ – appeared, describes Burrell, “as if you’re pushing your hand through a jelly membrane, just for a few seconds, and then it pops. It’s really, really subtle, but you get a little bit of texture, you get a little bit of striations, almost like the universe is bending as he does his spatial jumps.” Time-jumps by Number Five were more elaborate in nature,

3. Final render Weta Digital’s proprietary path tracer, Manuka, was used to render Pogo, who had to interact with live-action sets and live-action actors in the scenes.

TOP RIGHT: Scenes of Luther (played by Tom Hopper) on the moon first required a greenscreen shoot. MIDDLE RIGHT: CG elements were then added into the frame. BOTTOM RIGHT: A final moon composite.

SUMMER 2019 VFXVOICE.COM • 27


FILM

Pogo Progression How Weta Digital made the stunning CG chimpanzee for The Umbrella Academy.

“Netflix liked the idea of pushing the envelope on doing an all-CG character for television, especially for a show like this where it had to be very subtle and very supporting as a character. He wasn’t a dragon, flying around shooting fire. He had to simply be in the room and have conversations. It was a few months of terror, but once we saw the first test from Weta Digital, I think everybody calmed down.” —Everett Burrell, Production Visual Effects Supervisor

BRINGING THE APOCALYPSE

Before we actually find out just what causes the apocalypse, the viewer sees glimpses of it from the future, courtesy of Number Five (Aidan Gallagher), who is able to travel through space and time. Only a small portion of the destroyed apocalyptic environment was built, says Burrell. “There was a pile of rubble, the façade of the Umbrella Academy home and maybe a couple other buildings. The way it was designed was like an amphitheater of rubble, so when you were at the bottom of the amphitheater and you looked around, the rubble covered the skyline. But then occasionally we’d want to go high and wide to reveal the apocalypse.” That extension of the destruction was handled by Spin VFX, which took Lidar and photographic reference of the set and built extra derelict structures and general mayhem. “We also had great concept art from production designer Mark Worthington,” adds Burrell. “It really set the tone. Every time we had an incoming director, we gave them this packet which we called the ‘director’s welcome pack,’ and it had all the concept art, and all the R&D and tests that we had done that had been approved.”

1. Capturing the actors Ken Hall performed Pogo scenes in a gray tracking suit on set, while actor Adam Godley (inset) delivered the character’s voice in a separate capture session.

2. CG chimpanzee Taking advantage of its previous work on the Planet of the Apes films, Weta Digital crafted a digital Pogo, also tailoring a CG suit with new software.

JUMPING WITH NUMBER FIVE

TOP LEFT: The Umbrella Academy used invisible effects to add scope to the series. Here is the original plate for a city scene. TOP RIGHT: The final shot replaced the existing background with a fleshed-out city.

26 • VFXVOICE.COM SUMMER 2019

In order to show us the apocalypse, Number Five jumps forward in time. He can also do much smaller spatial jumps. For these, the visual effects team started with some early R&D. “In one of our camera tests,” recalls Burrell, “Aidan was in a hallway, and he was just kind of goofing off in between takes. I asked him, ‘Can you do a little hop in the air? Can you do a little jump?’ And he did a little jump, and then I said, ‘Well, that’s actually kind of neat, and it will help motivate you when you teleport, so why don’t you run towards camera, jump in the air, and then just walk out of frame?’” Spin took that footage and iterated on several kinds of spatial jump effects, all the way from heavy distortion to subtler images. The final effect – dubbed ‘jelly vision’ – appeared, describes Burrell, “as if you’re pushing your hand through a jelly membrane, just for a few seconds, and then it pops. It’s really, really subtle, but you get a little bit of texture, you get a little bit of striations, almost like the universe is bending as he does his spatial jumps.” Time-jumps by Number Five were more elaborate in nature,

3. Final render Weta Digital’s proprietary path tracer, Manuka, was used to render Pogo, who had to interact with live-action sets and live-action actors in the scenes.

TOP RIGHT: Scenes of Luther (played by Tom Hopper) on the moon first required a greenscreen shoot. MIDDLE RIGHT: CG elements were then added into the frame. BOTTOM RIGHT: A final moon composite.

SUMMER 2019 VFXVOICE.COM • 27


FILM

“In one of our camera tests, [actor] Aidan [Gallagher] was in a hallway, and he was just kind of goofing off in between takes. I asked him, ‘Can you do a little hop in the air? Can you do a little jump?’ And he did a little jump, and then I said, ‘Well, that’s actually kind of neat, and it will help motivate you when you teleport, so why don’t you run towards camera, jump in the air, and then just walk out of frame?’” —Everett Burrell, Production Visual Effects Supervisor

since they open up a portal that appears more like a displaced and distorted gateway. In addition, it was decided to have the portals ‘turn off’ like an old black and white television set. This was realized by having the portal “sucked into one little point with a little residual, horizontal lens there,” outlines Burrell. MOON CRUSHING

In the final episode of the season, the cause of the apocalypse is uncovered – the suppressed powers of Vanya are finally unleashed, and her fiery abilities burn a hole in the moon, which begins to break up and rain down on the Earth. Here, Burrell relied on several VFX vendors to deliver shots of the chaos, with matte paintings and digital environments shared between Spin, Folks VFX, Deluxe and Monsters Aliens Robots Zombies (MARZ). “We also brought in Method Studios to do all the really big destruction of the moon and of the city,” says Burrell. “They were heavily involved in all the effects animation and the R&D of blowing up the moon and the beam that Vanya shoots out from her chest. That was very elaborate, and I think we only had six weeks to make all that happen, but Method really came through, destroying the moon. That stuff looked great.” UNEXPECTED EFFECTS

TOP LEFT: Some scenes were shot in Winter when leaves had fallen off the trees. TOP RIGHT: Visual effects replaced the leafless trees with lush green ones to match other plate photography.

28 • VFXVOICE.COM SUMMER 2019

Among the complex challenges of CG characters, time-traveling effects and apocalyptic environments, The Umbrella Academy also required some largely invisible VFX when some undesirable locations filmed during mid-Winter were actually suitable once Spring had hit. “The locations that Steven Blackman had originally hated in Winter became these beautiful, green, lush tree areas,” notes Burrell. “So we had to go back to the first five or six episodes and add trees and leaves to every shot that did not have them in the Winter time. We went back to every location, and we shot with our 4K cameras those same angles once the trees had bloomed. I gave those to Folks VFX, and they basically comp’d them in, and when we couldn’t do that, we’d do some CG trees. “We were very lucky that we had the time and the patience to go back and get all those angles,” adds Burrell. “I’m almost prouder of that than Pogo because no one will ever know.”



FILM

CREATING VISUAL MAGIC FOR THE LIVE-ACTION ALADDIN By KEVIN H. MARTIN

Images courtesy of Disney Enterprises. TOP: Aladdin (Mena Massoud) in Disney’s live-action Aladdin, directed by Guy Ritchie. BOTTOM LEFT: Aladdin meets the CG larger-than-life Genie (played by Will Smith). Genie was realized by the digital character and FX simulation teams at ILM. BOTTOM RIGHT: Aladdin and Jasmine (played by Naomi Scott) flee from palace guards on the streets of Agrabah, the thriving desert city featured in the film. (Photo: Daniel Smith)

30 • VFXVOICE.COM SUMMER 2019

Disney’s wildly successful and critically-acclaimed 1992 animated feature Aladdin, co-directed by Ron Clements and John Musker, and starring Robin Williams as Genie, was merely one of many animated dramatizations of the classic story produced since the 1926 feature The Adventures of Prince Achmed. The Disney adaption drew on character and story elements from 1940’s live-action The Thief of Bagdad (itself a remake of 1924’s Douglas Fairbanks silent-era spectacular), a visual extravaganza that featured pioneering bluescreen work in a color film by Special Effects Director Larry Butler, which earned the film an Oscar for its effects. The success of the Disney feature led to a pair of direct-to-video sequels and a TV series, plus a Broadway adaptation. After some gestation, a live-action adaptation was announced, with Guy Ritchie signed to direct. A veteran of the Ritchie-directed Sherlock Holmes films, VFX Supervisor Chas Jarrett was tasked with both technical and creative control on the project, including a major contribution from Industrial Light and Magic, as well as work from Hybride Technologies, Base VFX, Magiclab, Host VFX and One of Us. DNEG provided stereo conversion for the 3D release, while Ncam and Nvizage handled on-set tracking and virtual camera functions. Jarrett, an Oscar nominee for Poseidon and BAFTA nominee

for Charlie and the Chocolate Factory, who won a VES Award for the first Sherlock Holmes feature, began his career at MPC. After a decade there, he became an independent supervisor, and in recent years, he headed up VFX efforts on both Pan and Logan. “Our story is very faithful to Disney’s original animated feature,” Jarrett readily acknowledges. “There were many inspirations we could take from it, but it was never our intention to lean too closely towards such an iconic film [for the live-action film]. It was important for everyone on the project that we stood on our own feet.” Towards that end, pre-production commenced with what Jarrett characterizes as a blank slate. Production designer Gemma Jackson, an Emmy winner for John Adams and Game of Thrones, who had recently collaborated with Ritchie on his King Arthur: Legend of the Sword, put her team in the art department to work immediately by sourcing volumes of reference material and beginning visualization for all aspects of Agrabah, the thriving desert city featured in the film. “At the same time, we began developing storyboards, animatics and previs with an internal VFX team of around 35 people, who ultimately produced around 40 minutes of animatics and previs for the film,” Jarrett continues. “This was mostly used to establish the more choreographed musical sequences in the film. We also brought on board a large team of traditional animators who were instrumental in developing character ideas and performances.” Proof Inc. also contributed to both previsualization and postvis efforts. Another major issue during prep revolved around vendor selection. “Choosing the right team for a project like this is crucial,” he emphasizes, “in part, because it’s such a smorgasbord of creative and technical challenges. So my first port of call was to ILM and two of their VFX supervisors, Mike Mulholland and David Seager. Mike was the Overall ILM Supervisor while David joined me on set as our 2nd Unit VFX Supervisor, and then headed back to Vancouver to manage the team there. Also on board early was Animation Supervisor Tim Harrington, joined later by Steve Aplin – two fantastically creative collaborators.” Other supervisors include ILM London’s Mark Bakowski, Daniele Bigi, Animation Supervisor Mike Beaulieu and Jeff Capogreco. Given Guy Ritchie’s trademark gritty approach to filmmaking, it was no surprise that this fairytale would be one with an edge. “Guy was clear from the outset that the film had to take place in a viably real world that felt tangible and authentic,” states Jarrett. “For us, this meant that while there’s a strong fantasy element to the story, the world needed to feel grounded with environments and characters that were plausible. Guy is very open to trying new technical methodologies in his films, and we certainly pushed those boundaries for this project, but working in real sets and locations as much as possible was always his preference. Where we used digital sets and extensions we took great care to base our work on scans and plates of real places to ensure that we stayed grounded in ‘reality’ – which meant the environments were inspired by real locations. So Giles Harding, my on-set supervisor, oversaw LIDAR and photogrammetry scanning of locations in

“[Director] Guy [Ritchie] was never inclined to dwell on ‘magical’ effects for long, so they tend to be quick and percussive.” —Chas Jarrett, VFX Supervisor

TOP: Aladdin’s sidekick Abu, who was in the original feature animation, was entirely digital in the live-action version, and based on a Capuchin monkey. MIDDLE: Aladdin, the street rat with a heart of gold, and Genie. (Photo: Daniel Smith) BOTTOM: Aladdin stands by as Genie is about to make his appearance from a mini tornado of blue vapor.

SUMMER 2019 VFXVOICE.COM • 31


FILM

CREATING VISUAL MAGIC FOR THE LIVE-ACTION ALADDIN By KEVIN H. MARTIN

Images courtesy of Disney Enterprises. TOP: Aladdin (Mena Massoud) in Disney’s live-action Aladdin, directed by Guy Ritchie. BOTTOM LEFT: Aladdin meets the CG larger-than-life Genie (played by Will Smith). Genie was realized by the digital character and FX simulation teams at ILM. BOTTOM RIGHT: Aladdin and Jasmine (played by Naomi Scott) flee from palace guards on the streets of Agrabah, the thriving desert city featured in the film. (Photo: Daniel Smith)

30 • VFXVOICE.COM SUMMER 2019

Disney’s wildly successful and critically-acclaimed 1992 animated feature Aladdin, co-directed by Ron Clements and John Musker, and starring Robin Williams as Genie, was merely one of many animated dramatizations of the classic story produced since the 1926 feature The Adventures of Prince Achmed. The Disney adaption drew on character and story elements from 1940’s live-action The Thief of Bagdad (itself a remake of 1924’s Douglas Fairbanks silent-era spectacular), a visual extravaganza that featured pioneering bluescreen work in a color film by Special Effects Director Larry Butler, which earned the film an Oscar for its effects. The success of the Disney feature led to a pair of direct-to-video sequels and a TV series, plus a Broadway adaptation. After some gestation, a live-action adaptation was announced, with Guy Ritchie signed to direct. A veteran of the Ritchie-directed Sherlock Holmes films, VFX Supervisor Chas Jarrett was tasked with both technical and creative control on the project, including a major contribution from Industrial Light and Magic, as well as work from Hybride Technologies, Base VFX, Magiclab, Host VFX and One of Us. DNEG provided stereo conversion for the 3D release, while Ncam and Nvizage handled on-set tracking and virtual camera functions. Jarrett, an Oscar nominee for Poseidon and BAFTA nominee

for Charlie and the Chocolate Factory, who won a VES Award for the first Sherlock Holmes feature, began his career at MPC. After a decade there, he became an independent supervisor, and in recent years, he headed up VFX efforts on both Pan and Logan. “Our story is very faithful to Disney’s original animated feature,” Jarrett readily acknowledges. “There were many inspirations we could take from it, but it was never our intention to lean too closely towards such an iconic film [for the live-action film]. It was important for everyone on the project that we stood on our own feet.” Towards that end, pre-production commenced with what Jarrett characterizes as a blank slate. Production designer Gemma Jackson, an Emmy winner for John Adams and Game of Thrones, who had recently collaborated with Ritchie on his King Arthur: Legend of the Sword, put her team in the art department to work immediately by sourcing volumes of reference material and beginning visualization for all aspects of Agrabah, the thriving desert city featured in the film. “At the same time, we began developing storyboards, animatics and previs with an internal VFX team of around 35 people, who ultimately produced around 40 minutes of animatics and previs for the film,” Jarrett continues. “This was mostly used to establish the more choreographed musical sequences in the film. We also brought on board a large team of traditional animators who were instrumental in developing character ideas and performances.” Proof Inc. also contributed to both previsualization and postvis efforts. Another major issue during prep revolved around vendor selection. “Choosing the right team for a project like this is crucial,” he emphasizes, “in part, because it’s such a smorgasbord of creative and technical challenges. So my first port of call was to ILM and two of their VFX supervisors, Mike Mulholland and David Seager. Mike was the Overall ILM Supervisor while David joined me on set as our 2nd Unit VFX Supervisor, and then headed back to Vancouver to manage the team there. Also on board early was Animation Supervisor Tim Harrington, joined later by Steve Aplin – two fantastically creative collaborators.” Other supervisors include ILM London’s Mark Bakowski, Daniele Bigi, Animation Supervisor Mike Beaulieu and Jeff Capogreco. Given Guy Ritchie’s trademark gritty approach to filmmaking, it was no surprise that this fairytale would be one with an edge. “Guy was clear from the outset that the film had to take place in a viably real world that felt tangible and authentic,” states Jarrett. “For us, this meant that while there’s a strong fantasy element to the story, the world needed to feel grounded with environments and characters that were plausible. Guy is very open to trying new technical methodologies in his films, and we certainly pushed those boundaries for this project, but working in real sets and locations as much as possible was always his preference. Where we used digital sets and extensions we took great care to base our work on scans and plates of real places to ensure that we stayed grounded in ‘reality’ – which meant the environments were inspired by real locations. So Giles Harding, my on-set supervisor, oversaw LIDAR and photogrammetry scanning of locations in

“[Director] Guy [Ritchie] was never inclined to dwell on ‘magical’ effects for long, so they tend to be quick and percussive.” —Chas Jarrett, VFX Supervisor

TOP: Aladdin’s sidekick Abu, who was in the original feature animation, was entirely digital in the live-action version, and based on a Capuchin monkey. MIDDLE: Aladdin, the street rat with a heart of gold, and Genie. (Photo: Daniel Smith) BOTTOM: Aladdin stands by as Genie is about to make his appearance from a mini tornado of blue vapor.

SUMMER 2019 VFXVOICE.COM • 31


FILM

TOP: An extensive backlot set served as the backstreets of Agrabah and the main parade ground in front of the palace gates. BOTTOM: Marwan Kenzari is the powerful sorcerer Jafar. (Photo: Daniel Smith) OPPOSITE TOP AND BOTTOM: Real and CG environments were inspired by desert locations in Morocco and Jordan. Aerial photography and scans enhanced the scope and depth of sequences.

32 • VFXVOICE.COM SUMMER 2019

Morocco and Jordan, as well as a hugely detailed capture of all our amazing sets.” Rather than remaining stagebound for those sets, production also availed itself of the studio backlot. “It is always my personal preference to shoot exterior scenes on exterior sets with real daylight and sun, and we were fortunate to be able to build an extensive backlot set for much of our principal photography,” says Jarrett. “Principal photography took place mostly on our enormous backlot set which served as the backstreets of Agrabah and the main parade ground in front of the palace gates, which we digitally augmented where needed to give greater depth to the environments. But a consequence of shooting at a studio just outside London is that you’re constantly at the mercy of the ‘great’ British weather. For this reason, some sets were built on interior stages to give us more reliable weather cover. In these cases, we created digital extensions and skies to offer the shots more depth. As with all the VFX on Aladdin, we were very careful to use textures and color palettes that supported the set and stayed true to Gemma’s designs. “Obviously,” continues Jarrett, “the bigger, wider views of Agrabah were created by our environments team at ILM, who used scans and photography from the sets and from real locations as a basis for their incredible work. We were extremely fortunate to be able to shoot some sequences on location in Jordan. The wealth of film and photographic reference we gathered in Jordan [during] our time there was pivotal in establishing the desert look for the film. We used drones and helicopters [through Helicopter Film Services and UAV drone pilot Alan Perrin] to photograph and scan some amazing environments which greatly enhanced the scope of those sequences. I also undertook a number of aerial photography shoots in Morocco, Jordan, Namibia and Svalbard to give these shots a sense of scope and realism. In many cases, this aerial footage also provided invaluable reference for our environment and lighting teams.”

No trek through this mythological landscape would be complete without a magic carpet ride. A six-axis hydraulic platform built by Special Effects Floor Supervisor David Holt’s team provided an articulated basis for practically-shot live-action elements. “In some instances, we had pre-filmed backgrounds shot with either wirecams on the backlot set or from helicopters on location,” Jarrett reports. “The hydraulic rig was programmed to move using Flair so we could synchronize it with our motion-control camera. In other cases, we could operate the carpet rig live using a hand-operated input device while we shot the actors from a Technocrane. In general, the process for combining backgrounds with foregrounds required a great deal of digital manipulation.” With respect to the characters of Carpet, Abu, Iago, Rajah and Genie, Jarrett readily acknowledges that to achieve their full potential on screen, a significant element of VFX involvement would be required. Achieving this required multiple iterations in order to conquer issues of acceptable behavior for the camera. “The characters should lean towards naturalism rather than caricature,” he states. “A good example is [the monkey] sidekick Abu, who in the original feature animation is almost human-like in his ability to understand and communicate. We knew from the outset this character would be entirely digital throughout the film, and that he should be visually based on a Capuchin monkey – but it wasn’t clear to us yet how ‘human’ or ‘monkey’ we should go with his performance.” A wide range of animation styles and cycles were attempted, ranging from cartoonish to anthropomorphized to natural. “We tried a variety just to see what clicked,” Jarrett elaborates, “and it became apparent very quickly that a ‘real’ looking monkey who didn’t behave like a real monkey didn’t feel right at all. If we had too much ‘human-ness’ in the performance, it just didn’t feel credible. So we scoured the archives for filmed material of Capuchins and found reference clips of behavior that matched the needs of our scenes, and used that as the basis for his performance.”

With the Genie (played by Will Smith), performance capture was a given, but the manner in which he manifested while emerging from his bottle was always intended as a post-production effect. “Genie’s materialization was always planned to be achieved by our digital character and FX simulation teams at ILM, primarily because the look of these effects needed to be carefully designed and controlled in a way we couldn’t achieve with practical effects on set,” Jarrett declares. “Guy was a strong advocate of finding a ‘language’ for Genie’s magic that would be consistent and recognizable throughout the film. So we developed FX simulations which took color cues from Genie’s costumes and jewelry and combined them in different ways to create some really interesting effects. Guy was never inclined to dwell on ‘magical’ effects for long, so they tend to be quick and percussive.” In evaluating the work of the multitudes involved in creating visual magic for this live-action version of Aladdin, Jarrett claims, “Every type of VFX work is represented within the film – from character animation, performance capture, set extensions, digital environments, and FX simulations for fur, fire, water, lava, cloth, muscles and skin – to a plethora of Genie magic.”

SUMMER 2019 VFXVOICE.COM • 33


FILM

TOP: An extensive backlot set served as the backstreets of Agrabah and the main parade ground in front of the palace gates. BOTTOM: Marwan Kenzari is the powerful sorcerer Jafar. (Photo: Daniel Smith) OPPOSITE TOP AND BOTTOM: Real and CG environments were inspired by desert locations in Morocco and Jordan. Aerial photography and scans enhanced the scope and depth of sequences.

32 • VFXVOICE.COM SUMMER 2019

Morocco and Jordan, as well as a hugely detailed capture of all our amazing sets.” Rather than remaining stagebound for those sets, production also availed itself of the studio backlot. “It is always my personal preference to shoot exterior scenes on exterior sets with real daylight and sun, and we were fortunate to be able to build an extensive backlot set for much of our principal photography,” says Jarrett. “Principal photography took place mostly on our enormous backlot set which served as the backstreets of Agrabah and the main parade ground in front of the palace gates, which we digitally augmented where needed to give greater depth to the environments. But a consequence of shooting at a studio just outside London is that you’re constantly at the mercy of the ‘great’ British weather. For this reason, some sets were built on interior stages to give us more reliable weather cover. In these cases, we created digital extensions and skies to offer the shots more depth. As with all the VFX on Aladdin, we were very careful to use textures and color palettes that supported the set and stayed true to Gemma’s designs. “Obviously,” continues Jarrett, “the bigger, wider views of Agrabah were created by our environments team at ILM, who used scans and photography from the sets and from real locations as a basis for their incredible work. We were extremely fortunate to be able to shoot some sequences on location in Jordan. The wealth of film and photographic reference we gathered in Jordan [during] our time there was pivotal in establishing the desert look for the film. We used drones and helicopters [through Helicopter Film Services and UAV drone pilot Alan Perrin] to photograph and scan some amazing environments which greatly enhanced the scope of those sequences. I also undertook a number of aerial photography shoots in Morocco, Jordan, Namibia and Svalbard to give these shots a sense of scope and realism. In many cases, this aerial footage also provided invaluable reference for our environment and lighting teams.”

No trek through this mythological landscape would be complete without a magic carpet ride. A six-axis hydraulic platform built by Special Effects Floor Supervisor David Holt’s team provided an articulated basis for practically-shot live-action elements. “In some instances, we had pre-filmed backgrounds shot with either wirecams on the backlot set or from helicopters on location,” Jarrett reports. “The hydraulic rig was programmed to move using Flair so we could synchronize it with our motion-control camera. In other cases, we could operate the carpet rig live using a hand-operated input device while we shot the actors from a Technocrane. In general, the process for combining backgrounds with foregrounds required a great deal of digital manipulation.” With respect to the characters of Carpet, Abu, Iago, Rajah and Genie, Jarrett readily acknowledges that to achieve their full potential on screen, a significant element of VFX involvement would be required. Achieving this required multiple iterations in order to conquer issues of acceptable behavior for the camera. “The characters should lean towards naturalism rather than caricature,” he states. “A good example is [the monkey] sidekick Abu, who in the original feature animation is almost human-like in his ability to understand and communicate. We knew from the outset this character would be entirely digital throughout the film, and that he should be visually based on a Capuchin monkey – but it wasn’t clear to us yet how ‘human’ or ‘monkey’ we should go with his performance.” A wide range of animation styles and cycles were attempted, ranging from cartoonish to anthropomorphized to natural. “We tried a variety just to see what clicked,” Jarrett elaborates, “and it became apparent very quickly that a ‘real’ looking monkey who didn’t behave like a real monkey didn’t feel right at all. If we had too much ‘human-ness’ in the performance, it just didn’t feel credible. So we scoured the archives for filmed material of Capuchins and found reference clips of behavior that matched the needs of our scenes, and used that as the basis for his performance.”

With the Genie (played by Will Smith), performance capture was a given, but the manner in which he manifested while emerging from his bottle was always intended as a post-production effect. “Genie’s materialization was always planned to be achieved by our digital character and FX simulation teams at ILM, primarily because the look of these effects needed to be carefully designed and controlled in a way we couldn’t achieve with practical effects on set,” Jarrett declares. “Guy was a strong advocate of finding a ‘language’ for Genie’s magic that would be consistent and recognizable throughout the film. So we developed FX simulations which took color cues from Genie’s costumes and jewelry and combined them in different ways to create some really interesting effects. Guy was never inclined to dwell on ‘magical’ effects for long, so they tend to be quick and percussive.” In evaluating the work of the multitudes involved in creating visual magic for this live-action version of Aladdin, Jarrett claims, “Every type of VFX work is represented within the film – from character animation, performance capture, set extensions, digital environments, and FX simulations for fur, fire, water, lava, cloth, muscles and skin – to a plethora of Genie magic.”

SUMMER 2019 VFXVOICE.COM • 33


FILM AND TV

villain Mysterio (Jake Gyllenhaal) threatens both his vacation plans and the fate of the world. Former Sony Pictures Imageworks Visual Effects Executive Producer Shauna Bryan oversaw the CGI, with the help of ILM (Visual Effects Supervisor Julian Foddy), Framestore (Visual Effects Supervisor Alexis Wajsbrot), Luma Pictures (Visual Effects Supervisor Brendan Seals), Scanline VFX (Visual Effects Supervisor Michele Stocco) and Rising Sun Pictures.

COOL SUMMER VFX FILMS AND TV By CHRIS McGOWAN

TOP LEFT: The Lion King (Photo © Walt Disney Pictures) TOP RIGHT: Once Upon a Time in Hollywood (Photo © Sony Pictures) BOTTOM: Fast & Furious Presents: Hobbs & Shaw (Photo © Universal Pictures)

In the summertime, VFX films will be diverse and push the envelope in all directions. The tapestry of superhero visual effects will broaden with Spider-Man: Far From Home and The New Mutants. The Lion King will no doubt take photorealistic computer animation to a new level, as it brings the African savanna to life. Abominable, Spies in Disguise and The Angry Birds Movie 2 all feature 3D computer animation. Artemis Fowl blends the earthly with the unearthly, as does Stranger Things: Season 3, with new visions from the Upside Down. An interesting mix of horror tales – Midsommar, Scary Stories to Tell in the Dark and It: Chapter Two – offer extraordinary VFX fulfilling an array of missions. Fast & Furious Presents: Hobbs and Shaw will showcase turbo-powered visual effects in the action realm, and so should (to a lesser extent) the action-suspense dramas Once Upon a Time in Hollywood, Angel Has Fallen and sci-fi edged Boss Level, as well as the comedies Stuber, Where’d You Go, Bernadette? and The Kitchen. “Invisible effects” that are part of many such films should also be on subtle, but important display in the film adaptation of Downton Abbey, set in Yorkshire a hundred years ago. VFX Voice has compiled a list of some of the top VFX-laden productions due this summer, from July through September, along with the visual effects supervisors and VFX houses involved, when possible. This is not intended to be a complete listing. Release dates are subject to change. Stranger Things, Season 3 (Netflix) Release date: U.K., July 4; U.S., July 4 Matt and Ross Duffer’s hit supernatural sci-fi series returns with Millie Bobby Brown, David Harbor and Finn Wolfhard. Paul Graff is Overall Visual Effects Supervisor, Christina Graff is Senior Visual Effects Producer, Chloe Lipp is a Visual Effects Coordinator and Ravindra Tamhankar is a Visual Effects Producer. Atomic Studios, now part of Deluxe Entertainment, worked on the visual effects last season with other VFX houses. Sister firm Method Studios is involved this time around. Crafty Apes is also on board for VFX. Spider-Man: Far From Home (Marvel Studios/Sony Pictures) Release date: U.K., July 5; U.S., July 5 Peter Parker (Tom Holland) wants to summer in Europe, but

34 • VFXVOICE.COM SUMMER 2019

Stuber (20th Century Fox) Release date: U.K., July 12; U.S., July 12 Kumail Nanjiani (Silicon Valley) portrays Stu, a mild-mannered Uber driver who picks up a detective (Dave Bautista), who is chasing a sadistic terrorist. Stu must survive the ride and help his passenger, all while maintaining his Uber rating. Sean Thigpen is the Visual Effects Supervisor and Crafty Apes adds VFX to the action. The Lion King (Walt Disney Studios) Release date: U.K., July 19; U.S., July 19 Following his successful live-action/CGI adaptation of The Jungle Book, Jon Favreau takes on the photorealistic computer-animated Lion King remake with award-winning Visual Effects Supervisor Rob Legato. MPC (Visual Effects Supervisor Elliot Newman) conjured up the simulations of the lions, their animal peers and African landscapes. Once Upon a Time in Hollywood (Sony Pictures) Release date: U.K., August 14; U.S., July 26 Quentin Tarantino wrote and directed this tale about struggling TV actor Rick Dalton (Leonardo DiCaprio) and his stunt double/buddy Cliff Booth (Brad Pitt), who are trying to reboot their careers in 1969 Hollywood. Unfortunately, it’s at the very time that the Manson Family are wreaking bloody havoc. Luma Pictures (Visual Effects Producer Michael Perdew) has the task of bringing authenticity to the period film. The New Mutants (20th Century Fox) Release date: U.K., August 2; U.S., August 2 Five young mutants learn about their abilities and fight to survive in a secret facility where they are being held prisoner. Oliver Dumont and Craig Wentworth are VFX Supervisors in this latest film featuring the X-Men. DNEG (VFX Producer Anton Agerbo), Method Studios (Visual Effects Producer Sheena Johnson), MPC (VFX Supervisor Bryan Litson), and Zero VFX (Visual Effects Supervisor Dan Cayer) focused on conveying the action and the heroes’ emerging powers. Fast & Furious Presents: Hobbs & Shaw (Universal Pictures) Release date: U.K., August 2; U.S., August 2 In David Leitch’s high-octane Fast and Furious spinoff, Luke Hobbs (Dwayne Johnson) and Deckard Shaw (Jason Statham) team up to combat the cyber-genetically-enhanced terrorist Brixton (Idris Elba). DNEG (Overall Visual Effects Supervisor

TOP: Spies in Disguise (Photo © 20th Century Fox) MIDDLE: Midsommar (Photo © A24) BOTTOM: Scary Stories to Tell in the Dark (Photo © Lionsgate)

SUMMER 2019 VFXVOICE.COM • 35


FILM AND TV

villain Mysterio (Jake Gyllenhaal) threatens both his vacation plans and the fate of the world. Former Sony Pictures Imageworks Visual Effects Executive Producer Shauna Bryan oversaw the CGI, with the help of ILM (Visual Effects Supervisor Julian Foddy), Framestore (Visual Effects Supervisor Alexis Wajsbrot), Luma Pictures (Visual Effects Supervisor Brendan Seals), Scanline VFX (Visual Effects Supervisor Michele Stocco) and Rising Sun Pictures.

COOL SUMMER VFX FILMS AND TV By CHRIS McGOWAN

TOP LEFT: The Lion King (Photo © Walt Disney Pictures) TOP RIGHT: Once Upon a Time in Hollywood (Photo © Sony Pictures) BOTTOM: Fast & Furious Presents: Hobbs & Shaw (Photo © Universal Pictures)

In the summertime, VFX films will be diverse and push the envelope in all directions. The tapestry of superhero visual effects will broaden with Spider-Man: Far From Home and The New Mutants. The Lion King will no doubt take photorealistic computer animation to a new level, as it brings the African savanna to life. Abominable, Spies in Disguise and The Angry Birds Movie 2 all feature 3D computer animation. Artemis Fowl blends the earthly with the unearthly, as does Stranger Things: Season 3, with new visions from the Upside Down. An interesting mix of horror tales – Midsommar, Scary Stories to Tell in the Dark and It: Chapter Two – offer extraordinary VFX fulfilling an array of missions. Fast & Furious Presents: Hobbs and Shaw will showcase turbo-powered visual effects in the action realm, and so should (to a lesser extent) the action-suspense dramas Once Upon a Time in Hollywood, Angel Has Fallen and sci-fi edged Boss Level, as well as the comedies Stuber, Where’d You Go, Bernadette? and The Kitchen. “Invisible effects” that are part of many such films should also be on subtle, but important display in the film adaptation of Downton Abbey, set in Yorkshire a hundred years ago. VFX Voice has compiled a list of some of the top VFX-laden productions due this summer, from July through September, along with the visual effects supervisors and VFX houses involved, when possible. This is not intended to be a complete listing. Release dates are subject to change. Stranger Things, Season 3 (Netflix) Release date: U.K., July 4; U.S., July 4 Matt and Ross Duffer’s hit supernatural sci-fi series returns with Millie Bobby Brown, David Harbor and Finn Wolfhard. Paul Graff is Overall Visual Effects Supervisor, Christina Graff is Senior Visual Effects Producer, Chloe Lipp is a Visual Effects Coordinator and Ravindra Tamhankar is a Visual Effects Producer. Atomic Studios, now part of Deluxe Entertainment, worked on the visual effects last season with other VFX houses. Sister firm Method Studios is involved this time around. Crafty Apes is also on board for VFX. Spider-Man: Far From Home (Marvel Studios/Sony Pictures) Release date: U.K., July 5; U.S., July 5 Peter Parker (Tom Holland) wants to summer in Europe, but

34 • VFXVOICE.COM SUMMER 2019

Stuber (20th Century Fox) Release date: U.K., July 12; U.S., July 12 Kumail Nanjiani (Silicon Valley) portrays Stu, a mild-mannered Uber driver who picks up a detective (Dave Bautista), who is chasing a sadistic terrorist. Stu must survive the ride and help his passenger, all while maintaining his Uber rating. Sean Thigpen is the Visual Effects Supervisor and Crafty Apes adds VFX to the action. The Lion King (Walt Disney Studios) Release date: U.K., July 19; U.S., July 19 Following his successful live-action/CGI adaptation of The Jungle Book, Jon Favreau takes on the photorealistic computer-animated Lion King remake with award-winning Visual Effects Supervisor Rob Legato. MPC (Visual Effects Supervisor Elliot Newman) conjured up the simulations of the lions, their animal peers and African landscapes. Once Upon a Time in Hollywood (Sony Pictures) Release date: U.K., August 14; U.S., July 26 Quentin Tarantino wrote and directed this tale about struggling TV actor Rick Dalton (Leonardo DiCaprio) and his stunt double/buddy Cliff Booth (Brad Pitt), who are trying to reboot their careers in 1969 Hollywood. Unfortunately, it’s at the very time that the Manson Family are wreaking bloody havoc. Luma Pictures (Visual Effects Producer Michael Perdew) has the task of bringing authenticity to the period film. The New Mutants (20th Century Fox) Release date: U.K., August 2; U.S., August 2 Five young mutants learn about their abilities and fight to survive in a secret facility where they are being held prisoner. Oliver Dumont and Craig Wentworth are VFX Supervisors in this latest film featuring the X-Men. DNEG (VFX Producer Anton Agerbo), Method Studios (Visual Effects Producer Sheena Johnson), MPC (VFX Supervisor Bryan Litson), and Zero VFX (Visual Effects Supervisor Dan Cayer) focused on conveying the action and the heroes’ emerging powers. Fast & Furious Presents: Hobbs & Shaw (Universal Pictures) Release date: U.K., August 2; U.S., August 2 In David Leitch’s high-octane Fast and Furious spinoff, Luke Hobbs (Dwayne Johnson) and Deckard Shaw (Jason Statham) team up to combat the cyber-genetically-enhanced terrorist Brixton (Idris Elba). DNEG (Overall Visual Effects Supervisor

TOP: Spies in Disguise (Photo © 20th Century Fox) MIDDLE: Midsommar (Photo © A24) BOTTOM: Scary Stories to Tell in the Dark (Photo © Lionsgate)

SUMMER 2019 VFXVOICE.COM • 35


FILM AND TV

Angel Has Fallen (Lionsgate) Release date: U.K., August 23; U.S., August 23 A Secret Service agent (Gerard Butler) has been framed for an assassination attempt on the president (Morgan Freeman) and must evade his colleagues while racing to uncover the real terrorist threat. Digital District and Worldwide FX are the chief VFX studios, and Marc Massicotte is the Visual Effects Supervisor.

Dan Glass and Visual Effects Supervisors Michael Brazelton and Stuart Lashley) and Framestore (VFX Supervisor Kyle McCulloch) turn the VFX up to maximum velocity. Midsommar (A24) Release date: U.K., August 9; U.S., August 9 A traveling couple discovers that Sweden is home to a strange pagan cult and its lethal midsummer festival in a rural village. Writer-director Ari Aster (Hereditary) has described this as an “apocalyptic breakup movie” and “Scandanavian folk horror.” Its cast boasts rising young stars such as Florence Pugh (Outlaw King) and Will Poulter (The Maze Runner). Gergely Takács is the Visual Effects Supervisor.

TOP LEFT: Where’d You Go, Bernadette? (Photo© Annapurna Pictures) TOP RIGHT: Artemis Fowl (Photo © Walt Disney Pictures) BOTTOM: The Angry Birds Movie 2 (Photo © Sony Pictures)

Scary Stories to Tell in the Dark (Lionsgate) Release date: U.K., TBD; U.S., August 9 Guillermo del Toro produced and contributed the story for this André Øvredal horror film based on the Alvin Schwartz children’s book series. Matt Glover (Visual Effects Supervisor) and Greg Sigurdson (Visual Effects Producer) coordinated the VFX. Where’d You Go, Bernadette? (Annapurna Pictures) Release date: U.K., TBD; U.S., August 9 In this Richard Linklater comedy-drama, an unhappy mother (Cate Blanchett) suddenly disappears, and her 15-year-old daughter (Emma Nelson) must find out what happened to her. Raoul Bolognini is the Visual Effects Producer, Robert Grasmere is Visual Effects Supervisor and Joseph Payo the Visual Effects Coordinator. Savage Visual Effects and makevfx are both on t he case. Artemis Fowl (Walt Disney Pictures) Release date: U.K., August 9; U.S., August 9 In this fantasy-adventure directed by Kenneth Branagh, Artemis Fowl II (Ferdia Shaw) seeks to save his father and restore the family fortune by kidnapping a fairy for ransom. Charley Henley is Visual Effects Supervisor. Framestore (On-Set Visual Effects Supervisor Robert Duncan), MPC, Nvizage and Visual Skies are VFX studios charged with bringing the fairy reality to life. Boss Level (Entertainment Studios) Release date: U.K., TBD; U.S., August 16 In Joe Carnahan’s sci-fi thriller, Frank Grillo portrays an army special forces veteran trapped in a time loop that always ends in his death. Makana Sylva is the Visual Effects Supervisor, while Brian Reiss is the Visual Effects Producer for Encore VFX. The Angry Birds Movie 2 (Sony Pictures) Release date: U.K., October 4; U.S., August 16 This 3D computer-animated sequel features the voices of Peter Dinklage, Jason Sudeikis and Bill Hader, along with Nicki Minaj. R. Stirling Duguid is the Visual Effects Supervisor for Sony Imageworks.

36 • VFXVOICE.COM SUMMER 2019

It: Chapter Two (Warner Bros. Pictures) Release date: U.K., September 6; U.S., September 6 The Loser Club members, now adults, reconvene 27 years after the events depicted in the first It horror film, based on the Stephen King novel. James McAvoy and Jessica Chastain lead the cast. Nicholas Brooks is the Visual Effects Supervisor for the sequel, as he was for the original It, and Method Studios (Visual Effects Supervisor Josh Simmonds) is onboard to help realize the scary elements. Downton Abbey (Focus Features) Release date: U.K., September 13; U.S., September 20 The hit epic U.K. series comes to the big screen, with many original cast members reprising their roles. Framestore Visual Effects Producer Ken Dailey faced the challenge of period visual effects. Spies in Disguise (20th Century Fox) Release date: U.K., September 13; U.S., September 13 This 3D computer-animated spy comedy from 20th Century Fox Animation and Chernin Entertainment is directed by Nick Bruno and Troy Quane, and includes Will Smith, Tom Holland and Rashida Jones in the voice talent. Brian Jason Tran is Effects Technical Director for Blue Sky Studios, which worked on the project, and Elvira Pinkhas is Effects Supervisor. The Kitchen (Warner Bros. Pictures) Release date: U.K., August 9; U.S., September 20 When three Irish mobsters are arrested by the FBI, their wives decide to run the family business themselves. The crime comedy-drama stars Elisabeth Moss, Tiffany Haddish and Melissa McCarthy. Method Studios (Senior Visual Effects Coordinator Ann-Marie Blommaert), Alkemy X (Visual Effects Supervisor Gabriel Regentin) and Shade VFX (Visual Effects Coordinator Nigel Cyril) are keeping the ’70s Hell’s Kitchen period piece real.

TOP: Angel Has Fallen (Photo © Lionsgate) MIDDLE: It: Chapter 2 (Photo © Warner Bros. Pictures) BOTTOM: Spider-Man: Far From Home (Photo © Marvel Studios/Sony Pictures)

Abominable (Universal Pictures) Release date: U.K., October 11; U.S., September 27 DreamWorks Animation and Pearl Studio produced this 3D computer-animated adventure. A mischievous group of friends encounter a young Yeti and set off on a long journey to the Himalayas to reunite the magical creature with his family. Max Bruce and Shaun Collaco are the CG Supervisors for DreamWorks Animation.

SUMMER 2019 VFXVOICE.COM • 37


FILM AND TV

Angel Has Fallen (Lionsgate) Release date: U.K., August 23; U.S., August 23 A Secret Service agent (Gerard Butler) has been framed for an assassination attempt on the president (Morgan Freeman) and must evade his colleagues while racing to uncover the real terrorist threat. Digital District and Worldwide FX are the chief VFX studios, and Marc Massicotte is the Visual Effects Supervisor.

Dan Glass and Visual Effects Supervisors Michael Brazelton and Stuart Lashley) and Framestore (VFX Supervisor Kyle McCulloch) turn the VFX up to maximum velocity. Midsommar (A24) Release date: U.K., August 9; U.S., August 9 A traveling couple discovers that Sweden is home to a strange pagan cult and its lethal midsummer festival in a rural village. Writer-director Ari Aster (Hereditary) has described this as an “apocalyptic breakup movie” and “Scandanavian folk horror.” Its cast boasts rising young stars such as Florence Pugh (Outlaw King) and Will Poulter (The Maze Runner). Gergely Takács is the Visual Effects Supervisor.

TOP LEFT: Where’d You Go, Bernadette? (Photo© Annapurna Pictures) TOP RIGHT: Artemis Fowl (Photo © Walt Disney Pictures) BOTTOM: The Angry Birds Movie 2 (Photo © Sony Pictures)

Scary Stories to Tell in the Dark (Lionsgate) Release date: U.K., TBD; U.S., August 9 Guillermo del Toro produced and contributed the story for this André Øvredal horror film based on the Alvin Schwartz children’s book series. Matt Glover (Visual Effects Supervisor) and Greg Sigurdson (Visual Effects Producer) coordinated the VFX. Where’d You Go, Bernadette? (Annapurna Pictures) Release date: U.K., TBD; U.S., August 9 In this Richard Linklater comedy-drama, an unhappy mother (Cate Blanchett) suddenly disappears, and her 15-year-old daughter (Emma Nelson) must find out what happened to her. Raoul Bolognini is the Visual Effects Producer, Robert Grasmere is Visual Effects Supervisor and Joseph Payo the Visual Effects Coordinator. Savage Visual Effects and makevfx are both on t he case. Artemis Fowl (Walt Disney Pictures) Release date: U.K., August 9; U.S., August 9 In this fantasy-adventure directed by Kenneth Branagh, Artemis Fowl II (Ferdia Shaw) seeks to save his father and restore the family fortune by kidnapping a fairy for ransom. Charley Henley is Visual Effects Supervisor. Framestore (On-Set Visual Effects Supervisor Robert Duncan), MPC, Nvizage and Visual Skies are VFX studios charged with bringing the fairy reality to life. Boss Level (Entertainment Studios) Release date: U.K., TBD; U.S., August 16 In Joe Carnahan’s sci-fi thriller, Frank Grillo portrays an army special forces veteran trapped in a time loop that always ends in his death. Makana Sylva is the Visual Effects Supervisor, while Brian Reiss is the Visual Effects Producer for Encore VFX. The Angry Birds Movie 2 (Sony Pictures) Release date: U.K., October 4; U.S., August 16 This 3D computer-animated sequel features the voices of Peter Dinklage, Jason Sudeikis and Bill Hader, along with Nicki Minaj. R. Stirling Duguid is the Visual Effects Supervisor for Sony Imageworks.

36 • VFXVOICE.COM SUMMER 2019

It: Chapter Two (Warner Bros. Pictures) Release date: U.K., September 6; U.S., September 6 The Loser Club members, now adults, reconvene 27 years after the events depicted in the first It horror film, based on the Stephen King novel. James McAvoy and Jessica Chastain lead the cast. Nicholas Brooks is the Visual Effects Supervisor for the sequel, as he was for the original It, and Method Studios (Visual Effects Supervisor Josh Simmonds) is onboard to help realize the scary elements. Downton Abbey (Focus Features) Release date: U.K., September 13; U.S., September 20 The hit epic U.K. series comes to the big screen, with many original cast members reprising their roles. Framestore Visual Effects Producer Ken Dailey faced the challenge of period visual effects. Spies in Disguise (20th Century Fox) Release date: U.K., September 13; U.S., September 13 This 3D computer-animated spy comedy from 20th Century Fox Animation and Chernin Entertainment is directed by Nick Bruno and Troy Quane, and includes Will Smith, Tom Holland and Rashida Jones in the voice talent. Brian Jason Tran is Effects Technical Director for Blue Sky Studios, which worked on the project, and Elvira Pinkhas is Effects Supervisor. The Kitchen (Warner Bros. Pictures) Release date: U.K., August 9; U.S., September 20 When three Irish mobsters are arrested by the FBI, their wives decide to run the family business themselves. The crime comedy-drama stars Elisabeth Moss, Tiffany Haddish and Melissa McCarthy. Method Studios (Senior Visual Effects Coordinator Ann-Marie Blommaert), Alkemy X (Visual Effects Supervisor Gabriel Regentin) and Shade VFX (Visual Effects Coordinator Nigel Cyril) are keeping the ’70s Hell’s Kitchen period piece real.

TOP: Angel Has Fallen (Photo © Lionsgate) MIDDLE: It: Chapter 2 (Photo © Warner Bros. Pictures) BOTTOM: Spider-Man: Far From Home (Photo © Marvel Studios/Sony Pictures)

Abominable (Universal Pictures) Release date: U.K., October 11; U.S., September 27 DreamWorks Animation and Pearl Studio produced this 3D computer-animated adventure. A mischievous group of friends encounter a young Yeti and set off on a long journey to the Himalayas to reunite the magical creature with his family. Max Bruce and Shaun Collaco are the CG Supervisors for DreamWorks Animation.

SUMMER 2019 VFXVOICE.COM • 37


PROFILE

ORIGIN STORY

2019 VES Lifetime Achievement Award Winner

CHRIS MELEDANDRI: INNOVATIVE ANIMATOR

By NAOMI GOLDMAN

Images courtesy of Universal Pictures and Illumination Entertainment TOP LEFT: Chris Meledandri, Founder and CEO, Illumination Entertainment (Photo: Alex Berliner) TOP RIGHT: Chris Meledandri receives the VES Lifetime Achievement Award from longtime collaborator Steve Carell, the voice of supervillain Gru in the Despicable Me franchise.

38 • VFXVOICE.COM SUMMER 2019

Sparked at a young age by cinematic marvels on the big screen, Chris Meledandri has taken that sense of wonder and brought animation into the lives of audiences worldwide, creating unforgettable characters that have ingrained themselves in the pop culture zeitgeist. Thanks to Meledandri’s dynamic leadership and creative approach to storytelling, the Oscar®-nominated producer, founder and CEO of Illumination has built one of the most pre-eminent brands in family entertainment by developing humor-based comedic films that strike the balance between subversive and emotionally engaging, and appeal to all ages and on a global scale. For his enormous contributions to the advancement and ever-increasing success of mainstream animated entertainment over the last 20 years, Meledandri was recently honored with the VES Lifetime Achievement Award at the 17th Annual VES Awards. Upon receiving the award from longtime collaborator Steve Carell – whom Melendandri first tapped 15 years ago for Horton Hears a Who – Meledandri mused on the sobering experience, “I want to thank the VES for waking me up from my blissful dream-world, where I am forever 25 and have a full head of hair!” Since founding Illumination in 2008, Chris Meledandri has become one of the most successful creators of original film franchises. In just a decade, Illumination produced two of the five highest-grossing animated films of all time, as well as the highest-grossing animated franchise in history: Despicable Me. Collectively, the company’s nine films – which include last year’s seventh highest-grossing film Dr. Seuss’ The Grinch, as well as Despicable Me, The Secret Life of Pets and Sing – have grossed over $6 billion globally, with 2016’s Minions and 2017’s Despicable Me 3 both crossing the $1 billion mark. In recognition of Meledandri’s ability to outperform titles such as Deadpool, Iron Man and Star Wars, Deadline wrote in 2017 that their “Most Valuable Blockbuster [rankings] might have to be coined the Meledandri Tournament.” Today, Illumination’s characters can be found worldwide in theme parks, consumer goods, social-media memes and games. Illumination’s 2013 mobile game, “Minion Rush,” has since been downloaded more than 800 million times, making it the sixth most-installed mobile game in history.

Looking back at what sparked his initial interest in filmed entertainment, Meledandri fondly recalls growing up in New York City with parents who loved cinema and transfixing films that helped chart his future course. His parents, Roland Meledandri, a noted men’s clothing designer, and Risha Meledandri, an activist, gallerist and poet, cultivated a movie-going household that relished auteurs like Scorsese, Kubrick and Fellini. He notes that it was only later, through is own children, that he was exposed to the wide world of animation. “My parents didn’t particularly believe in babysitters, and they took me to see Easy Rider when I was just nine years old! Then a few months later, I entered a cavernous dark space and joined with many others as we stared at a giant screen, watching flickering lights illuminate the imagery of 2001: A Space Odyssey. We were no longer in the Ziegfeld Theatre. Kubrick had transported us into the realm of his imagination and enabled us to suspend disbelief in a manner not quite previously possible. That afternoon, while flying through the star gate, my sense of wonder was ignited, and I have been chasing that feeling ever since.” To this day, he marvels at the significance of what the team behind 2001 created and the impact it had on modern cinema. In high school, Meledandri became interested in theater. “My mother told me that a producer provided the stage on which creative people come together to tell a story. I took her literally and immediately started constructing sets for plays, both at school and in off-Broadway theaters. “Now, 40 years later, the nature of the stage has changed, but I am still providing the creative space, the stories, the opportunity and support for extremely gifted people to come together to create movies that bring wonder into the lives of our audiences.” ROOTS AND WINGS

Meledandri points to some of the mentors who influenced him at critical junctures. At Dartmouth College, he studied with film historian David Thomson. “Movies had long been the window through which I learned about life, but David cemented my love of cinema and my desire to make it my life’s work. “In the early ’80s, when movies were still being scheduled on stripboards and cut on flatbeds, I spent five years with Producer Daniel Melnick, who made films ranging from Altered States to Footloose. Soon after starting, I was dispatched to bring Dan a script for a lunch meeting at the famed Ma Maison restaurant in Los Angeles. As I approached my new boss, I spotted the largerthan-life figure of Orson Welles. In that moment, I realized that I was in the land where the line between real life and cinematic magic were often blurred.” Meledandri describes the founding of Illumination as “a dream bouncing around my head,” and he gives tremendous recognition to the extended team of actors, musicians, writers, designers, artists and technical designers who have been integral to his journey. “The single most important moment in my career was asking the exceptional Janet Healy to come work with me. She has

TOP, MIDDLE AND BOTTOM: Despicable Me (2010)

SUMMER 2019 VFXVOICE.COM • 39


PROFILE

ORIGIN STORY

2019 VES Lifetime Achievement Award Winner

CHRIS MELEDANDRI: INNOVATIVE ANIMATOR

By NAOMI GOLDMAN

Images courtesy of Universal Pictures and Illumination Entertainment TOP LEFT: Chris Meledandri, Founder and CEO, Illumination Entertainment (Photo: Alex Berliner) TOP RIGHT: Chris Meledandri receives the VES Lifetime Achievement Award from longtime collaborator Steve Carell, the voice of supervillain Gru in the Despicable Me franchise.

38 • VFXVOICE.COM SUMMER 2019

Sparked at a young age by cinematic marvels on the big screen, Chris Meledandri has taken that sense of wonder and brought animation into the lives of audiences worldwide, creating unforgettable characters that have ingrained themselves in the pop culture zeitgeist. Thanks to Meledandri’s dynamic leadership and creative approach to storytelling, the Oscar®-nominated producer, founder and CEO of Illumination has built one of the most pre-eminent brands in family entertainment by developing humor-based comedic films that strike the balance between subversive and emotionally engaging, and appeal to all ages and on a global scale. For his enormous contributions to the advancement and ever-increasing success of mainstream animated entertainment over the last 20 years, Meledandri was recently honored with the VES Lifetime Achievement Award at the 17th Annual VES Awards. Upon receiving the award from longtime collaborator Steve Carell – whom Melendandri first tapped 15 years ago for Horton Hears a Who – Meledandri mused on the sobering experience, “I want to thank the VES for waking me up from my blissful dream-world, where I am forever 25 and have a full head of hair!” Since founding Illumination in 2008, Chris Meledandri has become one of the most successful creators of original film franchises. In just a decade, Illumination produced two of the five highest-grossing animated films of all time, as well as the highest-grossing animated franchise in history: Despicable Me. Collectively, the company’s nine films – which include last year’s seventh highest-grossing film Dr. Seuss’ The Grinch, as well as Despicable Me, The Secret Life of Pets and Sing – have grossed over $6 billion globally, with 2016’s Minions and 2017’s Despicable Me 3 both crossing the $1 billion mark. In recognition of Meledandri’s ability to outperform titles such as Deadpool, Iron Man and Star Wars, Deadline wrote in 2017 that their “Most Valuable Blockbuster [rankings] might have to be coined the Meledandri Tournament.” Today, Illumination’s characters can be found worldwide in theme parks, consumer goods, social-media memes and games. Illumination’s 2013 mobile game, “Minion Rush,” has since been downloaded more than 800 million times, making it the sixth most-installed mobile game in history.

Looking back at what sparked his initial interest in filmed entertainment, Meledandri fondly recalls growing up in New York City with parents who loved cinema and transfixing films that helped chart his future course. His parents, Roland Meledandri, a noted men’s clothing designer, and Risha Meledandri, an activist, gallerist and poet, cultivated a movie-going household that relished auteurs like Scorsese, Kubrick and Fellini. He notes that it was only later, through is own children, that he was exposed to the wide world of animation. “My parents didn’t particularly believe in babysitters, and they took me to see Easy Rider when I was just nine years old! Then a few months later, I entered a cavernous dark space and joined with many others as we stared at a giant screen, watching flickering lights illuminate the imagery of 2001: A Space Odyssey. We were no longer in the Ziegfeld Theatre. Kubrick had transported us into the realm of his imagination and enabled us to suspend disbelief in a manner not quite previously possible. That afternoon, while flying through the star gate, my sense of wonder was ignited, and I have been chasing that feeling ever since.” To this day, he marvels at the significance of what the team behind 2001 created and the impact it had on modern cinema. In high school, Meledandri became interested in theater. “My mother told me that a producer provided the stage on which creative people come together to tell a story. I took her literally and immediately started constructing sets for plays, both at school and in off-Broadway theaters. “Now, 40 years later, the nature of the stage has changed, but I am still providing the creative space, the stories, the opportunity and support for extremely gifted people to come together to create movies that bring wonder into the lives of our audiences.” ROOTS AND WINGS

Meledandri points to some of the mentors who influenced him at critical junctures. At Dartmouth College, he studied with film historian David Thomson. “Movies had long been the window through which I learned about life, but David cemented my love of cinema and my desire to make it my life’s work. “In the early ’80s, when movies were still being scheduled on stripboards and cut on flatbeds, I spent five years with Producer Daniel Melnick, who made films ranging from Altered States to Footloose. Soon after starting, I was dispatched to bring Dan a script for a lunch meeting at the famed Ma Maison restaurant in Los Angeles. As I approached my new boss, I spotted the largerthan-life figure of Orson Welles. In that moment, I realized that I was in the land where the line between real life and cinematic magic were often blurred.” Meledandri describes the founding of Illumination as “a dream bouncing around my head,” and he gives tremendous recognition to the extended team of actors, musicians, writers, designers, artists and technical designers who have been integral to his journey. “The single most important moment in my career was asking the exceptional Janet Healy to come work with me. She has

TOP, MIDDLE AND BOTTOM: Despicable Me (2010)

SUMMER 2019 VFXVOICE.COM • 39


PROFILE

“The single most important moment in my career was asking the exceptional Janet Healy to come work with me. She has been my producing partner and invaluable to the evolution of Illumination. She is a pioneer in our business – a founding member of the VES – and the finest producer I have known.” —Chris Meledandri been my producing partner and invaluable to the evolution of Illumination. She is a pioneer in our business – a founding member of the VES – and the finest producer I have known.” RIDING THE ROLLERCOASTER

Meledandri went out on his own to produce movies when he was 25, and waxes, “Boy, did I make some stinkers!” In 1998, when he founded Fox’s animation division, he oversaw the costly film Titan A.E., which was deemed a massive failure, losing $100 million. But he cites the experience as one of the transformational learning experiences of his career and values every opportunity to integrate lessons learned into his overall creative and operational approach. “A few years earlier, in 1993, after executive producing Cool Runnings at Dawn Steel Pictures at Disney, I was working at Fox when a young director showed me a few sequences in which cockroaches performed Busby Berkeley musical numbers, and I was spellbound. “I learned that the sequences were animated at a studio called Blue Sky, where they had gifted animators and an extraordinary proprietary renderer. I funded the completion of the brilliant short Bunny and asked [Blue Sky co-founder] Chris Wedge to direct Ice Age. Making that first movie, I discovered that there is no greater experience for a producer than to be surrounded by brilliant artists and technical geniuses along with writers, musicians, production whizzes and visionary directors as we endeavor to tell stories. “I left Fox [where he served as the founding president of 20th Century Fox Animation] with an idea for a company where people with creative aspirations would collaborate in telling stories using the universal language of visual storytelling. We would support people based on their talent and ideas, not based on their track records. So here we are, 12 years, 1,000 people and nine films later, with a company where every film has been directed by somebody

TOP: Despicable Me 2 (2013) MIDDLE AND BOTTOM: Despicable Me 3 (2017)

40 • VFXVOICE.COM SUMMER 2019

who started with us, never having previously directed an animated feature film.” ON PASSION AND INSPIRATION

“Every day we all join in a noble mission,” concludes Meledandri. “We contribute our creativity and craft to bring joy into the lives of audiences. That joy has the potential to unite people of all ages and cultures in a shared experience of a story that explores universal truths and has a visual expression that invites us to discover ideas, experiences, and emotions in ways never before imagined. That is wonder. That is what we all aspire to do together. And while it can be experienced on many different screens, none of them are as awesome as that majestic silver screen surrounded by darkness where we all began this journey. “No matter how challenging making a movie can be, every time I see life breathed into a new character or enter a newly created environment, or I am thrust into the middle of what promises to be a dynamic action, I feel the chills I felt watching 2001: A Space Odyssey, and I am reminded of Orson Welles’ famous line when he first visited a studio and exclaimed: ‘This is the biggest electric train set a boy ever had!’” TOP LEFT AND RIGHT, MIDDLE: The Secret Life of Pets 1 and 2 (2016) BOTTOM LEFT AND RIGHT: Sing (2016)

“Here we are, 12 years, 1,000 people and nine films later, with a company where every film has been directed by somebody who started with us, never having previously directed an animated feature film.” —Chris Meledandri

SUMMER 2019 VFXVOICE.COM • 41


PROFILE

“The single most important moment in my career was asking the exceptional Janet Healy to come work with me. She has been my producing partner and invaluable to the evolution of Illumination. She is a pioneer in our business – a founding member of the VES – and the finest producer I have known.” —Chris Meledandri been my producing partner and invaluable to the evolution of Illumination. She is a pioneer in our business – a founding member of the VES – and the finest producer I have known.” RIDING THE ROLLERCOASTER

Meledandri went out on his own to produce movies when he was 25, and waxes, “Boy, did I make some stinkers!” In 1998, when he founded Fox’s animation division, he oversaw the costly film Titan A.E., which was deemed a massive failure, losing $100 million. But he cites the experience as one of the transformational learning experiences of his career and values every opportunity to integrate lessons learned into his overall creative and operational approach. “A few years earlier, in 1993, after executive producing Cool Runnings at Dawn Steel Pictures at Disney, I was working at Fox when a young director showed me a few sequences in which cockroaches performed Busby Berkeley musical numbers, and I was spellbound. “I learned that the sequences were animated at a studio called Blue Sky, where they had gifted animators and an extraordinary proprietary renderer. I funded the completion of the brilliant short Bunny and asked [Blue Sky co-founder] Chris Wedge to direct Ice Age. Making that first movie, I discovered that there is no greater experience for a producer than to be surrounded by brilliant artists and technical geniuses along with writers, musicians, production whizzes and visionary directors as we endeavor to tell stories. “I left Fox [where he served as the founding president of 20th Century Fox Animation] with an idea for a company where people with creative aspirations would collaborate in telling stories using the universal language of visual storytelling. We would support people based on their talent and ideas, not based on their track records. So here we are, 12 years, 1,000 people and nine films later, with a company where every film has been directed by somebody

TOP: Despicable Me 2 (2013) MIDDLE AND BOTTOM: Despicable Me 3 (2017)

40 • VFXVOICE.COM SUMMER 2019

who started with us, never having previously directed an animated feature film.” ON PASSION AND INSPIRATION

“Every day we all join in a noble mission,” concludes Meledandri. “We contribute our creativity and craft to bring joy into the lives of audiences. That joy has the potential to unite people of all ages and cultures in a shared experience of a story that explores universal truths and has a visual expression that invites us to discover ideas, experiences, and emotions in ways never before imagined. That is wonder. That is what we all aspire to do together. And while it can be experienced on many different screens, none of them are as awesome as that majestic silver screen surrounded by darkness where we all began this journey. “No matter how challenging making a movie can be, every time I see life breathed into a new character or enter a newly created environment, or I am thrust into the middle of what promises to be a dynamic action, I feel the chills I felt watching 2001: A Space Odyssey, and I am reminded of Orson Welles’ famous line when he first visited a studio and exclaimed: ‘This is the biggest electric train set a boy ever had!’” TOP LEFT AND RIGHT, MIDDLE: The Secret Life of Pets 1 and 2 (2016) BOTTOM LEFT AND RIGHT: Sing (2016)

“Here we are, 12 years, 1,000 people and nine films later, with a company where every film has been directed by somebody who started with us, never having previously directed an animated feature film.” —Chris Meledandri

SUMMER 2019 VFXVOICE.COM • 41


PROFILE

“The only way to do that,” adds Nolan, “is that you have to get your hands dirty, and figure out how you’re going to make that work in front of a camera, and after the fact. And so, from the very, very beginning, that VFX challenge goes hand-in-hand with the writing, and with direction, and working with actors and music and everything else.”

“The challenge for us was, ‘Okay, well, it better look pretty damn good. If all these people are paying all this money to experience a theme park, it better look pretty great.’” —Jonathan Nolan

WRITING AND VISUAL EFFECTS

2019 VES Visionary Award Winner

JONATHAN NOLAN: MAKING THINGS REALER THAN REAL

By IAN FAILES

Upon receiving the VES Visionary Award at the 17th Annual VES Awards earlier this year, writer/producer/director Jonathan Nolan spoke of growing up idolizing pioneering filmmakers who innovated in visual effects. The Westworld co-creator pledged that “VFX is movies.” That comment certainly endeared the VFX-strong audience at the Awards, but as Nolan continued, it was clear he revered the entire visual effects process, its history and the artists involved in it. “It’s an extraordinary privilege for me to be honored by this group – this is the club that I always wanted to join,” he said. Nolan is certainly part of that club, having been involved with some of the most significant visual effects films in recent years writing (with his brother Christopher Nolan) on The Dark Knight, The Dark Knight Returns and Interstellar, and as a creator of the television shows Person of Interest and Westworld. His credits also include Memento and The Prestige. HBO’s Westworld, in particular, which Nolan created with his wife, Lisa Joy, via their Kilter Films production company, arrived at a time when the bar had been set incredibly high for visual effects in television. The show has been recognized with both Emmy and VES Award accolades, and the showrunner is deeply embedded in the visual effects process. EARLY BEGINNINGS

TOP LEFT: Westworld co-creator Jonathan Nolan on location. (Photo courtesy of HBO) TOP RIGHT: Nolan on the set of Westworld with actress Thandie Newton, who plays host Maeve Millay. (Photo courtesy of HBO)

42 • VFXVOICE.COM SUMMER 2019

For Nolan, his exposure to the world of visual effects began early, as he and his brother made their own films. “We were trying to figure out, how could we emulate the filmmaking that we were seeing at the movie theater?” Nolan recounts. “Star Wars was obviously the single biggest influence for us. We were fascinated in unmasking the magic trick of those films, and trying to figure out how they did what they did.” It turned out, too, that the movies Nolan wanted to make would be the sorts of films where visual effects would have to play a large part. “The kind of films and the kind of stories that [Chris and I] wanted to tell really didn’t map onto the physical world as it exists,” he says. “As much respect as I have for the more naturalistic kind of filmmaking, and the amazing stories that can be told right here on Earth, for me, I was always more interested in storytelling that created new realities.

Nolan is the first to acknowledge that, as a writer on the Dark Knight films and Interstellar, he was able to put almost anything on the page and then leave it to others to turn his words into imagery for the screen. “One great benefit as a writer on those projects, is not really being responsible in any way for, or on the line for, what you’d write,” he says. “It was Chris and [Visual Effects Supervisor] Paul Franklin’s problem to figure out how to make it work.” A key observation Nolan made from watching his brother’s films come together was the combination of different filmmaking and effects methods to produce the final result. “Those films are meticulously, beautifully made, and they use all the different available techniques,” says Nolan. “You have miniatures work, and you have full computer graphics environments, and you have projection, and you have a lot of techniques that we then adapted into Westworld where we could.” Indeed, on-set projection was something Nolan saw being used on Interstellar that he then became particularly fond of as a filmmaking and visual effects tool. “I remember being on the set of Interstellar and climbing into a spaceship. Production designer Nathan Crowley’s team built this beautiful interior for the spaceship, all practical, you climbed inside it, you’re in there with the actors. Then when it was time for takes, through-the-window projection starts and space appears. The pièce de résistance was the special effects team shaking the entire platform to give it movement. The thing I loved about it is that, for the actors, it requires no imagination on their part. They’re in space, you’re all there together. “I think that’s one of the most exciting things that’s happening right now – the clawing back reality as much as possible from the greenscreen era that we’ve been in, and having those assets ready ahead of schedule so you can feed them directly to the camera. That allows everyone on set to know exactly the story that you’re telling.” Projection was so influential, in fact, that Nolan specifically incorporated it as a story device in Westworld – for the central map – which was done as a live projection on set. “There were a lot of people watching, imagining that it’s computer graphics – and it is – but it’s beautifully prepared assets that are then live-projected onto the set,” explains Nolan. “That was actually quite an interesting challenge, because we wanted topography on that map, so that required working with a vendor to hand-carve and create the physical shape of the map and then working very closely with the vendors to supply them with the assets.” “The map also lifted out of the ground and turned, and we had to marry that to the movement,” says Nolan. “It was an enormous technical challenge, but came off beautifully, really beautifully. So

TOP: Westworld is one of the few television series shot on film. (Photo courtesy of HBO) BOTTOM: Jonathan Nolan with Westworld Visual Effects Supervisor Jay Worth. (Image courtesy of HBO)

SUMMER 2019 VFXVOICE.COM • 43


PROFILE

“The only way to do that,” adds Nolan, “is that you have to get your hands dirty, and figure out how you’re going to make that work in front of a camera, and after the fact. And so, from the very, very beginning, that VFX challenge goes hand-in-hand with the writing, and with direction, and working with actors and music and everything else.”

“The challenge for us was, ‘Okay, well, it better look pretty damn good. If all these people are paying all this money to experience a theme park, it better look pretty great.’” —Jonathan Nolan

WRITING AND VISUAL EFFECTS

2019 VES Visionary Award Winner

JONATHAN NOLAN: MAKING THINGS REALER THAN REAL

By IAN FAILES

Upon receiving the VES Visionary Award at the 17th Annual VES Awards earlier this year, writer/producer/director Jonathan Nolan spoke of growing up idolizing pioneering filmmakers who innovated in visual effects. The Westworld co-creator pledged that “VFX is movies.” That comment certainly endeared the VFX-strong audience at the Awards, but as Nolan continued, it was clear he revered the entire visual effects process, its history and the artists involved in it. “It’s an extraordinary privilege for me to be honored by this group – this is the club that I always wanted to join,” he said. Nolan is certainly part of that club, having been involved with some of the most significant visual effects films in recent years writing (with his brother Christopher Nolan) on The Dark Knight, The Dark Knight Returns and Interstellar, and as a creator of the television shows Person of Interest and Westworld. His credits also include Memento and The Prestige. HBO’s Westworld, in particular, which Nolan created with his wife, Lisa Joy, via their Kilter Films production company, arrived at a time when the bar had been set incredibly high for visual effects in television. The show has been recognized with both Emmy and VES Award accolades, and the showrunner is deeply embedded in the visual effects process. EARLY BEGINNINGS

TOP LEFT: Westworld co-creator Jonathan Nolan on location. (Photo courtesy of HBO) TOP RIGHT: Nolan on the set of Westworld with actress Thandie Newton, who plays host Maeve Millay. (Photo courtesy of HBO)

42 • VFXVOICE.COM SUMMER 2019

For Nolan, his exposure to the world of visual effects began early, as he and his brother made their own films. “We were trying to figure out, how could we emulate the filmmaking that we were seeing at the movie theater?” Nolan recounts. “Star Wars was obviously the single biggest influence for us. We were fascinated in unmasking the magic trick of those films, and trying to figure out how they did what they did.” It turned out, too, that the movies Nolan wanted to make would be the sorts of films where visual effects would have to play a large part. “The kind of films and the kind of stories that [Chris and I] wanted to tell really didn’t map onto the physical world as it exists,” he says. “As much respect as I have for the more naturalistic kind of filmmaking, and the amazing stories that can be told right here on Earth, for me, I was always more interested in storytelling that created new realities.

Nolan is the first to acknowledge that, as a writer on the Dark Knight films and Interstellar, he was able to put almost anything on the page and then leave it to others to turn his words into imagery for the screen. “One great benefit as a writer on those projects, is not really being responsible in any way for, or on the line for, what you’d write,” he says. “It was Chris and [Visual Effects Supervisor] Paul Franklin’s problem to figure out how to make it work.” A key observation Nolan made from watching his brother’s films come together was the combination of different filmmaking and effects methods to produce the final result. “Those films are meticulously, beautifully made, and they use all the different available techniques,” says Nolan. “You have miniatures work, and you have full computer graphics environments, and you have projection, and you have a lot of techniques that we then adapted into Westworld where we could.” Indeed, on-set projection was something Nolan saw being used on Interstellar that he then became particularly fond of as a filmmaking and visual effects tool. “I remember being on the set of Interstellar and climbing into a spaceship. Production designer Nathan Crowley’s team built this beautiful interior for the spaceship, all practical, you climbed inside it, you’re in there with the actors. Then when it was time for takes, through-the-window projection starts and space appears. The pièce de résistance was the special effects team shaking the entire platform to give it movement. The thing I loved about it is that, for the actors, it requires no imagination on their part. They’re in space, you’re all there together. “I think that’s one of the most exciting things that’s happening right now – the clawing back reality as much as possible from the greenscreen era that we’ve been in, and having those assets ready ahead of schedule so you can feed them directly to the camera. That allows everyone on set to know exactly the story that you’re telling.” Projection was so influential, in fact, that Nolan specifically incorporated it as a story device in Westworld – for the central map – which was done as a live projection on set. “There were a lot of people watching, imagining that it’s computer graphics – and it is – but it’s beautifully prepared assets that are then live-projected onto the set,” explains Nolan. “That was actually quite an interesting challenge, because we wanted topography on that map, so that required working with a vendor to hand-carve and create the physical shape of the map and then working very closely with the vendors to supply them with the assets.” “The map also lifted out of the ground and turned, and we had to marry that to the movement,” says Nolan. “It was an enormous technical challenge, but came off beautifully, really beautifully. So

TOP: Westworld is one of the few television series shot on film. (Photo courtesy of HBO) BOTTOM: Jonathan Nolan with Westworld Visual Effects Supervisor Jay Worth. (Image courtesy of HBO)

SUMMER 2019 VFXVOICE.COM • 43


PROFILE

“As much respect as I have for the more naturalistic kind of filmmaking, and the amazing stories that can be told right here on Earth, for me, I was always more interested in storytelling that created new realities.” —Jonathan Nolan

to be able to stand on a set, and instead of putting down a little green box for people to look at, they would look at a live rendering – the illusion is essentially complete on set.” Audiences can expect to see that approach taken further in Westworld’s third season, according to Nolan, who is looking to “take that technology off of just the map and use it to extend sets in more sophisticated ways that had been impossible to do up to this point. That’s an area in which we’re very excited about the possibilities and, without breaking confidences, we’ve been very excited at some things that other people are doing right now, and we’re trying to see what we can do about them in Season 3.” GOING OLD-SCHOOL

TOP LEFT: The original plate for an aerial shot of the Westworld control center called ‘The Mesa.’ (Photo courtesy of HBO) TOP RIGHT: The final shot took the plate photography and added in the sprawling control center. (Photo courtesy of HBO) BOTTOM LEFT: Live-action plate of a train arrival scene in Westworld. (Photo courtesy of HBO) BOTTOM RIGHT: Greenscreen elements of the plate were replaced with train and environment extensions. (Photo courtesy of HBO)

44 • VFXVOICE.COM SUMMER 2019

One reason Nolan was drawn to the live-projection technique, of course, is that it fit neatly into the type of world he imagined for Westworld’s characters to inhabit. “One of the ironies of Westworld,” comments Nolan, “was creating this artificial world, but one that you have to imagine that people would be willing to spend tens of thousands of dollars a day to experience. “Our take was, it had to feel more real than real,” he continues. “There can’t be any sense that it’s an illusion, because that’s what the guests are paying a lot of money for. They want a tactile, lived-in, beautiful reality that they can experience that may feel more real than where they come from. So the challenge for us was, ‘Okay, well,

it better look pretty damn good. If all these people are paying all this money to experience a theme park, it better look pretty great.’” To make it look ‘pretty damn good,’ Nolan and his collaborators chose to film as much as they could for real, then use visual effects to flesh out what couldn’t be shot. For example, a common approach in the show has been to take advantage of existing architecture to shoot scenes in, at least to acquire one particular direction of action. “Then,” notes Nolan, “we’d take the actors and re-build, practically, a piece of that set on the opposite direction. So you’d have basically both sides of the scene of reality – they’re married with a little bit of movie magic. Inevitably, with those sequences you’d have a couple of ‘stitching’ moments. You’d have an angle where you’ve got to extend the architecture or extend the landscape. “Our goal from the very beginning was to try to find a hybrid approach to any of these challenges that honored what every system could do best. If you have the time and energy to recreate a part of your set in both places, to marry a piece of architecture to a location that you loved, if you do all that, then you’d know that you have an amazing team that can, in the moments in between, literally marry those two places together.” VFX COLLABORATION

Those architectural or landscape extensions, and many other

“[The Dark Knight films and Interstellar] are meticulously, beautifully made, and they use all the different available techniques. You have miniatures work, and you have full computer graphics environments, and you have projection, and you have a lot of techniques that we then adapted into Westworld where we could.” —Jonathan Nolan TOP LEFT: Actor Jeffrey Wright as Bernard Lowe in Westworld. (Photo courtesy of HBO) TOP RIGHT: Live projection was an aspect of filmmaking Nolan had seen used on other films, including Interstellar. On Westworld it was used to help bring the central map to life. (Photo courtesy of HBO) BOTTOM LEFT: A Super Bowl spot, in which synthetic bulls rampage, incorporated multi-pass plate photography. (Photo courtesy of HBO) BOTTOM RIGHT: A robotic endoskeleton was then added to the bulls relatively late in the process. (Photo courtesy of HBO)

SUMMER 2019 VFXVOICE.COM • 45


PROFILE

“As much respect as I have for the more naturalistic kind of filmmaking, and the amazing stories that can be told right here on Earth, for me, I was always more interested in storytelling that created new realities.” —Jonathan Nolan

to be able to stand on a set, and instead of putting down a little green box for people to look at, they would look at a live rendering – the illusion is essentially complete on set.” Audiences can expect to see that approach taken further in Westworld’s third season, according to Nolan, who is looking to “take that technology off of just the map and use it to extend sets in more sophisticated ways that had been impossible to do up to this point. That’s an area in which we’re very excited about the possibilities and, without breaking confidences, we’ve been very excited at some things that other people are doing right now, and we’re trying to see what we can do about them in Season 3.” GOING OLD-SCHOOL

TOP LEFT: The original plate for an aerial shot of the Westworld control center called ‘The Mesa.’ (Photo courtesy of HBO) TOP RIGHT: The final shot took the plate photography and added in the sprawling control center. (Photo courtesy of HBO) BOTTOM LEFT: Live-action plate of a train arrival scene in Westworld. (Photo courtesy of HBO) BOTTOM RIGHT: Greenscreen elements of the plate were replaced with train and environment extensions. (Photo courtesy of HBO)

44 • VFXVOICE.COM SUMMER 2019

One reason Nolan was drawn to the live-projection technique, of course, is that it fit neatly into the type of world he imagined for Westworld’s characters to inhabit. “One of the ironies of Westworld,” comments Nolan, “was creating this artificial world, but one that you have to imagine that people would be willing to spend tens of thousands of dollars a day to experience. “Our take was, it had to feel more real than real,” he continues. “There can’t be any sense that it’s an illusion, because that’s what the guests are paying a lot of money for. They want a tactile, lived-in, beautiful reality that they can experience that may feel more real than where they come from. So the challenge for us was, ‘Okay, well,

it better look pretty damn good. If all these people are paying all this money to experience a theme park, it better look pretty great.’” To make it look ‘pretty damn good,’ Nolan and his collaborators chose to film as much as they could for real, then use visual effects to flesh out what couldn’t be shot. For example, a common approach in the show has been to take advantage of existing architecture to shoot scenes in, at least to acquire one particular direction of action. “Then,” notes Nolan, “we’d take the actors and re-build, practically, a piece of that set on the opposite direction. So you’d have basically both sides of the scene of reality – they’re married with a little bit of movie magic. Inevitably, with those sequences you’d have a couple of ‘stitching’ moments. You’d have an angle where you’ve got to extend the architecture or extend the landscape. “Our goal from the very beginning was to try to find a hybrid approach to any of these challenges that honored what every system could do best. If you have the time and energy to recreate a part of your set in both places, to marry a piece of architecture to a location that you loved, if you do all that, then you’d know that you have an amazing team that can, in the moments in between, literally marry those two places together.” VFX COLLABORATION

Those architectural or landscape extensions, and many other

“[The Dark Knight films and Interstellar] are meticulously, beautifully made, and they use all the different available techniques. You have miniatures work, and you have full computer graphics environments, and you have projection, and you have a lot of techniques that we then adapted into Westworld where we could.” —Jonathan Nolan TOP LEFT: Actor Jeffrey Wright as Bernard Lowe in Westworld. (Photo courtesy of HBO) TOP RIGHT: Live projection was an aspect of filmmaking Nolan had seen used on other films, including Interstellar. On Westworld it was used to help bring the central map to life. (Photo courtesy of HBO) BOTTOM LEFT: A Super Bowl spot, in which synthetic bulls rampage, incorporated multi-pass plate photography. (Photo courtesy of HBO) BOTTOM RIGHT: A robotic endoskeleton was then added to the bulls relatively late in the process. (Photo courtesy of HBO)

SUMMER 2019 VFXVOICE.COM • 45


PROFILE

visual effects duties on Westworld, are overseen by Visual Effects Supervisor Jay Worth, who Nolan also worked with on Person of Interest. Worth was critical, says Nolan, in delivering on one incredibly complicated shot that was used for a scene of rampaging bulls in a Westworld Super Bowl spot. “It was a beautiful sequence,” Nolan outlines. “We used a robotically-controlled dolly with a robotically-controlled head so we could repeat camera nodal moves. We did these dolly moves over and over and over again as these live animals are running through the frame. It was an enormous amount of work for Jay and his team to marry all those things and elements together seamlessly. “Then toward the end of the creative process, we said, ‘God, wouldn’t it look fantastic if one of those bulls was just opened up a little bit to reveal the sort of robotic structure underneath?’ That was a decision that happened relatively late in the process, but working with DNEG – they did an extraordinary job.” Important Looking Pirates had already established for other shots what that kind of robotic endoskeleton look was, so DNEG – working around the clock to pull off the bull shots – took those ideas and delivered the VFX in time for the commercial. Nolan admits it was a tough scene, but, as he noted at the VES Awards, “Jay and I have made 123 episodes of television together and he has never, not a single time, complained within earshot of me about our process. “We try to plan very carefully to know exactly what we’re doing,” says Nolan, “but sometimes we get a sequence back and you go, ‘There’s just this one little extra thing...’ And if you’ve been careful and you have the right assets and you’re working with the right people, we go just a little bit further to make something that looks great even better.”

TOP: Live-action plate for the ‘Door’ scene in Season 2 of Westworld. (Photo courtesy of HBO) MIDDLE: The CG elements. (Photo courtesy of HBO) BOTTOM LEFT: Final shot revealing the gateway to the ‘Sublime.’ (Photo courtesy of HBO) BOTTOM RIGHT: Jonathan Nolan (right) at the 17th Annual VES Awards with Westworld star Evan Rachael Wood, who presented him with the VES Visionary Award. (Photo by Phil McCarten)

46 • VFXVOICE.COM SUMMER 2019



COVER

HOW GAME OF THRONES CHANGED TV VFX FOREVER By IAN FAILES

All Game of Thrones images copyright HBO. TOP LEFT: Visual Effects Supervisor Joe Bauer (left) with Visual Effects Producer Steve Kullback scouting in Bardenas Reales, Spain for “The Spoils of War” episode. (Image courtesy of Steve Kullback) TOP RIGHT: On the set of “Battle of the Bastards” in Saintfield, Northern Island, with (from left) Kristofer Hivju, Fabian Wagner, Joe Bauer and Steve Kullback. (Image courtesy of Steve Kullback) BOTTOM: Daenerys rides Drogon in “The Spoils of War.”

48 • VFXVOICE.COM SUMMER 2019

Visual Effects Supervisor Joe Bauer has a very honest answer when asked what it was like working on the multiple award-winning and now stunningly completed HBO series, Game of Thrones. “It was an ongoing panic,” admits Bauer. “You know the dreams you used to have, where you dreamed you were on the school bus and you looked down and you’re in your underwear? It felt like that.” Similarly, Visual Effects Producer Steve Kullback – the other half of the crack VFX team that shepherded the groundbreaking work in the David Benioff and D.B. Weiss show – recalls the monumental task that lay before them each season, including this past final eighth season. “When we look back on what’s been done,” states Kullback, “obviously we’re all enormously proud. It’s an amazing team of insane overachievers that have just pushed and pushed to deliver on a vision that Dan and David had.” WHY GAME OF THRONES WAS DIFFERENT

There’s no shortage right now of feature-film-quality visual effects work in the world of television. So what do Kullback and Bauer think lies at the success of Game of Thrones’ visual effects, both in terms of accolades (including multiple Emmys and VES awards) and helping to bring large audiences to the show? One aspect Kullback highlights is their own take on shooting everything that can possibly be shot photographically. “It gives us not only an insane number of elements that go into our shots,” he says, “but also insane reference to inform those parts that have to be CG. I think that has probably single-handedly upped the bar, and it’s something we hadn’t been accustomed to even striving for before, in television and a lot of times in features, too.” Indeed, this final season’s visual effects were certainly ‘upped,’ with Bauer suggesting that “any one of our complex shots would have been the highlight of any previous season. Most of our shots were complex shots this year, with more than two and sometimes eight layers of photographed elements. “The reason we went so photography-heavy,” adds Bauer, “was the concern of the post-production time, in that with a feature you’ll have much more time to develop your assets and the sim work – all the stuff that you would otherwise shoot. We didn’t have

the time to do that, and I wanted to go into post with photography that covered all the bases. Then the CG was all about holding it together. We’ve stuck to that because we really liked the way it looks. The thing is, as the shots get more complex, you end up needing to shoot more elements if you’re going to follow the same philosophy.” As the VFX work did become more complex season over season, Kullback (who joined the show in Season 2) and Bauer (who started in Season 3), found themselves investing more time in the planning stages. They also learned they could delegate more via a large team of virtual and previs supervisors, and an army of on-set effects supervisors working with different units, including motion control. “Once upon a time we’d be the people running the little dragon heads on sticks around the set,” says Bauer. The duo says their approach to the visual effects work took a major leap in Season 3 for a moment in which Daenerys orders one of her dragons to kill the slaver Kraznys. “It is the first time that a dragon roasted somebody on camera,” says Bauer. “And that was

“[Showrunnners] Dan [Weiss] and David [Benioff] wanted to max out and go for the gusto. While we were not doing anything we hadn’t really done before, we were doing so much of it, and so much of it is so complex that it was really frightening. Ultimately their support and participation has been the linchpin to us being able to do our jobs, because they give us an enormous amount of creative freedom, and the freedom to execute it in the way that we think is the right way to do it.” —Steve Kullback, Visual Effects Producer

TOP: New ways for actress Emilia Clarke to ride Drogon were developed, including pre-animating a hydraulic buck for convincing motion. MIDDLE: The dragons of Game of Thrones grew in size as each season was released. BOTTOM: Ice Dragon. Things turn dangerous when the Night King is able to commandeer one of Daenerys’ dragons.

SUMMER 2019 VFXVOICE.COM • 49


COVER

HOW GAME OF THRONES CHANGED TV VFX FOREVER By IAN FAILES

All Game of Thrones images copyright HBO. TOP LEFT: Visual Effects Supervisor Joe Bauer (left) with Visual Effects Producer Steve Kullback scouting in Bardenas Reales, Spain for “The Spoils of War” episode. (Image courtesy of Steve Kullback) TOP RIGHT: On the set of “Battle of the Bastards” in Saintfield, Northern Island, with (from left) Kristofer Hivju, Fabian Wagner, Joe Bauer and Steve Kullback. (Image courtesy of Steve Kullback) BOTTOM: Daenerys rides Drogon in “The Spoils of War.”

48 • VFXVOICE.COM SUMMER 2019

Visual Effects Supervisor Joe Bauer has a very honest answer when asked what it was like working on the multiple award-winning and now stunningly completed HBO series, Game of Thrones. “It was an ongoing panic,” admits Bauer. “You know the dreams you used to have, where you dreamed you were on the school bus and you looked down and you’re in your underwear? It felt like that.” Similarly, Visual Effects Producer Steve Kullback – the other half of the crack VFX team that shepherded the groundbreaking work in the David Benioff and D.B. Weiss show – recalls the monumental task that lay before them each season, including this past final eighth season. “When we look back on what’s been done,” states Kullback, “obviously we’re all enormously proud. It’s an amazing team of insane overachievers that have just pushed and pushed to deliver on a vision that Dan and David had.” WHY GAME OF THRONES WAS DIFFERENT

There’s no shortage right now of feature-film-quality visual effects work in the world of television. So what do Kullback and Bauer think lies at the success of Game of Thrones’ visual effects, both in terms of accolades (including multiple Emmys and VES awards) and helping to bring large audiences to the show? One aspect Kullback highlights is their own take on shooting everything that can possibly be shot photographically. “It gives us not only an insane number of elements that go into our shots,” he says, “but also insane reference to inform those parts that have to be CG. I think that has probably single-handedly upped the bar, and it’s something we hadn’t been accustomed to even striving for before, in television and a lot of times in features, too.” Indeed, this final season’s visual effects were certainly ‘upped,’ with Bauer suggesting that “any one of our complex shots would have been the highlight of any previous season. Most of our shots were complex shots this year, with more than two and sometimes eight layers of photographed elements. “The reason we went so photography-heavy,” adds Bauer, “was the concern of the post-production time, in that with a feature you’ll have much more time to develop your assets and the sim work – all the stuff that you would otherwise shoot. We didn’t have

the time to do that, and I wanted to go into post with photography that covered all the bases. Then the CG was all about holding it together. We’ve stuck to that because we really liked the way it looks. The thing is, as the shots get more complex, you end up needing to shoot more elements if you’re going to follow the same philosophy.” As the VFX work did become more complex season over season, Kullback (who joined the show in Season 2) and Bauer (who started in Season 3), found themselves investing more time in the planning stages. They also learned they could delegate more via a large team of virtual and previs supervisors, and an army of on-set effects supervisors working with different units, including motion control. “Once upon a time we’d be the people running the little dragon heads on sticks around the set,” says Bauer. The duo says their approach to the visual effects work took a major leap in Season 3 for a moment in which Daenerys orders one of her dragons to kill the slaver Kraznys. “It is the first time that a dragon roasted somebody on camera,” says Bauer. “And that was

“[Showrunnners] Dan [Weiss] and David [Benioff] wanted to max out and go for the gusto. While we were not doing anything we hadn’t really done before, we were doing so much of it, and so much of it is so complex that it was really frightening. Ultimately their support and participation has been the linchpin to us being able to do our jobs, because they give us an enormous amount of creative freedom, and the freedom to execute it in the way that we think is the right way to do it.” —Steve Kullback, Visual Effects Producer

TOP: New ways for actress Emilia Clarke to ride Drogon were developed, including pre-animating a hydraulic buck for convincing motion. MIDDLE: The dragons of Game of Thrones grew in size as each season was released. BOTTOM: Ice Dragon. Things turn dangerous when the Night King is able to commandeer one of Daenerys’ dragons.

SUMMER 2019 VFXVOICE.COM • 49


COVER

“Any one of our complex shots would have been the highlight of any previous season. Most of our shots were complex shots this year, with more than two and sometimes eight layers of photographed elements.” —Joe Bauer, Visual Effects Supervisor

TOP LEFT: The dragons make their mark in the frozen lake battle. TOP RIGHT: Real photography, CG imagery and effects simulations helped bring part of The Wall down. BOTTOM LEFT: A wight bear attack was one of the highlights of Season 7. BOTTOM RIGHT: To accomplish skeletal wight effects, visual effects artists took live-action actors with makeup and prosthetics and removed parts of the bodies and faces digitally.

50 • VFXVOICE.COM SUMMER 2019

the first time we made the argument to production to set up an on-set fire stunt rather than doing CG fire.” Other game-changing VFX moments identified by Bauer and Kullback included the fighting pit sequence in Season 5, where Daenerys climbs onto the back of her dragon, and other sequences involving Wun Wun the giant in battle that had to be shot motion control (“These were the early steps that lead us to the rather intensive machine of techvis and preparation and motion control,” says Kullback). That close reliance on planning and on photographed elements was often also in concert with the special and practical effects teams on the show, so much so that a very large miniature was built and exploded for the final season. “I would dare say that we’re the only production that would have supported that approach for this

event,” argues Bauer. “If practical effects teams have ever been concerned that their jobs were going to be diminished with CG they should have worked on Game of Thrones because our guys were taxed to the max. Special Effects Supervisor Sam Conway and his team were hand in glove through almost all of this.” MORE COMPLEXITY, MORE PLANNING

Over the years, Game of Thrones’ visual effects workforce has been spread out over several countries. But before shots even got into post-production, another of the show’s point of differences was the close involvement of virtual supervisors on set from The Third Floor, which tackled previs, techvis and virtual production. “We had so much material that needed to be shot and produced

TOP LEFT: The city of Braavos, one of the many environments required to be built for Game of Thrones. TOP RIGHT: A previs frame for a scene from the loot train attack in ‘The Spoils of War.’ BOTTOM: Techvis helped work out shooting locations, camera positions and other logistical elements to pull off complicated scenes.

SUMMER 2019 VFXVOICE.COM • 51


COVER

“Any one of our complex shots would have been the highlight of any previous season. Most of our shots were complex shots this year, with more than two and sometimes eight layers of photographed elements.” —Joe Bauer, Visual Effects Supervisor

TOP LEFT: The dragons make their mark in the frozen lake battle. TOP RIGHT: Real photography, CG imagery and effects simulations helped bring part of The Wall down. BOTTOM LEFT: A wight bear attack was one of the highlights of Season 7. BOTTOM RIGHT: To accomplish skeletal wight effects, visual effects artists took live-action actors with makeup and prosthetics and removed parts of the bodies and faces digitally.

50 • VFXVOICE.COM SUMMER 2019

the first time we made the argument to production to set up an on-set fire stunt rather than doing CG fire.” Other game-changing VFX moments identified by Bauer and Kullback included the fighting pit sequence in Season 5, where Daenerys climbs onto the back of her dragon, and other sequences involving Wun Wun the giant in battle that had to be shot motion control (“These were the early steps that lead us to the rather intensive machine of techvis and preparation and motion control,” says Kullback). That close reliance on planning and on photographed elements was often also in concert with the special and practical effects teams on the show, so much so that a very large miniature was built and exploded for the final season. “I would dare say that we’re the only production that would have supported that approach for this

event,” argues Bauer. “If practical effects teams have ever been concerned that their jobs were going to be diminished with CG they should have worked on Game of Thrones because our guys were taxed to the max. Special Effects Supervisor Sam Conway and his team were hand in glove through almost all of this.” MORE COMPLEXITY, MORE PLANNING

Over the years, Game of Thrones’ visual effects workforce has been spread out over several countries. But before shots even got into post-production, another of the show’s point of differences was the close involvement of virtual supervisors on set from The Third Floor, which tackled previs, techvis and virtual production. “We had so much material that needed to be shot and produced

TOP LEFT: The city of Braavos, one of the many environments required to be built for Game of Thrones. TOP RIGHT: A previs frame for a scene from the loot train attack in ‘The Spoils of War.’ BOTTOM: Techvis helped work out shooting locations, camera positions and other logistical elements to pull off complicated scenes.

SUMMER 2019 VFXVOICE.COM • 51


COVER

TOP LEFT: Practical fire elements regularly made up dragon fire scenes in the show, with CG elements timed to coincide with these large effects moments. TOP RIGHT: The fighting pit sequence employed two motion-control camera cranes to film action and operate a real flamethrower that was then tied to the CG dragon. BOTTOM: A scene from the “Second Siege of Meereen,” which required extensive digital environments.

in a relatively short period of time that there was no other way to do this than to be so well planned out,” says Kullback. “So that when you’re walking on set you know exactly where the camera’s going to go, exactly where the hundred tiles of Unsullied are going to need to be and moved around for a particular shot.” Bauer says that his experience on other productions was that film crews were traditionally not so accepting of previs or techvis. But the visual effects supervisor adds that, on Game of Thrones, “the work has gotten so complicated, and the timeframe for achieving it was so tight, that the camera and grip crews would seek out the information from us. They’d ask, ‘Do you have a diagram for where the dolly track goes in relation to the greenscreen and where the greenscreen goes?’ Everyone understood the advantage of having this stuff worked out ahead of time instead of trying to solve it on the day, which would have been disastrous.”

That kind of planning came in particularly handy during the filming of Season 5’s fighting pit scene, which took place in Osuna, Spain. The final imagery required a dragon breathing fire within the pit directly onto adversaries. The shots were planned as dual motion-control scenes, one rig for filming and one rig holding an actual flame thrower. Recalls Kullback: “Our director, David Nutter, you could see his eyes roll back in his head because he had done some motion control previously and just remembered it being cumbersome and problematic and finicky, and we were having two setups! But because we had planned it out in a rather elaborate scheme and had a very detailed playbook, thankfully – because it was the most frightening day of my life – it went off without a hitch.” “That morning,” continues Bauer, “I remember I woke up, looked in the mirror and my top lip had swollen up as a manifestation of how nervous I was, because people’s lives were at stake. It really was thanks to the stunt team, too, which is overseen by Rowley Irlam. They’re top notch and obviously no one got hurt and everyone is professional.” The presence of the virtual supervisors became particularly important for the even more heightened action seen in Season 8, when simulcam techniques became a major part of the mix. “Now we could show the CG character performing in the set for shots where we needed to do a fairly complicated move,” says Kullback.

TOP LEFT: The “Battle of the Bastards” Winterfell sequence required the coordination of stunts, practical effects and many digital elements, including fully-CG horses and riders. TOP RIGHT: Daenerys arrives on her dragon during the loot train attack. BOTTOM: Devastation during the loot train attack. Game of Thrones not only pioneered the incorporation of large amounts of photographed plates, but also feature film-quality CG characters.

BIG, BIG SEQUENCES

Having navigated the brutal results of this final season – which Bauer describes as “quadruple” the work of the largest sequences of previous seasons, such as the ‘Battle of the Bastards’ or ‘Hardhome’ – the core visual effects team is particularly proud of the fact that the show’s creators trusted them with going big. “Dan and David wanted to max out and go for the gusto,” says Kullback. “While we were not doing anything we hadn’t really done before, we were doing so much of it, and so much of it is so complex that it was really frightening. Ultimately, their support and participation has been the linchpin to us being able to do our jobs, because they give us an enormous amount of creative freedom, and the freedom to execute it in the way that we think is the right way to do it.”

52 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 53


COVER

TOP LEFT: Practical fire elements regularly made up dragon fire scenes in the show, with CG elements timed to coincide with these large effects moments. TOP RIGHT: The fighting pit sequence employed two motion-control camera cranes to film action and operate a real flamethrower that was then tied to the CG dragon. BOTTOM: A scene from the “Second Siege of Meereen,” which required extensive digital environments.

in a relatively short period of time that there was no other way to do this than to be so well planned out,” says Kullback. “So that when you’re walking on set you know exactly where the camera’s going to go, exactly where the hundred tiles of Unsullied are going to need to be and moved around for a particular shot.” Bauer says that his experience on other productions was that film crews were traditionally not so accepting of previs or techvis. But the visual effects supervisor adds that, on Game of Thrones, “the work has gotten so complicated, and the timeframe for achieving it was so tight, that the camera and grip crews would seek out the information from us. They’d ask, ‘Do you have a diagram for where the dolly track goes in relation to the greenscreen and where the greenscreen goes?’ Everyone understood the advantage of having this stuff worked out ahead of time instead of trying to solve it on the day, which would have been disastrous.”

That kind of planning came in particularly handy during the filming of Season 5’s fighting pit scene, which took place in Osuna, Spain. The final imagery required a dragon breathing fire within the pit directly onto adversaries. The shots were planned as dual motion-control scenes, one rig for filming and one rig holding an actual flame thrower. Recalls Kullback: “Our director, David Nutter, you could see his eyes roll back in his head because he had done some motion control previously and just remembered it being cumbersome and problematic and finicky, and we were having two setups! But because we had planned it out in a rather elaborate scheme and had a very detailed playbook, thankfully – because it was the most frightening day of my life – it went off without a hitch.” “That morning,” continues Bauer, “I remember I woke up, looked in the mirror and my top lip had swollen up as a manifestation of how nervous I was, because people’s lives were at stake. It really was thanks to the stunt team, too, which is overseen by Rowley Irlam. They’re top notch and obviously no one got hurt and everyone is professional.” The presence of the virtual supervisors became particularly important for the even more heightened action seen in Season 8, when simulcam techniques became a major part of the mix. “Now we could show the CG character performing in the set for shots where we needed to do a fairly complicated move,” says Kullback.

TOP LEFT: The “Battle of the Bastards” Winterfell sequence required the coordination of stunts, practical effects and many digital elements, including fully-CG horses and riders. TOP RIGHT: Daenerys arrives on her dragon during the loot train attack. BOTTOM: Devastation during the loot train attack. Game of Thrones not only pioneered the incorporation of large amounts of photographed plates, but also feature film-quality CG characters.

BIG, BIG SEQUENCES

Having navigated the brutal results of this final season – which Bauer describes as “quadruple” the work of the largest sequences of previous seasons, such as the ‘Battle of the Bastards’ or ‘Hardhome’ – the core visual effects team is particularly proud of the fact that the show’s creators trusted them with going big. “Dan and David wanted to max out and go for the gusto,” says Kullback. “While we were not doing anything we hadn’t really done before, we were doing so much of it, and so much of it is so complex that it was really frightening. Ultimately, their support and participation has been the linchpin to us being able to do our jobs, because they give us an enormous amount of creative freedom, and the freedom to execute it in the way that we think is the right way to do it.”

52 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 53


COVER

VFX Vendors on Being Part of Thrones VFX Voice asked several key contributors to Game of Thrones to reflect on their involvement in the groundbreaking visual effects for the show.

“It was a fantastic experience to work for Seasons 2 to 7 on Game of Thrones and being part of virtually re-inventing VFX for TV. Steve, Joe and Rainer (for Season 2) always pushed the limits of what used to be possible for TV and they were able to set new standards for the whole industry.” —Heiko Burkardsmaier, VFX Executive Producer/Head of Business & Legal Affairs, Mackevision

“The Third Floor has had a unique scope of work as an ‘information hub’ on Game of Thrones, where we’ve synthesized information from multiple departments, built it into the previs and then used that to help develop essential on-set data and shooting solutions.” —Michelle Blok, Previs Supervisor, The Third Floor

“Working on a show of this scale and recognition is an amazing experience because you feel you become a part of the entire movement that the audience is so excited about. Much of our work revolved around Winterfell, which makes it all the more exciting for us, since we got to be a part of realizing one of the show’s most significant locations.” —Seth Hill, Visual Effects Supervisor, Atomic Fiction/Method Studios

“I’m very grateful that we could raise and accompany the dragons for such a long time. From babies to grown-ups we could go on adventures with them – from learning to blast fire, flying, hunting, diving, and fighting to intimate moments. Being on this fantastic journey with Joe and Steve over all these years – a rarity in our current industry – makes us very proud!” —Sven Martin, Visual Effects Supervisor, Pixomondo

“Making the journey from the claustrophobic hand-to-hand fighting scenes in ‘Hardhome’ to the vast mayhem in the frozen lake scenes, with its army of thousands of wights, extensive dragon destruction simulations, full CG scenic environments, and the final ice dragon climax, pushed us beyond our limits to grow both as a company and also personally as professionals.” —David Ramos, Visual Effects Supervisor, El Ranchito

“The evolution of Daenerys riding Drogon was the most interesting to me. In Season 6, we began to pre-animate those riding sequences in the ‘Meereen’ attack. That gave us some pretty dynamic flying shots that were converted to moco camera moves. They were played back on set when filming Emilia Clarke sitting on a simple buck that approximated Drogon’s back. In Season 7, we went a step further in the Ice Lake.” —Derek Spears, Visual Effects Supervisor, Rhythm & Hues

“We knew that there was nothing like it on television – and there still isn’t. Game of Thrones was the first series to aim for the same quality of visual effects as feature films, but on an episodic schedule. The show formed Rodeo FX as a company, and it is a defining project for many people here.” —Matthew Rouleau, Visual Effects Supervisor, Rodeo FX

“The most gratifying aspect of Game of Thrones was the personal and professional satisfaction achieved by the team when they overcame creative and technical hurdles. For us, it’s built a sense of camaraderie and pride and cemented friendships that will last far beyond when the final episode fades to black.” —Thomas Schelesny, Visual Effects Supervisor, Image Engine

“When I was asked if I’d like to bring a large, fiery, zombie polar bear to life, it must have been a good half-second before I said ‘Yes.’ When the episode aired, my phone went crazy. I have never worked on a show that has had a reaction like that, and I apparently earned a huge amount of street cred with nephews and nieces.” —Wayne Stables, Visual Effects Supervisor, Weta Digital

“Not only was it a thrill to work on a show of that caliber, but the ‘Battle of the Bastards’ has become one of the most iconic sequences of the whole series. It was hugely challenging as our CG characters had to blend seamlessly into the plate photography, oftentimes very close to camera and very inspectable.” —Glenn Melenhorst, Visual Effects Supervisor, Iloura

“Contributing to Game of Thrones is truly a rare experience that I will never forget. Not only are you on a mission to delight audiences and show them something they’ve never seen before, but you are also entering a continuum of storytelling and filmmaking that needs to be studied and honored.” —Ryan Tudhope, Visual Effects Supervisor, Atomic Fiction/Method Studios

“Over the years the show has played a big part of mine and SSVFX’s life delivering over 1,600 shots. The passion and enthusiasm from HBO, the showrunners and all departments from day one have been the intoxicating key factors that, in my opinion, have led to the super successful series it has become.” —Ed Bruce, Visual Effects Supervisor, SSVFX

“Daenerys, Drogon and the Dothraki battling the Lannister army was probably the most hotly-anticipated sequence I’ve ever had the pleasure – and pressure – of working on. This series redefines what is possible in VFX for the small screen with every new season.” —Josh Simmonds, Visual Effects Supervisor, Iloura

54 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 55


COVER

VFX Vendors on Being Part of Thrones VFX Voice asked several key contributors to Game of Thrones to reflect on their involvement in the groundbreaking visual effects for the show.

“It was a fantastic experience to work for Seasons 2 to 7 on Game of Thrones and being part of virtually re-inventing VFX for TV. Steve, Joe and Rainer (for Season 2) always pushed the limits of what used to be possible for TV and they were able to set new standards for the whole industry.” —Heiko Burkardsmaier, VFX Executive Producer/Head of Business & Legal Affairs, Mackevision

“The Third Floor has had a unique scope of work as an ‘information hub’ on Game of Thrones, where we’ve synthesized information from multiple departments, built it into the previs and then used that to help develop essential on-set data and shooting solutions.” —Michelle Blok, Previs Supervisor, The Third Floor

“Working on a show of this scale and recognition is an amazing experience because you feel you become a part of the entire movement that the audience is so excited about. Much of our work revolved around Winterfell, which makes it all the more exciting for us, since we got to be a part of realizing one of the show’s most significant locations.” —Seth Hill, Visual Effects Supervisor, Atomic Fiction/Method Studios

“I’m very grateful that we could raise and accompany the dragons for such a long time. From babies to grown-ups we could go on adventures with them – from learning to blast fire, flying, hunting, diving, and fighting to intimate moments. Being on this fantastic journey with Joe and Steve over all these years – a rarity in our current industry – makes us very proud!” —Sven Martin, Visual Effects Supervisor, Pixomondo

“Making the journey from the claustrophobic hand-to-hand fighting scenes in ‘Hardhome’ to the vast mayhem in the frozen lake scenes, with its army of thousands of wights, extensive dragon destruction simulations, full CG scenic environments, and the final ice dragon climax, pushed us beyond our limits to grow both as a company and also personally as professionals.” —David Ramos, Visual Effects Supervisor, El Ranchito

“The evolution of Daenerys riding Drogon was the most interesting to me. In Season 6, we began to pre-animate those riding sequences in the ‘Meereen’ attack. That gave us some pretty dynamic flying shots that were converted to moco camera moves. They were played back on set when filming Emilia Clarke sitting on a simple buck that approximated Drogon’s back. In Season 7, we went a step further in the Ice Lake.” —Derek Spears, Visual Effects Supervisor, Rhythm & Hues

“We knew that there was nothing like it on television – and there still isn’t. Game of Thrones was the first series to aim for the same quality of visual effects as feature films, but on an episodic schedule. The show formed Rodeo FX as a company, and it is a defining project for many people here.” —Matthew Rouleau, Visual Effects Supervisor, Rodeo FX

“The most gratifying aspect of Game of Thrones was the personal and professional satisfaction achieved by the team when they overcame creative and technical hurdles. For us, it’s built a sense of camaraderie and pride and cemented friendships that will last far beyond when the final episode fades to black.” —Thomas Schelesny, Visual Effects Supervisor, Image Engine

“When I was asked if I’d like to bring a large, fiery, zombie polar bear to life, it must have been a good half-second before I said ‘Yes.’ When the episode aired, my phone went crazy. I have never worked on a show that has had a reaction like that, and I apparently earned a huge amount of street cred with nephews and nieces.” —Wayne Stables, Visual Effects Supervisor, Weta Digital

“Not only was it a thrill to work on a show of that caliber, but the ‘Battle of the Bastards’ has become one of the most iconic sequences of the whole series. It was hugely challenging as our CG characters had to blend seamlessly into the plate photography, oftentimes very close to camera and very inspectable.” —Glenn Melenhorst, Visual Effects Supervisor, Iloura

“Contributing to Game of Thrones is truly a rare experience that I will never forget. Not only are you on a mission to delight audiences and show them something they’ve never seen before, but you are also entering a continuum of storytelling and filmmaking that needs to be studied and honored.” —Ryan Tudhope, Visual Effects Supervisor, Atomic Fiction/Method Studios

“Over the years the show has played a big part of mine and SSVFX’s life delivering over 1,600 shots. The passion and enthusiasm from HBO, the showrunners and all departments from day one have been the intoxicating key factors that, in my opinion, have led to the super successful series it has become.” —Ed Bruce, Visual Effects Supervisor, SSVFX

“Daenerys, Drogon and the Dothraki battling the Lannister army was probably the most hotly-anticipated sequence I’ve ever had the pleasure – and pressure – of working on. This series redefines what is possible in VFX for the small screen with every new season.” —Josh Simmonds, Visual Effects Supervisor, Iloura

54 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 55


PROFILE

2019 VES Creative Excellence Award Winner

D.B. WEISS AND DAVID BENIOFF: SCHOLARLY PYROMANCERS OF GAME OF THRONES By JIM McCULLAUGH

TOP LEFT: D.B. Weiss and David Benioff receive the VES Award for Creative Excellence at the 17th Annual VES Awards.

56 • VFXVOICE.COM SUMMER 2019

When D.B. Weiss and David Benioff accepted the VES Award for Creative Excellence at the 17th Annual VES Awards in February, they readily acknowledged the contributions of all the VFX practitioners who contributed to the show. “When George R.R. Martin wrote the book from which Game of Thrones is based,” commented Benioff, “he did so to escape the restraints that film and TV imposed on the imagination. When we first met him, he told us point blank it was meant to be unfilmable. But because of Visual Effects Producer Steve Kullback and Visual Effects Supervisor Joe Bauer, we filmed it.” He continued, “Steve and Joe have consistently and stubbornly refused to recognize limitations. They have never been about ‘if.’ They have only been about ‘how.’ They and their team made wights, dragons, and dire wolves feel as real as the actors. Working with these gentlemen has been straight joy from the first to the last. This award rightly belongs to them. We write about dragons flying around burning shit – and they do it all.” It’s more than fair to say that Daniel Brett Weiss and David Benioff have changed the nature of TV and visual effects like never before when they became co-creators and showrunners for HBO’s Game of Thrones. Not only is Game of Thrones, which debuted in April 2011, a smash hit worldwide, but its final 2019 season has been one of the biggest events in TV history. Consider: It has received 308 awards from various industry groups and cinematic institutions and has been nominated 506 times. It has become the gold standard of TV (and maybe even cinema) special effects. D.B. Weiss hails from Chicago and earned a degree from Wesleyan University and went on to Dublin, Ireland’s Trinity College where he earned a Master of Philosophy in Irish Literature. His thesis was on James Joyce’s Finnegan’s Wake. After that he earned a Master of Fine Arts in creative writing at the Iowa Writer’s Workshop. It was in Dublin in 1995 that Weiss first met David Benioff. David Benioff hails from New York City. He attended Dartmouth College and later went on to Dublin’s Trinity College where he, too, studied Irish literature, writing his thesis on Samuel Beckett. Later, he attended the Creative Writing Program at the University of California, Irvine where he earned a Master of Fine Arts in Creative Writing.

After their academics, both Weiss and Benioff tried their hands at professional writing. Benioff’s first published novel was The 25th Hour, which caught the attention of Tobey Maguire. Benioff adapted it into a screenplay for a movie by director Spike Lee. After that he published a short story collection called When the Nines Roll Over. Next up for Benioff was a screenplay he wrote for the film Troy. He then wrote screenplays for the films Stay and The Kite Runner, and co-wrote X-Men Origins: Wolverine. Meanwhile, D.B. Weiss’s trajectory after school was to work as a personal assistant to Eagle Glenn Frey while later reuniting with Benioff in Santa Monica, California in 1998 where they co-wrote and produced several scripts. Weiss wrote a novel in 2003 called Lucky Wander Boy. The duo really took off in 2011 when they intersected with author George R.R. Martin and HBO to adapt Martin’s book A Song of Ice and Fire. In addition to writing most of the episodes of the series, each has directed an episode. They co-directed the series finale. The future bodes well for the pair. The duo has been linked to another HBO series called Confederate. Disney has also announced that the pair will write and produce a series of Star Wars films. Benioff and Weiss have become two of Hollywood’s most formidable writers, producers and screenwriters. As Weiss and Benioff reflect back on their careers to date, they appear to have heeded the words of George R.R. Martin from his book A Dance with Dragons: “If you want to conquer the world, you best have dragons.”

TOP LEFT: D.B. Weiss (left) and David Benioff on the set of Season 5. (Photo courtesy of HBO. Photo: Macall B. Polay) TOP RIGHT: David Benioff (left) and D.B. Weiss on the set. (Photo courtesy of HBO. Photo: Helen Sloan) MIDDLE: David Benioff (left) and D.B. Weiss discuss a shot during Season 5. (Photo courtesy of HBO. Photo by Helen Sloan) BOTTOM: Actress Emilia Clark, aka Daenerys Targaryen, enjoys a light moment with D.B. Weiss (middle) and David Benioff on the set during Season 7.

SUMMER 2019 VFXVOICE.COM • 57


PROFILE

2019 VES Creative Excellence Award Winner

D.B. WEISS AND DAVID BENIOFF: SCHOLARLY PYROMANCERS OF GAME OF THRONES By JIM McCULLAUGH

TOP LEFT: D.B. Weiss and David Benioff receive the VES Award for Creative Excellence at the 17th Annual VES Awards.

56 • VFXVOICE.COM SUMMER 2019

When D.B. Weiss and David Benioff accepted the VES Award for Creative Excellence at the 17th Annual VES Awards in February, they readily acknowledged the contributions of all the VFX practitioners who contributed to the show. “When George R.R. Martin wrote the book from which Game of Thrones is based,” commented Benioff, “he did so to escape the restraints that film and TV imposed on the imagination. When we first met him, he told us point blank it was meant to be unfilmable. But because of Visual Effects Producer Steve Kullback and Visual Effects Supervisor Joe Bauer, we filmed it.” He continued, “Steve and Joe have consistently and stubbornly refused to recognize limitations. They have never been about ‘if.’ They have only been about ‘how.’ They and their team made wights, dragons, and dire wolves feel as real as the actors. Working with these gentlemen has been straight joy from the first to the last. This award rightly belongs to them. We write about dragons flying around burning shit – and they do it all.” It’s more than fair to say that Daniel Brett Weiss and David Benioff have changed the nature of TV and visual effects like never before when they became co-creators and showrunners for HBO’s Game of Thrones. Not only is Game of Thrones, which debuted in April 2011, a smash hit worldwide, but its final 2019 season has been one of the biggest events in TV history. Consider: It has received 308 awards from various industry groups and cinematic institutions and has been nominated 506 times. It has become the gold standard of TV (and maybe even cinema) special effects. D.B. Weiss hails from Chicago and earned a degree from Wesleyan University and went on to Dublin, Ireland’s Trinity College where he earned a Master of Philosophy in Irish Literature. His thesis was on James Joyce’s Finnegan’s Wake. After that he earned a Master of Fine Arts in creative writing at the Iowa Writer’s Workshop. It was in Dublin in 1995 that Weiss first met David Benioff. David Benioff hails from New York City. He attended Dartmouth College and later went on to Dublin’s Trinity College where he, too, studied Irish literature, writing his thesis on Samuel Beckett. Later, he attended the Creative Writing Program at the University of California, Irvine where he earned a Master of Fine Arts in Creative Writing.

After their academics, both Weiss and Benioff tried their hands at professional writing. Benioff’s first published novel was The 25th Hour, which caught the attention of Tobey Maguire. Benioff adapted it into a screenplay for a movie by director Spike Lee. After that he published a short story collection called When the Nines Roll Over. Next up for Benioff was a screenplay he wrote for the film Troy. He then wrote screenplays for the films Stay and The Kite Runner, and co-wrote X-Men Origins: Wolverine. Meanwhile, D.B. Weiss’s trajectory after school was to work as a personal assistant to Eagle Glenn Frey while later reuniting with Benioff in Santa Monica, California in 1998 where they co-wrote and produced several scripts. Weiss wrote a novel in 2003 called Lucky Wander Boy. The duo really took off in 2011 when they intersected with author George R.R. Martin and HBO to adapt Martin’s book A Song of Ice and Fire. In addition to writing most of the episodes of the series, each has directed an episode. They co-directed the series finale. The future bodes well for the pair. The duo has been linked to another HBO series called Confederate. Disney has also announced that the pair will write and produce a series of Star Wars films. Benioff and Weiss have become two of Hollywood’s most formidable writers, producers and screenwriters. As Weiss and Benioff reflect back on their careers to date, they appear to have heeded the words of George R.R. Martin from his book A Dance with Dragons: “If you want to conquer the world, you best have dragons.”

TOP LEFT: D.B. Weiss (left) and David Benioff on the set of Season 5. (Photo courtesy of HBO. Photo: Macall B. Polay) TOP RIGHT: David Benioff (left) and D.B. Weiss on the set. (Photo courtesy of HBO. Photo: Helen Sloan) MIDDLE: David Benioff (left) and D.B. Weiss discuss a shot during Season 5. (Photo courtesy of HBO. Photo by Helen Sloan) BOTTOM: Actress Emilia Clark, aka Daenerys Targaryen, enjoys a light moment with D.B. Weiss (middle) and David Benioff on the set during Season 7.

SUMMER 2019 VFXVOICE.COM • 57


LEGEND

ROGER CORMAN: THE GODFATHER OF INDEPENDENT FILM By NAOMI GOLDMAN

Legendary filmmaker Roger Corman is one of the most prolific producers and directors in the history of cinematic entertainment. A trailblazer in independent film, his creative work and output of drive-in classics has earned him monikers such as “The Pope of Pop Culture” and “The King of the B Movies.” At 92 years old and with a storied career spanning more than six decades, Corman has produced upwards of 350 feature films, with a stunning track record for turning a profit on all but a handful of his wildly inventive features. Corman’s filmography boasts one inspired title after another, including The Beast with a Million Eyes, Slumber Party Massacre, The Devil on Horseback, Swamp Woman, Caged Heat, Nightcall Nurses, Frankenstein Unbound, and cult classics Death Race 2000 and The Little Shop of Horrors. His diverse slate also includes The House of Usher, part of the critically acclaimed adaptations of Edgar Allan Poe stories starring Vincent Price; The Intruder, a serious look at racial integration featuring William Shatner in his film debut; The Wild Angels, which kicked off the ‘biker movie’ cycle; and The Trip, which began the psychedelic film wave of the late 1960s. In 1964, Corman became the youngest filmmaker to have a retrospective at the Cinémathèque Française. He has also been honored with retrospectives at the British Film Institute and the Museum of Modern Art. For his influence on modern American cinema and ability to nurture aspiring filmmakers, in 2009 he was bestowed with an Honorary Academy Award. ORIGIN STORY

TOP: Corman directing Peter Fonda and Nancy Sinatra in The Wild Angels. BOTTOM: Corman on The Wild Angels set with Peter Fonda.

58 • VFXVOICE.COM SUMMER 2019

Corman was born in 1926 in Detroit, Michigan. As a young boy, he and his friends were regulars at the local theaters on Saturday afternoons, primed to catch double features. The reissue of Frankenstein and an English science fiction film, Things to Come, were among the first films to spark his imagination. Corman initially planned to follow in his father’s footsteps and pursue civil engineering, but while studying at Stanford University his interests shifted. “I was writing for the Stanford Daily and found out that the film critics got free passes to all the theaters in Palo Alto, and one was graduating. So I wrote a few reviews and was taken on as a critic. Films had been just entertainment, but now I began to analyze them. I was more interested in this, but I was graduating and earned my engineering degree. I was the failure of my Stanford class. After three days at U.S. Electrical Motors, I quit and got the worst job among my peers – as a messenger at 20th Century Fox, delivering the mail for $32.50 a week. But it was pure passion. “At Stanford,” Corman continues, “I enlisted in the Navy College Training Program. When I left Fox, I still had time left on the G.I. Bill, which covered education costs for service veterans. I spent a term at Oxford University before coming back home with a drive to become a screenwriter and producer. I landed a job as an assistant to literary agent Dick Hyland and started writing under a pseudonym. When I sold a script with someone else’s name on it and it came back around, Dick laughed and said, ‘As long as we get the 10% commission, it’s okay!’”

In 1953, Corman sold his first script, The House in the Sea, which was filmed and released as Highway Dragnet. “From that sale and other outreach, I raised a grand total of $12,000 and used it to make a low-budget film that we shot in six days, called It Stalked the Floor, which was changed to Monster from the Ocean Floor. It did well, and then I produced The Fast and the Furious – a valuable title that served me well in later deals!” He went on to broker a multi-picture deal with American Releasing Corp., later renamed as American-International Pictures (AIP). With Corman as its lead filmmaker, albeit with no formal training, AIP became one of the most successful independent studios in cinema history. Corman first took to the director’s chair with Five Guns West and, over the next 15 years, directed 53 films. He quickly earned a reputation for churning out low-budget films on a lightning fast turnaround. Upon leaving AIP, he decided to focus on production and distribution through his own company, New World Pictures. “I had always admired the great auteurs and had a new model for the profitable regional distribution of art cinema, in addition to my films. I reached out to Ingmar Bergman to take Cries and Whispers under my cost-and-profit-sharing approach, and that started the flow of foreign films.” New World became the American distributor of the films of Bergman, Akira Kurosawa, Federico Fellini, François Truffaut and others. In a 10-year period, New World Pictures won more Academy Awards for Best Foreign Film than all other studios combined. Corman sold New World Pictures in the 1980s, but continued his work through various companies over the years – Concorde Pictures, New Horizons, Millennium Pictures and New Concorde.

TOP LEFT: Corman with protégé Ron Howard. TOP RIGHT: Jonathan Demme directs Corman’s cameo in The Silence of the Lambs. BOTTOM: Corman directing John Hurt in Frankenstein Unbound.

STANDOUT CINEMA

Corman describes what led to The Little Shop of Horrors. “I had just made A Bucket of Blood (1959), combining horror with some humor and it did very well. I had a chance to use an adjacent set about to be torn down, so I went full steam in making a comedy with a little bit of horror thrown in – that was The Little Shop of

SUMMER 2019 VFXVOICE.COM • 59


LEGEND

ROGER CORMAN: THE GODFATHER OF INDEPENDENT FILM By NAOMI GOLDMAN

Legendary filmmaker Roger Corman is one of the most prolific producers and directors in the history of cinematic entertainment. A trailblazer in independent film, his creative work and output of drive-in classics has earned him monikers such as “The Pope of Pop Culture” and “The King of the B Movies.” At 92 years old and with a storied career spanning more than six decades, Corman has produced upwards of 350 feature films, with a stunning track record for turning a profit on all but a handful of his wildly inventive features. Corman’s filmography boasts one inspired title after another, including The Beast with a Million Eyes, Slumber Party Massacre, The Devil on Horseback, Swamp Woman, Caged Heat, Nightcall Nurses, Frankenstein Unbound, and cult classics Death Race 2000 and The Little Shop of Horrors. His diverse slate also includes The House of Usher, part of the critically acclaimed adaptations of Edgar Allan Poe stories starring Vincent Price; The Intruder, a serious look at racial integration featuring William Shatner in his film debut; The Wild Angels, which kicked off the ‘biker movie’ cycle; and The Trip, which began the psychedelic film wave of the late 1960s. In 1964, Corman became the youngest filmmaker to have a retrospective at the Cinémathèque Française. He has also been honored with retrospectives at the British Film Institute and the Museum of Modern Art. For his influence on modern American cinema and ability to nurture aspiring filmmakers, in 2009 he was bestowed with an Honorary Academy Award. ORIGIN STORY

TOP: Corman directing Peter Fonda and Nancy Sinatra in The Wild Angels. BOTTOM: Corman on The Wild Angels set with Peter Fonda.

58 • VFXVOICE.COM SUMMER 2019

Corman was born in 1926 in Detroit, Michigan. As a young boy, he and his friends were regulars at the local theaters on Saturday afternoons, primed to catch double features. The reissue of Frankenstein and an English science fiction film, Things to Come, were among the first films to spark his imagination. Corman initially planned to follow in his father’s footsteps and pursue civil engineering, but while studying at Stanford University his interests shifted. “I was writing for the Stanford Daily and found out that the film critics got free passes to all the theaters in Palo Alto, and one was graduating. So I wrote a few reviews and was taken on as a critic. Films had been just entertainment, but now I began to analyze them. I was more interested in this, but I was graduating and earned my engineering degree. I was the failure of my Stanford class. After three days at U.S. Electrical Motors, I quit and got the worst job among my peers – as a messenger at 20th Century Fox, delivering the mail for $32.50 a week. But it was pure passion. “At Stanford,” Corman continues, “I enlisted in the Navy College Training Program. When I left Fox, I still had time left on the G.I. Bill, which covered education costs for service veterans. I spent a term at Oxford University before coming back home with a drive to become a screenwriter and producer. I landed a job as an assistant to literary agent Dick Hyland and started writing under a pseudonym. When I sold a script with someone else’s name on it and it came back around, Dick laughed and said, ‘As long as we get the 10% commission, it’s okay!’”

In 1953, Corman sold his first script, The House in the Sea, which was filmed and released as Highway Dragnet. “From that sale and other outreach, I raised a grand total of $12,000 and used it to make a low-budget film that we shot in six days, called It Stalked the Floor, which was changed to Monster from the Ocean Floor. It did well, and then I produced The Fast and the Furious – a valuable title that served me well in later deals!” He went on to broker a multi-picture deal with American Releasing Corp., later renamed as American-International Pictures (AIP). With Corman as its lead filmmaker, albeit with no formal training, AIP became one of the most successful independent studios in cinema history. Corman first took to the director’s chair with Five Guns West and, over the next 15 years, directed 53 films. He quickly earned a reputation for churning out low-budget films on a lightning fast turnaround. Upon leaving AIP, he decided to focus on production and distribution through his own company, New World Pictures. “I had always admired the great auteurs and had a new model for the profitable regional distribution of art cinema, in addition to my films. I reached out to Ingmar Bergman to take Cries and Whispers under my cost-and-profit-sharing approach, and that started the flow of foreign films.” New World became the American distributor of the films of Bergman, Akira Kurosawa, Federico Fellini, François Truffaut and others. In a 10-year period, New World Pictures won more Academy Awards for Best Foreign Film than all other studios combined. Corman sold New World Pictures in the 1980s, but continued his work through various companies over the years – Concorde Pictures, New Horizons, Millennium Pictures and New Concorde.

TOP LEFT: Corman with protégé Ron Howard. TOP RIGHT: Jonathan Demme directs Corman’s cameo in The Silence of the Lambs. BOTTOM: Corman directing John Hurt in Frankenstein Unbound.

STANDOUT CINEMA

Corman describes what led to The Little Shop of Horrors. “I had just made A Bucket of Blood (1959), combining horror with some humor and it did very well. I had a chance to use an adjacent set about to be torn down, so I went full steam in making a comedy with a little bit of horror thrown in – that was The Little Shop of

SUMMER 2019 VFXVOICE.COM • 59


LEGEND

“From that sale [of his first script in 1953] and other outreach, I raised a grand total of $12,000 and used it to make a low-budget film that we shot in six days, called It Stalked the Floor, which was changed to Monster from the Ocean Floor. It did well, and then I produced The Fast and the Furious – a valuable title that served me well in later deals!” —Roger Corman

TOP: Corman directing John Hurt in Frankenstein Unbound. BOTTOM: Corman with Bridget Fonda and John Hurt on the set of Frankenstein Unbound.

Horrors. We used a lot of the same cast and shot it in two days on a shoestring budget, finding that sweet spot between horror and humor.” Corman shared that The Little Shop of Horrors (1960) was almost a joke. “I had made a horror film where I carefully planned a sequence to make the audience scream. The key to that type of shooting isn’t the moment of screaming – it’s the building up and the breaking of that tension. It worked perfectly, but after they screamed there was laughter. I wondered, what did I do wrong? I understood it was appreciative laughter and started developing a theory on the relationship between comedy and horror.” Progressing forward, Corman cites Battle Beyond the Stars (1980) as his first attempt at making a bigger-budget sci-fi film with extensive special effects. “When Star Wars came out, I had great admiration for the film. But I thought, wow, we are in trouble, because this is what we had been doing, only bigger and better. The only way to compete was to raise my budgets and run my own studio. So I pre-sold some producing rights to Warner Bros. and put up the other half of the money to buy a site [in Venice, California] and converted it into a sound stage. The Lumber Yard was born.” Death Race 2000 (1975) remains one of Corman’s favorite films, based on a short story about a futuristic race where the drivers don’t only drive to win, but to knock other cars off the road. “I started thinking about violence and gladiator games and the bloodthirsty role of the audience. My idea was for drivers to get points for knocking off other cars and for how many pedestrians they could kill. It was a huge success and was voted somewhere as the greatest B picture of all time. To this day, when I hear ‘$20 for the little old lady in the crosswalk,’ I still chuckle.” THE SCHOOL OF CORMAN

Legions of filmmakers and actors got their start with Corman, who has a remarkable penchant for spotting and cultivating talent. Among his protégés are Francis Ford Coppola, Ron Howard, James Cameron, Martin Scorcese, Jack Nicholson, Sandra Bullock, Robert De Niro, Peter Bogdanovich, Jonathan Demme, Dennis Hopper, Sylvester Stallone, William Shatner, John Landis, and even current VES Board of Directors Chair Mike Chambers. Many of them have paid their respects by giving

60 • VFXVOICE.COM SUMMER 2019

him cameos in their films, including The Silence of the Lambs, The Godfather Part II, Apollo 13, The Manchurian Candidate and Philadelphia. He speaks with a sense of parental pride about some of his alumni. Francis Ford Coppola: “In the late 1950s, I saw some beautiful Russian science fiction films. I went to Moscow and bought the American distribution rights. But the films had a lot of anti-American propaganda in them. I called the UCLA Film School and asked for the most promising editor among their graduate students, and they sent over Francis Ford Coppola. So Francis’ first job was cutting the anti-American propaganda out of Russian science fiction films. “I was shooting a Grand Prix Formula One picture in Europe called The Young Racers and Francis was the 2nd AD, and we had rebuilt a Volkswagen microbus into a traveling mini studio. Once the picture wrapped, since I had the nucleus of a crew and the costs were covered, I decided to make another film and gave Francis the opportunity to make his directorial debut with Dementia 13.” Jack Nicholson: “When I started directing, my engineering background enabled me to learn the camera and editing fairly quickly, but I didn’t know enough about acting. So I enrolled in a method acting class. Jack was in it and was just 19 years old, but clearly the best actor in the class, so I gave him his first role in The Cry Baby Killer. And he was on his way.” James Cameron: “He started with me as a model maker for Battle Beyond the Stars, and is the only person I ever gave a raise to and promoted in the middle of a picture. To show his ability on an outer space picture, he had designed the spaceship, but there was an issue with one of the walls of the craft. The next day, Jim went to a store, got some cartons, glued them, sprayed them with aluminum paint and added some dials, but was concerned that we went over budget to solve the problem – by spending $12. It was his talent that enabled him to solve a problem for $12 that ultimately made Titanic look like something that had double the production budget.”

TOP: Corman at work at New Horizon Pictures. BOTTOM: Corman with Vincent Price, star of The House of Usher.

ON TECHNOLOGY AND IMAGINATION

“Visual effects were a part of our early films, but we just built the creatures and put people in the suits. We moved into other effects using optical printing and matte shots, and we used process shots, like when filming someone driving a car, you would shoot the moving background and then put the car in front of the process screen. “I remember a reissue of a Hitchcock film that used process shooting and the audience laughed at it. No one laughed when the original came out, but today they look so crude. The work being done today digitally is the best that’s ever been done. But 50 years from now, people might laugh at what we consider state of the art. Now that we can create anything you can imagine with CG and technology, I think sometimes the special effects are emphasized over the story. It should still be about effects serving the narrative. “And I think some elements in horror films should be left to the

SUMMER 2019 VFXVOICE.COM • 61


LEGEND

“From that sale [of his first script in 1953] and other outreach, I raised a grand total of $12,000 and used it to make a low-budget film that we shot in six days, called It Stalked the Floor, which was changed to Monster from the Ocean Floor. It did well, and then I produced The Fast and the Furious – a valuable title that served me well in later deals!” —Roger Corman

TOP: Corman directing John Hurt in Frankenstein Unbound. BOTTOM: Corman with Bridget Fonda and John Hurt on the set of Frankenstein Unbound.

Horrors. We used a lot of the same cast and shot it in two days on a shoestring budget, finding that sweet spot between horror and humor.” Corman shared that The Little Shop of Horrors (1960) was almost a joke. “I had made a horror film where I carefully planned a sequence to make the audience scream. The key to that type of shooting isn’t the moment of screaming – it’s the building up and the breaking of that tension. It worked perfectly, but after they screamed there was laughter. I wondered, what did I do wrong? I understood it was appreciative laughter and started developing a theory on the relationship between comedy and horror.” Progressing forward, Corman cites Battle Beyond the Stars (1980) as his first attempt at making a bigger-budget sci-fi film with extensive special effects. “When Star Wars came out, I had great admiration for the film. But I thought, wow, we are in trouble, because this is what we had been doing, only bigger and better. The only way to compete was to raise my budgets and run my own studio. So I pre-sold some producing rights to Warner Bros. and put up the other half of the money to buy a site [in Venice, California] and converted it into a sound stage. The Lumber Yard was born.” Death Race 2000 (1975) remains one of Corman’s favorite films, based on a short story about a futuristic race where the drivers don’t only drive to win, but to knock other cars off the road. “I started thinking about violence and gladiator games and the bloodthirsty role of the audience. My idea was for drivers to get points for knocking off other cars and for how many pedestrians they could kill. It was a huge success and was voted somewhere as the greatest B picture of all time. To this day, when I hear ‘$20 for the little old lady in the crosswalk,’ I still chuckle.” THE SCHOOL OF CORMAN

Legions of filmmakers and actors got their start with Corman, who has a remarkable penchant for spotting and cultivating talent. Among his protégés are Francis Ford Coppola, Ron Howard, James Cameron, Martin Scorcese, Jack Nicholson, Sandra Bullock, Robert De Niro, Peter Bogdanovich, Jonathan Demme, Dennis Hopper, Sylvester Stallone, William Shatner, John Landis, and even current VES Board of Directors Chair Mike Chambers. Many of them have paid their respects by giving

60 • VFXVOICE.COM SUMMER 2019

him cameos in their films, including The Silence of the Lambs, The Godfather Part II, Apollo 13, The Manchurian Candidate and Philadelphia. He speaks with a sense of parental pride about some of his alumni. Francis Ford Coppola: “In the late 1950s, I saw some beautiful Russian science fiction films. I went to Moscow and bought the American distribution rights. But the films had a lot of anti-American propaganda in them. I called the UCLA Film School and asked for the most promising editor among their graduate students, and they sent over Francis Ford Coppola. So Francis’ first job was cutting the anti-American propaganda out of Russian science fiction films. “I was shooting a Grand Prix Formula One picture in Europe called The Young Racers and Francis was the 2nd AD, and we had rebuilt a Volkswagen microbus into a traveling mini studio. Once the picture wrapped, since I had the nucleus of a crew and the costs were covered, I decided to make another film and gave Francis the opportunity to make his directorial debut with Dementia 13.” Jack Nicholson: “When I started directing, my engineering background enabled me to learn the camera and editing fairly quickly, but I didn’t know enough about acting. So I enrolled in a method acting class. Jack was in it and was just 19 years old, but clearly the best actor in the class, so I gave him his first role in The Cry Baby Killer. And he was on his way.” James Cameron: “He started with me as a model maker for Battle Beyond the Stars, and is the only person I ever gave a raise to and promoted in the middle of a picture. To show his ability on an outer space picture, he had designed the spaceship, but there was an issue with one of the walls of the craft. The next day, Jim went to a store, got some cartons, glued them, sprayed them with aluminum paint and added some dials, but was concerned that we went over budget to solve the problem – by spending $12. It was his talent that enabled him to solve a problem for $12 that ultimately made Titanic look like something that had double the production budget.”

TOP: Corman at work at New Horizon Pictures. BOTTOM: Corman with Vincent Price, star of The House of Usher.

ON TECHNOLOGY AND IMAGINATION

“Visual effects were a part of our early films, but we just built the creatures and put people in the suits. We moved into other effects using optical printing and matte shots, and we used process shots, like when filming someone driving a car, you would shoot the moving background and then put the car in front of the process screen. “I remember a reissue of a Hitchcock film that used process shooting and the audience laughed at it. No one laughed when the original came out, but today they look so crude. The work being done today digitally is the best that’s ever been done. But 50 years from now, people might laugh at what we consider state of the art. Now that we can create anything you can imagine with CG and technology, I think sometimes the special effects are emphasized over the story. It should still be about effects serving the narrative. “And I think some elements in horror films should be left to the

SUMMER 2019 VFXVOICE.COM • 61


LEGEND

“Now that we can create anything you can imagine with CG and technology, I think sometimes the special effects are emphasized over the story. It should still be about effects serving the narrative.” —Roger Corman imagination. Like the outrageous films with people cutting off their limbs. To me, that’s a crude way to get a shock. You should set shots up, but leave it to the audience’s imagination, because everyone perceives things differently. In that way, audience members are left to create the horror that fits their psyche. Five hundred audience members, that’s 500 different interpretations – a creative storyteller’s playground.” LAST WORDS

“I’m still making one, two films per year, and my last one was made with the Chinese equivalent of Netflix. I welcome technical advancements and new venues for more films to be made and seen. “What continues to drive me is story. It’s the originality of the idea, knowing that you can never be 100% original. I think it was Picasso who said, ‘If you can add one small thing to art, that is a full career.’ I’m aware of what has come before, but I try and find something new to bring to every project. I’m still dazzled by the opportunity to bring those stories to life.”

TOP LEFT: Director Corman setting up a shot. TOP RIGHT: Corman directing Boris Karloff in The Raven. MIDDLE: Corman presents the Outstanding Visual Effects in a Photoreal Feature award at the 17th Annual VES Awards. BOTTOM: Corman with VES Board Chair Mike Chambers, one of his early protégés, at the 17th Annual VES Awards.

62 • VFXVOICE.COM SUMMER 2019



TV

CROSSING THE SPRAWLING DC TELEVISION UNIVERSE By TREVOR HOGG

Images courtesy of Warner Bros. Entertainment and Encore VFX

64 • VFXVOICE.COM SUMMER 2019

As producer, director and writer, Greg Berlanti has become synonymous with the DC Television Universe by creating Arrow, The Flash, Legends of Tomorrow, Supergirl, Black Lightning, Titans, Doom Patrol and Batwoman. Berlanti has been assisted by the expertise of Encore VFX Executive Creative Director and Visual Effects Supervisor Armen V. Kevorkian. “As far as my role goes, I’m heavily involved with Flash, Supergirl, Titans and Doom Patrol,” explains Kevorkian. “After the first year of personally working on Black Lightning, I handed it off to Kim Rasser, one of our other supervisors here, to maintain that look.” Significant contributions also come from Zoic Studios, with Co-Founder and Executive Creative Director Chris Jones overseeing Arrow and Visual Effects Supervisor Andrew Bardusk looking after Legends of Tomorrow. The first DC TV show created by Berlanti remains a distinct entity. “Arrow has the vigilante undertone to it and always has strong story arcs over the course of the season,” notes Jones. “Rather than having superpowers, Oliver Queen/Green Arrow [Stephen Amell] is a traditional bad-ass with the ability to shoot people.” The key to the visual effects in Arrow is to always be grounded and organic. “You’ll have story arcs that require big visual effects and others that require them to be in a supporting role.” Stunts are an integral part of storytelling. “Arrow has a crazy, skilled stunt crew and we’re always in awe of what they are able to pull off practically. There is a lot of digital-double work only because some stunts become extreme and are time-prohibitive to pull off.” Unique digital assets needed to be created. “Supergirl has more creature work compared to any of our other shows,” notes Kevorkian. “We can’t reuse that asset unless that character crosses over.” Legends of Tomorrow is in a similar situation. “The most challenging creature that we’ve done so far has been the 200-foottall octopus in [the Season 4 episode] ‘Tagumo Attacks!!!,’” explains Bardusk. “In the middle of the episode, Tagumo gets hit by the shrink ray that makes him six feet tall. We needed our asset to be look developed in two different scenarios for the subsurface and

shader response. In addition to that, he has eight tentacles which are difficult to deal with. Something I wasn’t expecting was how difficult it would be to get the suckers to behave correctly. If totally tethered to the tentacle they get stretchy and oblong, which isn’t what happens in nature. The suckers stay circular and protrude. That lone asset had curveball after curveball, but was a lot of fun to figure it out.” Two CG cities are worked on by Encore VFX during downtime to make them even better. “National City in Supergirl has a Los Angeles feel because our first year was in Los Angeles,” remarks Kevorkian. “Then for Flash, Central City is more like Vancouver.” Gotham will be reimagined for Batwoman. “We have the Chris Nolan, Tim Burton and Joel Schumacher versions. The TV series Gotham did it differently. We did a little bit of it in Titans. Now we’re in the process of rethinking Gotham on our end. You stay true to the comics as far as the Gothic design, but pepper it with more modern buildings as well.” Star City in Arrow is approached differently by Zoic Studios. “Being the original series, Arrow has laid claim to Vancouver as Star City, so there is not a whole lot of digital enhancement necessary,” notes Jones. “A couple of seasons ago we collapsed a large portion of the city and added the Glades,” adds Jones. “We put in a lot of iconic buildings that are true to the series.” “A lot of times, I’ll work with our artists on anything that involves R&D, based on what I feel something should look like,” explains Kevorkian. “But then I run it by our executive producers to see if this is going to be good enough, or if it’s what they were thinking. Our EPs are good about approving a first pass and chime in every now and then.” Sometimes the concept art staged is bypassed. “We have a talented modelling department, so I’ll have someone build a quick 3D sculpt of something that can be rendered at 360 degrees.” For Arrow, R&D focuses on the tools utilized by Oliver Queen. “How do some of the arrows work, whether they’re grappling or bola arrows?” states Jones. “We run physics tests to make sure that the arrows can fly and still be mechanically functional. Then there are large effects that happen, like the explosions towards the end

OPPOSITE TOP LEFT: Armen Kevorkian, Executive Creative Director and Visual Effects Supervisor, Encore VFX, talks to Claudia Doumit on set while directing the Supergirl Season 2 episode “Ace Reporter.” (Photo courtesy of Diyah Pera and Warner Bros. Entertainment) OPPOSITE TOP RIGHT: China Anne McClain, who plays Jennifer Pierce in Black Lightning, has the ability to generate electric shocks and bursts from her hands. OPPOSITE BOTTOM: Cress Williams portrays Jefferson Pierce/Black Lightning, who has an electrical outburst in the Season 2 episode “Gift of the Magi.” TOP LEFT: Matt Bomer portrays military pilot Larry Trainor in the pilot episode of Doom Patrol. TOP RIGHT: The bluescreen is transformed into a B-52 aircraft, which is about to release Larry Trainor on his fateful test flight in the pilot episode of Doom Patrol. BOTTOM: Robotman is voiced by Brendan Fraser, while underneath the prosthetic makeup is Riley Shanahan, standing beside Diane Guerrero, who plays the schizophrenic superhero Crazy Jane in the pilot episode of Doom Patrol.

SUMMER 2019 VFXVOICE.COM • 65


TV

CROSSING THE SPRAWLING DC TELEVISION UNIVERSE By TREVOR HOGG

Images courtesy of Warner Bros. Entertainment and Encore VFX

64 • VFXVOICE.COM SUMMER 2019

As producer, director and writer, Greg Berlanti has become synonymous with the DC Television Universe by creating Arrow, The Flash, Legends of Tomorrow, Supergirl, Black Lightning, Titans, Doom Patrol and Batwoman. Berlanti has been assisted by the expertise of Encore VFX Executive Creative Director and Visual Effects Supervisor Armen V. Kevorkian. “As far as my role goes, I’m heavily involved with Flash, Supergirl, Titans and Doom Patrol,” explains Kevorkian. “After the first year of personally working on Black Lightning, I handed it off to Kim Rasser, one of our other supervisors here, to maintain that look.” Significant contributions also come from Zoic Studios, with Co-Founder and Executive Creative Director Chris Jones overseeing Arrow and Visual Effects Supervisor Andrew Bardusk looking after Legends of Tomorrow. The first DC TV show created by Berlanti remains a distinct entity. “Arrow has the vigilante undertone to it and always has strong story arcs over the course of the season,” notes Jones. “Rather than having superpowers, Oliver Queen/Green Arrow [Stephen Amell] is a traditional bad-ass with the ability to shoot people.” The key to the visual effects in Arrow is to always be grounded and organic. “You’ll have story arcs that require big visual effects and others that require them to be in a supporting role.” Stunts are an integral part of storytelling. “Arrow has a crazy, skilled stunt crew and we’re always in awe of what they are able to pull off practically. There is a lot of digital-double work only because some stunts become extreme and are time-prohibitive to pull off.” Unique digital assets needed to be created. “Supergirl has more creature work compared to any of our other shows,” notes Kevorkian. “We can’t reuse that asset unless that character crosses over.” Legends of Tomorrow is in a similar situation. “The most challenging creature that we’ve done so far has been the 200-foottall octopus in [the Season 4 episode] ‘Tagumo Attacks!!!,’” explains Bardusk. “In the middle of the episode, Tagumo gets hit by the shrink ray that makes him six feet tall. We needed our asset to be look developed in two different scenarios for the subsurface and

shader response. In addition to that, he has eight tentacles which are difficult to deal with. Something I wasn’t expecting was how difficult it would be to get the suckers to behave correctly. If totally tethered to the tentacle they get stretchy and oblong, which isn’t what happens in nature. The suckers stay circular and protrude. That lone asset had curveball after curveball, but was a lot of fun to figure it out.” Two CG cities are worked on by Encore VFX during downtime to make them even better. “National City in Supergirl has a Los Angeles feel because our first year was in Los Angeles,” remarks Kevorkian. “Then for Flash, Central City is more like Vancouver.” Gotham will be reimagined for Batwoman. “We have the Chris Nolan, Tim Burton and Joel Schumacher versions. The TV series Gotham did it differently. We did a little bit of it in Titans. Now we’re in the process of rethinking Gotham on our end. You stay true to the comics as far as the Gothic design, but pepper it with more modern buildings as well.” Star City in Arrow is approached differently by Zoic Studios. “Being the original series, Arrow has laid claim to Vancouver as Star City, so there is not a whole lot of digital enhancement necessary,” notes Jones. “A couple of seasons ago we collapsed a large portion of the city and added the Glades,” adds Jones. “We put in a lot of iconic buildings that are true to the series.” “A lot of times, I’ll work with our artists on anything that involves R&D, based on what I feel something should look like,” explains Kevorkian. “But then I run it by our executive producers to see if this is going to be good enough, or if it’s what they were thinking. Our EPs are good about approving a first pass and chime in every now and then.” Sometimes the concept art staged is bypassed. “We have a talented modelling department, so I’ll have someone build a quick 3D sculpt of something that can be rendered at 360 degrees.” For Arrow, R&D focuses on the tools utilized by Oliver Queen. “How do some of the arrows work, whether they’re grappling or bola arrows?” states Jones. “We run physics tests to make sure that the arrows can fly and still be mechanically functional. Then there are large effects that happen, like the explosions towards the end

OPPOSITE TOP LEFT: Armen Kevorkian, Executive Creative Director and Visual Effects Supervisor, Encore VFX, talks to Claudia Doumit on set while directing the Supergirl Season 2 episode “Ace Reporter.” (Photo courtesy of Diyah Pera and Warner Bros. Entertainment) OPPOSITE TOP RIGHT: China Anne McClain, who plays Jennifer Pierce in Black Lightning, has the ability to generate electric shocks and bursts from her hands. OPPOSITE BOTTOM: Cress Williams portrays Jefferson Pierce/Black Lightning, who has an electrical outburst in the Season 2 episode “Gift of the Magi.” TOP LEFT: Matt Bomer portrays military pilot Larry Trainor in the pilot episode of Doom Patrol. TOP RIGHT: The bluescreen is transformed into a B-52 aircraft, which is about to release Larry Trainor on his fateful test flight in the pilot episode of Doom Patrol. BOTTOM: Robotman is voiced by Brendan Fraser, while underneath the prosthetic makeup is Riley Shanahan, standing beside Diane Guerrero, who plays the schizophrenic superhero Crazy Jane in the pilot episode of Doom Patrol.

SUMMER 2019 VFXVOICE.COM • 65


TV

TOP LEFT: Melissa Benoist lies on set in a pristine Supergirl costume in the Season 4 episode “Man of Steel.” TOP RIGHT: The Supergirl costume is replaced by badly damaged armor and the face of Melissa Benoist is digitally augmented with Kryptonite dust travelling through the veins. BOTTOM LEFT: Anthony Konechny plays D.E.O. agent Raymond Jensen in the Supergirl Season 4 episode “Ahisma.” BOTTOM RIGHT: The alien parasite that inhabits the body of Raymond Jensen is digitally added.

of last season, or water dynamics that occur towards the beginning of this season. There is a lot of R&D in that pipeline to make sure those effects work correctly.” “A lot of the team have been with me since Flash started five years ago,” states Kevorkian. “For the most part they get to work on all of the shows. We definitely have 2D and 3D supervisors who are specific to a show, but our compositors may be working on Flash this week and next week help on Doom Patrol or Supergirl. It’s spread out to the point where no one gets bored working on one show.” Zoic Studios has digital artists working exclusively for a show. “The Legend and Arrow teams are separate although the supervisors sit right next to each other,” reveals Jones. “There’s a lot of synergy between them, with Andrew Bardusk previously being the digital effects supervisor on Arrow.” Every situation is handled depending on the needs of the show for that moment. Before Batwoman started shooting in February, Kevorkian was prepping “certain environments I knew we were going to have to do CG for. I knew I was going to need a digital

double of [Ruby Rose, playing Batwoman], so we started building that. You work with the director and get some storyboards to figure out what you’ll need. There was a scene with a car that needed CG help for what the vehicle was required to do. This is where I said, ‘My car library has over 50 vehicles including a Toyota Prius and Lincoln Town Car. If the practical car can be one of these then the studio will not have to pay for an asset because we have something built already.’” “We’re lucky because so much of this is captured practically on set,” notes Jones. “That’s one of the things that has always challenged us, which is to create organic cameras. You want to take your cameras, lay them out as if you were shooting the scene, and use the same lenses, camera speed and camera equipment that they have on set. What we want to try to avoid are cameras whipping around in space for no reason, because they automatically start to feel CG.” Other elements contribute to the believability factor. “In [Legends of Tomorrow] ‘The Virgin Gary’ episode, we have this demonic unicorn being sucked into the Hell vortex, courtesy of Constantine [Matt Ryan],” explains Bardusk. “We had to do a matte painting of all of the hoofmarks the unicorn was leaving, as well as dust being kicked up as it tries to claw its way towards Gary Green [Adam Tsekhman]. You spend as much time on all of these extra little details as the asset itself.” Stock footage is used as the basis for establishing shots featured in Legends of Tomorrow. “We’ll paint over the stock footage to make it period, cool and exciting,” states Bardusk. “Then we’ll project that onto 3D geometry so we can still give it an interesting camera move.” Greenscreens come up regularly for Legends of Tomorrow. “The art department will build the first story of a building, which will cover all of the actors, and the walk and talks,” notes Bardusk. “A lot of times we have to add a roof to a building, which involves roto. Greenscreen comes up for the time portals when they come in from one location to the other. They’re flexible about popping up a greenscreen on a C-stand on the fly and getting it lit when we need to.” A major asset is having collaborative executive producers.

TOP: A dragon has an aerial encounter with Supergirl in the Season 4 episode “Call to Action.” BOTTOM: Supergirl meets an alien adversary, which is an entirely CG creation, in Supergirl Season 4 episode “Suspicious Minds.”

“A lot of the team have been with me since Flash started five years ago. For the most part they get to work on all of the shows. We definitely have 2D and 3D supervisors who are specific to a show, but our compositors may be working on Flash this week and next week help on Doom Patrol or Supergirl. It’s spread out to the point where no one gets bored working on one show.” —Armen V. Kevorkian, Executive Creative Director and Visual Effects Supervisor, Encore VFX

66 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 67


TV

TOP LEFT: Melissa Benoist lies on set in a pristine Supergirl costume in the Season 4 episode “Man of Steel.” TOP RIGHT: The Supergirl costume is replaced by badly damaged armor and the face of Melissa Benoist is digitally augmented with Kryptonite dust travelling through the veins. BOTTOM LEFT: Anthony Konechny plays D.E.O. agent Raymond Jensen in the Supergirl Season 4 episode “Ahisma.” BOTTOM RIGHT: The alien parasite that inhabits the body of Raymond Jensen is digitally added.

of last season, or water dynamics that occur towards the beginning of this season. There is a lot of R&D in that pipeline to make sure those effects work correctly.” “A lot of the team have been with me since Flash started five years ago,” states Kevorkian. “For the most part they get to work on all of the shows. We definitely have 2D and 3D supervisors who are specific to a show, but our compositors may be working on Flash this week and next week help on Doom Patrol or Supergirl. It’s spread out to the point where no one gets bored working on one show.” Zoic Studios has digital artists working exclusively for a show. “The Legend and Arrow teams are separate although the supervisors sit right next to each other,” reveals Jones. “There’s a lot of synergy between them, with Andrew Bardusk previously being the digital effects supervisor on Arrow.” Every situation is handled depending on the needs of the show for that moment. Before Batwoman started shooting in February, Kevorkian was prepping “certain environments I knew we were going to have to do CG for. I knew I was going to need a digital

double of [Ruby Rose, playing Batwoman], so we started building that. You work with the director and get some storyboards to figure out what you’ll need. There was a scene with a car that needed CG help for what the vehicle was required to do. This is where I said, ‘My car library has over 50 vehicles including a Toyota Prius and Lincoln Town Car. If the practical car can be one of these then the studio will not have to pay for an asset because we have something built already.’” “We’re lucky because so much of this is captured practically on set,” notes Jones. “That’s one of the things that has always challenged us, which is to create organic cameras. You want to take your cameras, lay them out as if you were shooting the scene, and use the same lenses, camera speed and camera equipment that they have on set. What we want to try to avoid are cameras whipping around in space for no reason, because they automatically start to feel CG.” Other elements contribute to the believability factor. “In [Legends of Tomorrow] ‘The Virgin Gary’ episode, we have this demonic unicorn being sucked into the Hell vortex, courtesy of Constantine [Matt Ryan],” explains Bardusk. “We had to do a matte painting of all of the hoofmarks the unicorn was leaving, as well as dust being kicked up as it tries to claw its way towards Gary Green [Adam Tsekhman]. You spend as much time on all of these extra little details as the asset itself.” Stock footage is used as the basis for establishing shots featured in Legends of Tomorrow. “We’ll paint over the stock footage to make it period, cool and exciting,” states Bardusk. “Then we’ll project that onto 3D geometry so we can still give it an interesting camera move.” Greenscreens come up regularly for Legends of Tomorrow. “The art department will build the first story of a building, which will cover all of the actors, and the walk and talks,” notes Bardusk. “A lot of times we have to add a roof to a building, which involves roto. Greenscreen comes up for the time portals when they come in from one location to the other. They’re flexible about popping up a greenscreen on a C-stand on the fly and getting it lit when we need to.” A major asset is having collaborative executive producers.

TOP: A dragon has an aerial encounter with Supergirl in the Season 4 episode “Call to Action.” BOTTOM: Supergirl meets an alien adversary, which is an entirely CG creation, in Supergirl Season 4 episode “Suspicious Minds.”

“A lot of the team have been with me since Flash started five years ago. For the most part they get to work on all of the shows. We definitely have 2D and 3D supervisors who are specific to a show, but our compositors may be working on Flash this week and next week help on Doom Patrol or Supergirl. It’s spread out to the point where no one gets bored working on one show.” —Armen V. Kevorkian, Executive Creative Director and Visual Effects Supervisor, Encore VFX

66 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 67


TV

TOP LEFT: Grant Gustin rushes off as The Flash in the Season 5 episode “Oh Come, All Ye Thankful.” TOP RIGHT: Bolts of electricity are augmented into the shot to emphasize the superhuman speed of The Flash. BOTTOM LEFT: A plate taken from The Flash Season 5 episode “The Icicle Cometh.” BOTTOM RIGHT: Metahuman abilities of turning arms into knives is digitally integrated into the live-action performance.

“Before Titans started up in March, I met with the showrunner, Greg Walker. I asked him, ‘What are you thinking about for Season 2?’ He gave me broad strokes of what was coming. That gave me the ability early on to put my team on certain things.” There is also room for discussion. “I’m able to express my opinion and sometimes they’ll take it into consideration. You don’t have to win every single time. Everyone is trying to make the best show possible.” The hardest aspect of crossover episodes is coordinating the schedules of the cast members. “There are certain looks that we set up, like the Reality Wave which was more prominent in Supergirl,” explains Kevorkian. “We had to send the Reality Wave to the producers on the other shows to make sure that everybody was onboard. It adds another layer to the approval process which takes more time.” Stepping behind the camera is not uncommon for Kevorkian, who has served as both second unit director and director on The Flash and Supergirl. “Season 2 of Flash was the first time that I directed an episode, which happens to be a favorite of mine. It was a great script and had some cool visuals that you

could show off. Then I started doing Supergirl and Flash regularly.” The transition from comic book to television sometimes requires a significant adjustment. “We went back and forth on the tiger in Titans because in the comic books it’s really green,” reveals Kevorkian. “But when you do that in real life it doesn’t feel real. It took time to dial that in and have everybody agree.” A DC Universe series about a group of superhero misfits is a career highlight. “Doom Patrol is probably the most fun that I’ve had on any show because it’s so out there and some of the things that they write you’re like, ‘I don’t think this has ever been created before.’ That makes it interesting for me.” Adds Kevorkian, “Between reading all of these [Doom Patrol] scripts, sometimes I’m like, ‘Where did I read that? Was that in Titans or Supergirl?’ We get scripts every week. It’s a mental challenge to keep up with it, but I’ll get bored if I don’t have that problem!”

TOP LEFT: The face of Rag Doll, portrayed by Troy James, gets digitally inserted into the plate shot taken for The Flash Season 5 episode “All Doll’d Up.” TOP RIGHT: Anna Diop portrays Koriand’r/Starfire in the pilot episode of Titans. BOTTOM: A lasso of energy was created by Encore VFX for the Season 1 finale of Titans.

“We’re lucky because so much of this was captured practically on set. That’s one of the things that has always challenged us, which is to create organic cameras. You want to take your cameras, lay them out as if you were shooting the scene, and use the same lenses, camera speed and camera equipment that they have on set. What we want to try to avoid are cameras whipping around in space for no reason, because they automatically start to feel CG.” —Chris Jones, Co-Founder/Executive Creative Director, Zoic Studios

68 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 69


TV

TOP LEFT: Grant Gustin rushes off as The Flash in the Season 5 episode “Oh Come, All Ye Thankful.” TOP RIGHT: Bolts of electricity are augmented into the shot to emphasize the superhuman speed of The Flash. BOTTOM LEFT: A plate taken from The Flash Season 5 episode “The Icicle Cometh.” BOTTOM RIGHT: Metahuman abilities of turning arms into knives is digitally integrated into the live-action performance.

“Before Titans started up in March, I met with the showrunner, Greg Walker. I asked him, ‘What are you thinking about for Season 2?’ He gave me broad strokes of what was coming. That gave me the ability early on to put my team on certain things.” There is also room for discussion. “I’m able to express my opinion and sometimes they’ll take it into consideration. You don’t have to win every single time. Everyone is trying to make the best show possible.” The hardest aspect of crossover episodes is coordinating the schedules of the cast members. “There are certain looks that we set up, like the Reality Wave which was more prominent in Supergirl,” explains Kevorkian. “We had to send the Reality Wave to the producers on the other shows to make sure that everybody was onboard. It adds another layer to the approval process which takes more time.” Stepping behind the camera is not uncommon for Kevorkian, who has served as both second unit director and director on The Flash and Supergirl. “Season 2 of Flash was the first time that I directed an episode, which happens to be a favorite of mine. It was a great script and had some cool visuals that you

could show off. Then I started doing Supergirl and Flash regularly.” The transition from comic book to television sometimes requires a significant adjustment. “We went back and forth on the tiger in Titans because in the comic books it’s really green,” reveals Kevorkian. “But when you do that in real life it doesn’t feel real. It took time to dial that in and have everybody agree.” A DC Universe series about a group of superhero misfits is a career highlight. “Doom Patrol is probably the most fun that I’ve had on any show because it’s so out there and some of the things that they write you’re like, ‘I don’t think this has ever been created before.’ That makes it interesting for me.” Adds Kevorkian, “Between reading all of these [Doom Patrol] scripts, sometimes I’m like, ‘Where did I read that? Was that in Titans or Supergirl?’ We get scripts every week. It’s a mental challenge to keep up with it, but I’ll get bored if I don’t have that problem!”

TOP LEFT: The face of Rag Doll, portrayed by Troy James, gets digitally inserted into the plate shot taken for The Flash Season 5 episode “All Doll’d Up.” TOP RIGHT: Anna Diop portrays Koriand’r/Starfire in the pilot episode of Titans. BOTTOM: A lasso of energy was created by Encore VFX for the Season 1 finale of Titans.

“We’re lucky because so much of this was captured practically on set. That’s one of the things that has always challenged us, which is to create organic cameras. You want to take your cameras, lay them out as if you were shooting the scene, and use the same lenses, camera speed and camera equipment that they have on set. What we want to try to avoid are cameras whipping around in space for no reason, because they automatically start to feel CG.” —Chris Jones, Co-Founder/Executive Creative Director, Zoic Studios

68 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 69


TV

EDGY, EERIE VFX REFLECTIONS IN BLACK MIRROR By TREVOR HOGG

Images courtesy of Netflix and Russell McLean.

70 • VFXVOICE.COM SUMMER 2019

There is never a dull moment working on the visual effects for the Primetime Emmy Awards-lauded Netflix science fiction anthology series Black Mirror. The series examines the dark side of technology and human nature – whether it be a mechanical dog-like creature, exploring space with the USS Callister or producing the standalone, interactive psychological thriller Black Mirror: Bandersnatch. “DNEG did an amazing job with the robot in ‘Metalhead,’ which is definitely one of the best visual effects that I’ve worked on,” states Black Mirror VFX Producer Russell McLean, who was part of the team that received a BAFTA Award for Best Special, Visual and Graphic Effects for the episode that aired on December 29, 2017. Producing Black Mirror: Bandersnatch was a unique and complicated task as viewers are given the ability to make narrative choices. “When I first started with Black Mirror on ‘The Waldo Moment’ as the animation producer, the Waldo character had to be in real-time because we couldn’t afford to do all of those as screen composites. We also needed to have interaction between Waldo and the cast. The budgets were smaller, but the ambition was still there. Visual effects started as something that was one of the last things to be discussed but in Season 3 it became a lot more central to the storytelling.” Black Mirror Executive Producers Charlie Brooker and Annabel Jones never let the visual effects overshadow the storytelling. “They’re always focused on the narrative and making sure that it is clear and working,” notes McLean. “Visual effects have become a big part of telling those stories.” Everything needs to be grounded in reality, McLean explains. “The temptation when you’ve got a CG character is to cheat with the camera. David Slade, who directed “Metalhead,” never wanted to have any shots that you couldn’t have if that was a real robot running in front of you. David shoots with a narrow depth of field as well, so we definitely wanted to make sure that was reflected in the focus pulls on the CG character. We were able to record the focus pulls, which allowed us to replicate them. [Capturing] that element of error where the focus is moving across the CG object helps it to feel natural.” McLean says a common topic of conversation was the reliance on several visual effects vendors as opposed to a single one. “Each episode is trying to do something totally different, so you want to go where the people who have the right skills for that particular thing are. Also, there’s never enough time because the episodes are getting written quite quickly, and before the shoot things scale up.” Post-production time varies with the episodes. “They’re all getting aired at the same time, so the last episode to shoot is always the nightmare one, which was ‘Black Museum’ for Season 4,” remarks McLean. “We shot ‘USS Callister’ first in Season 4 because that had all of the space effects. ‘Metalhead’ was the fifth one to shoot, so that was quite trying as we had two months from picture lock to finish the post.” Visual effects shots range from 150 to 600 depending on the needs of the episode. “Some things you know that you’re going to have to do, like the killer bees in ‘Hated in the Nation,’ so we can get going with those designs and concepts straightaway. After Charlie had written the first draft of ‘Playtest,’ I was told it was going to require a couple

“The temptation when you’ve got a CG character is to cheat with the camera. David Slade, who directed ‘Metalhead,’ never wanted to have any shots that you couldn’t have if that was a real robot running in front of you. David shoots with a narrow depth of field as well, so we definitely wanted to make sure that was reflected in the focus pulls on the CG character. We were able to record the focus pulls, which allowed us to replicate them. [Capturing] that element of error where the focus is moving across the CG object helps it to feel natural.” —Russell McLean, VFX Producer of digital matte paintings and a CG normal-sized spider. Then Dan Trachtenberg came on board as the director, and that expanded into a human spider creature, glitching rooms, and the gopher sequence. We were conceptualizing as quickly as we could for the spider creature and trying to figure it out as we were going. “I started on Bandersnatch at the beginning of January 2018 and we delivered in the middle of October 2018, which is 10 and a half months,” he says. “Bandersnatch has 250 visual effects shots that appear in 750 different places. Everything with Bandersnatch became an admin problem and making sure that we were only delivering final shots to Technicolor, because if you have to update a shot that’s in 20 different places it becomes a nightmare. We managed to turnover some sequences quite early, like the mirror sequence, which took about three months. Part of the advantage of having an edit that’s built up of segments is that we could lock segments as we went, instead of waiting for the whole thing to be locked.”

OPPOSITE TOP: Russell McLean, VFX Producer, Black Mirror and Black Mirror: Bandersnatch OPPOSITE BOTTOM: The Pax Demon in Bandersnatch was created practically with makeup design by Kirstin Chalmers and prosthetics created by Mark Coulier. TOP: The artificial substitute bees featured in the “Hated in the Nation” episode were conceptualized by Painting Practice, while the visual effect was produced by Jellyfish Pictures under the supervision of Dave Cook and Luke Dodd.

SUMMER 2019 VFXVOICE.COM • 71


TV

EDGY, EERIE VFX REFLECTIONS IN BLACK MIRROR By TREVOR HOGG

Images courtesy of Netflix and Russell McLean.

70 • VFXVOICE.COM SUMMER 2019

There is never a dull moment working on the visual effects for the Primetime Emmy Awards-lauded Netflix science fiction anthology series Black Mirror. The series examines the dark side of technology and human nature – whether it be a mechanical dog-like creature, exploring space with the USS Callister or producing the standalone, interactive psychological thriller Black Mirror: Bandersnatch. “DNEG did an amazing job with the robot in ‘Metalhead,’ which is definitely one of the best visual effects that I’ve worked on,” states Black Mirror VFX Producer Russell McLean, who was part of the team that received a BAFTA Award for Best Special, Visual and Graphic Effects for the episode that aired on December 29, 2017. Producing Black Mirror: Bandersnatch was a unique and complicated task as viewers are given the ability to make narrative choices. “When I first started with Black Mirror on ‘The Waldo Moment’ as the animation producer, the Waldo character had to be in real-time because we couldn’t afford to do all of those as screen composites. We also needed to have interaction between Waldo and the cast. The budgets were smaller, but the ambition was still there. Visual effects started as something that was one of the last things to be discussed but in Season 3 it became a lot more central to the storytelling.” Black Mirror Executive Producers Charlie Brooker and Annabel Jones never let the visual effects overshadow the storytelling. “They’re always focused on the narrative and making sure that it is clear and working,” notes McLean. “Visual effects have become a big part of telling those stories.” Everything needs to be grounded in reality, McLean explains. “The temptation when you’ve got a CG character is to cheat with the camera. David Slade, who directed “Metalhead,” never wanted to have any shots that you couldn’t have if that was a real robot running in front of you. David shoots with a narrow depth of field as well, so we definitely wanted to make sure that was reflected in the focus pulls on the CG character. We were able to record the focus pulls, which allowed us to replicate them. [Capturing] that element of error where the focus is moving across the CG object helps it to feel natural.” McLean says a common topic of conversation was the reliance on several visual effects vendors as opposed to a single one. “Each episode is trying to do something totally different, so you want to go where the people who have the right skills for that particular thing are. Also, there’s never enough time because the episodes are getting written quite quickly, and before the shoot things scale up.” Post-production time varies with the episodes. “They’re all getting aired at the same time, so the last episode to shoot is always the nightmare one, which was ‘Black Museum’ for Season 4,” remarks McLean. “We shot ‘USS Callister’ first in Season 4 because that had all of the space effects. ‘Metalhead’ was the fifth one to shoot, so that was quite trying as we had two months from picture lock to finish the post.” Visual effects shots range from 150 to 600 depending on the needs of the episode. “Some things you know that you’re going to have to do, like the killer bees in ‘Hated in the Nation,’ so we can get going with those designs and concepts straightaway. After Charlie had written the first draft of ‘Playtest,’ I was told it was going to require a couple

“The temptation when you’ve got a CG character is to cheat with the camera. David Slade, who directed ‘Metalhead,’ never wanted to have any shots that you couldn’t have if that was a real robot running in front of you. David shoots with a narrow depth of field as well, so we definitely wanted to make sure that was reflected in the focus pulls on the CG character. We were able to record the focus pulls, which allowed us to replicate them. [Capturing] that element of error where the focus is moving across the CG object helps it to feel natural.” —Russell McLean, VFX Producer of digital matte paintings and a CG normal-sized spider. Then Dan Trachtenberg came on board as the director, and that expanded into a human spider creature, glitching rooms, and the gopher sequence. We were conceptualizing as quickly as we could for the spider creature and trying to figure it out as we were going. “I started on Bandersnatch at the beginning of January 2018 and we delivered in the middle of October 2018, which is 10 and a half months,” he says. “Bandersnatch has 250 visual effects shots that appear in 750 different places. Everything with Bandersnatch became an admin problem and making sure that we were only delivering final shots to Technicolor, because if you have to update a shot that’s in 20 different places it becomes a nightmare. We managed to turnover some sequences quite early, like the mirror sequence, which took about three months. Part of the advantage of having an edit that’s built up of segments is that we could lock segments as we went, instead of waiting for the whole thing to be locked.”

OPPOSITE TOP: Russell McLean, VFX Producer, Black Mirror and Black Mirror: Bandersnatch OPPOSITE BOTTOM: The Pax Demon in Bandersnatch was created practically with makeup design by Kirstin Chalmers and prosthetics created by Mark Coulier. TOP: The artificial substitute bees featured in the “Hated in the Nation” episode were conceptualized by Painting Practice, while the visual effect was produced by Jellyfish Pictures under the supervision of Dave Cook and Luke Dodd.

SUMMER 2019 VFXVOICE.COM • 71


TV

TOP LEFT: A look from concept through development of the “Metalhead” asset created by DNEG under the supervision of Michael Bell. TOP RIGHT: A shot breakdown of a scene in the “Metalhead” episode, which won a BAFTA Award for Best Special, Visual and Graphic Effects. BOTTOM LEFT: For the “Playtest” episode, the gopher was created by Glassworks, while SpiderPeters and the fight were produced by Framestore under the supervision of Justin Hutchinson-Chatburn. BOTTOM RIGHT: The Arachnajax creature that appears in the “USS Callister” episode was produced by Framestore under the supervision of Russell Dodgson.

Previs was only created for the mirror and jumping-off-thebalcony sequences. “The script was evolving, so there wasn’t time to do competitive bidding shots or speak to different companies,” remarks McLean. “We did it all with Duncan Malcolm, who is a visual effects supervisor at Glassworks. I’ve worked with him for years, and he has done quite a lot of sequences for Black Mirror. I wanted just one person to deal with and trust to do a good job, and come to a good agreement on price. I also needed someone who would not freak out when I said, ‘Okay that scene is gone. Now we’ve got this scene.’” The mirror sequence came into the script quite late. “There were two mirrored sets, so David Slade had the freedom to put the camera anywhere. There are some shots where the back of my head is dressed in the same costume as Stefan Butler [Fionn Whitehead] while he’s taking the pills. For the crawl-through, two witness cameras were jammed-synced to shoot the reflection pass. Then we scanned Stefan and projected the reflection passes onto the geometry.” With the story set in 1984, the video game technology had to reflect that time period. “Almost all of the ZX Spectrum games were pre-made ahead of time and in-camera,” reveals McLean. “Clayton McDermott, who has worked on Black Mirror since

the beginning, did the graphics. He understands Charlie and pretty much at his first attempt was able to create the games of ‘Metalhead,’ ‘Bandersnatch’ and ‘Nosedive.’ “Based on Clayton’s design, we had someone else make the ‘Nosedive’ game that you can download if you get a certain sequence of things. At the end of the credits, Stefan is listening to a ZX Spectrum tape that you can record and play into an emulator. The ‘Nosedive’ game is actually quite good. After the post process, when I was speaking to the ZX Spectrum programmer about making the ‘Nosedive’ game, I asked him to make the ‘Metalhead’ and ‘Bandersnatch’ games as well. He gave ‘Metalhead’ a shot, but it’s hard to get all of that into 48K – 48K isn’t even a text email for us now.” A demon known as the Pax makes an appearance in Bandersnatch. “There are no visual effects there,” states McLean. “Mark Coulier made the prosthetics for the Pax, and Kirstin Chalmers was the Makeup Designer. When you have someone in makeup for seven hours – for half an hour on set – it seems quite extravagant, but they’re important jump scares. I love the fight scene in the kitchen that takes place in the ‘Playtest’ episode, done by stunt coordinator Andy Bennett. When the stuntwoman smashed into the dresser with all of the crockery, we all sat behind the monitor and thought that she had really hurt herself. Then she jumps up and says, ‘How’s that?’ It looked so real and was terrifying to watch.” Blood and gore are part of the visual language. “David Slade would say that you could never go too far with gore, as he’s always asking for more blood!” states McLean. “It’s a bit of a shame that the chopping up of the dad in the bath scene had to be shortened to make the rhythm of it work. There are no visual effects in that at all. The severed head is the actor in the dresser, so we had to paint out his shoulder. David is not keen on stuff like that as a prosthetic. He thinks you’ll never believe it.” The year 1984 needed to be recreated in the production design. “There was a lot of bad taste in 1984,” observes McLean. “The suburban houses were horrible. The temptation is to go to the 1970s, which was a bit more attractive, and also Black Mirror has a design aesthetic that looks really good. We had to find the

TOP: Exteriors of the USS Callister spaceship, which were produced by Framestore under the supervision of Russell Dodgson. BOTTOM: Bandersnatch allows viewers to make narrative choices that not only impact story points, but also what television ads and songs appear later on.

“There were two mirrored sets, so David Slade had the freedom to put the camera anywhere. There are some shots where the back of my head is dressed in the same costume as Stefan Butler [Fionn Whitehead] while he’s taking the pills. For the crawl-through, two witness cameras were jammed-synced to shoot the reflection pass. Then we scanned Stefan and projected the reflection passes onto the geometry.” —Russell McLean, VFX Producer

72 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 73


TV

TOP LEFT: A look from concept through development of the “Metalhead” asset created by DNEG under the supervision of Michael Bell. TOP RIGHT: A shot breakdown of a scene in the “Metalhead” episode, which won a BAFTA Award for Best Special, Visual and Graphic Effects. BOTTOM LEFT: For the “Playtest” episode, the gopher was created by Glassworks, while SpiderPeters and the fight were produced by Framestore under the supervision of Justin Hutchinson-Chatburn. BOTTOM RIGHT: The Arachnajax creature that appears in the “USS Callister” episode was produced by Framestore under the supervision of Russell Dodgson.

Previs was only created for the mirror and jumping-off-thebalcony sequences. “The script was evolving, so there wasn’t time to do competitive bidding shots or speak to different companies,” remarks McLean. “We did it all with Duncan Malcolm, who is a visual effects supervisor at Glassworks. I’ve worked with him for years, and he has done quite a lot of sequences for Black Mirror. I wanted just one person to deal with and trust to do a good job, and come to a good agreement on price. I also needed someone who would not freak out when I said, ‘Okay that scene is gone. Now we’ve got this scene.’” The mirror sequence came into the script quite late. “There were two mirrored sets, so David Slade had the freedom to put the camera anywhere. There are some shots where the back of my head is dressed in the same costume as Stefan Butler [Fionn Whitehead] while he’s taking the pills. For the crawl-through, two witness cameras were jammed-synced to shoot the reflection pass. Then we scanned Stefan and projected the reflection passes onto the geometry.” With the story set in 1984, the video game technology had to reflect that time period. “Almost all of the ZX Spectrum games were pre-made ahead of time and in-camera,” reveals McLean. “Clayton McDermott, who has worked on Black Mirror since

the beginning, did the graphics. He understands Charlie and pretty much at his first attempt was able to create the games of ‘Metalhead,’ ‘Bandersnatch’ and ‘Nosedive.’ “Based on Clayton’s design, we had someone else make the ‘Nosedive’ game that you can download if you get a certain sequence of things. At the end of the credits, Stefan is listening to a ZX Spectrum tape that you can record and play into an emulator. The ‘Nosedive’ game is actually quite good. After the post process, when I was speaking to the ZX Spectrum programmer about making the ‘Nosedive’ game, I asked him to make the ‘Metalhead’ and ‘Bandersnatch’ games as well. He gave ‘Metalhead’ a shot, but it’s hard to get all of that into 48K – 48K isn’t even a text email for us now.” A demon known as the Pax makes an appearance in Bandersnatch. “There are no visual effects there,” states McLean. “Mark Coulier made the prosthetics for the Pax, and Kirstin Chalmers was the Makeup Designer. When you have someone in makeup for seven hours – for half an hour on set – it seems quite extravagant, but they’re important jump scares. I love the fight scene in the kitchen that takes place in the ‘Playtest’ episode, done by stunt coordinator Andy Bennett. When the stuntwoman smashed into the dresser with all of the crockery, we all sat behind the monitor and thought that she had really hurt herself. Then she jumps up and says, ‘How’s that?’ It looked so real and was terrifying to watch.” Blood and gore are part of the visual language. “David Slade would say that you could never go too far with gore, as he’s always asking for more blood!” states McLean. “It’s a bit of a shame that the chopping up of the dad in the bath scene had to be shortened to make the rhythm of it work. There are no visual effects in that at all. The severed head is the actor in the dresser, so we had to paint out his shoulder. David is not keen on stuff like that as a prosthetic. He thinks you’ll never believe it.” The year 1984 needed to be recreated in the production design. “There was a lot of bad taste in 1984,” observes McLean. “The suburban houses were horrible. The temptation is to go to the 1970s, which was a bit more attractive, and also Black Mirror has a design aesthetic that looks really good. We had to find the

TOP: Exteriors of the USS Callister spaceship, which were produced by Framestore under the supervision of Russell Dodgson. BOTTOM: Bandersnatch allows viewers to make narrative choices that not only impact story points, but also what television ads and songs appear later on.

“There were two mirrored sets, so David Slade had the freedom to put the camera anywhere. There are some shots where the back of my head is dressed in the same costume as Stefan Butler [Fionn Whitehead] while he’s taking the pills. For the crawl-through, two witness cameras were jammed-synced to shoot the reflection pass. Then we scanned Stefan and projected the reflection passes onto the geometry.” —Russell McLean, VFX Producer

72 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 73


TV

“As a producer, it felt like I was halfway between a software development project and a filmmaking process.” —Russell McLean, VFX Producer

TOP: The house where Stefan Butler (Fionn Whitehead) lives in Bandersnatch was a combination of a practical location and studio set.

“I love the fight scene in the kitchen that takes place in the ‘Playtest’ episode, done by stunt coordinator Andy Bennett. When the stuntwoman smashed into the dresser with all of the crockery, we all sat behind the monitor and thought that she had really hurt herself. Then she jumps up and says, ‘How’s that?’ It looked so real and was terrifying to watch.” —Russell McLean, VFX Producer

74 • VFXVOICE.COM SUMMER 2019

right locations that worked for us. The shopping center and WHSmith [British retail locations] were quite easy because that stuff is there.” The home that Stefan lives in was a hybrid of location and set. “We looked at 50 houses until we found Stefan’s house. It needed to be a suburban house, have interesting features, be middle class, and not look too fancy.” The only big CG environment for Bandersnatch was the ground beneath the tower in which Colin Ritman (Will Poulter) resides that required a 2.5D digital matte painting. “We shot on the 24th floor, and there was scaffolding up to the 18th floor, so there were clean-up and CG elements for when Stefan and Colin jump from the balcony.” Two and a half hours of original material is in the final edit; however, the duplication of segments resulted in the delivery of five and a quarter hours-worth of material. “Our shooting script was 166 pages,” reveals McLean. “The scale of Bandersnatch involved a horrible number of spreadsheets! There were not two big visual effects sequences or one creature to do, but a lot of smaller and different visual effects. It was managing that across everything that was a tricky task.” Netflix developed some new software for the project called Branch Manager, which allows for the writing of interactive stories. “We worked closely with the engineering team at Netflix to establish our parameters, how long we need to have between choices so there is no buffering, and how the choice points and scene tracking were going to work. There were weekly calls. We would come up with something and ask, ‘Are you able to do this or that?’ They never said, ‘No.’ Initially, we needed a minute between choice points to allow for buffering, and that ended up being 30 seconds as they improved on their end. As a producer, it felt like I was halfway between a software development project and a filmmaking process.” Early decisions by the viewer impact the narrative later on. “The cereal choice in the beginning is mundane, but that’s the point of it,” states McLean. “Later on, that will affect what commercial will come up on the TV before he puts in the VHS. Whether people notice that, I don’t know. It was nice to have those kinds of things in there. The music choices affect what Stefan puts on in his bedroom later. Figuring out all of that stuff was interesting and good fun.” A standout moment is when the viewer can choose whether the young Stefan accompanies his mother on the train. “You get the choice to be with your mum, but you’re going to die if you do! It’s very Black Mirror. It’s quite a warm moment.” McLean acknowledges that Bandersnatch has been the biggest challenge associated with producing Black Mirror. “It’s going to make doing a linear project feel easy!”



MUSIC VIDEO

For VFX artists, music videos can be a very different kind of project compared to film or television. The schedules are often tighter, the budgets lower and the crews smaller. But the amount of work involved in making VFX for music videos, also known as promos, is rarely lower. Indeed, many promos run like short films and often include a range of diverse, if not over the top, imagery. To get a sense of the state of play in this area of effects, VFX Voice tracked down studios and individual artists behind some of the most captivating VFX imagery in promos in recent years. FUN AND FANTASTICAL

EFFECTS ADD EXTRA BEATS TO MUSIC VIDEOS By IAN FAILES

TOP: A greenscreen element for a shot in DJ Snake’s “Magenta Riddim” promo. (Image courtesy of Tal Baltuch) BOTTOM: The final shot involved smoke simulations and compositing of the live-action shots. (Image courtesy of Tal Baltuch)

76 • VFXVOICE.COM SUMMER 2019

Music videos tend to showcase a confined story. In the case of DJ Snake’s “Magenta Riddim,” a firefighter visits India to demonstrate how to extinguish a small blaze. But things quickly escalate from this simple story, and visual effects were used to tell a firefighting story that channels the typical over-the-top action scenes of Tollywood, the south India film industry. Designer Tal Baltuch led the visual effects work for the music video, taking cues from directors Gal Muggia and Vania Heymann. “Vania and Gal wanted to apply the energy of Tollywood to this tale of heroic firemen,” says Baltuch. “It needed to be epically entertaining and obviously not bound to physics as we know it.” Each VFX shot in the promo would make use of smoke and fire and the addition of practical elements filmed at Ramoji Film City in India. “There was a lot of fire, smoke and flying cars on the set, but not enough,” notes Baltuch. “Each of the firefighting shots ended up being a composite of the shot action and special effects, footage of volcano eruptions, fire, smoke, and some additional 3D and simulations. “Ideally,” Baltuch adds, “we might have had months to work with a large group of 3D artists simulating each shot. But that was not the case. And I think both Vania, who directed and did the VFX with me, and myself prefer having more direct control over the result and favor using compositing techniques rather than 3D simulations when possible.” The Maroon 5 music video for “Wait” features another seemingly innocuous story, this time about a breakup. That is, until strange events start to happen, courtesy of visual effects studio Timber. One character’s face disintegrates into paint, weird things take place underwater, missiles make a mark in a car junkyard and, finally, the lead singer, Adam Levine, gets unwrapped as endless strands of yarn. For the ‘face of paint’ shots, Timber worked with production designer John Hammer to build a mask out of paint that could be manipulated. “We shot extra passes of the actress Alexandra Daddario with it on her face with Velcro straps,” explains Timber Creative Director Jonah Hill. “We also got performance takes with her without the paint mask. In digital composite we carefully combined them so it looked like one pass.” Underwater scenes were handled by filming swimmers in a water tank, and then duplicating them, with Levine shot separately in a pool. Meanwhile, practical pyro explosions drove how the junkyard shots came together, with this sequence also showcasing

a unique transition from underwater, inside the car, and out to the yard. Finally, the yarn shots made use of procedural simulations that unraveled Levine and the house around him. Says Hill: “We used Houdini to do the effects, but actual simulations were secondary, really only used for the wiggling yarn pieces that stretched across the room. This was a challenging sequence, but we’re happy with the decisions we made.” “All the Stars,” from Kendrick Lamar and SZA, takes the fun and fantastical even further, mixing in starfields, fields of waving hands, panthers and more for an especially lavish production. BUF was brought on to produce the visual effects. “We received a very detailed moodboard from the director Dave Meyers,” says BUF Executive Producer Loris Paillier, “and we had some calls to make sure we understood what he had in mind.” Shots of a boat sailing through a myriad of waving hands relied on greenscreen photography of Lamar and motion-control plates, with only a small group of people for the crowds. “We put the images of the arms on simple 3D models and we put them on a 3D ocean to create the animation,” details BUF CG artist Marion Eloy and VFX Supervisor Dominique Vidal. “Then we worked a lot on the sky part with different pieces of clouds.” The panther scenes also relied on motion control, with a real panther shot multiple times in separate passes (Lamar was also filmed separately), then combined. Equally majestic were scenes of SZA among a starfield. She was filmed greenscreen, with the

TOP LEFT: A mix of practical effects on location served as the base of many shots for “Magenta Riddim.” (Image courtesy of Tal Baltuch) TOP RIGHT: The final composite. (Image courtesy of Tal Baltuch) BOTTOM LEFT: Maroon 5’s “Wait” breakup storyline includes a scene where one character’s face is made up of paint. (Image courtesy of Timber) BOTTOM MIDDLE: Adam Levine is unwound as a huge length of yarn, simulated by Ingenuity Studios. (Image courtesy of Timber) BOTTOM RIGHT: Practical pyro effects were augmented with CG missiles and composited explosions. (Image courtesy of Timber)

SUMMER 2019 VFXVOICE.COM • 77


MUSIC VIDEO

For VFX artists, music videos can be a very different kind of project compared to film or television. The schedules are often tighter, the budgets lower and the crews smaller. But the amount of work involved in making VFX for music videos, also known as promos, is rarely lower. Indeed, many promos run like short films and often include a range of diverse, if not over the top, imagery. To get a sense of the state of play in this area of effects, VFX Voice tracked down studios and individual artists behind some of the most captivating VFX imagery in promos in recent years. FUN AND FANTASTICAL

EFFECTS ADD EXTRA BEATS TO MUSIC VIDEOS By IAN FAILES

TOP: A greenscreen element for a shot in DJ Snake’s “Magenta Riddim” promo. (Image courtesy of Tal Baltuch) BOTTOM: The final shot involved smoke simulations and compositing of the live-action shots. (Image courtesy of Tal Baltuch)

76 • VFXVOICE.COM SUMMER 2019

Music videos tend to showcase a confined story. In the case of DJ Snake’s “Magenta Riddim,” a firefighter visits India to demonstrate how to extinguish a small blaze. But things quickly escalate from this simple story, and visual effects were used to tell a firefighting story that channels the typical over-the-top action scenes of Tollywood, the south India film industry. Designer Tal Baltuch led the visual effects work for the music video, taking cues from directors Gal Muggia and Vania Heymann. “Vania and Gal wanted to apply the energy of Tollywood to this tale of heroic firemen,” says Baltuch. “It needed to be epically entertaining and obviously not bound to physics as we know it.” Each VFX shot in the promo would make use of smoke and fire and the addition of practical elements filmed at Ramoji Film City in India. “There was a lot of fire, smoke and flying cars on the set, but not enough,” notes Baltuch. “Each of the firefighting shots ended up being a composite of the shot action and special effects, footage of volcano eruptions, fire, smoke, and some additional 3D and simulations. “Ideally,” Baltuch adds, “we might have had months to work with a large group of 3D artists simulating each shot. But that was not the case. And I think both Vania, who directed and did the VFX with me, and myself prefer having more direct control over the result and favor using compositing techniques rather than 3D simulations when possible.” The Maroon 5 music video for “Wait” features another seemingly innocuous story, this time about a breakup. That is, until strange events start to happen, courtesy of visual effects studio Timber. One character’s face disintegrates into paint, weird things take place underwater, missiles make a mark in a car junkyard and, finally, the lead singer, Adam Levine, gets unwrapped as endless strands of yarn. For the ‘face of paint’ shots, Timber worked with production designer John Hammer to build a mask out of paint that could be manipulated. “We shot extra passes of the actress Alexandra Daddario with it on her face with Velcro straps,” explains Timber Creative Director Jonah Hill. “We also got performance takes with her without the paint mask. In digital composite we carefully combined them so it looked like one pass.” Underwater scenes were handled by filming swimmers in a water tank, and then duplicating them, with Levine shot separately in a pool. Meanwhile, practical pyro explosions drove how the junkyard shots came together, with this sequence also showcasing

a unique transition from underwater, inside the car, and out to the yard. Finally, the yarn shots made use of procedural simulations that unraveled Levine and the house around him. Says Hill: “We used Houdini to do the effects, but actual simulations were secondary, really only used for the wiggling yarn pieces that stretched across the room. This was a challenging sequence, but we’re happy with the decisions we made.” “All the Stars,” from Kendrick Lamar and SZA, takes the fun and fantastical even further, mixing in starfields, fields of waving hands, panthers and more for an especially lavish production. BUF was brought on to produce the visual effects. “We received a very detailed moodboard from the director Dave Meyers,” says BUF Executive Producer Loris Paillier, “and we had some calls to make sure we understood what he had in mind.” Shots of a boat sailing through a myriad of waving hands relied on greenscreen photography of Lamar and motion-control plates, with only a small group of people for the crowds. “We put the images of the arms on simple 3D models and we put them on a 3D ocean to create the animation,” details BUF CG artist Marion Eloy and VFX Supervisor Dominique Vidal. “Then we worked a lot on the sky part with different pieces of clouds.” The panther scenes also relied on motion control, with a real panther shot multiple times in separate passes (Lamar was also filmed separately), then combined. Equally majestic were scenes of SZA among a starfield. She was filmed greenscreen, with the

TOP LEFT: A mix of practical effects on location served as the base of many shots for “Magenta Riddim.” (Image courtesy of Tal Baltuch) TOP RIGHT: The final composite. (Image courtesy of Tal Baltuch) BOTTOM LEFT: Maroon 5’s “Wait” breakup storyline includes a scene where one character’s face is made up of paint. (Image courtesy of Timber) BOTTOM MIDDLE: Adam Levine is unwound as a huge length of yarn, simulated by Ingenuity Studios. (Image courtesy of Timber) BOTTOM RIGHT: Practical pyro effects were augmented with CG missiles and composited explosions. (Image courtesy of Timber)

SUMMER 2019 VFXVOICE.COM • 77


MUSIC VIDEO

starfield particles inhabiting the space around the singer. “The design and distribution of all the stars – in fact, they are made of little bulb lights – was specific to each shot in order to avoid being hit by SZA during the dance,” says Vidal. “The other tricky point was to replace the ground with a reflective surface, thus we needed to rebuild SZA’s reflection, missing in the original plate. At the end, meticulous color grading has been done during the compositing stage to simulate a consistent lighting on SZA in this colorful world.” TRIPPING OUT TOP LEFT: Greenscreen boat plate. (Image courtesy of BUF) TOP RIGHT: A final BUF shot from “All the Stars” by Kendrick Lamar and SZA features Lamar in a boat amongst a sea of hands. (Image courtesy of BUF) BOTTOM LEFT: Greenscreen crowd plate. (Image courtesy of BUF) BOTTOM MIDDLE: Greenscreen hands plate. (Image courtesy of BUF) BOTTOM RIGHT: Simplified 3D hand models onto which real hands are projected. (Image courtesy of BUF)

78 • VFXVOICE.COM SUMMER 2019

Music can, of course, have its own hallucinogenic effects, and there are plenty of music videos that provide matching stunning imagery. Take, for example, Aphex Twin and the “T69 Collapse” single. The musician has had a long-term collaboration of intense visuals with artist Nicky Smith, also known as Weirdcore. For the “T69 Collapse” music video, the viewer races through obscure cityscapes and other terrains. Smith, who describes his visually impactful work as evoking a “punk attitude,” generated the imagery by acquiring geometry using photogrammetry with Agisoft’s PhotoScan. Unique flickering and oscillating looks in the video were achieved using After Effects, Cinema 4D and Max/MSP. The music was synchronized to the visuals using a combination of AniMidi for Cinema 4D, Trapcode Sound Keys and Greyscalegorilla’s Signal. Smith says those visuals came to him from “listening to the track

or the artist’s tracks on a loop. I also checked my many art books while listening to the music to trigger ideas.” Perhaps just as trippy is Die Antwood’s “Alien,” a promo that sees band member Yolandi perform as an alien-esque insect. VFX studio JAMM realized a number of CG creations in the video, including a flying centipede and a cocoon that houses a human girl. “We’re used to doing complex creature work, but the cocoon was a new type of challenge,” says CG Artist Zak DiMaria. “It was important to [band member and director] Ninja that the cocoon looked very real, but also unlike anything people had seen before. We based the material of the cocoon on different molds and fungi to ground it in reality. Then we simulated it and allowed it to evolve

“[Directors] Vania [Heymann] and Gal [Muggia] wanted to apply the energy of Tollywood to this tale of heroic firemen. It needed to be epically entertaining and obviously not bound to physics as we know it.” —Tal Baltuch, Designer, Visual Effects, DJ Snake’s “Magenta Riddim”

TOP LEFT: A panther scene made use of multiple real greenscreen panther plates composited together. (Image courtesy of BUF) TOP RIGHT: SZA appears in a starfield orchestrated by BUF. (Image courtesy of BUF) BOTTOM LEFT: Weirdcore’s visuals for the Aphex Twin single “T69 Collapse” were made up of photogrammetry elements stitched together in unusual ways. (Image courtesy of Weirdcore) BOTTOM RIGHT: A cocoon forms in Die Antwoord’s “Alien.” JAMM completed the VFX for the promo. (Image courtesy of JAMM)

SUMMER 2019 VFXVOICE.COM • 79


MUSIC VIDEO

starfield particles inhabiting the space around the singer. “The design and distribution of all the stars – in fact, they are made of little bulb lights – was specific to each shot in order to avoid being hit by SZA during the dance,” says Vidal. “The other tricky point was to replace the ground with a reflective surface, thus we needed to rebuild SZA’s reflection, missing in the original plate. At the end, meticulous color grading has been done during the compositing stage to simulate a consistent lighting on SZA in this colorful world.” TRIPPING OUT TOP LEFT: Greenscreen boat plate. (Image courtesy of BUF) TOP RIGHT: A final BUF shot from “All the Stars” by Kendrick Lamar and SZA features Lamar in a boat amongst a sea of hands. (Image courtesy of BUF) BOTTOM LEFT: Greenscreen crowd plate. (Image courtesy of BUF) BOTTOM MIDDLE: Greenscreen hands plate. (Image courtesy of BUF) BOTTOM RIGHT: Simplified 3D hand models onto which real hands are projected. (Image courtesy of BUF)

78 • VFXVOICE.COM SUMMER 2019

Music can, of course, have its own hallucinogenic effects, and there are plenty of music videos that provide matching stunning imagery. Take, for example, Aphex Twin and the “T69 Collapse” single. The musician has had a long-term collaboration of intense visuals with artist Nicky Smith, also known as Weirdcore. For the “T69 Collapse” music video, the viewer races through obscure cityscapes and other terrains. Smith, who describes his visually impactful work as evoking a “punk attitude,” generated the imagery by acquiring geometry using photogrammetry with Agisoft’s PhotoScan. Unique flickering and oscillating looks in the video were achieved using After Effects, Cinema 4D and Max/MSP. The music was synchronized to the visuals using a combination of AniMidi for Cinema 4D, Trapcode Sound Keys and Greyscalegorilla’s Signal. Smith says those visuals came to him from “listening to the track

or the artist’s tracks on a loop. I also checked my many art books while listening to the music to trigger ideas.” Perhaps just as trippy is Die Antwood’s “Alien,” a promo that sees band member Yolandi perform as an alien-esque insect. VFX studio JAMM realized a number of CG creations in the video, including a flying centipede and a cocoon that houses a human girl. “We’re used to doing complex creature work, but the cocoon was a new type of challenge,” says CG Artist Zak DiMaria. “It was important to [band member and director] Ninja that the cocoon looked very real, but also unlike anything people had seen before. We based the material of the cocoon on different molds and fungi to ground it in reality. Then we simulated it and allowed it to evolve

“[Directors] Vania [Heymann] and Gal [Muggia] wanted to apply the energy of Tollywood to this tale of heroic firemen. It needed to be epically entertaining and obviously not bound to physics as we know it.” —Tal Baltuch, Designer, Visual Effects, DJ Snake’s “Magenta Riddim”

TOP LEFT: A panther scene made use of multiple real greenscreen panther plates composited together. (Image courtesy of BUF) TOP RIGHT: SZA appears in a starfield orchestrated by BUF. (Image courtesy of BUF) BOTTOM LEFT: Weirdcore’s visuals for the Aphex Twin single “T69 Collapse” were made up of photogrammetry elements stitched together in unusual ways. (Image courtesy of Weirdcore) BOTTOM RIGHT: A cocoon forms in Die Antwoord’s “Alien.” JAMM completed the VFX for the promo. (Image courtesy of JAMM)

SUMMER 2019 VFXVOICE.COM • 79


MUSIC VIDEO

Motion of the Music Among the major adopters of visual effects in music videos is the Chemical Brothers, whose recent release. “Free Yourself” required an army of CG robots with mannequin-like faces to form a rebellion and dance to freedom. The Mill came on board to generate the robots, utilizing motion capture and keyframe animation to do so. “We knew right from the start that we would end up filling rooms with robots, which meant that we would need to have multiple motion-capture shoots in order to get all of the data we needed,” says The Mill Visual Effects Supervisor Sid Harrington-Odedra. “Due to budgetary constraints, going down the optical motion-capture route wasn’t really viable, so we spent some time looking into other options. We’d heard that since doing our last project for the Chemical Brothers, Xsens had released a more technologically advanced motion-capture suit, which relies solely on gyroscopic sensors that are fitted into a skin tight suit that the performer wears. The gyros feed back sensor data on the orientation of each body part into proprietary software from Xsens, which can then reconstruct a pose in real time, so we were able to watch performances on set, as they happened.” Along with the capture, The Mill also keyframed final dance moves, and then also worked on ways to differentiate the three different types of robots that populate the promo: a humanoid male, a humanoid female and then a slightly less advanced faceless robot (referred to as ‘Bob’). “To try and keep things efficient,” outlines Harrington-Odedra, “we wanted to try and do as much of the variation with textures and visibility switching, so geometry-wise we kept it to just these three models, but then built in the ability to add and remove faces and the protective parts around the limbs. “However, most of the variation came with the texturing and shading. The lead lighter, Clement Granjon, set up a shader in Houdini that would apply different levels of dirt, dust, scratches, decals, color schemes and even stickers to the robots. The chosen scheme for the robots would be laid out at the Maya end and then the attributes would be automatically picked up when rendering with HtoA.”

TOP: An Xsens MVN suit was used to acquire motion capture for the music video. (Image courtesy The Mill) MIDDLE: Playblasts of the robot dance animation. (Image courtesy of The Mill) BOTTOM: A final shot from “Free Yourself” featuring visual effects by The Mill. (Image courtesy of The Mill)

80 • VFXVOICE.COM SUMMER 2019

and grow on its own into a very detailed and beautiful structure.” Then there’s Maroon 5’s promo for their single “Three Little Birds.” It begins with the band members in a seemingly traditional setting with their instruments, until each is transformed into varying – and increasingly trippy – forms. Ingenuity Studios crafted these effects by augmenting both the band’s performance done against greenscreen and utilizing in-house motion capture of five separate dancers. “For each of the band member’s effects,” outlines Ingenuity Visual Effects Supervisor and Creative Director Grant Miller, “we first needed to track the camera, then RotoMate a character, complete with instruments, over the top of their performance. Then, in addition to the dancer mocap, we also captured a variety of handheld camera moves which were used throughout the video.” A number of simulation effects were used during the promo, with Miller noting that the water dancer was one of the toughest. “With fluid simulations like that we’re always up against time. It was one of the first looks that we locked down, but one of the last to be completed, as every aspect is just painfully slow. The body made of cars was another challenging one. Getting the cars to pass smoothly over the body’s surface and appear to be in continuous motion required a lot of visual trickery, scaling cars away and playing things to the camera heavily.” Ingenuity Managing Director Mike Lebensfeld says “Three Little Birds,” along with Taylor Swift’s “Blank Space” – which itself has more than 2 billion views on YouTube – stand out as some of the studio’s most memorable videos in recent times, with hugely visually interesting and creative VFX work required. “On the surface, ‘Blank Space’ doesn’t look like a VFX-heavy video,” notes Lebensfeld. “However, it included a fully CG Shelby Cobra that we had to dent, tons of fog and atmosphere, and a few other surprises we swore never to reveal. It also represented the return of big-budget music video-making that ended up winning an Emmy award for its VR component. On the other hand, ‘Three Little Birds’ was a total greenscreen video with a ton of ‘in-yourface’ VFX that aren’t exactly invisible and represented this return to ‘big-budget Hollywood-level filmmaking’ in music videos.”

“We’re used to doing complex creature work, but the cocoon was a new type of challenge. It was important to [band member and director] Ninja that the cocoon looked very real, but also unlike anything people had seen before. We based the material of the cocoon on different molds and fungi to ground it in reality. Then we simulated it and allowed it to evolve and grow on its own into a very detailed and beautiful structure.” —Zak DiMaria, CG Artist, Die Antwood’s “Alien”

TOP LEFT: Maroon 5’s promo for “Three Little Birds” saw them perform against greenscreen. (Image courtesy of Ingenuity Studios) TOP RIGHT: Ingenuity Studios turned the band members into different forms, often with simulated FX. (Image courtesy of Ingenuity Studios)

SUMMER 2019 VFXVOICE.COM • 81


MUSIC VIDEO

Motion of the Music Among the major adopters of visual effects in music videos is the Chemical Brothers, whose recent release. “Free Yourself” required an army of CG robots with mannequin-like faces to form a rebellion and dance to freedom. The Mill came on board to generate the robots, utilizing motion capture and keyframe animation to do so. “We knew right from the start that we would end up filling rooms with robots, which meant that we would need to have multiple motion-capture shoots in order to get all of the data we needed,” says The Mill Visual Effects Supervisor Sid Harrington-Odedra. “Due to budgetary constraints, going down the optical motion-capture route wasn’t really viable, so we spent some time looking into other options. We’d heard that since doing our last project for the Chemical Brothers, Xsens had released a more technologically advanced motion-capture suit, which relies solely on gyroscopic sensors that are fitted into a skin tight suit that the performer wears. The gyros feed back sensor data on the orientation of each body part into proprietary software from Xsens, which can then reconstruct a pose in real time, so we were able to watch performances on set, as they happened.” Along with the capture, The Mill also keyframed final dance moves, and then also worked on ways to differentiate the three different types of robots that populate the promo: a humanoid male, a humanoid female and then a slightly less advanced faceless robot (referred to as ‘Bob’). “To try and keep things efficient,” outlines Harrington-Odedra, “we wanted to try and do as much of the variation with textures and visibility switching, so geometry-wise we kept it to just these three models, but then built in the ability to add and remove faces and the protective parts around the limbs. “However, most of the variation came with the texturing and shading. The lead lighter, Clement Granjon, set up a shader in Houdini that would apply different levels of dirt, dust, scratches, decals, color schemes and even stickers to the robots. The chosen scheme for the robots would be laid out at the Maya end and then the attributes would be automatically picked up when rendering with HtoA.”

TOP: An Xsens MVN suit was used to acquire motion capture for the music video. (Image courtesy The Mill) MIDDLE: Playblasts of the robot dance animation. (Image courtesy of The Mill) BOTTOM: A final shot from “Free Yourself” featuring visual effects by The Mill. (Image courtesy of The Mill)

80 • VFXVOICE.COM SUMMER 2019

and grow on its own into a very detailed and beautiful structure.” Then there’s Maroon 5’s promo for their single “Three Little Birds.” It begins with the band members in a seemingly traditional setting with their instruments, until each is transformed into varying – and increasingly trippy – forms. Ingenuity Studios crafted these effects by augmenting both the band’s performance done against greenscreen and utilizing in-house motion capture of five separate dancers. “For each of the band member’s effects,” outlines Ingenuity Visual Effects Supervisor and Creative Director Grant Miller, “we first needed to track the camera, then RotoMate a character, complete with instruments, over the top of their performance. Then, in addition to the dancer mocap, we also captured a variety of handheld camera moves which were used throughout the video.” A number of simulation effects were used during the promo, with Miller noting that the water dancer was one of the toughest. “With fluid simulations like that we’re always up against time. It was one of the first looks that we locked down, but one of the last to be completed, as every aspect is just painfully slow. The body made of cars was another challenging one. Getting the cars to pass smoothly over the body’s surface and appear to be in continuous motion required a lot of visual trickery, scaling cars away and playing things to the camera heavily.” Ingenuity Managing Director Mike Lebensfeld says “Three Little Birds,” along with Taylor Swift’s “Blank Space” – which itself has more than 2 billion views on YouTube – stand out as some of the studio’s most memorable videos in recent times, with hugely visually interesting and creative VFX work required. “On the surface, ‘Blank Space’ doesn’t look like a VFX-heavy video,” notes Lebensfeld. “However, it included a fully CG Shelby Cobra that we had to dent, tons of fog and atmosphere, and a few other surprises we swore never to reveal. It also represented the return of big-budget music video-making that ended up winning an Emmy award for its VR component. On the other hand, ‘Three Little Birds’ was a total greenscreen video with a ton of ‘in-yourface’ VFX that aren’t exactly invisible and represented this return to ‘big-budget Hollywood-level filmmaking’ in music videos.”

“We’re used to doing complex creature work, but the cocoon was a new type of challenge. It was important to [band member and director] Ninja that the cocoon looked very real, but also unlike anything people had seen before. We based the material of the cocoon on different molds and fungi to ground it in reality. Then we simulated it and allowed it to evolve and grow on its own into a very detailed and beautiful structure.” —Zak DiMaria, CG Artist, Die Antwood’s “Alien”

TOP LEFT: Maroon 5’s promo for “Three Little Birds” saw them perform against greenscreen. (Image courtesy of Ingenuity Studios) TOP RIGHT: Ingenuity Studios turned the band members into different forms, often with simulated FX. (Image courtesy of Ingenuity Studios)

SUMMER 2019 VFXVOICE.COM • 81


THEME PARKS

STAR WARS: GALAXY’S EDGEDEEP-DIVE RIDES, EPIC EFFECTS By CHRIS McGOWAN

TOP LEFT: Concept art from Millennium Falcon: Smugglers Run, pulled from the real-time engine imagery. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Star Wars: Rise of the Resistance has animatronic stormtroopers and giant screens with views of space. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM: Giant AT-AT walkers in the Rise of the Resistance ride. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.)

82 • VFXVOICE.COM SUMMER 2019

In recent years, theme parks have incorporated VFX into rides in bigger and bolder ways, including the VES Award-winning King Kong 360 3-D and Avatar: Flight of Passage. In the latest example of this, the Walt Disney Company is debuting two attractions in its new Star Wars: Galaxy’s Edge lands that have been years in the making, that have visual effects on an epic scale and are exploring uncharted immersive territory. Millennium Falcon: Smugglers Run is an interactive space-flight simulation that will feature real-time rendering, while Star Wars: Rise of the Resistance is a lengthy immersive experience that puts theme park-goers inside a Star Wars story. Both were created by Walt Disney Imagineering (WDI), Lucasfilm and Industrial Light and Magic (ILM). Star Wars: Galaxy’s Edge is scheduled to open May 31 in Disneyland and August 29 in Disney’s Hollywood Studios near Orlando. The Florida version will connect to a Star Wars-themed hotel. Each land covers 14 acres and has the spires and structures of a forgotten space outpost, as well as the two rides and themed shops and restaurants. “From the moment you cross into Galaxy’s Edge, we challenge every one of your senses to transport you to a planet named Batuu on the edge of the Star Wars galaxy,” comments Jason Bayever, Manager, Visual Effects Design, WDI. On Smugglers Run, guests have the opportunity to fly the Millennium Falcon – the fantasy of millions of Star Wars fans. You enter the spacecraft, pause to interact with items such as the famed holographic chess table, and then climb into the cockpit to pilot the ship on a secret mission, evade Imperial TIE fighters and help space pirate Hondo Ohnaka bring back a pile of loot. “On Smugglers Run, you can control where the ship goes and affect the action,” says Bei Yang, Technology Studio Executive for WDI. “Each guest will have their own controls. If the pilot wants to go left, you go left. If the gunners shoot at the TIE fighters in front of you, you’re going to see them explode. And if the engineers are doing their job – you’ll make it back in one piece. So while you’ll always make it to your destination, guests control how they get there and how many credits they bring home,” says Jacqueline King, Producer for Millennium Falcon: Smugglers Run, WDI. Once it was decided that Smugglers Run would be a fully guest-controlled experience, the WDI technical and creative teams paired up to develop the technology needed to deliver the real-time media for the attraction. WDI was in charge of the “creative intent” for the overall attraction and worked closely with ILM on creating

the digital imagery and VFX. “ILM has been a partner in this development since late 2015, working on early mock-ups of the media and eventually going into full production,” comments King. “The digital imagery for Millennium Falcon: Smugglers Run is all real-time generated,” says Yang. “The workflow to generate such media used a process more similar to video game development.” He notes, “Balancing visual fidelity and performance is always the primary challenge in any real-time rendered project and Smugglers Run was no different. Additionally, we had to render a large field of view on a multi-projector system, which introduced additional complications requiring us to combine state-of-the-art flight-simulation technology and high-end gaming technology.” Yang explains, “Working with the full support of Epic Games, we were able to create an eight GPU [Graphic Processing Unit] implementation of the Unreal Engine [Epic’s game engine]. This gave us the head-room needed to render at the resolution and fidelity we needed while keeping the frame rate up.” The attraction’s digital imagery “approximates to about 8k of resolution, but that would be thinking about it as a rectangular screen, which is inaccurate,” says Yang. “We are, however, drawing about 4.5 times the number of pixels of what a typical 4k (2160p) display has, and we’re doing this at a constant 50 times per second. “Most workflows are optimized for a rectangular screen,” he adds. “We had to employ other methods, like VR simulations of the theme park attraction, to allow us to understand the content during its creation,” notes Yang. “There are a number of things that all have to not only be synchronized, but look and feel believable. The flight dynamics itself, synchronizing the motion base, and

“From the moment you cross into Galaxy’s Edge, we challenge every one of your senses to transport you to a planet named Batuu on the edge of the Star Wars galaxy.” —Jason Bayever, Manager, Visual Effects Design, WDI

TOP LEFT: In the cockpit of Millennium Falcon: Smugglers Run. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Guests control the cockpit in Smugglers Run. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM LEFT: Stormtroopers approach in Rise of the Resistance. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM RIGHT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.)

SUMMER 2019 VFXVOICE.COM • 83


THEME PARKS

STAR WARS: GALAXY’S EDGEDEEP-DIVE RIDES, EPIC EFFECTS By CHRIS McGOWAN

TOP LEFT: Concept art from Millennium Falcon: Smugglers Run, pulled from the real-time engine imagery. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Star Wars: Rise of the Resistance has animatronic stormtroopers and giant screens with views of space. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM: Giant AT-AT walkers in the Rise of the Resistance ride. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.)

82 • VFXVOICE.COM SUMMER 2019

In recent years, theme parks have incorporated VFX into rides in bigger and bolder ways, including the VES Award-winning King Kong 360 3-D and Avatar: Flight of Passage. In the latest example of this, the Walt Disney Company is debuting two attractions in its new Star Wars: Galaxy’s Edge lands that have been years in the making, that have visual effects on an epic scale and are exploring uncharted immersive territory. Millennium Falcon: Smugglers Run is an interactive space-flight simulation that will feature real-time rendering, while Star Wars: Rise of the Resistance is a lengthy immersive experience that puts theme park-goers inside a Star Wars story. Both were created by Walt Disney Imagineering (WDI), Lucasfilm and Industrial Light and Magic (ILM). Star Wars: Galaxy’s Edge is scheduled to open May 31 in Disneyland and August 29 in Disney’s Hollywood Studios near Orlando. The Florida version will connect to a Star Wars-themed hotel. Each land covers 14 acres and has the spires and structures of a forgotten space outpost, as well as the two rides and themed shops and restaurants. “From the moment you cross into Galaxy’s Edge, we challenge every one of your senses to transport you to a planet named Batuu on the edge of the Star Wars galaxy,” comments Jason Bayever, Manager, Visual Effects Design, WDI. On Smugglers Run, guests have the opportunity to fly the Millennium Falcon – the fantasy of millions of Star Wars fans. You enter the spacecraft, pause to interact with items such as the famed holographic chess table, and then climb into the cockpit to pilot the ship on a secret mission, evade Imperial TIE fighters and help space pirate Hondo Ohnaka bring back a pile of loot. “On Smugglers Run, you can control where the ship goes and affect the action,” says Bei Yang, Technology Studio Executive for WDI. “Each guest will have their own controls. If the pilot wants to go left, you go left. If the gunners shoot at the TIE fighters in front of you, you’re going to see them explode. And if the engineers are doing their job – you’ll make it back in one piece. So while you’ll always make it to your destination, guests control how they get there and how many credits they bring home,” says Jacqueline King, Producer for Millennium Falcon: Smugglers Run, WDI. Once it was decided that Smugglers Run would be a fully guest-controlled experience, the WDI technical and creative teams paired up to develop the technology needed to deliver the real-time media for the attraction. WDI was in charge of the “creative intent” for the overall attraction and worked closely with ILM on creating

the digital imagery and VFX. “ILM has been a partner in this development since late 2015, working on early mock-ups of the media and eventually going into full production,” comments King. “The digital imagery for Millennium Falcon: Smugglers Run is all real-time generated,” says Yang. “The workflow to generate such media used a process more similar to video game development.” He notes, “Balancing visual fidelity and performance is always the primary challenge in any real-time rendered project and Smugglers Run was no different. Additionally, we had to render a large field of view on a multi-projector system, which introduced additional complications requiring us to combine state-of-the-art flight-simulation technology and high-end gaming technology.” Yang explains, “Working with the full support of Epic Games, we were able to create an eight GPU [Graphic Processing Unit] implementation of the Unreal Engine [Epic’s game engine]. This gave us the head-room needed to render at the resolution and fidelity we needed while keeping the frame rate up.” The attraction’s digital imagery “approximates to about 8k of resolution, but that would be thinking about it as a rectangular screen, which is inaccurate,” says Yang. “We are, however, drawing about 4.5 times the number of pixels of what a typical 4k (2160p) display has, and we’re doing this at a constant 50 times per second. “Most workflows are optimized for a rectangular screen,” he adds. “We had to employ other methods, like VR simulations of the theme park attraction, to allow us to understand the content during its creation,” notes Yang. “There are a number of things that all have to not only be synchronized, but look and feel believable. The flight dynamics itself, synchronizing the motion base, and

“From the moment you cross into Galaxy’s Edge, we challenge every one of your senses to transport you to a planet named Batuu on the edge of the Star Wars galaxy.” —Jason Bayever, Manager, Visual Effects Design, WDI

TOP LEFT: In the cockpit of Millennium Falcon: Smugglers Run. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Guests control the cockpit in Smugglers Run. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM LEFT: Stormtroopers approach in Rise of the Resistance. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM RIGHT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.)

SUMMER 2019 VFXVOICE.COM • 83


THEME PARKS

TOP LEFT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM: Star Wars: Galaxy’s Edge concept art. While guests wait in line outside the Millennium Falcon: Smugglers Run ride prior to their flight, they get a chance to examine the famous holotable in the Falcon’s main hold. (Photo: Joshua Sudock. Copyright © 2019 Disney Enterprises, Inc./Lucasfilm Ltd.

84 • VFXVOICE.COM SUMMER 2019

tuning the controls were all challenges that had to be overcome.” Rise of the Resistance, on the other hand, is not interactive, but is highly immersive and full of digital imagery seen in various formats, including holograms. The ride is an extremely complex theme park attraction because of its incorporation of multiple ride systems and multiple VFX pipelines and media delivery systems. In Rise of the Resistance, guests wind through various themed interiors, including those of an Intersystem Transport Ship (I-TS) and a Star Destroyer. They view animatronic stormtroopers, giant AT-AT walkers, displays of space and much else, before boarding escape pods. “Lucasfilm and ILM partnered with Walt Disney Imagineering on Rise of the Resistance to push every system, idea and experiment to the level a Star Wars audience expects,” says Patrick Kearney, Executive Media Producer, WDI. “The audience is traveling through a fully realized Star Wars environment that will hopefully leave them overwhelmed by the level of detail.” ILM started work on the media for Rise of the Resistance in November 2017. A large amount of previsualization had already been completed by that point by the design team at WDI. “Since production started while the building was under construction, our schedule required creative buy-off even before we actually had a ride building. The VR version of the ride system helped us anticipate challenges and move the creative process forward in a timely fashion. Once we had all of our scenes blocked out in the VR ride, we were able to bring our ILM partners literally onboard to view the experience at the beginning of their effort,” says Kearney. “This was an incredibly valuable tool for us not only to understand the attraction, but as a way to review our early VFX work,” recalls Bill George, VFX Supervisor, ILM. The overall VFX shot count for Rise of the Resistance came out to be 175 shots, according to Kearney. “ILM focused on the large-format space sequences, another vendor developed the motion-graphic packages you see throughout the attraction, and our internal team provided the previsualization, technical layouts, digital mapping and illusion effects, and produced a handful of our own VFX shots and the final DI assembly of the entire attraction.” A wide variety of imagery required extensive testing. Bayever observes, “How successful a hologram or live projection illusion

is to the audience is directly determined by the limitations of the delivery system. Every illusion will have its technical limitations due to the projectors, mirrors, etc., and we had to dive into those effects to determine the best way to produce the media. Before we ever started planning for a live-action shoot, we built one-to-one mock-ups to test costumes, scale, audience point of view and lighting to narrow down our challenges and solutions. We worked with ILM to understand and develop a creative workflow to make sure we stayed within the established look of the films and then applied that to what we had learned. Once we walked onto the set with our team, we were able to allow our director and the actors to focus on the performance rather than the technical parameters.” Bayever notes that one of the biggest challenges in the ride was making the imagery look real and not cinematic. “We want to make you think that you are actually looking out a window into space, so you can’t use filmic effects that happen because you are using a camera with a lens.” Lens flares and depth of field are two examples. “While these effects contribute to beautiful imagery, it is not what your eye sees. Getting things to look real without these effects, while still honoring the Star Wars cinematic look established in the films, was a creative challenge.” “On a film, you are working with a static rectangle of an image that people will be viewing in a dark theater,” says George. “The media we created for Rise of the Resistance comes in a variety of shapes – egg shaped, a thin slice, wide rectangles and figures on black. Also, our viewers are zipping by the screens in a moving vehicle. With these variables, all of our training and experience composing shots go out the window. It’s a very different style of storytelling that has a lot of creative challenges. On average, a shot we would create for a film is five seconds long. Some shots for Rise of the Resistance are over two minutes of continuous and connected action. One of our shots is the equivalent of a full sequence on a feature.” George continues, “Some of the shots are of such high resolution that a single camera cannot capture it all. We rendered the images out on multiple cameras and stitched those images together. The massive size and high frame rate of our shots pushed the limits of the ILM processor farm. “For Star Wars fans, this attraction is going to be like walking right into the movies,” George adds. “Actually, it will be better than the movies in that the environments will surround you completely on all sides and come with sound effects and music built-in! Our hope is that people won’t perceive the media on its own, but feel that it’s just part of this amazing world they have entered. I’ve never been on a project where all of the varied disciplines have been so in sync, and I think that will shine through when you see the attraction for yourself. On the ride, you will experience a wide variety of scenes that use the full expanse of practical and visual wonders available today.”

TOP LEFT: Bei Yang, Technology Studio Executive, WDI TOP RIGHT: Jason Bayever, Manager, Visual Effects Design, WDI MIDDLE LEFT: Jacqueline King, Producer, WDI MIDDLE RIGHT: Patrick Kearney, Executive Media Producer, WDI BOTTOM: Bill George, VFX Supervisor, ILM

SUMMER 2019 VFXVOICE.COM • 85


THEME PARKS

TOP LEFT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) TOP RIGHT: Star Wars: Galaxy’s Edge concept art. (Image © Disney Enterprises, Inc./Lucasfilm Ltd.) BOTTOM: Star Wars: Galaxy’s Edge concept art. While guests wait in line outside the Millennium Falcon: Smugglers Run ride prior to their flight, they get a chance to examine the famous holotable in the Falcon’s main hold. (Photo: Joshua Sudock. Copyright © 2019 Disney Enterprises, Inc./Lucasfilm Ltd.

84 • VFXVOICE.COM SUMMER 2019

tuning the controls were all challenges that had to be overcome.” Rise of the Resistance, on the other hand, is not interactive, but is highly immersive and full of digital imagery seen in various formats, including holograms. The ride is an extremely complex theme park attraction because of its incorporation of multiple ride systems and multiple VFX pipelines and media delivery systems. In Rise of the Resistance, guests wind through various themed interiors, including those of an Intersystem Transport Ship (I-TS) and a Star Destroyer. They view animatronic stormtroopers, giant AT-AT walkers, displays of space and much else, before boarding escape pods. “Lucasfilm and ILM partnered with Walt Disney Imagineering on Rise of the Resistance to push every system, idea and experiment to the level a Star Wars audience expects,” says Patrick Kearney, Executive Media Producer, WDI. “The audience is traveling through a fully realized Star Wars environment that will hopefully leave them overwhelmed by the level of detail.” ILM started work on the media for Rise of the Resistance in November 2017. A large amount of previsualization had already been completed by that point by the design team at WDI. “Since production started while the building was under construction, our schedule required creative buy-off even before we actually had a ride building. The VR version of the ride system helped us anticipate challenges and move the creative process forward in a timely fashion. Once we had all of our scenes blocked out in the VR ride, we were able to bring our ILM partners literally onboard to view the experience at the beginning of their effort,” says Kearney. “This was an incredibly valuable tool for us not only to understand the attraction, but as a way to review our early VFX work,” recalls Bill George, VFX Supervisor, ILM. The overall VFX shot count for Rise of the Resistance came out to be 175 shots, according to Kearney. “ILM focused on the large-format space sequences, another vendor developed the motion-graphic packages you see throughout the attraction, and our internal team provided the previsualization, technical layouts, digital mapping and illusion effects, and produced a handful of our own VFX shots and the final DI assembly of the entire attraction.” A wide variety of imagery required extensive testing. Bayever observes, “How successful a hologram or live projection illusion

is to the audience is directly determined by the limitations of the delivery system. Every illusion will have its technical limitations due to the projectors, mirrors, etc., and we had to dive into those effects to determine the best way to produce the media. Before we ever started planning for a live-action shoot, we built one-to-one mock-ups to test costumes, scale, audience point of view and lighting to narrow down our challenges and solutions. We worked with ILM to understand and develop a creative workflow to make sure we stayed within the established look of the films and then applied that to what we had learned. Once we walked onto the set with our team, we were able to allow our director and the actors to focus on the performance rather than the technical parameters.” Bayever notes that one of the biggest challenges in the ride was making the imagery look real and not cinematic. “We want to make you think that you are actually looking out a window into space, so you can’t use filmic effects that happen because you are using a camera with a lens.” Lens flares and depth of field are two examples. “While these effects contribute to beautiful imagery, it is not what your eye sees. Getting things to look real without these effects, while still honoring the Star Wars cinematic look established in the films, was a creative challenge.” “On a film, you are working with a static rectangle of an image that people will be viewing in a dark theater,” says George. “The media we created for Rise of the Resistance comes in a variety of shapes – egg shaped, a thin slice, wide rectangles and figures on black. Also, our viewers are zipping by the screens in a moving vehicle. With these variables, all of our training and experience composing shots go out the window. It’s a very different style of storytelling that has a lot of creative challenges. On average, a shot we would create for a film is five seconds long. Some shots for Rise of the Resistance are over two minutes of continuous and connected action. One of our shots is the equivalent of a full sequence on a feature.” George continues, “Some of the shots are of such high resolution that a single camera cannot capture it all. We rendered the images out on multiple cameras and stitched those images together. The massive size and high frame rate of our shots pushed the limits of the ILM processor farm. “For Star Wars fans, this attraction is going to be like walking right into the movies,” George adds. “Actually, it will be better than the movies in that the environments will surround you completely on all sides and come with sound effects and music built-in! Our hope is that people won’t perceive the media on its own, but feel that it’s just part of this amazing world they have entered. I’ve never been on a project where all of the varied disciplines have been so in sync, and I think that will shine through when you see the attraction for yourself. On the ride, you will experience a wide variety of scenes that use the full expanse of practical and visual wonders available today.”

TOP LEFT: Bei Yang, Technology Studio Executive, WDI TOP RIGHT: Jason Bayever, Manager, Visual Effects Design, WDI MIDDLE LEFT: Jacqueline King, Producer, WDI MIDDLE RIGHT: Patrick Kearney, Executive Media Producer, WDI BOTTOM: Bill George, VFX Supervisor, ILM

SUMMER 2019 VFXVOICE.COM • 85


TV

tethering shots to carry the viewer between one moment to the next.” Fayette notes that the script was distilled to written beats where only hard dialogue was cut back to in the edit. “The next step was to cut the entire battle into a long visual CG previs film,” he says. “This was shown to [show creator and star] Seth MacFarlane, who fell in love with all of the action and decided to use the entire piece instead of isolated shots.” With so many shots to deliver, communication between vendors had to flow seamlessly. “Having the ability to hold an open channel of communication and volley back and forth design and look updates to ensure that nothing bumped between shots and their vendors was an amazing feat,” acknowledges Noska. “Countless phone calls, meetings and updates ensured an on-schedule delivery, but the commitment and passion the teams had to really achieve a top-notch episode is what gives it that extra-special touch.” JUMPING INTO THE BATTLE

VFX BATTLE STATIONS! SPACE WARS IN THE ORVILLE By IAN FAILES

An eight-minute space battle is not something you typically get to see in episodic television. But Season 2 viewers of Fox’s The Orville were able to witness such a feat as the titular ship becomes part of a dynamic outer space contest between opposing forces in episode 9. Realizing the visual effects for the “Identity, Part II” battle were Pixomondo and FuseFX, which worked under co-Visual Effects Supervisors Luke McDonald and Brandon Fayette, and Visual Effects Producer Brooke Noska. VFX Voice asked key members of the team how the action was brought to the screen. PREPARING AN EPIC ACTION SCENE

Images copyright © 2019 Fox

“The main focus was to try to tell a space battle in a way that the audience could follow the story,” explains Fayette. “In that regard, we worked on staging the props and assets into story beats with

Pixomondo, who was a major vendor on the overall series in addition to their portion of the battle work, also handled previs for the battle sequence itself, while Fayette worked on additional previs scenes. Pixomondo then took on the introductory shots of the battle where a Kaylon armada arrives at Earth and faces off against the Union fleet. With animation largely dictated by the animatics, the studio focused on the overall look of the ships in space. “We wanted, of course, for it to be realistic looking,” observes Pixomondo Visual Effects Supervisor Nhat Phong Tran. “But we also wanted it to retain some of the look of the model shoots like they did on the original Star Trek. So for the lighting, instead of just using lights that are the default CG lights, Brandon actually suggested that we should use lights that had been photographed from the actual Season 1 model shoot.” “He took HDR photos of all the different lights – bounce lights, reflectors and so on,” adds Tran. “And he gave us a light reel to use as reference. We used that to actually light our scenes. Once you compare the difference between using those lights and not using those lights, you see that there’s an organic component added to the lighting, which makes the image much more pleasant.”

OPPOSITE TOP: One of Pixomondo’s shots from the massive space battle sequence in Season 2 of The Orville. OPPOSITE BOTTOM LEFT: The U.S.S. Orville among a group of Kaylon craft in this Pixomondo shot. OPPOSITE BOTTOM RIGHT: In addition to generating a portion of the final battle sequence, as shown in this frame, Pixomondo also delivered previs for the scenes. TOP: A Kaylon sphere on the attack. BOTTOM: FuseFX worked on the majority of the battle sequence, taking in assets from Pixomondo and bringing previs’d scenes into its pipeline.

DEEP INTO BATTLE

FuseFX took over the battle once the Krill arrived, ultimately delivering 117 shots and more than seven minutes of screen time in just eight weeks. FuseFX took assets that had been built at Pixomondo and imported them into their 3ds Max pipeline, remodeling where necessary to allow for further detail in the destruction and damage as the battle continued. One thing the studio tried to ensure was that, despite the chaotic scenes, the audience could follow the beats. “I attribute that to Seth MacFarlane because Seth really drove the storyboarding,” says FuseFX Visual Effects Supervisor Tommy Tran. “And Brandon Fayette designed that chaos so well, too.” “Every shot tied together through continuity and story,” adds

86 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 87


TV

tethering shots to carry the viewer between one moment to the next.” Fayette notes that the script was distilled to written beats where only hard dialogue was cut back to in the edit. “The next step was to cut the entire battle into a long visual CG previs film,” he says. “This was shown to [show creator and star] Seth MacFarlane, who fell in love with all of the action and decided to use the entire piece instead of isolated shots.” With so many shots to deliver, communication between vendors had to flow seamlessly. “Having the ability to hold an open channel of communication and volley back and forth design and look updates to ensure that nothing bumped between shots and their vendors was an amazing feat,” acknowledges Noska. “Countless phone calls, meetings and updates ensured an on-schedule delivery, but the commitment and passion the teams had to really achieve a top-notch episode is what gives it that extra-special touch.” JUMPING INTO THE BATTLE

VFX BATTLE STATIONS! SPACE WARS IN THE ORVILLE By IAN FAILES

An eight-minute space battle is not something you typically get to see in episodic television. But Season 2 viewers of Fox’s The Orville were able to witness such a feat as the titular ship becomes part of a dynamic outer space contest between opposing forces in episode 9. Realizing the visual effects for the “Identity, Part II” battle were Pixomondo and FuseFX, which worked under co-Visual Effects Supervisors Luke McDonald and Brandon Fayette, and Visual Effects Producer Brooke Noska. VFX Voice asked key members of the team how the action was brought to the screen. PREPARING AN EPIC ACTION SCENE

Images copyright © 2019 Fox

“The main focus was to try to tell a space battle in a way that the audience could follow the story,” explains Fayette. “In that regard, we worked on staging the props and assets into story beats with

Pixomondo, who was a major vendor on the overall series in addition to their portion of the battle work, also handled previs for the battle sequence itself, while Fayette worked on additional previs scenes. Pixomondo then took on the introductory shots of the battle where a Kaylon armada arrives at Earth and faces off against the Union fleet. With animation largely dictated by the animatics, the studio focused on the overall look of the ships in space. “We wanted, of course, for it to be realistic looking,” observes Pixomondo Visual Effects Supervisor Nhat Phong Tran. “But we also wanted it to retain some of the look of the model shoots like they did on the original Star Trek. So for the lighting, instead of just using lights that are the default CG lights, Brandon actually suggested that we should use lights that had been photographed from the actual Season 1 model shoot.” “He took HDR photos of all the different lights – bounce lights, reflectors and so on,” adds Tran. “And he gave us a light reel to use as reference. We used that to actually light our scenes. Once you compare the difference between using those lights and not using those lights, you see that there’s an organic component added to the lighting, which makes the image much more pleasant.”

OPPOSITE TOP: One of Pixomondo’s shots from the massive space battle sequence in Season 2 of The Orville. OPPOSITE BOTTOM LEFT: The U.S.S. Orville among a group of Kaylon craft in this Pixomondo shot. OPPOSITE BOTTOM RIGHT: In addition to generating a portion of the final battle sequence, as shown in this frame, Pixomondo also delivered previs for the scenes. TOP: A Kaylon sphere on the attack. BOTTOM: FuseFX worked on the majority of the battle sequence, taking in assets from Pixomondo and bringing previs’d scenes into its pipeline.

DEEP INTO BATTLE

FuseFX took over the battle once the Krill arrived, ultimately delivering 117 shots and more than seven minutes of screen time in just eight weeks. FuseFX took assets that had been built at Pixomondo and imported them into their 3ds Max pipeline, remodeling where necessary to allow for further detail in the destruction and damage as the battle continued. One thing the studio tried to ensure was that, despite the chaotic scenes, the audience could follow the beats. “I attribute that to Seth MacFarlane because Seth really drove the storyboarding,” says FuseFX Visual Effects Supervisor Tommy Tran. “And Brandon Fayette designed that chaos so well, too.” “Every shot tied together through continuity and story,” adds

86 • VFXVOICE.COM SUMMER 2019

SUMMER 2019 VFXVOICE.COM • 87


TV

blast, and then once we figured that out we let our effects guys loose on its ferocity in Houdini. It needed to feel like an arc welder that would burn your corneas if you ever really looked into it.” The smaller lasers were done with a particle system that was developed at FuseFX in NUKE, and had to match the actions of hundreds of ships firing at once. “We had a couple of our lead artists come up with a very cool gizmo that, in a procedural way, could populate four different colors of lasers firing from ships that are just flying in 3D space,” describes Tran. COORDINATING CHAOS

TOP: Destruction in the sequence was generally crafted in Houdini. BOTTOM: Debris was somewhat ‘art-directed’ by FuseFX artists.

FuseFX CG Supervisor Matt Von Brock. “There was always something that connected one shot to the shot after it, and it always made it have a good sense of continuity to go with the sequence.” A further feature of the battle, which also helped with integration, was having what Tran describes as ‘off-camera lighting.’ “We’d have these off-camera explosions where you never saw the explosion, but you saw the volume of light basically light up a ship – to backlight it or front light it. It really separated it from the background. And when we got to that phase, once we had all the shots filled, we called it our sweetened phase where we were asking, how do we make this ship pop off the screen?” LASER CONVERSATIONS

FuseFX also sought to integrate the action by having each kind of ship employ a unique laser or weapon effect. “The Krill had green lasers, the Orville and the Union fighters had blue, and the Kaylon, which was the main enemy – the spherical ships – had red,” explains FuseFX Additional Visual Effects Supervisor Kevin Lingenfelser. Indeed, the final look, color and feel of lasers were the result of endless conversations at FuseFX. This was particularly the case for the Kaylon sphere ship’s laser ‘tri-beam.’ Says Tran: “There was a discussion over whether it was a laser blast or a laser beam that goes off into infinity. After a lot of development on how it interacts with ships, we came to the conclusion that the laser beam was not a good idea for how fast things move. So we ended up with a laser

88 • VFXVOICE.COM SUMMER 2019

Those lasers, and a number of subsequent ship collisions, cause some hefty explosions. Depending on the type of ship, FuseFX came up with specific ways to handle explosions and debris. “For the Kaylon spheres, for instance,” says Von Brock, “they’re so large that when they blew up we made it like a chain reaction – it took multiple blasts and multiple explosions to tear them up. So these tended to disintegrate over time, while a smaller ship involved just small ‘pop’ explosions. We also had a library of explosions and blasts for background action.” One particularly elaborate explosion sees a Krill ship explode into green flames. “It was getting pelted by the interceptors and eventually blows up,” says Von Brock. “If that had been a normal explosion it would have had orange fire and debris. But their color scheme is green, so we thought, let’s say they have a green, reactive nuclear core, so when that thing blows it must be green energy. We threw it across to Seth as a work in progress, and he loved it!” FuseFX was aided in generating these complex scenes by several custom tools that were written to proceduralize the battle shots and enable them to be done within eight weeks. These included a tool to help ingest previs into FuseFX’s scenes, as Tran explains. “One of the challenges we had when they delivered the previs was sometimes they had 60-plus ships in there, and they were just proxy models that the client sent over. We had tools that would bring all the setups in, but originally we would have to identify the position and location of all the assets and replace them with our ships. So we had a tool written for the artists to build a scene from the library with all those explosions, hits and things like that. We had a library of debris so artists could almost paint debris into their scenes.” The result was a compelling sequence that made the most of FuseFX’s toolkit, and helped put on the screen an ambitious battle that had been envisaged by the show’s production team. And despite such a fast turnaround, FuseFX says the process was a memorable one. “I have to say that working with this client has been very easy and wonderful for me because previous experiences haven’t gone as smoothly as this one has, so that also made our work on the show that much easier,” says Lingenfelser. “We were working with a very well-tuned machine and people who respected the choices that we were making and allowed us to make the sequence that much better for it.”

“[A Krill ship] was getting pelted by the interceptors and eventually blows up. If that had been a normal explosion it would have had orange fire and debris. But their color scheme is green, so we thought, let’s say they have a green, reactive nuclear core, so when that thing blows it must be green energy. We threw it across to [show creator and star] Seth [MacFarlane] as a work in progress, and he loved it!” —Matt Von Brock, CG Supervisor, FuseFX TOP: Many elements made up the final shots, from the starfields to Earth, the ships, lasers and destruction. BOTTOM: A cockpit view of the battle sequence. FuseFX composited its spaceship scenes into the live-action plate.

SUMMER 2019 VFXVOICE.COM • 89


TV

blast, and then once we figured that out we let our effects guys loose on its ferocity in Houdini. It needed to feel like an arc welder that would burn your corneas if you ever really looked into it.” The smaller lasers were done with a particle system that was developed at FuseFX in NUKE, and had to match the actions of hundreds of ships firing at once. “We had a couple of our lead artists come up with a very cool gizmo that, in a procedural way, could populate four different colors of lasers firing from ships that are just flying in 3D space,” describes Tran. COORDINATING CHAOS

TOP: Destruction in the sequence was generally crafted in Houdini. BOTTOM: Debris was somewhat ‘art-directed’ by FuseFX artists.

FuseFX CG Supervisor Matt Von Brock. “There was always something that connected one shot to the shot after it, and it always made it have a good sense of continuity to go with the sequence.” A further feature of the battle, which also helped with integration, was having what Tran describes as ‘off-camera lighting.’ “We’d have these off-camera explosions where you never saw the explosion, but you saw the volume of light basically light up a ship – to backlight it or front light it. It really separated it from the background. And when we got to that phase, once we had all the shots filled, we called it our sweetened phase where we were asking, how do we make this ship pop off the screen?” LASER CONVERSATIONS

FuseFX also sought to integrate the action by having each kind of ship employ a unique laser or weapon effect. “The Krill had green lasers, the Orville and the Union fighters had blue, and the Kaylon, which was the main enemy – the spherical ships – had red,” explains FuseFX Additional Visual Effects Supervisor Kevin Lingenfelser. Indeed, the final look, color and feel of lasers were the result of endless conversations at FuseFX. This was particularly the case for the Kaylon sphere ship’s laser ‘tri-beam.’ Says Tran: “There was a discussion over whether it was a laser blast or a laser beam that goes off into infinity. After a lot of development on how it interacts with ships, we came to the conclusion that the laser beam was not a good idea for how fast things move. So we ended up with a laser

88 • VFXVOICE.COM SUMMER 2019

Those lasers, and a number of subsequent ship collisions, cause some hefty explosions. Depending on the type of ship, FuseFX came up with specific ways to handle explosions and debris. “For the Kaylon spheres, for instance,” says Von Brock, “they’re so large that when they blew up we made it like a chain reaction – it took multiple blasts and multiple explosions to tear them up. So these tended to disintegrate over time, while a smaller ship involved just small ‘pop’ explosions. We also had a library of explosions and blasts for background action.” One particularly elaborate explosion sees a Krill ship explode into green flames. “It was getting pelted by the interceptors and eventually blows up,” says Von Brock. “If that had been a normal explosion it would have had orange fire and debris. But their color scheme is green, so we thought, let’s say they have a green, reactive nuclear core, so when that thing blows it must be green energy. We threw it across to Seth as a work in progress, and he loved it!” FuseFX was aided in generating these complex scenes by several custom tools that were written to proceduralize the battle shots and enable them to be done within eight weeks. These included a tool to help ingest previs into FuseFX’s scenes, as Tran explains. “One of the challenges we had when they delivered the previs was sometimes they had 60-plus ships in there, and they were just proxy models that the client sent over. We had tools that would bring all the setups in, but originally we would have to identify the position and location of all the assets and replace them with our ships. So we had a tool written for the artists to build a scene from the library with all those explosions, hits and things like that. We had a library of debris so artists could almost paint debris into their scenes.” The result was a compelling sequence that made the most of FuseFX’s toolkit, and helped put on the screen an ambitious battle that had been envisaged by the show’s production team. And despite such a fast turnaround, FuseFX says the process was a memorable one. “I have to say that working with this client has been very easy and wonderful for me because previous experiences haven’t gone as smoothly as this one has, so that also made our work on the show that much easier,” says Lingenfelser. “We were working with a very well-tuned machine and people who respected the choices that we were making and allowed us to make the sequence that much better for it.”

“[A Krill ship] was getting pelted by the interceptors and eventually blows up. If that had been a normal explosion it would have had orange fire and debris. But their color scheme is green, so we thought, let’s say they have a green, reactive nuclear core, so when that thing blows it must be green energy. We threw it across to [show creator and star] Seth [MacFarlane] as a work in progress, and he loved it!” —Matt Von Brock, CG Supervisor, FuseFX TOP: Many elements made up the final shots, from the starfields to Earth, the ships, lasers and destruction. BOTTOM: A cockpit view of the battle sequence. FuseFX composited its spaceship scenes into the live-action plate.

SUMMER 2019 VFXVOICE.COM • 89


VR/AR/MR TRENDS

CUSTOM-BUILT AR HEADSETS, SPHERES, AND THE UNREAL GARDEN By CHRIS McGOWAN

90 • VFXVOICE.COM SUMMER 2019

“With AR we are no longer constrained to any surface or physical limitation of a space. We can move around the content, engage with it from all angles, and get our own personal experiences at our own pace,” says Jason Ambler, Executive Producer/Director of Production at Falcon’s Digital Media, which is launching Falcon’s Vision, a wireless AR headset that is specially designed for location-based experiences in theme parks, museums, zoos, aquariums, art galleries and other special venues. The firm sees augmented reality as having significant potential for such attractions. Falcon’s Digital Media is a division of Orlando, Florida-based Falcon’s Creative Group, a leading designer of immersive environments, dynamic media content and theme park rides. Falcon’s has worked on such attractions/locations as Busch Gardens Williamsburg’s Battle for Eire, National Geographic Encounter: Ocean Odyssey in New York City, Motiongate Dubai, IMG Worlds of Adventure (Dubai), Kennedy Space Center Visitor Complex, SeaWorld, Chimelong Ocean Kingdom (Zhuhai, China), and Knight Valley at OCT Eco Park (Shenzhen, China). Falcon’s has recently linked with PBS for a number of branded ‘edutainment’ outlets that will be PBS’s first foray into the realm of location-based entertainment. After producing several hologram experiences for major parks and attractions, Falcon’s started looking at individualized AR experiences. “We found that the AR headsets that were commercially available in the marketplace would not be a viable solution for entertainment venues,” says Ambler. “Falcon’s Vision was created out of a necessity for us to be able to deliver these types of immersive AR experiences to our clients.” The firm’s experience with the Battle for Eire, a multi-sensory VR motion simulator ride that uses headsets and opened in early 2018, proved invaluable. “With Battle for Eire, we recognized from the onset that existing VR head-mounted displays would need to be adapted to meet the demands of a high-capacity theme park attraction,” Ambler comments. “Our primary solution was to separate the head-mount from the display, which would then allow guests time for fitment and [give] operations the ability to hygienically wash the components that

directly interface with guests in the same way they would wash 3D glasses.” The AR headset has a 55-degree field of view and a one-size-fits-all design, and is ruggedly built, tamper proof, water resistant and shock absorbing. It is powered by a high-density battery with various options for wireless charging stations. The Falcon’s Vision headset can be equipped with 3D polarized lenses. “We created Falcon’s Vision not only to enhance physical experiences, but also to enhance media-based experiences as well,” Ambler expands. “We see the headset bringing another layer of storytelling and interactivity to a stereoscopic 3D film experience on a ride or in a theater environment.” The headset can also “gamify” attractions. He notes, “Whether you are competing against yourself, your friends and family, or people you’ve never met, we believe gamification can be a great motivator and a powerful way to connect with audiences. A healthy competition can help keep people active and focused on an objective for much longer than a traditional passive experience. We also believe this increases the level of ‘repeatability’ in an experience.” Ambler says, “What excites us most about AR is that it can be more of a shared experience, where there is less of a barrier between you and your friends or family members. We really see the benefits to both VR and AR. It really depends on the type of experience and story you are looking to tell.” Some observers feel that theme parks are interesting places to inspire innovation in AR. Ambler comments, “In the right hands, it is certainly possible to create more bespoke and well-integrated AR content experiences in a controlled environment. For this reason, these spaces may be the best place to showcase the capabilities of the technology.” The visuals in the AR headset will be created in-house by Falcon’s Digital Media. Ambler says, “Falcon’s Vision is designed to be a turn-key solution for venue operators including full installation and all content exclusively provided by Falcon’s Digital Media.” Ambler concludes, “Guests’ expectations have been evolving largely due to the way they can access and engage with various forms of entertainment on a daily basis. As experience designers, our job is to bring them something that they can’t get anywhere else.”

OPPOSITE TOP LEFT: Explore the solar system with the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media) OPPOSITE TOP RIGHT: Get up close and personal with a sea lion in AR with the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media) OPPOSITE BOTTOM: Various colors are available for the Falcon’s Vision headset. (Image © 2018 Falcon’s Digital Media) TOP: Jason Ambler, Executive Producer and Director of Production, Falcon’s Digital Media. (Image © 2018 Falcon’s Digital Media) BOTTOM: Wireless charging stations are available for the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media)

SPHERES: SONGS OF SPACE-TIME

Carl Sagan would have been pleased. The three-chapter, 43-minute VR experience Spheres takes viewers on a 360-degree tour of the cosmos, explores gravity waves and the Big Bang, uncovers the hidden songs of space, and dives into a black hole. It was the first VR title to debut at the Telluride Film Festival and the first to be sold to a distributor for more than a million dollars (VR financing and distribution firm CityLights acquired it for seven figures at the Sundance Film Festival). Presented by Protozoa Pictures and Oculus Studios, Spheres won the Best VR Award (Immersive Story) at the Venice Film Festival last September. It has its share of another type of “star power”

SUMMER 2019 VFXVOICE.COM • 91


VR/AR/MR TRENDS

CUSTOM-BUILT AR HEADSETS, SPHERES, AND THE UNREAL GARDEN By CHRIS McGOWAN

90 • VFXVOICE.COM SUMMER 2019

“With AR we are no longer constrained to any surface or physical limitation of a space. We can move around the content, engage with it from all angles, and get our own personal experiences at our own pace,” says Jason Ambler, Executive Producer/Director of Production at Falcon’s Digital Media, which is launching Falcon’s Vision, a wireless AR headset that is specially designed for location-based experiences in theme parks, museums, zoos, aquariums, art galleries and other special venues. The firm sees augmented reality as having significant potential for such attractions. Falcon’s Digital Media is a division of Orlando, Florida-based Falcon’s Creative Group, a leading designer of immersive environments, dynamic media content and theme park rides. Falcon’s has worked on such attractions/locations as Busch Gardens Williamsburg’s Battle for Eire, National Geographic Encounter: Ocean Odyssey in New York City, Motiongate Dubai, IMG Worlds of Adventure (Dubai), Kennedy Space Center Visitor Complex, SeaWorld, Chimelong Ocean Kingdom (Zhuhai, China), and Knight Valley at OCT Eco Park (Shenzhen, China). Falcon’s has recently linked with PBS for a number of branded ‘edutainment’ outlets that will be PBS’s first foray into the realm of location-based entertainment. After producing several hologram experiences for major parks and attractions, Falcon’s started looking at individualized AR experiences. “We found that the AR headsets that were commercially available in the marketplace would not be a viable solution for entertainment venues,” says Ambler. “Falcon’s Vision was created out of a necessity for us to be able to deliver these types of immersive AR experiences to our clients.” The firm’s experience with the Battle for Eire, a multi-sensory VR motion simulator ride that uses headsets and opened in early 2018, proved invaluable. “With Battle for Eire, we recognized from the onset that existing VR head-mounted displays would need to be adapted to meet the demands of a high-capacity theme park attraction,” Ambler comments. “Our primary solution was to separate the head-mount from the display, which would then allow guests time for fitment and [give] operations the ability to hygienically wash the components that

directly interface with guests in the same way they would wash 3D glasses.” The AR headset has a 55-degree field of view and a one-size-fits-all design, and is ruggedly built, tamper proof, water resistant and shock absorbing. It is powered by a high-density battery with various options for wireless charging stations. The Falcon’s Vision headset can be equipped with 3D polarized lenses. “We created Falcon’s Vision not only to enhance physical experiences, but also to enhance media-based experiences as well,” Ambler expands. “We see the headset bringing another layer of storytelling and interactivity to a stereoscopic 3D film experience on a ride or in a theater environment.” The headset can also “gamify” attractions. He notes, “Whether you are competing against yourself, your friends and family, or people you’ve never met, we believe gamification can be a great motivator and a powerful way to connect with audiences. A healthy competition can help keep people active and focused on an objective for much longer than a traditional passive experience. We also believe this increases the level of ‘repeatability’ in an experience.” Ambler says, “What excites us most about AR is that it can be more of a shared experience, where there is less of a barrier between you and your friends or family members. We really see the benefits to both VR and AR. It really depends on the type of experience and story you are looking to tell.” Some observers feel that theme parks are interesting places to inspire innovation in AR. Ambler comments, “In the right hands, it is certainly possible to create more bespoke and well-integrated AR content experiences in a controlled environment. For this reason, these spaces may be the best place to showcase the capabilities of the technology.” The visuals in the AR headset will be created in-house by Falcon’s Digital Media. Ambler says, “Falcon’s Vision is designed to be a turn-key solution for venue operators including full installation and all content exclusively provided by Falcon’s Digital Media.” Ambler concludes, “Guests’ expectations have been evolving largely due to the way they can access and engage with various forms of entertainment on a daily basis. As experience designers, our job is to bring them something that they can’t get anywhere else.”

OPPOSITE TOP LEFT: Explore the solar system with the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media) OPPOSITE TOP RIGHT: Get up close and personal with a sea lion in AR with the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media) OPPOSITE BOTTOM: Various colors are available for the Falcon’s Vision headset. (Image © 2018 Falcon’s Digital Media) TOP: Jason Ambler, Executive Producer and Director of Production, Falcon’s Digital Media. (Image © 2018 Falcon’s Digital Media) BOTTOM: Wireless charging stations are available for the Falcon’s Vision AR headset. (Image © 2018 Falcon’s Digital Media)

SPHERES: SONGS OF SPACE-TIME

Carl Sagan would have been pleased. The three-chapter, 43-minute VR experience Spheres takes viewers on a 360-degree tour of the cosmos, explores gravity waves and the Big Bang, uncovers the hidden songs of space, and dives into a black hole. It was the first VR title to debut at the Telluride Film Festival and the first to be sold to a distributor for more than a million dollars (VR financing and distribution firm CityLights acquired it for seven figures at the Sundance Film Festival). Presented by Protozoa Pictures and Oculus Studios, Spheres won the Best VR Award (Immersive Story) at the Venice Film Festival last September. It has its share of another type of “star power”

SUMMER 2019 VFXVOICE.COM • 91


VR/AR/MR TRENDS

“The programming [for The Unreal Garden] will change every six to 12 months. Enklu, our development partner, has built a storytelling platform for immersive experience that allows the creating of content and adding to a live experience in AR. It’s really fluid and fast and cost-efficient. Because of that, it allows us to make updates more regularly than you would think – so the content isn’t fixed and can adapt to whatever environment we find.” —Leila Amirsadeghi, CMO, Onedome

– Millie Bobby Brown (Stranger Things), Jessica Chastain and rock icon Patti Smith narrate Spheres, which was written and directed by Eliza McNitt. Darren Aronofsky (Pi, Black Swan, Noah) was a co-executive producer, along with Ari Handel, Jessica Chastain and Portlandbased Kaleidoscope VR. Kyle Dixon and Michael Stein (Stranger Things) provided the soundtrack. Baptiste Poligné and Dimitri Sourza were VFX artists who worked on the project, along with VR Technical Director Clément Chériot and VR production studios Atlas V (based in Paris), Novelab (Paris) and Crimes of Curiosity (Los Angeles). Spheres is now available on the Oculus Rift VR headset (www.oculus.com). THE UNREAL GARDEN

Just as AR can “gamify” environments and bring users together in a collective gaming experience, it can enhance the social

92 • VFXVOICE.COM SUMMER 2019

potential of an art gallery or other shared setting. The Unreal Garden mixes a physical area with spatial sound, 2D/3D projection mapping, and augmented reality (provided by a Microsoft Hololens headset). Guests – up to 25 at a time – enter a colorful space with winding paths, radiant wildflowers, a stream, a waterfall and boulders. In the setting, AR and projection technology bring to life digital art installations from ten artists. Visitors can move around the artworks (often connected to nature) and interact with key components, teasing out hidden meanings and unexpected changes. Participating artists include Android Jones, John Park, Jasmine Pradissitto, Andy Thomas, Shuster + Moseley, Scott Musgrove, Werc, Ray Kallmeyer and Vladislav Solovjov. Onedome created the project both to be an artist’s platform and to use creativity as a way to inspire connection and community. Lelia Amirsadeghi, CMO of Onedome, observes, “As you bring more people into it, you realize the experience can be even

better in collaboration with someone.” Yet, Amirsadeghi is conscious that group interactivity with AR is something new for most people and can be a little daunting. The Unreal Garden will be “introducing a whole new technology and language to audiences. AR is not something that’s commonly experienced. The Hololens is not something that many would have access to right now.” And few have had the experience of joining with others via AR to unlock special visual effects in an artwork. According to Amirsadeghi, as people get used to the technology, the content will evolve. “The programming will change every six to 12 months. Enklu, our development partner, has built a storytelling platform for immersive experience that allows the creating of content and adding to a live experience in AR. It’s really fluid and fast and cost-efficient. Because of that, it allows us to make updates more regularly than you would think – so the content isn’t fixed and can adapt to whatever environment we find.” Amirsadeghi believes that audiences are ready for an experience like The Unreal Garden. “I think you’ve seen a behavioral shift over the last three years,” says Amirsadeghi about the popularity of location-based entertainment using VR and AR, whether pop-up or permanent venues. “People really want more out of their entertainment, want to be part of it, want to engage and interact with others, want to be social and share it socially. It’s just the next layer of it, the next evolution, in my humble opinion.” She continues, “We’re just at the beginning. Every day is a surprise. We’re learning. This is brand-new tech that has never been used in this manner before. We’re doing stuff that’s not been done, so there’s no playbook. We learn something every day from customers.” Onedome, which presented The Unreal Garden at its San Francisco location from last October through April, will take it on tour later this year. Another immersive attraction – LMNL, comprised of 14 interactive, interconnected rooms and installations – was also offered recently by Onedome at its Market Street location in San Francisco.

OPPOSITE TOP LEFT: Jasmine Pradissitto’s interactive artwork for The Unreal Garden. (Image © 2018 Onedome) OPPOSITE TOP MIDDLE: The setting is psychedelic beneath the layers added by AR in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE TOP RIGHT: Ray Kallmeyer’s interactive artwork in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE BOTTOM LEFT: Onedome CMO Lelia Amirsadeghi. (Image © 2018 Onedome) OPPOSITE BOTTOM MIDDLE: Andy Thomas’s interactive artwork in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE BOTTOM RIGHT: Exploring augmented reality with the HoloLens in The Unreal Garden. (Image © 2018 Onedome) TOP: Spheres. (Image © Protozoa Pictures and CityLights) BOTTOM: Spheres. (Image © Protozoa Pictures and CityLights)

SUMMER 2019 VFXVOICE.COM • 93


VR/AR/MR TRENDS

“The programming [for The Unreal Garden] will change every six to 12 months. Enklu, our development partner, has built a storytelling platform for immersive experience that allows the creating of content and adding to a live experience in AR. It’s really fluid and fast and cost-efficient. Because of that, it allows us to make updates more regularly than you would think – so the content isn’t fixed and can adapt to whatever environment we find.” —Leila Amirsadeghi, CMO, Onedome

– Millie Bobby Brown (Stranger Things), Jessica Chastain and rock icon Patti Smith narrate Spheres, which was written and directed by Eliza McNitt. Darren Aronofsky (Pi, Black Swan, Noah) was a co-executive producer, along with Ari Handel, Jessica Chastain and Portlandbased Kaleidoscope VR. Kyle Dixon and Michael Stein (Stranger Things) provided the soundtrack. Baptiste Poligné and Dimitri Sourza were VFX artists who worked on the project, along with VR Technical Director Clément Chériot and VR production studios Atlas V (based in Paris), Novelab (Paris) and Crimes of Curiosity (Los Angeles). Spheres is now available on the Oculus Rift VR headset (www.oculus.com). THE UNREAL GARDEN

Just as AR can “gamify” environments and bring users together in a collective gaming experience, it can enhance the social

92 • VFXVOICE.COM SUMMER 2019

potential of an art gallery or other shared setting. The Unreal Garden mixes a physical area with spatial sound, 2D/3D projection mapping, and augmented reality (provided by a Microsoft Hololens headset). Guests – up to 25 at a time – enter a colorful space with winding paths, radiant wildflowers, a stream, a waterfall and boulders. In the setting, AR and projection technology bring to life digital art installations from ten artists. Visitors can move around the artworks (often connected to nature) and interact with key components, teasing out hidden meanings and unexpected changes. Participating artists include Android Jones, John Park, Jasmine Pradissitto, Andy Thomas, Shuster + Moseley, Scott Musgrove, Werc, Ray Kallmeyer and Vladislav Solovjov. Onedome created the project both to be an artist’s platform and to use creativity as a way to inspire connection and community. Lelia Amirsadeghi, CMO of Onedome, observes, “As you bring more people into it, you realize the experience can be even

better in collaboration with someone.” Yet, Amirsadeghi is conscious that group interactivity with AR is something new for most people and can be a little daunting. The Unreal Garden will be “introducing a whole new technology and language to audiences. AR is not something that’s commonly experienced. The Hololens is not something that many would have access to right now.” And few have had the experience of joining with others via AR to unlock special visual effects in an artwork. According to Amirsadeghi, as people get used to the technology, the content will evolve. “The programming will change every six to 12 months. Enklu, our development partner, has built a storytelling platform for immersive experience that allows the creating of content and adding to a live experience in AR. It’s really fluid and fast and cost-efficient. Because of that, it allows us to make updates more regularly than you would think – so the content isn’t fixed and can adapt to whatever environment we find.” Amirsadeghi believes that audiences are ready for an experience like The Unreal Garden. “I think you’ve seen a behavioral shift over the last three years,” says Amirsadeghi about the popularity of location-based entertainment using VR and AR, whether pop-up or permanent venues. “People really want more out of their entertainment, want to be part of it, want to engage and interact with others, want to be social and share it socially. It’s just the next layer of it, the next evolution, in my humble opinion.” She continues, “We’re just at the beginning. Every day is a surprise. We’re learning. This is brand-new tech that has never been used in this manner before. We’re doing stuff that’s not been done, so there’s no playbook. We learn something every day from customers.” Onedome, which presented The Unreal Garden at its San Francisco location from last October through April, will take it on tour later this year. Another immersive attraction – LMNL, comprised of 14 interactive, interconnected rooms and installations – was also offered recently by Onedome at its Market Street location in San Francisco.

OPPOSITE TOP LEFT: Jasmine Pradissitto’s interactive artwork for The Unreal Garden. (Image © 2018 Onedome) OPPOSITE TOP MIDDLE: The setting is psychedelic beneath the layers added by AR in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE TOP RIGHT: Ray Kallmeyer’s interactive artwork in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE BOTTOM LEFT: Onedome CMO Lelia Amirsadeghi. (Image © 2018 Onedome) OPPOSITE BOTTOM MIDDLE: Andy Thomas’s interactive artwork in The Unreal Garden. (Image © 2018 Onedome) OPPOSITE BOTTOM RIGHT: Exploring augmented reality with the HoloLens in The Unreal Garden. (Image © 2018 Onedome) TOP: Spheres. (Image © Protozoa Pictures and CityLights) BOTTOM: Spheres. (Image © Protozoa Pictures and CityLights)

SUMMER 2019 VFXVOICE.COM • 93


[ VES NEWS ]

New York VES Awards Celebration Salutes VFX Excellence By NAOMI GOLDMAN

On March 1, the VES New York Section hosted its 5th Annual Awards Celebration. Held at The Green Building in Brooklyn, the VES Awards “after-after party” celebrated the finest achievements in visual effects artistry – from New York’s VES Award winners and nominees to local VFX heroes. The Section established their Awards Celebration in 2015 to recognize local VFX accomplishments and bring together the vibrant New York community. Live music, an atmospheric venue, generous sponsors and local honorees contribute to a high-class evening of networking and celebration. The event was an instant success and has since become a must-attend annual affair. The VES New York Empire Award was established in 2015 to recognize a New York-based visual effects professional who has made enormous contributions to the field of visual arts and filmed entertainment, and whose work embodies the spirit of New York. The Empire Innovator Award was created in 2019 to broaden the recognition to innovators who have advanced the field through cutting-edge technology. Digital pioneer Bob Greenberg, Founder and Executive Chairman of R/GA, received the 2019 VES New York Empire Award for his pioneering work in visual effects and trailblazing work in the advertising and communications industry. Under his tutelage, R/GA has been on the forefront of VFX since the 1970s, evolving from a world-class movie title shop, to a digital studio, to a major digital advertising agency and service innovator. R/GA has focused on live-action film and video production and the development of leading-edge motion graphics. They have created groundbreaking effects for movies such as Alien, Predator and Se7en, with a body of work spanning 400 feature films and 4,000 television commercials. Greenberg has received an Academy Award, Clio, Cannes Lions, and induction into the Advertising Hall of Fame. Upon receiving the award, Greenberg remarked, “It’s an interesting award for me because production and special effects have been such a big part of my career.” As Greenberg shared that the focus of his work is now on AR, VR, AI and robotics, he stated, “We’ll never stop innovating. We’ll never stop being a creatively-driven organization, exploring themes and emphasizing entrepreneurship.” In closing, he urged all the attendees to “keep

94 • VFXVOICE.COM SUMMER 2019

suspending disbelief!” The inaugural Empire Innovator Award was presented to Jonah Friedman and Andy Jones. The duo was honored for their technological innovations that have dramatically influenced and elevated the status of the visual effects industry in New York City, including their participation in the development of the open-source tool Cryptomatte, which has quickly become an industry standard since its introduction in 2015. Friedman is the Bifröst Product Owner at Autodesk and he has been a software engineer, photographer, and 3D artist in a wide range of sub-disciplines. Jones is CTO at Psyop, following his tenure as Head of VFX. He has worked in VFX and animation in a variety of disciplines, including 3D, compositing, VFX supervision, software and systems engineering. Friedman gave kudos to his VFX peers in accepting the award with Jones: “Visual effects artists are the most innovative and knowledge-hungry people I have ever met, bar none. We are trying to recreate the world ... literally everything under the sun. The visual effects industry is where art, science and technology all come together in this pressure cooker, and the results are really just magic.” “Bob, Jonah and Andy are innovators who have elevated and advanced the visual effects industry through their inventive work,” said Sean Curran, New York VES Awards Celebration Chair. “Through their commitment to the art and science of VFX, they exemplify what is achievable through creative vision and technical expertise and have had an impact on visual effects here in New York and globally.” “It was truly a memorable evening,” said Dan Schrecker, New York Section co-Treasurer and VES Board member. “Each year we feel as though the Awards Celebration gets better and better, and we look forward to many more in the years to come.”

TOP LEFT: VES Executive Director Eric Roth celebrates with Innovator Award honorees Andy Jones and Jonah Friedman, and Empire Award honoree Bob Greenberg. TOP RIGHT: VES members and guests enjoy the 5th Annual New York VES Awards Celebration.



[ FINAL FRAME ]

Dragons Then and Now

Long before there was a Westeros with Daenerys Targaryen and her fiery charges, there was visual effects legend Ray Harryhausen, who is considered to be the godfather of screen monsters, including dragons. This is a shot of Harryhausen working with a model dragon for the 1958 film called The 7th Voyage of Sinbad. The visual effects technique used was model stop-motion animation, also called ‘Dynamation’ to differentiate it from cartoon animation. This was the first feature film to use stop-motion animation effects that were completely shot in color. Other films in the Harryhausen cinematic pantheon include 20 Million Miles to Earth, Jason and the Argonauts, One Million Years B.C., The Golden Voyage of Sinbad and Clash of the Titans, among

many others. Harryhausen died in 2013, but his legacy lives on through the Ray and Diana Harryhausen Foundation, a charitable trust based in Scotland. The Foundation is dedicated to preserving his legacy and collection, and has an estimated 50,000 items, including original models, miniatures, stills, negatives, hard-rubber stand-in models, molds, original artwork, original equipment and screenplays. June 29, 2020 marks Harryhausen’s centenary, and for his 100th birthday the Foundation is planning a series of events and initiatives under the banner #Harryhausen100. Harryhausen was inducted into the VES Hall of Fame in 2018. More information can be found at www.rayharryhausen.com. Photo generously provided by the Ray and Diana Harryhausen Foundation.

96 • VFXVOICE.COM SUMMER 2019




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.