VFX Voice - Summer 2017 Issue

Page 1

VFXVOICE.COM

SUMMER 2017

GAME OF THRONES SUMMER VFX

FILMS • TV • GAMES STEPHANE CERETTI JOHN FRAZIER ROB LEGATO




[ EXECUTIVE NOTE ]

Welcome to the Summer Issue of VFX Voice! Thank you for your support and enthusiasm in making the launch of our signature publication a success – one that has surpassed our hopes and expectations. In creating VFX Voice, we envisioned it as a source of information and inspiration to tether our growing community even closer together. On behalf of the Society, we are gratified to hear your positive feedback from around the globe and will use it to keep delivering insightful stories that advance our field. This proud endeavor has only just begun and we’re excited to bring you stories that continue to elevate the art of visual effects – and salute the talented artists who bring us truly remarkable imagery to help tell the most extraordinary stories. Our Summer issue features a special focus on television VFX and a compelling roundup of highly anticipated summer movies with major VFX components, which includes: A Golden Age of TV VFX features a roundup of VFX on TV and captures the dynamics of working on shows with standard-setting big VFX and shorter timeframes. Summer VFX Blockbusters surveys the scene of Summer 2017’s avalanche of big-budget VFX films driving this season’s box office. VR Trends takes a 360-degree look at virtual filmmaking and examines how the movement is affecting the visual effects industry worldwide, and then spotlights what’s happening in the ad industry in light of the VR and VFX evolution. We take a look at the hottest VFX-infused video games and then head out to VFX theme parks to check out how they are changing the face of special venue entertainment. You’ll find compelling profiles of industry leaders and pioneers who share what drives them and how they got where they are – including prolific supervisors Rob Legato, ASC John Frazier and Stephane Ceretti – and a craft roundtable with industry luminaries from across the VFX world. Add to that, historical reviews in VFX Vault, V-Art, the latest VES News and a profile on the VES Bay Area section – our oldest regional group. As we reach the midpoint of our milestone 20th Anniversary year, we appreciate you as VES members, readers, advertisers and valuable contributors to our mutual success. Please share the magazine with your colleagues in the industry-at-large and help us grow the community. Visit www.vfxvoice.com for exclusive Web features and news between issues. And let us know your thoughts – we want to your hear your ideas. Thank you again for joining us on this journey.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2017


[ EXECUTIVE NOTE ]

Welcome to the Summer Issue of VFX Voice! Thank you for your support and enthusiasm in making the launch of our signature publication a success – one that has surpassed our hopes and expectations. In creating VFX Voice, we envisioned it as a source of information and inspiration to tether our growing community even closer together. On behalf of the Society, we are gratified to hear your positive feedback from around the globe and will use it to keep delivering insightful stories that advance our field. This proud endeavor has only just begun and we’re excited to bring you stories that continue to elevate the art of visual effects – and salute the talented artists who bring us truly remarkable imagery to help tell the most extraordinary stories. Our Summer issue features a special focus on television VFX and a compelling roundup of highly anticipated summer movies with major VFX components, which includes: A Golden Age of TV VFX features a roundup of VFX on TV and captures the dynamics of working on shows with standard-setting big VFX and shorter timeframes. Summer VFX Blockbusters surveys the scene of Summer 2017’s avalanche of big-budget VFX films driving this season’s box office. VR Trends takes a 360-degree look at virtual filmmaking and examines how the movement is affecting the visual effects industry worldwide, and then spotlights what’s happening in the ad industry in light of the VR and VFX evolution. We take a look at the hottest VFX-infused video games and then head out to VFX theme parks to check out how they are changing the face of special venue entertainment. You’ll find compelling profiles of industry leaders and pioneers who share what drives them and how they got where they are – including prolific supervisors Rob Legato, ASC John Frazier and Stephane Ceretti – and a craft roundtable with industry luminaries from across the VFX world. Add to that, historical reviews in VFX Vault, V-Art, the latest VES News and a profile on the VES Bay Area section – our oldest regional group. As we reach the midpoint of our milestone 20th Anniversary year, we appreciate you as VES members, readers, advertisers and valuable contributors to our mutual success. Please share the magazine with your colleagues in the industry-at-large and help us grow the community. Visit www.vfxvoice.com for exclusive Web features and news between issues. And let us know your thoughts – we want to your hear your ideas. Thank you again for joining us on this journey.

Mike Chambers, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2017


[ CONTENTS ] FEATURES 8 THEME PARKS: DARK RIDES Movie experiences draw crowds around the world.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 100 VES SECTION SPOTLIGHT

16 COMMERCIALS: CREATIVE SURGE Ad agencies clamor for latest in photorealism. 22 PROFILE: ROB LEGATO, ASC Three-time Oscar® winner is changing moviemaking. 30 FILM: SUMMER BOX OFFICE Analysts value summer sequels in the billions. 32 FILM: SUMMER VFX BLOCKBUSTERS Highlights from the major effects-driven films. 36 INDUSTRY ROUNDTABLE Well-known effects supervisors discuss key issues.

106 THE VES HANDBOOK 108 V-ART: TyRUBEN ELLINGSON 110 VES NEWS 112 FINAL FRAME: ANDREW WHITEHURST

ON THE COVER: The Night King observes the battle at Hardhome in Game of Thrones, Season 6. (Photo copyright © 2016 HBO Inc. All Rights Reserved.)

46 COVER: GAME OF THRONES The biggest VFX moments in the show’s history. 52 TV: A VFX GOLDEN AGE New episodic series place VFX front and center. 56 PREVIS: GAME OF THRONES The Third Floor’s artists plot behind the scenes. 58 SFX: JOHN FRAZIER Car crashes and explosions are his calling card. 64 VFX TRENDS: VR FILMMAKING More VFX companies are expanding into VR. 70 FILM: CREATURES RISING Putting a new twist on historical CG monsters. 76 PROFILE: STEPHANE CERETTI A brilliant blend of science, art and technology. 82 GAMES: FEEL THE HEAT Summer’s usually slow for hot games; not this year. 87 ANIMATION: THE EMOJI MOVIE Internet icons are brought to life from scratch. 88 VFX VAULT: ROBOCOP AT 30 ED-209 continues to grow in stop-motion stature. 92 COMPANY PROFILE: ATOMIC FICTION Scaling up while maintaining its boutique feel. 98 FILM: FRAMESTORE’S CURTAIN-RAISER Guardians Vol. 2’s opening sequence made to stun. 104 PREVIS: A VISIONEER’S GUIDE Previs art and actual scenes from Guardians Vol. 2.

4 • VFXVOICE.COM SUMMER 2017


[ CONTENTS ] FEATURES 8 THEME PARKS: DARK RIDES Movie experiences draw crowds around the world.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 100 VES SECTION SPOTLIGHT

16 COMMERCIALS: CREATIVE SURGE Ad agencies clamor for latest in photorealism. 22 PROFILE: ROB LEGATO, ASC Three-time Oscar® winner is changing moviemaking. 30 FILM: SUMMER BOX OFFICE Analysts value summer sequels in the billions. 32 FILM: SUMMER VFX BLOCKBUSTERS Highlights from the major effects-driven films. 36 INDUSTRY ROUNDTABLE Well-known effects supervisors discuss key issues.

106 THE VES HANDBOOK 108 V-ART: TyRUBEN ELLINGSON 110 VES NEWS 112 FINAL FRAME: ANDREW WHITEHURST

ON THE COVER: The Night King observes the battle at Hardhome in Game of Thrones, Season 6. (Photo copyright © 2016 HBO Inc. All Rights Reserved.)

46 COVER: GAME OF THRONES The biggest VFX moments in the show’s history. 52 TV: A VFX GOLDEN AGE New episodic series place VFX front and center. 56 PREVIS: GAME OF THRONES The Third Floor’s artists plot behind the scenes. 58 SFX: JOHN FRAZIER Car crashes and explosions are his calling card. 64 VFX TRENDS: VR FILMMAKING More VFX companies are expanding into VR. 70 FILM: CREATURES RISING Putting a new twist on historical CG monsters. 76 PROFILE: STEPHANE CERETTI A brilliant blend of science, art and technology. 82 GAMES: FEEL THE HEAT Summer’s usually slow for hot games; not this year. 87 ANIMATION: THE EMOJI MOVIE Internet icons are brought to life from scratch. 88 VFX VAULT: ROBOCOP AT 30 ED-209 continues to grow in stop-motion stature. 92 COMPANY PROFILE: ATOMIC FICTION Scaling up while maintaining its boutique feel. 98 FILM: FRAMESTORE’S CURTAIN-RAISER Guardians Vol. 2’s opening sequence made to stun. 104 PREVIS: A VISIONEER’S GUIDE Previs art and actual scenes from Guardians Vol. 2.

4 • VFXVOICE.COM SUMMER 2017


SUMMER 2017 • VOL. 1, NO. 2

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Willie Clark Andy Eddy Ian Failes Michael Goldman Naomi Goldman Debra Kaufman Chris McGowan Ed Ochs Paula Parisi Barbara Robertson Helene Siefer PUBLICATION ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Rita Cahill, Secretary Dennis Hoffman, Treasurer DIRECTORS Jeff Barnes, Brooke Breton, Kathryn Brillhart, Emma Clifton Perry, Bob Coleman, Dayne Cowan, Kim Davidson, Debbie Denise, Richard Edlund, VES, Pam Hogarth, Joel Hynek, Jeff Kleiser, Tim Landry, Neil Lim-Sang, Brooke Lyndon-Stanford, Tim McGovern, Kevin Rafferty, Scott Ross, Barry Sandrew, Tim Sassoon, Dan Schrecker, David Tanaka, Bill Taylor, VES, Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Andrew Bly, Fon Davis, Charlie Iturriaga, Christian Kubsch, Andres Martinez, Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Brent Armstrong, Director of Operations Nancy Ward, Program & Development Dir. Jeff Casper, Manager of Media & Graphics Ben Schneider, Membership Coordinator Colleen Kelly, Office Manager Jennifer Cabrera, Administrative Assistant Vicki Sheveck, Global Administrative Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2017 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2017


SUMMER 2017 • VOL. 1, NO. 2

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Willie Clark Andy Eddy Ian Failes Michael Goldman Naomi Goldman Debra Kaufman Chris McGowan Ed Ochs Paula Parisi Barbara Robertson Helene Siefer PUBLICATION ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth

OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Rita Cahill, Secretary Dennis Hoffman, Treasurer DIRECTORS Jeff Barnes, Brooke Breton, Kathryn Brillhart, Emma Clifton Perry, Bob Coleman, Dayne Cowan, Kim Davidson, Debbie Denise, Richard Edlund, VES, Pam Hogarth, Joel Hynek, Jeff Kleiser, Tim Landry, Neil Lim-Sang, Brooke Lyndon-Stanford, Tim McGovern, Kevin Rafferty, Scott Ross, Barry Sandrew, Tim Sassoon, Dan Schrecker, David Tanaka, Bill Taylor, VES, Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Andrew Bly, Fon Davis, Charlie Iturriaga, Christian Kubsch, Andres Martinez, Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Brent Armstrong, Director of Operations Nancy Ward, Program & Development Dir. Jeff Casper, Manager of Media & Graphics Ben Schneider, Membership Coordinator Colleen Kelly, Office Manager Jennifer Cabrera, Administrative Assistant Vicki Sheveck, Global Administrative Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2017 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2017


THEME PARKS

3D MOVIE EXPERIENCES RULE THE WORLD By HELENE SEIFER

TOP: DISH, the Digital Immersive Showroom from Walt Disney Imagineering, is a 360-degree “VR cave” for previsualization of Disney properties. Viewers are stationary and wear 3D glasses. Moving images are projected on all four walls in the 20-foot by 20-foot rounded-corner room.

8 • VFXVOICE.COM SUMMER 2017

Anyone remember when the best ride at the amusement park was the roller coaster with the steepest drop? Times have changed. Although coasters with multiple loops do thrill, there’s nothing like the ever-evolving, tech-heavy dark rides: enclosed experiences that immerse the rider in 3D, 4D or even 5D experiences, all while telling a story and manipulating emotions far beyond the stomach-flopping scare of a sudden turn in the track. Even traditional parks have ventured into the realm previously ruled by film and entertainment company theme parks. Take Six Flags. In the past few years they’ve added a Justice League dark ride to many of their parks, known until recently for such exhilarations as “Batman: The Ride” – a 50-mph inverted coaster. This spring, however, they took on the big guys by installing the “Justice League: Battle for Metropolis” 3D ride at Six Flags Magic Mountain, just a nanosecond from Southern California juggernauts Universal and Disneyland. Designed and built by Sally Corporation, a dark-ride and animatronic company, the Justice League ride zooms guests past interactive animatronic characters while in six-person motion-platform vehicles, and features real-time gaming. Riders wear 3D goggles that blast nefarious characters with laser guns in order to free the good guys, defeat Lex Luther and the Joker, and save the universe. To create an immersive experience, a 360-degree screen is fashioned by offsetting two 280-degree toroidal screens. Rich Hill, Sally Corp.’s Creative Director and lead designer on the ride, is excited by the ride options this degree of immersion allows. “It’s almost a virtual reality experience. We can break the plane of the screen and even do virtual loops.”

It’s de rigeur for dark rides to tell a story, usually based on existing characters for which there’s already a fan base, and a successful ride needs to take the disparate elements of story and effects and make magic. Previsualization allows creators and clients alike to preview the ride in 3D and adjust the timing, visuals and script before the attraction is built. It can focus on the screen content alone, the arc of the entire ride, or even preview a whole park. At its most basic, a previsualization company such as The Third Floor will “take what the creatives envision and put it in a place that they can look at it,” explains Brian Pace, The Third Floor’s Head of the Virtual Art Department. “The purpose of these visualizations is that we can illustrate and refine the story that the riders will experience, from the point of view of that rider, while also taking technical parameters, such as speed of the ride on the track,

TOP LEFT and RIGHT: Sally Corp.’s new Magic Mountain dark ride, “Justice League: Battle for Metropolis.” In “Street Battle,” The Joker and his henchman run wild in the streets of Metropolis. In “S.T.A.R. Labs Battle” The Joker blasts riders with “laughing gas.” (Photo credit: Copyright © 2012 Kevin Brown) BOTTOM: Walt Disney Imagineering, with James Cameron and Lightstorm Entertainment, are readying Walt Disney World’s new attraction, “Pandora – the World of Avatar,” opening this year in Orlando. Concept art shows riders navigating the interactive bioluminescent rainforest through which the Na’Vi River Journey travels. (Photo copyright © 2016 The Walt Disney Company. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 9


THEME PARKS

3D MOVIE EXPERIENCES RULE THE WORLD By HELENE SEIFER

TOP: DISH, the Digital Immersive Showroom from Walt Disney Imagineering, is a 360-degree “VR cave” for previsualization of Disney properties. Viewers are stationary and wear 3D glasses. Moving images are projected on all four walls in the 20-foot by 20-foot rounded-corner room.

8 • VFXVOICE.COM SUMMER 2017

Anyone remember when the best ride at the amusement park was the roller coaster with the steepest drop? Times have changed. Although coasters with multiple loops do thrill, there’s nothing like the ever-evolving, tech-heavy dark rides: enclosed experiences that immerse the rider in 3D, 4D or even 5D experiences, all while telling a story and manipulating emotions far beyond the stomach-flopping scare of a sudden turn in the track. Even traditional parks have ventured into the realm previously ruled by film and entertainment company theme parks. Take Six Flags. In the past few years they’ve added a Justice League dark ride to many of their parks, known until recently for such exhilarations as “Batman: The Ride” – a 50-mph inverted coaster. This spring, however, they took on the big guys by installing the “Justice League: Battle for Metropolis” 3D ride at Six Flags Magic Mountain, just a nanosecond from Southern California juggernauts Universal and Disneyland. Designed and built by Sally Corporation, a dark-ride and animatronic company, the Justice League ride zooms guests past interactive animatronic characters while in six-person motion-platform vehicles, and features real-time gaming. Riders wear 3D goggles that blast nefarious characters with laser guns in order to free the good guys, defeat Lex Luther and the Joker, and save the universe. To create an immersive experience, a 360-degree screen is fashioned by offsetting two 280-degree toroidal screens. Rich Hill, Sally Corp.’s Creative Director and lead designer on the ride, is excited by the ride options this degree of immersion allows. “It’s almost a virtual reality experience. We can break the plane of the screen and even do virtual loops.”

It’s de rigeur for dark rides to tell a story, usually based on existing characters for which there’s already a fan base, and a successful ride needs to take the disparate elements of story and effects and make magic. Previsualization allows creators and clients alike to preview the ride in 3D and adjust the timing, visuals and script before the attraction is built. It can focus on the screen content alone, the arc of the entire ride, or even preview a whole park. At its most basic, a previsualization company such as The Third Floor will “take what the creatives envision and put it in a place that they can look at it,” explains Brian Pace, The Third Floor’s Head of the Virtual Art Department. “The purpose of these visualizations is that we can illustrate and refine the story that the riders will experience, from the point of view of that rider, while also taking technical parameters, such as speed of the ride on the track,

TOP LEFT and RIGHT: Sally Corp.’s new Magic Mountain dark ride, “Justice League: Battle for Metropolis.” In “Street Battle,” The Joker and his henchman run wild in the streets of Metropolis. In “S.T.A.R. Labs Battle” The Joker blasts riders with “laughing gas.” (Photo credit: Copyright © 2012 Kevin Brown) BOTTOM: Walt Disney Imagineering, with James Cameron and Lightstorm Entertainment, are readying Walt Disney World’s new attraction, “Pandora – the World of Avatar,” opening this year in Orlando. Concept art shows riders navigating the interactive bioluminescent rainforest through which the Na’Vi River Journey travels. (Photo copyright © 2016 The Walt Disney Company. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 9


THEME PARKS

dimensions of the screen and stereoscopic rendering, into account. Often this is the first time clients are able to see their creations in context and from there they can focus on refining their creation.” When Universal Creative approached The Third Floor to previs the ride video for “Despicable Me: Minion Mayhem 3D,” they used concept art from the film’s producers, Illumination, considered the ride’s multiple seating locations, and had to conform their work to Universal’s custom screen in order to show what the ride experience would be like. Pace works with 3D program Maya to visualize how screen content is seen from different perspectives; how well the illusion of being within a new world is maintained. “With a 3D feature film, you can cut around a lot, such as when we worked on the movie Avatar. With a ride, what you’re seeing in a scene, you’re supposed to be a part of it. Shots are longer, more continuous than for feature films.” If a client has an existing ride that is similar, The Third Floor will modify their previs to play back in their existing environment to see if the execution works or needs adjustment. Often for overseas clients they’ll download it for viewing with VR headsets abroad. Bei Yang, WDI Technology Studio Executive for Walt Disney Imagineering, is one of the architects of DISH, the Digital Immersive Showroom, an innovative 360-degree “VR cave” for previsualization of Disney properties. Bei describes the 20-foot by

10 • VFXVOICE.COM SUMMER 2017

TOP LEFT and RIGHT: In development for ITEC Entertainment are the High Tech 3D Theater Presentation, “High Speed Indoor Coaster Dark Ride” and “Dragon Laser Effects Dark Ride.”

BOTTOM LEFT and RIGHT: Live action and concept art from the “Pirates of the Caribbean: Battle for the Sunken Treasure” 3D ride at the Shanghai Disney Resort in Shanghai, China. (Photo copyright © 2016 The Walt Disney Company. All Rights Reserved.)

“The ‘Justice League: Battle for Metropolis 3D’ ride at Six Flags Magic Mountain in Southern California is almost a virtual reality experience. We can break the plane of the screen and even do virtual loops.” —Rich Hill, Creative Director, Sally Corporation


THEME PARKS

dimensions of the screen and stereoscopic rendering, into account. Often this is the first time clients are able to see their creations in context and from there they can focus on refining their creation.” When Universal Creative approached The Third Floor to previs the ride video for “Despicable Me: Minion Mayhem 3D,” they used concept art from the film’s producers, Illumination, considered the ride’s multiple seating locations, and had to conform their work to Universal’s custom screen in order to show what the ride experience would be like. Pace works with 3D program Maya to visualize how screen content is seen from different perspectives; how well the illusion of being within a new world is maintained. “With a 3D feature film, you can cut around a lot, such as when we worked on the movie Avatar. With a ride, what you’re seeing in a scene, you’re supposed to be a part of it. Shots are longer, more continuous than for feature films.” If a client has an existing ride that is similar, The Third Floor will modify their previs to play back in their existing environment to see if the execution works or needs adjustment. Often for overseas clients they’ll download it for viewing with VR headsets abroad. Bei Yang, WDI Technology Studio Executive for Walt Disney Imagineering, is one of the architects of DISH, the Digital Immersive Showroom, an innovative 360-degree “VR cave” for previsualization of Disney properties. Bei describes the 20-foot by

10 • VFXVOICE.COM SUMMER 2017

TOP LEFT and RIGHT: In development for ITEC Entertainment are the High Tech 3D Theater Presentation, “High Speed Indoor Coaster Dark Ride” and “Dragon Laser Effects Dark Ride.”

BOTTOM LEFT and RIGHT: Live action and concept art from the “Pirates of the Caribbean: Battle for the Sunken Treasure” 3D ride at the Shanghai Disney Resort in Shanghai, China. (Photo copyright © 2016 The Walt Disney Company. All Rights Reserved.)

“The ‘Justice League: Battle for Metropolis 3D’ ride at Six Flags Magic Mountain in Southern California is almost a virtual reality experience. We can break the plane of the screen and even do virtual loops.” —Rich Hill, Creative Director, Sally Corporation


THEME PARKS

“VR is coming on very heavily, but hasn’t yet been perfected. It was created as a personal experience, not for moving 2,000 people an hour. We still need to solve those operational problems. How do we implement VR and make it viable and sustainable?” —Bill Coan, President, ITEC Entertainment 20-foot rounded-corner room as having “projectors and fancy tracking. Designers can inhabit their designs. You wear 3D stereo glasses, like those at a 3D movie. You can still see your colleagues. Images are projected on all four walls for a 360degree experience.” Sitting on a normal office chair, while wearing a bowler “tracking” hat, DISH allows you to experience everything. “You don’t move,” Bei continues, “but we give you the experience of moving through the ride.” Some things still need mock-ups, though. The complex environment of Walt Disney World’s new attraction, “Pandora – the World of Avatar,” needed mock-ups in addition to DISH media, including the fantastical interactive bioluminescent forest through which the Na’Vi River Journey travels. There, handmade plants and floating jellyfish mingle with Floridian flora and fauna and surround the Shaman of Songs, a convincingly articulated animatronic songstress. Whatever the story, previs, planning and the right technology

12 • VFXVOICE.COM SUMMER 2017

TOP LEFT and BOTTOM LEFT : “Ultimate Energy” is a 5D theater that integrates 3D projection, live actors, props, stunts and special effects into one ride. Riders are transported into the future when people are at war with machines. Guests visit the Mech Warriors and War Museum, which commemorates the history of a fictional global conflict. Both experiences are part of Wanda Movie Park, an indoor movie theme park in Wuhan, China, offering a variety of film entertainment such as 4D and 5D cinema, a flight theater, immersive theater, interactive theater and space theater. TOP RIGHT and BOTTOM RIGHT: Two ITEC-designed attractions opened in Wuhan, China, in late 2014 – “Ultimate Energy” and “Power of Nature.” “Power of Nature” is a virtual simulation ride where guests become storm chasers and are transported into powerful natural disasters. The experience comes to life in the Extreme Weather Institute, using the largest 3D screen in the world. Pictured is the lobby area of “Power of Nature.” The vehicle image is concept art.


THEME PARKS

“VR is coming on very heavily, but hasn’t yet been perfected. It was created as a personal experience, not for moving 2,000 people an hour. We still need to solve those operational problems. How do we implement VR and make it viable and sustainable?” —Bill Coan, President, ITEC Entertainment 20-foot rounded-corner room as having “projectors and fancy tracking. Designers can inhabit their designs. You wear 3D stereo glasses, like those at a 3D movie. You can still see your colleagues. Images are projected on all four walls for a 360degree experience.” Sitting on a normal office chair, while wearing a bowler “tracking” hat, DISH allows you to experience everything. “You don’t move,” Bei continues, “but we give you the experience of moving through the ride.” Some things still need mock-ups, though. The complex environment of Walt Disney World’s new attraction, “Pandora – the World of Avatar,” needed mock-ups in addition to DISH media, including the fantastical interactive bioluminescent forest through which the Na’Vi River Journey travels. There, handmade plants and floating jellyfish mingle with Floridian flora and fauna and surround the Shaman of Songs, a convincingly articulated animatronic songstress. Whatever the story, previs, planning and the right technology

12 • VFXVOICE.COM SUMMER 2017

TOP LEFT and BOTTOM LEFT : “Ultimate Energy” is a 5D theater that integrates 3D projection, live actors, props, stunts and special effects into one ride. Riders are transported into the future when people are at war with machines. Guests visit the Mech Warriors and War Museum, which commemorates the history of a fictional global conflict. Both experiences are part of Wanda Movie Park, an indoor movie theme park in Wuhan, China, offering a variety of film entertainment such as 4D and 5D cinema, a flight theater, immersive theater, interactive theater and space theater. TOP RIGHT and BOTTOM RIGHT: Two ITEC-designed attractions opened in Wuhan, China, in late 2014 – “Ultimate Energy” and “Power of Nature.” “Power of Nature” is a virtual simulation ride where guests become storm chasers and are transported into powerful natural disasters. The experience comes to life in the Extreme Weather Institute, using the largest 3D screen in the world. Pictured is the lobby area of “Power of Nature.” The vehicle image is concept art.


THEME PARKS

“With a 3D feature film, you can cut around a lot, such as when we worked on the movie Avatar. With a ride, what you’re seeing in a scene, you’re supposed to be a part of it. Shots are longer, more continuous than for feature films.” —Brian Pace, Head of the Virtual Art Department, The Third Floor are key. At the Harry Potter attractions at Universal Studios Hollywood and Florida, robotic arms hold the vehicle’s seats, enabling the motion to imitate and interact with the visuals as riders visit the sites from the beloved books and movies. One interesting detail can be found in Florida’s Hogwarts™ Express train connecting Hogmeade™ Station in Universal’s Islands of Adventure with King’s Cross Station in Universal Studios Florida: The train extends the Harry Potter experience, but no technology existed to believably convey movement of Harry Potter elements past the windows as the train rolled from one park to the next. “Clients come to us with what we consider science projects,” explains Bill Coan, ITEC Entertainment President. “We work with them to create a tech solution that doesn’t exist.” For the Harry Potter train windows, a bicycle rider outside should be seen from a different perspective from each window, for example. “It needed to be believable, but the tech didn’t exist. It took two-three years to develop that, without pixilation. We worked with the Swiss train company that was providing the actual trains.” During the time ITEC worked on the solution, screen technology itself advanced and became high definition. “We engineered it and then had to re-engineer it. It was a moving target!” Previewing the ride is one thing, having it run is another technological process. As ITEC’s Coan notes, “The Disneys and Universals don’t execute it by themselves. They create, then hire us to help. We’ve done some 60 attractions for Universal studios – that’s disparate pieces and parts and systems. The sound, the lighting, the effects, all highly timed. For example, we had a very specific role on the ‘Skull Island: Reign of Kong’ ride. We made the automated technology system which controls and automates the brain and the safety systems.” Even though each designer strives to make a ride experience that is loaded with bells and whistles and a heavy dose of wow, it’s equally important to recognize the target audience and the goal of the ride. At an average length of four minutes, deemed the perfect wait-time-to-thrill-time ratio, an attraction has limited time to accomplish its aims while also considering age limitations and how

14 • VFXVOICE.COM SUMMER 2017

TOP: “Despicable Me: Minion Mayhem 3D” is a simulator ride attraction at Universal Studios Florida, Universal Studios Hollywood and Universal Studios Japan. The Third Floor was tapped to work with Universal in visualizing content and interaction for this 3D ride.

many people should move through the ride at a time. As Coan points out, “VR is coming on very heavily, but hasn’t yet been perfected. It was created as a personal experience, not for moving 2,000 people an hour. We still need to solve those operational problems. How do we implement VR and make it viable and sustainable?” As for fast rides, Coan continues, “We know that 60-mph roller coasters eliminate 80% of people (from riding them).” The Third Floor’s Pace adds, “Sometimes it depends on how old the audience is. We won’t put 3D glasses on kids, for example. As for ride intensity, we are occasionally tasked with having the ride lurch. The amount of the lurch is based on average height and tolerance, which is localized for the country. In China, we have to tone down the rocking because they have a lower tolerance for it.” Asia is the fastest-growing market for theme parks, so understanding that population is critical. Sally Corp. has created animated musical shows in China, including animatronic penguins for the Penguin Hotel in Chengdu. They are acutely aware of the limitations in what they can sell there. “In Chinese mythology, they don’t really have ghosts and creepy spirits like we do,” explains Sally Corp.’s Hill. “So a standard haunted house with ghosts chasing us doesn’t work there. Their mythology is around history and battles – those are their scary things. We’ve developed a ride around AMC’s The Walking Dead, and typical to the story of the show we visit survival camps and in between there are battles. That would play well in China and we think will sell there.” Disney recently opened their park in Shanghai, and the “Pirates of the Caribbean” ride is a blockbuster attraction. Bei Yang says, “The one in Anaheim has vignettes of stories, but in Shanghai, we built to the world of the Pirates movies. The overall scale is bigger, the boats are precisely timed to allow amazing special effects.

Gigantic multi-screens combine with actual sets. Perspective rendering can extend the scenes.” His concern with building in other countries is making certain it still feels like a Disney Park, but “because it’s regional, things in there have to align with cultural norms. In China, the family unit is slightly different, not like the nuclear family of North America. Usually grandparents come along. How do you make sure they have a fun ride, too? Or if they don’t go on the ride, how do they share in the experience?” One way is to include parts of the ride that are visible to grandparents waiting outside. ITEC is designing and building six projects in Vietnam, including Kim Quy (translated as “Golden Turtle)” Amusement Park, designed for Sun Group, which will partially open late 2018. Coan describes the 250-acre park as being “story driven. One part city walk, one part theme park. The core attraction is ‘The Matrix’. It’s super high-tech – cyber and digital, VR, AR, 5D, 6D theaters, low capacity, high end.” ITEC also has projects in China, including two for the Wanda Group: “The Power of Nature,” a virtual simulation ride where guests chase powerful natural disasters; and “Ultimate Energy,” a ride into the future when people are at war with machines. Coan assesses the ride landscape in China. “Disney just arrived in Shanghai, so the bar has been raised. Universal is coming in 2019. We’re working with them there. China wants the same storytelling, the same thrills, but they don’t have the support. I have to arm wrestle with the clients because we need to create something they can sustain and maintain without sophisticated talent. We’re also localizing more. Generally, people in the East are smaller than in the West, but the West builds more of the equipment, so there’s not so much customization now.” Another growing market is the Middle East, where gender roles come into play. “Sally Corp. did work for a company there and when they came to see us they requested that all the women leave the shop,” said Hill. “We refused. If you’re visiting us, we expect you to respect our rules. When we go there, we’ll respect theirs. Justice League rides would take convincing to build there because of Wonder Woman, Harley Quinn and Supergirl. DC Comics and Warner Bros. would have to push for it. We built some animatronics in Dubai and had to cross some of those gender issues.” Wherever in the world rides are built, the increasingly sophisticated screens and media combined with animatronics and scenics enhance the feeling of being part of a real place. Toroidal curved screens, which encompass our peripheral vision, continue to expand and allow massive amounts of digital data (The Third Floor provided 60-fps stereo-media content for a tram-based ride that was over 25,000 pixels wide by 2,160 tall). This is how magic is created. Imagineering’s Yang sums it up. “Even though I know how it’s done, it blows my mind when the illusion absolutely works. The border between what is shot versus CG now, it’s the same pattern in rides. The technology is amazing. It’s always a joy to see, but it’s not technology for technology’s sake. All of this is aiming to create experiences. It’s a privilege to be in the field of special effects in the service of one of the most human things we can do: tell a story.”

TOP: TEC Entertainment President Bill Coan addresses dignitaries at the Kim Quy Amusement Park groundbreaking ceremony in Hanoi, Vietnam, in 2016. MIDDLE: A high concept design of Kim Quy Amusement Park in Vietnam. Designed by ITEC for Sun Group, the project, being built in Hanoi, partially opens in 2018. It will feature an open space for outdoor activities with cutting-edge technology, including a Sun Wheel and hot air balloons. Interactive attractions will include the latest immersive, virtual and augmented reality experiences. BOTTOM: Entrance to the “Pirates of the Caribbean” ride at the Shanghai Disney Resort.

SUMMER 2017

VFXVOICE.COM • 15


THEME PARKS

“With a 3D feature film, you can cut around a lot, such as when we worked on the movie Avatar. With a ride, what you’re seeing in a scene, you’re supposed to be a part of it. Shots are longer, more continuous than for feature films.” —Brian Pace, Head of the Virtual Art Department, The Third Floor are key. At the Harry Potter attractions at Universal Studios Hollywood and Florida, robotic arms hold the vehicle’s seats, enabling the motion to imitate and interact with the visuals as riders visit the sites from the beloved books and movies. One interesting detail can be found in Florida’s Hogwarts™ Express train connecting Hogmeade™ Station in Universal’s Islands of Adventure with King’s Cross Station in Universal Studios Florida: The train extends the Harry Potter experience, but no technology existed to believably convey movement of Harry Potter elements past the windows as the train rolled from one park to the next. “Clients come to us with what we consider science projects,” explains Bill Coan, ITEC Entertainment President. “We work with them to create a tech solution that doesn’t exist.” For the Harry Potter train windows, a bicycle rider outside should be seen from a different perspective from each window, for example. “It needed to be believable, but the tech didn’t exist. It took two-three years to develop that, without pixilation. We worked with the Swiss train company that was providing the actual trains.” During the time ITEC worked on the solution, screen technology itself advanced and became high definition. “We engineered it and then had to re-engineer it. It was a moving target!” Previewing the ride is one thing, having it run is another technological process. As ITEC’s Coan notes, “The Disneys and Universals don’t execute it by themselves. They create, then hire us to help. We’ve done some 60 attractions for Universal studios – that’s disparate pieces and parts and systems. The sound, the lighting, the effects, all highly timed. For example, we had a very specific role on the ‘Skull Island: Reign of Kong’ ride. We made the automated technology system which controls and automates the brain and the safety systems.” Even though each designer strives to make a ride experience that is loaded with bells and whistles and a heavy dose of wow, it’s equally important to recognize the target audience and the goal of the ride. At an average length of four minutes, deemed the perfect wait-time-to-thrill-time ratio, an attraction has limited time to accomplish its aims while also considering age limitations and how

14 • VFXVOICE.COM SUMMER 2017

TOP: “Despicable Me: Minion Mayhem 3D” is a simulator ride attraction at Universal Studios Florida, Universal Studios Hollywood and Universal Studios Japan. The Third Floor was tapped to work with Universal in visualizing content and interaction for this 3D ride.

many people should move through the ride at a time. As Coan points out, “VR is coming on very heavily, but hasn’t yet been perfected. It was created as a personal experience, not for moving 2,000 people an hour. We still need to solve those operational problems. How do we implement VR and make it viable and sustainable?” As for fast rides, Coan continues, “We know that 60-mph roller coasters eliminate 80% of people (from riding them).” The Third Floor’s Pace adds, “Sometimes it depends on how old the audience is. We won’t put 3D glasses on kids, for example. As for ride intensity, we are occasionally tasked with having the ride lurch. The amount of the lurch is based on average height and tolerance, which is localized for the country. In China, we have to tone down the rocking because they have a lower tolerance for it.” Asia is the fastest-growing market for theme parks, so understanding that population is critical. Sally Corp. has created animated musical shows in China, including animatronic penguins for the Penguin Hotel in Chengdu. They are acutely aware of the limitations in what they can sell there. “In Chinese mythology, they don’t really have ghosts and creepy spirits like we do,” explains Sally Corp.’s Hill. “So a standard haunted house with ghosts chasing us doesn’t work there. Their mythology is around history and battles – those are their scary things. We’ve developed a ride around AMC’s The Walking Dead, and typical to the story of the show we visit survival camps and in between there are battles. That would play well in China and we think will sell there.” Disney recently opened their park in Shanghai, and the “Pirates of the Caribbean” ride is a blockbuster attraction. Bei Yang says, “The one in Anaheim has vignettes of stories, but in Shanghai, we built to the world of the Pirates movies. The overall scale is bigger, the boats are precisely timed to allow amazing special effects.

Gigantic multi-screens combine with actual sets. Perspective rendering can extend the scenes.” His concern with building in other countries is making certain it still feels like a Disney Park, but “because it’s regional, things in there have to align with cultural norms. In China, the family unit is slightly different, not like the nuclear family of North America. Usually grandparents come along. How do you make sure they have a fun ride, too? Or if they don’t go on the ride, how do they share in the experience?” One way is to include parts of the ride that are visible to grandparents waiting outside. ITEC is designing and building six projects in Vietnam, including Kim Quy (translated as “Golden Turtle)” Amusement Park, designed for Sun Group, which will partially open late 2018. Coan describes the 250-acre park as being “story driven. One part city walk, one part theme park. The core attraction is ‘The Matrix’. It’s super high-tech – cyber and digital, VR, AR, 5D, 6D theaters, low capacity, high end.” ITEC also has projects in China, including two for the Wanda Group: “The Power of Nature,” a virtual simulation ride where guests chase powerful natural disasters; and “Ultimate Energy,” a ride into the future when people are at war with machines. Coan assesses the ride landscape in China. “Disney just arrived in Shanghai, so the bar has been raised. Universal is coming in 2019. We’re working with them there. China wants the same storytelling, the same thrills, but they don’t have the support. I have to arm wrestle with the clients because we need to create something they can sustain and maintain without sophisticated talent. We’re also localizing more. Generally, people in the East are smaller than in the West, but the West builds more of the equipment, so there’s not so much customization now.” Another growing market is the Middle East, where gender roles come into play. “Sally Corp. did work for a company there and when they came to see us they requested that all the women leave the shop,” said Hill. “We refused. If you’re visiting us, we expect you to respect our rules. When we go there, we’ll respect theirs. Justice League rides would take convincing to build there because of Wonder Woman, Harley Quinn and Supergirl. DC Comics and Warner Bros. would have to push for it. We built some animatronics in Dubai and had to cross some of those gender issues.” Wherever in the world rides are built, the increasingly sophisticated screens and media combined with animatronics and scenics enhance the feeling of being part of a real place. Toroidal curved screens, which encompass our peripheral vision, continue to expand and allow massive amounts of digital data (The Third Floor provided 60-fps stereo-media content for a tram-based ride that was over 25,000 pixels wide by 2,160 tall). This is how magic is created. Imagineering’s Yang sums it up. “Even though I know how it’s done, it blows my mind when the illusion absolutely works. The border between what is shot versus CG now, it’s the same pattern in rides. The technology is amazing. It’s always a joy to see, but it’s not technology for technology’s sake. All of this is aiming to create experiences. It’s a privilege to be in the field of special effects in the service of one of the most human things we can do: tell a story.”

TOP: TEC Entertainment President Bill Coan addresses dignitaries at the Kim Quy Amusement Park groundbreaking ceremony in Hanoi, Vietnam, in 2016. MIDDLE: A high concept design of Kim Quy Amusement Park in Vietnam. Designed by ITEC for Sun Group, the project, being built in Hanoi, partially opens in 2018. It will feature an open space for outdoor activities with cutting-edge technology, including a Sun Wheel and hot air balloons. Interactive attractions will include the latest immersive, virtual and augmented reality experiences. BOTTOM: Entrance to the “Pirates of the Caribbean” ride at the Shanghai Disney Resort.

SUMMER 2017

VFXVOICE.COM • 15


COMMERCIALS

CREATURES, CARS AND PHOTOREALISM By DEBRA KAUFMAN

16 • VFXVOICE.COM SUMMER 2017

Visual effects for commercials are still a healthy business, according to many visual effects facilities. Advertisers are tapping visual effects for the latest in photorealism and creatures to make their messages stand out. A number of VFX houses that do commercials are identifying new trends and pushing the envelope to create the imagery. At Blur Studios, “Titanfall 2: Become One” was a visually complex commercial that earned Blur a nomination for Outstanding Visual Effects in a Commercial at the 15th Annual Visual Effects Society Awards. Blur’s Dave Wilson directed it for EA/Respawn with agency Argonaut San Francisco. “It’s a very character-driven piece about a man and his ‘steed’, which in this case is an 18-foot robot,” explains Lead Compositor Nitant Karnik, whose team composited live action with the CG robots. “When our hero would be decompressing after a battle, the expression on his face was live action, but we would add flames or stress the background.” The background was a lush, forest-like atmosphere, also CG, adding to the mosaic of live action and CG. Karnik reports he is seeing a “big push towards photorealism” in videogame commercials and promos, surpassing the level of photorealism in the video games themselves. “The games obviously do not look photoreal when you are playing them, but there are things we can do in comp to make them more believable,” he says. He notes that when people can’t tell if it’s real or computer-generated, they forget to distinguish. This trend of photorealism within the context of marketing video games, says Karnik, is due to the nature of the content. “As a gamer, you envision yourself in the real world playing your character,” he says. “The game only takes you so far, and your imagination takes you the rest of the way. With a commercial, we have the liberty to push it to the realism you imagine in your head.” What enables this advanced blend of elements is deep compositing, using a combination of Houdini and Nuke, which gives compositors more control with layers. “If you have smoke in the

atmosphere, for example, in the past you would render it without the characters in the room, so when you put the smoke on top of the character, it looks like he is inside the smoke,” explains Karnik. “But if the animation changed, we’d have to re-render the smoke.” Now, he adds, with deep compositing, that smoke can be added “on the fly” and be retained even when the animation changes. At Method Studios, Senior Creative Director Ben Walsh reports that his facility is seeing an uptick in creature work and photorealism. Method created hundreds of photoreal cephalopods for GE’s “Raining Octopuses,” and convincing urban destruction for a Halo 5: Guardians ad. He believes that advertisers and brands are trending towards putting more animals and creatures into commercials because technology has improved to the point that a VFX house can turn them around in record time. That is in part because Method, like some other VFX companies, can tap into the resources and R&D from its feature film division. For example, Walsh’s team recently repurposed the musculature of a horse built for an episodic TV show and used it for the basis of a full CG moose, for a Farmer’s Insurance commercial, in a speedy two months. For another spot, in which Method recently created a werewolf, Walsh explains that feature film assets are not simply copied from a movie to an ad. “You cannot just grab a werewolf from a film, because it has a distinct design,” he says. “But you can use the R&D and the lessons you learned on how to create it.” Walsh points out that some trends are often ephemeral, driven by pop culture. “They might see a cool effect in a film and want to mimic it,” he says. “It is a huge part of advertising.” He reports another trend: different clients and different projects than the usual television commercials. “We still do a lot of car work for TV, and the majority of our commercials are traditional,” he says. “But we now do a lot of secret projects in the area of technology and devices. In the last few years, VFX has gotten so good that now tech and information companies have faith in it.” Commercials are themselves repurposed for different platforms, adds Walsh, who notes the impact of that on the VFX facility. “It makes it a bit more time-consuming for the Flame

“As a gamer, you envision yourself in the real world playing your character. The game can only take you so far, and your imagination takes you the rest of the way. With a commercial, we have the liberty to push it to the realism you can imagine in your head.” —Nitant Karnik, Lead Compositor, Blur Studios

OPPOSITE and BOTTOM: Blur Studios’ “Titanfall 2: Become One” commercial, directed by Blur’s Dave Wilson for EA/Respawn and agency Argonaut.

SUMMER 2017

VFXVOICE.COM • 17


COMMERCIALS

CREATURES, CARS AND PHOTOREALISM By DEBRA KAUFMAN

16 • VFXVOICE.COM SUMMER 2017

Visual effects for commercials are still a healthy business, according to many visual effects facilities. Advertisers are tapping visual effects for the latest in photorealism and creatures to make their messages stand out. A number of VFX houses that do commercials are identifying new trends and pushing the envelope to create the imagery. At Blur Studios, “Titanfall 2: Become One” was a visually complex commercial that earned Blur a nomination for Outstanding Visual Effects in a Commercial at the 15th Annual Visual Effects Society Awards. Blur’s Dave Wilson directed it for EA/Respawn with agency Argonaut San Francisco. “It’s a very character-driven piece about a man and his ‘steed’, which in this case is an 18-foot robot,” explains Lead Compositor Nitant Karnik, whose team composited live action with the CG robots. “When our hero would be decompressing after a battle, the expression on his face was live action, but we would add flames or stress the background.” The background was a lush, forest-like atmosphere, also CG, adding to the mosaic of live action and CG. Karnik reports he is seeing a “big push towards photorealism” in videogame commercials and promos, surpassing the level of photorealism in the video games themselves. “The games obviously do not look photoreal when you are playing them, but there are things we can do in comp to make them more believable,” he says. He notes that when people can’t tell if it’s real or computer-generated, they forget to distinguish. This trend of photorealism within the context of marketing video games, says Karnik, is due to the nature of the content. “As a gamer, you envision yourself in the real world playing your character,” he says. “The game only takes you so far, and your imagination takes you the rest of the way. With a commercial, we have the liberty to push it to the realism you imagine in your head.” What enables this advanced blend of elements is deep compositing, using a combination of Houdini and Nuke, which gives compositors more control with layers. “If you have smoke in the

atmosphere, for example, in the past you would render it without the characters in the room, so when you put the smoke on top of the character, it looks like he is inside the smoke,” explains Karnik. “But if the animation changed, we’d have to re-render the smoke.” Now, he adds, with deep compositing, that smoke can be added “on the fly” and be retained even when the animation changes. At Method Studios, Senior Creative Director Ben Walsh reports that his facility is seeing an uptick in creature work and photorealism. Method created hundreds of photoreal cephalopods for GE’s “Raining Octopuses,” and convincing urban destruction for a Halo 5: Guardians ad. He believes that advertisers and brands are trending towards putting more animals and creatures into commercials because technology has improved to the point that a VFX house can turn them around in record time. That is in part because Method, like some other VFX companies, can tap into the resources and R&D from its feature film division. For example, Walsh’s team recently repurposed the musculature of a horse built for an episodic TV show and used it for the basis of a full CG moose, for a Farmer’s Insurance commercial, in a speedy two months. For another spot, in which Method recently created a werewolf, Walsh explains that feature film assets are not simply copied from a movie to an ad. “You cannot just grab a werewolf from a film, because it has a distinct design,” he says. “But you can use the R&D and the lessons you learned on how to create it.” Walsh points out that some trends are often ephemeral, driven by pop culture. “They might see a cool effect in a film and want to mimic it,” he says. “It is a huge part of advertising.” He reports another trend: different clients and different projects than the usual television commercials. “We still do a lot of car work for TV, and the majority of our commercials are traditional,” he says. “But we now do a lot of secret projects in the area of technology and devices. In the last few years, VFX has gotten so good that now tech and information companies have faith in it.” Commercials are themselves repurposed for different platforms, adds Walsh, who notes the impact of that on the VFX facility. “It makes it a bit more time-consuming for the Flame

“As a gamer, you envision yourself in the real world playing your character. The game can only take you so far, and your imagination takes you the rest of the way. With a commercial, we have the liberty to push it to the realism you can imagine in your head.” —Nitant Karnik, Lead Compositor, Blur Studios

OPPOSITE and BOTTOM: Blur Studios’ “Titanfall 2: Become One” commercial, directed by Blur’s Dave Wilson for EA/Respawn and agency Argonaut.

SUMMER 2017

VFXVOICE.COM • 17


COMMERCIALS

ABOVE: The Mill’s “Human Race Club” Chevrolet commercial.

18 • VFXVOICE.COM SUMMER 2017

artist or whoever outputs the final job, knowing they will have to do a portrait phone version of a commercial,” he says. “At the end of the day, we prefer not to have to worry about the look or framing based on an iPhone. The user can rotate the iPhone and see it in proper aspect ratio.” Instagram is a different story. “The Instagram version has to be pan-and-scanned, cropped old school,” he says. At Luma Pictures, General Manager Jay Lichtman agrees with Walsh’s assessment. He reports that, “more than ever, clients and brands understand that the 60-second TV commercial is not the be-all and end-all.” He notes some of the challenges that it raises. Advertisers are allocating money to many platforms, he says, which can “eat away at the marketing dollars.” The result is

creating far more savvy marketers, says Lichtman, and the tools have to be more varied. “About a decade ago, there was a lot of value put in the craft of content creation,” he says. “Now it is more the quantity of content creation, because of the ability of the brand to be present to be everywhere, and there’s a need to populate all those centers.” He believes that eventually there will be a unified platform, which will improve the situation and, he hopes, restore VFX to its rightful place as one of the top line items in a budget. At Method Studios, Walsh reports that the work his company does to create versions for other platforms does not impact how Method works. But he does not dismiss the importance of online viewing of commercials. “Once you do the work of getting a commercial out there, not everyone sees it on TV,” he says. “People can watch Super Bowl ads online before the game. Online views as opposed to TV are huge for us and everyone else.” Recently, Method’s New York office designed and built the official Association of Independent Commercial Producers Sponsor Reel Director’s Cut for its awards show. “We motion-captured dancing characters and applied all these different textures and put it to Major Lazer’s song ‘Light It Up,’” he says. When Major Lazer saw it, he put it up as his music video on YouTube, which has so far gotten 131 million views. That led to a call from Facebook. “It just shows how things are changing through the social media world as opposed to traditional TV commercials,” says Walsh. At The Mill, Group CG Director Vince Baertsoen reports that his New York facility is doing more and more interesting commercial work every year, and not just for TV. “We used to do commercials primarily for broadcast, but now the media are expanding to all sorts of platforms and screens,” he says. “To make all that content exciting, you need to have visual effects.” Speedier tools and a more

TOP: The Mill’s “Human Race Club” Chevrolet commercial.

“It isn’t always easy to implement new technology to work faster and more efficiently. It just takes time, and you need new skill sets and developers to do this integration. VFX is about looking for solutions.” —Vince Baertsoen, Group CG Director, The Mill

SUMMER 2017

VFXVOICE.COM • 19


COMMERCIALS

ABOVE: The Mill’s “Human Race Club” Chevrolet commercial.

18 • VFXVOICE.COM SUMMER 2017

artist or whoever outputs the final job, knowing they will have to do a portrait phone version of a commercial,” he says. “At the end of the day, we prefer not to have to worry about the look or framing based on an iPhone. The user can rotate the iPhone and see it in proper aspect ratio.” Instagram is a different story. “The Instagram version has to be pan-and-scanned, cropped old school,” he says. At Luma Pictures, General Manager Jay Lichtman agrees with Walsh’s assessment. He reports that, “more than ever, clients and brands understand that the 60-second TV commercial is not the be-all and end-all.” He notes some of the challenges that it raises. Advertisers are allocating money to many platforms, he says, which can “eat away at the marketing dollars.” The result is

creating far more savvy marketers, says Lichtman, and the tools have to be more varied. “About a decade ago, there was a lot of value put in the craft of content creation,” he says. “Now it is more the quantity of content creation, because of the ability of the brand to be present to be everywhere, and there’s a need to populate all those centers.” He believes that eventually there will be a unified platform, which will improve the situation and, he hopes, restore VFX to its rightful place as one of the top line items in a budget. At Method Studios, Walsh reports that the work his company does to create versions for other platforms does not impact how Method works. But he does not dismiss the importance of online viewing of commercials. “Once you do the work of getting a commercial out there, not everyone sees it on TV,” he says. “People can watch Super Bowl ads online before the game. Online views as opposed to TV are huge for us and everyone else.” Recently, Method’s New York office designed and built the official Association of Independent Commercial Producers Sponsor Reel Director’s Cut for its awards show. “We motion-captured dancing characters and applied all these different textures and put it to Major Lazer’s song ‘Light It Up,’” he says. When Major Lazer saw it, he put it up as his music video on YouTube, which has so far gotten 131 million views. That led to a call from Facebook. “It just shows how things are changing through the social media world as opposed to traditional TV commercials,” says Walsh. At The Mill, Group CG Director Vince Baertsoen reports that his New York facility is doing more and more interesting commercial work every year, and not just for TV. “We used to do commercials primarily for broadcast, but now the media are expanding to all sorts of platforms and screens,” he says. “To make all that content exciting, you need to have visual effects.” Speedier tools and a more

TOP: The Mill’s “Human Race Club” Chevrolet commercial.

“It isn’t always easy to implement new technology to work faster and more efficiently. It just takes time, and you need new skill sets and developers to do this integration. VFX is about looking for solutions.” —Vince Baertsoen, Group CG Director, The Mill

SUMMER 2017

VFXVOICE.COM • 19


COMMERCIALS

Celebrate the VES 20th Anniversary in the October Issue of VFX Voice

“Congratulations to @VFXSociety on the launch of VFX Voice! We’re psyched to be a part of it” - THE MOLECULE TOP and LEFT: The Mill London’s VES Award-winning 2016 Christmas “trampoline” ad for John Lewis: Buster the Boxer, directed by Dougal Wilson for agency adam&eveDBD, with VFX by MPC. MIDDLE RIGHT: Method Studios created hundreds of photoreal cephalopods for GE’s “Raining Octopuses” commercial. Directed by Christopher Riggert for agency BBDO.

“Once you do the work of getting a commercial out there, not everyone sees it on TV. People can watch Super Bowl ads online before the game. Online views as opposed to TV are huge for us and everyone else.” —Ben Walsh, Senior Creative Director, Method Studios

20 • VFXVOICE.COM SUMMER 2017

efficient VFX pipeline have helped this kind of VFX work become a regular feature at The Mill, says Baertsoen, but he notes that it’s a learning curve. “It isn’t always easy to implement new technology to work faster and more efficiently,” he says. “It just takes time, and you need new skill sets and developers to do this integration. VFX is about looking for solutions.” The Mill has found some crucial solutions by creating its own tools, which were used for the first time for “The Human Race,” a Chevrolet Camaro ad, which shows a human racing a 2017 Chevrolet Camaro ZL1 against the Chevrolet FNR autonomous concept car, powered by artificial intelligence. To accomplish the task, The Mill used its Blackbird, an adjustable rig that is a skeleton for a CG car to be layered on top of it. Blackbird has won both an Innovation Lion at Cannes and an HPA (Hollywood Professional Association) Judges Award for Creativity and Innovation. The Mill also used its virtual production toolkit, Mill Cyclops, and Epic’s Unreal Engine for real-time rendering. Blackbird, says Baertsoen, took care of the “lack of car availability,” and Cyclops solved issues related to on-set visualization. “It enables directors and creatives to see their digital creations on set, lit,” he says. “It also has the potential to optimize the VFX pipeline in some ways.” The company has also created a huge number of creatures in the last year. A CG orangutan, created for energy company SSE, won multiple awards at the 14th Annual VES Awards. “Clients and agencies are inspired to open up the creative and try different things when the technology allows,” Baertsoen says.

“Big congrats to @VFXSociety on the inaugural issue of VFX Voice. Looks great!” - METHOD STUDIOS “Congratulations to the @VFXSociety on the launch of their new magazine VFX Voice!” - ILM

OCTOBER EDITORIAL FEATURES VES 20th Anniversary Expanded Editorial Section Special Focus on VR/AR/MR

Film · TV · Games · Industry Roundtable · Animation · VR/AR/MR VFX Trends · Commercials · Theme Parks · Previs · Profiles Advertising deadline: August 31, 2017 • Contact: publisher@vfxvoice.com Download Media Kit at vfxvoice.com


COMMERCIALS

Celebrate the VES 20th Anniversary in the October Issue of VFX Voice

“Congratulations to @VFXSociety on the launch of VFX Voice! We’re psyched to be a part of it” - THE MOLECULE TOP and LEFT: The Mill London’s VES Award-winning 2016 Christmas “trampoline” ad for John Lewis: Buster the Boxer, directed by Dougal Wilson for agency adam&eveDBD, with VFX by MPC. MIDDLE RIGHT: Method Studios created hundreds of photoreal cephalopods for GE’s “Raining Octopuses” commercial. Directed by Christopher Riggert for agency BBDO.

“Once you do the work of getting a commercial out there, not everyone sees it on TV. People can watch Super Bowl ads online before the game. Online views as opposed to TV are huge for us and everyone else.” —Ben Walsh, Senior Creative Director, Method Studios

20 • VFXVOICE.COM SUMMER 2017

efficient VFX pipeline have helped this kind of VFX work become a regular feature at The Mill, says Baertsoen, but he notes that it’s a learning curve. “It isn’t always easy to implement new technology to work faster and more efficiently,” he says. “It just takes time, and you need new skill sets and developers to do this integration. VFX is about looking for solutions.” The Mill has found some crucial solutions by creating its own tools, which were used for the first time for “The Human Race,” a Chevrolet Camaro ad, which shows a human racing a 2017 Chevrolet Camaro ZL1 against the Chevrolet FNR autonomous concept car, powered by artificial intelligence. To accomplish the task, The Mill used its Blackbird, an adjustable rig that is a skeleton for a CG car to be layered on top of it. Blackbird has won both an Innovation Lion at Cannes and an HPA (Hollywood Professional Association) Judges Award for Creativity and Innovation. The Mill also used its virtual production toolkit, Mill Cyclops, and Epic’s Unreal Engine for real-time rendering. Blackbird, says Baertsoen, took care of the “lack of car availability,” and Cyclops solved issues related to on-set visualization. “It enables directors and creatives to see their digital creations on set, lit,” he says. “It also has the potential to optimize the VFX pipeline in some ways.” The company has also created a huge number of creatures in the last year. A CG orangutan, created for energy company SSE, won multiple awards at the 14th Annual VES Awards. “Clients and agencies are inspired to open up the creative and try different things when the technology allows,” Baertsoen says.

“Big congrats to @VFXSociety on the inaugural issue of VFX Voice. Looks great!” - METHOD STUDIOS “Congratulations to the @VFXSociety on the launch of their new magazine VFX Voice!” - ILM

OCTOBER EDITORIAL FEATURES VES 20th Anniversary Expanded Editorial Section Special Focus on VR/AR/MR

Film · TV · Games · Industry Roundtable · Animation · VR/AR/MR VFX Trends · Commercials · Theme Parks · Previs · Profiles Advertising deadline: August 31, 2017 • Contact: publisher@vfxvoice.com Download Media Kit at vfxvoice.com


PROFILE

ROB LEGATO: INTUITIVE INNOVATOR By PAULA PARISI

BELOW: Legato, ASC, created the “pan-and-tile” system to template the rocket launch for Ron Howard’s Apollo 13. He captured a series of “generic” angles on Cape Canaveral that allowed for more options when he later constructed the shot using a combination of CG and live elements. (Photos copyright © 1995 Universal Pictures. All Rights Reserved.)

22 • VFXVOICE.COM SUMMER 2017

Rob Legato, ASC, didn’t set out to revolutionize an entire industry, he was just trying to make movies in a way that felt intuitive to him. Trained as a cinematographer, that meant the freedom to try to change shots based on what he saw through a camera viewfinder, rather than adhering rigidly to storyboards created by an artist far removed from the set. Legato’s career, during which he won three Academy Awards for Best Visual Effects for James Cameron’s 1997 Titanic, 2011’s Hugo and 2016’s The Jungle Book, has been dedicated to dissolving the wall between production and post, creating a new filmmaking vocabulary around virtual cameras and virtual sets. His techniques are now being adopted by others, as the live action and digital worlds blend. The changes upend decades of established practice where visual effects were rigidly dictated by storyboards and digital renderings created far from the set and costly to change. For Legato, the tectonic shift began with the “pan-and-tile” system he created to template the rocket launch for Ron Howard’s Apollo 13 in 1995 (for which he received an Academy Award nomination). Having endured previous frustrations, where he was restricted in his VFX camera moves by photography of the live-action plates, he undertook a series of “generic” angles on Cape Canaveral that would allow for more options when he later constructed the shot using a combination of CG and live elements. He refined and expanded his techniques to the point where 14 years later he presented it to Cameron as a viable means of directing virtual actors in real time on the set of Avatar. (Although Legato is credited as “virtual production conceived by,” he was not part of the Avatar production team, choosing to continue his collaboration with Martin Scorsese instead). Legato credits Cameron with funding him to develop a “large-scale version” of the simulcam system that made The Jungle Book possible. Further advances meant when he joined Favreau in 2014 to start The Jungle Book, the director would be able to live-direct one actor whose performance was being captured photographically alongside a cast of virtual actors on an entirely digital set. With The Lion King (slated for 2019 release), on which Legato is Visual Effects Supervisor, the techniques will leap forward once again. “I was going for something more improvisational and lifelike, and not so studied,” recalls Legato, who had just returned from a field trip to Africa with The Jungle Book director Jon Favreau, for pre-production on The Lion King. “Visual effects sequences can have a stiffness about them, because they’re created in dark rooms, sometimes halfway around the world, by CG artists who may or may not have live-action reference footage. “It’s like, had they known what the three shots before it would have felt like, with music, they might have made that fifth one different. But you can’t do that if you’re executing a storyboard and everything just gets cut together later by a third person, the editor.” The result, in Legato’s opinion, was effects that seemed “disembodied from the film.” For The Jungle Book, the idea was to have an infinite jungle set in downtown Los Angeles, for purposes of creative control and so the filmmakers didn’t have to endure the hardships of a remote location. The need for lifelike talking animals was another

complicating factor. Legato, Visual Effects Supervisor and secondunit director on the film, was charged with creating a workflow that was to be essentially identical to shooting a live-action movie. He accomplished this in several steps, the first being to work with Favreau to template the shots, a process he calls “visualizing the movie,” as distinct from “pre-viz.” “Previsualization has a cartoon connotation, and is usually done by an outside house,” says Legato, who creates his own visualizations as part of his effects process. “The discipline and commitment to this system is what made The Jungle Book so special,” Legato says. “The whole movie was shot exactly like a live-action film.” For starters, the visualization was created by Legato not by mousing around a 3D computer environment populated by CG characters, but by connecting a virtual camera that he moved through physical space, approximating the controls on the Arri Alexa that would be used on set. This visualized world is very specific. “We know if the set is 18-feet out from the wall that there

TOP: Digital Titanic on a digital sea created by Digital Domain for the 1997 film. Legato earned an Academy Award for his work on Titanic. (Photo copyright © 1997 Twentieth Century Fox. All Rights Reserved) ABOVE LEFT: Legato at work on The Jungle Book, for which he received an Academy Award in 2017. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) ABOVE RIGHT: Candid moment on the set of Hugo with director Martin Scorsese, Director of Photography Robert Richardson, ASC, and Legato (Photo credit: Jaap Buitendijk. Copyright © 2010 Paramount Pictures/GK Films. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 23


PROFILE

ROB LEGATO: INTUITIVE INNOVATOR By PAULA PARISI

BELOW: Legato, ASC, created the “pan-and-tile” system to template the rocket launch for Ron Howard’s Apollo 13. He captured a series of “generic” angles on Cape Canaveral that allowed for more options when he later constructed the shot using a combination of CG and live elements. (Photos copyright © 1995 Universal Pictures. All Rights Reserved.)

22 • VFXVOICE.COM SUMMER 2017

Rob Legato, ASC, didn’t set out to revolutionize an entire industry, he was just trying to make movies in a way that felt intuitive to him. Trained as a cinematographer, that meant the freedom to try to change shots based on what he saw through a camera viewfinder, rather than adhering rigidly to storyboards created by an artist far removed from the set. Legato’s career, during which he won three Academy Awards for Best Visual Effects for James Cameron’s 1997 Titanic, 2011’s Hugo and 2016’s The Jungle Book, has been dedicated to dissolving the wall between production and post, creating a new filmmaking vocabulary around virtual cameras and virtual sets. His techniques are now being adopted by others, as the live action and digital worlds blend. The changes upend decades of established practice where visual effects were rigidly dictated by storyboards and digital renderings created far from the set and costly to change. For Legato, the tectonic shift began with the “pan-and-tile” system he created to template the rocket launch for Ron Howard’s Apollo 13 in 1995 (for which he received an Academy Award nomination). Having endured previous frustrations, where he was restricted in his VFX camera moves by photography of the live-action plates, he undertook a series of “generic” angles on Cape Canaveral that would allow for more options when he later constructed the shot using a combination of CG and live elements. He refined and expanded his techniques to the point where 14 years later he presented it to Cameron as a viable means of directing virtual actors in real time on the set of Avatar. (Although Legato is credited as “virtual production conceived by,” he was not part of the Avatar production team, choosing to continue his collaboration with Martin Scorsese instead). Legato credits Cameron with funding him to develop a “large-scale version” of the simulcam system that made The Jungle Book possible. Further advances meant when he joined Favreau in 2014 to start The Jungle Book, the director would be able to live-direct one actor whose performance was being captured photographically alongside a cast of virtual actors on an entirely digital set. With The Lion King (slated for 2019 release), on which Legato is Visual Effects Supervisor, the techniques will leap forward once again. “I was going for something more improvisational and lifelike, and not so studied,” recalls Legato, who had just returned from a field trip to Africa with The Jungle Book director Jon Favreau, for pre-production on The Lion King. “Visual effects sequences can have a stiffness about them, because they’re created in dark rooms, sometimes halfway around the world, by CG artists who may or may not have live-action reference footage. “It’s like, had they known what the three shots before it would have felt like, with music, they might have made that fifth one different. But you can’t do that if you’re executing a storyboard and everything just gets cut together later by a third person, the editor.” The result, in Legato’s opinion, was effects that seemed “disembodied from the film.” For The Jungle Book, the idea was to have an infinite jungle set in downtown Los Angeles, for purposes of creative control and so the filmmakers didn’t have to endure the hardships of a remote location. The need for lifelike talking animals was another

complicating factor. Legato, Visual Effects Supervisor and secondunit director on the film, was charged with creating a workflow that was to be essentially identical to shooting a live-action movie. He accomplished this in several steps, the first being to work with Favreau to template the shots, a process he calls “visualizing the movie,” as distinct from “pre-viz.” “Previsualization has a cartoon connotation, and is usually done by an outside house,” says Legato, who creates his own visualizations as part of his effects process. “The discipline and commitment to this system is what made The Jungle Book so special,” Legato says. “The whole movie was shot exactly like a live-action film.” For starters, the visualization was created by Legato not by mousing around a 3D computer environment populated by CG characters, but by connecting a virtual camera that he moved through physical space, approximating the controls on the Arri Alexa that would be used on set. This visualized world is very specific. “We know if the set is 18-feet out from the wall that there

TOP: Digital Titanic on a digital sea created by Digital Domain for the 1997 film. Legato earned an Academy Award for his work on Titanic. (Photo copyright © 1997 Twentieth Century Fox. All Rights Reserved) ABOVE LEFT: Legato at work on The Jungle Book, for which he received an Academy Award in 2017. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) ABOVE RIGHT: Candid moment on the set of Hugo with director Martin Scorsese, Director of Photography Robert Richardson, ASC, and Legato (Photo credit: Jaap Buitendijk. Copyright © 2010 Paramount Pictures/GK Films. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 23


PROFILE

TOP: Legato cites Howard Hughes’s plane crash on set in Beverly Hills as a pivotal stage in his evolution. As with the Apollo rocket launch, it became one of Legato’s signature second-unit action scenes. (Photo copyright © 2004 Warner Bros. Pictures. All Rights Reserved.) BOTTOM LEFT: Behind the scenes with a model of the Titanic. (Photo copyright © 1997 Twentieth Century Fox. All Rights Reserved.) BOTTOM RIGHT: Clock-plate on the set of 2011’s Hugo. Legato won an Academy Award for Best Visual Effects for Hugo.

24 • VFXVOICE.COM SUMMER 2017

is going to be a crane positioned here and an extension arm there. Then we execute that vision.” The big advantage was that when Legato was working in that visualized world, he could experiment with different possibilities to figure out what worked best, something that becomes if not impossible, certainly impractical when a computer artist is building scenes based on storyboards, and changing a camera angle can cost tens of thousands of dollars. “Because I’m a second unit director, I like to stage things for the camera, and what I stage for the camera is really for how it’s going to be cut together for a sequence. So my brain is always in that mode. I see the shot before me, and as I’m moving it’s telling me how to frame and how to do all of these things.” The result is that even though the final images are captured according to the visualized blueprint, the results are looser because it was mapped in a more intuitive style. “So even if it’s a visual effect, it still has this off-handed life to it, kind of like when dialogue works really well it feels like the actors are saying it for the first time. It’s fresh, it’s alive, it’s natural. The same holds true for shooting.” For The Jungle Book, Legato helped Favreau capture “the inspiration of the moment, so it would feel like this scene is really beautiful because the sun is here, and the actor with the animals over there makes a great composition. It feels great for the movie, not for the effect.” Shooting took place on a giant bluescreen stage at Los Angeles Center Studios, where the parameters of the virtual sets were

digitally mapped onto the stage. Both Favreau’s cameras and the actors had motion sensors attached, so their movements could be conformed in the virtual environment and displayed on computers in real time. The animal performances were powered by actors digitally assigned a walk cycle, or through the manipulation of motion-sensitized objects. “If they take a step, the four-legged creature takes a step. If they walk fast, the creature walks fast. There was an analog person, capable of taking direction, standing in for the creature.” The same held true for the virtual camera. Legato and Director of Photography Bill Pope, ASC, could map moves by tracking sensor-equipped platforms – a crane, a dolly, a helicopter or a hand-held camera – capturing those moves in the virtual set. Since there are no live-action characters in The Lion King, there is talk of having the actors powering the animals wearing VR headsets as they are performance-captured on the virtual set – effectively transporting them in the virtual world so they can react more immediately to each other and the environment. Legato declined to comment. “The system keeps improving no matter what the next film, which happens to be The Lion King.” Legato creates his visualizations using Autodesk’s

TOP: Train station greenscreen set for Hugo in 2010. BOTTOM: Legato with light meter on the set of The Aviator in 2003.

SUMMER 2017

VFXVOICE.COM • 25


PROFILE

TOP: Legato cites Howard Hughes’s plane crash on set in Beverly Hills as a pivotal stage in his evolution. As with the Apollo rocket launch, it became one of Legato’s signature second-unit action scenes. (Photo copyright © 2004 Warner Bros. Pictures. All Rights Reserved.) BOTTOM LEFT: Behind the scenes with a model of the Titanic. (Photo copyright © 1997 Twentieth Century Fox. All Rights Reserved.) BOTTOM RIGHT: Clock-plate on the set of 2011’s Hugo. Legato won an Academy Award for Best Visual Effects for Hugo.

24 • VFXVOICE.COM SUMMER 2017

is going to be a crane positioned here and an extension arm there. Then we execute that vision.” The big advantage was that when Legato was working in that visualized world, he could experiment with different possibilities to figure out what worked best, something that becomes if not impossible, certainly impractical when a computer artist is building scenes based on storyboards, and changing a camera angle can cost tens of thousands of dollars. “Because I’m a second unit director, I like to stage things for the camera, and what I stage for the camera is really for how it’s going to be cut together for a sequence. So my brain is always in that mode. I see the shot before me, and as I’m moving it’s telling me how to frame and how to do all of these things.” The result is that even though the final images are captured according to the visualized blueprint, the results are looser because it was mapped in a more intuitive style. “So even if it’s a visual effect, it still has this off-handed life to it, kind of like when dialogue works really well it feels like the actors are saying it for the first time. It’s fresh, it’s alive, it’s natural. The same holds true for shooting.” For The Jungle Book, Legato helped Favreau capture “the inspiration of the moment, so it would feel like this scene is really beautiful because the sun is here, and the actor with the animals over there makes a great composition. It feels great for the movie, not for the effect.” Shooting took place on a giant bluescreen stage at Los Angeles Center Studios, where the parameters of the virtual sets were

digitally mapped onto the stage. Both Favreau’s cameras and the actors had motion sensors attached, so their movements could be conformed in the virtual environment and displayed on computers in real time. The animal performances were powered by actors digitally assigned a walk cycle, or through the manipulation of motion-sensitized objects. “If they take a step, the four-legged creature takes a step. If they walk fast, the creature walks fast. There was an analog person, capable of taking direction, standing in for the creature.” The same held true for the virtual camera. Legato and Director of Photography Bill Pope, ASC, could map moves by tracking sensor-equipped platforms – a crane, a dolly, a helicopter or a hand-held camera – capturing those moves in the virtual set. Since there are no live-action characters in The Lion King, there is talk of having the actors powering the animals wearing VR headsets as they are performance-captured on the virtual set – effectively transporting them in the virtual world so they can react more immediately to each other and the environment. Legato declined to comment. “The system keeps improving no matter what the next film, which happens to be The Lion King.” Legato creates his visualizations using Autodesk’s

TOP: Train station greenscreen set for Hugo in 2010. BOTTOM: Legato with light meter on the set of The Aviator in 2003.

SUMMER 2017

VFXVOICE.COM • 25


PROFILE

“Even if it’s a visual effect, it still has this off-handed life to it, kind of like when dialogue works really well it feels like the actors are saying it for the first time. It’s fresh, it’s alive, it’s natural. The same holds true for shooting.” —Rob Legato, ASC TOP: Shutter Island lighthouse. LOWER LEFT: Legato posing with model train on the set of Hugo. LOWER RIGHT: Model of the Titanic. (Photo copyright ©1997 Twentieth Century Fox. All Rights Reserved.)

26 • VFXVOICE.COM SUMMER 2017

MotionBuilder. While he foresees Autodesk’s Maya software and Pixar’s Renderman as continuing to be the standard bearers for rendering, he thinks the hacked videogame engines that have powered these on-set virtual worlds is due for a major overhaul. “That’s not really what (the game engines) were made for,” he says, more elaborate visions dancing in his head. Like any artist, Legato excelled at using the tools at hand to advance his craft. He credits the directors with whom he’s worked with backing his play. “They like this system because it’s directable,” he says of the enthusiastic reception his techniques have received from Cameron, Howard, Scorsese and others. “In terms of the shot planning, the middleman is removed. You’re doing it virtually, it’s real-time. It’s getting more and more handsome as computers get faster and game engines get better.” He estimates Favreau stuck with the visualized camera moves for 75%-80% of the film. For the 20%-25% percent that Favreau did want to change, he let Legato go all-in to make it great.

“That’s where something like this really shines – you can focus the extra effort.” Legato is not a fan of dividing effects work up among a dozen houses. “For The Jungle Book, there was MPC and Weta. Weta did the ape scenes, about 400 shots. And MPC did the other 1,200. “I was delighted Disney let us do it that way,” he says, “because some studios insist on breaking the work apart. They think they can save money and retain more control that way, but what it does is create chaos that you’re trying to manage through triage.” Things have certainly come a long way since he got his start out of college as a commercial producer. “I didn’t have any special interest in effects,” he admits. “I was just interested in moviemaking, but the whole thing about movies is it’s all illusory – you shoot one angle, then you cut to another angle, and it looks like they’re in the same room, but they may not be. It could have been shot at a completely different time. I’ve always loved editing.” Legato would go on to membership in both the Local 600

“I didn’t have any special interest in effects. I was just interested in moviemaking, but the whole thing about movies is it’s all illusory – you shoot one angle, then you cut to another angle, and it looks like they’re in the same room, but they may not be. It could have been shot at a completely different time. I’ve always loved editing.” —Rob Legato, ASC TOP: Because the model plane they were crashing on The Aviator was expensive, Legato knew he had one chance to shoot it live. LOWER LEFT: Another plane model in The Aviator. LOWER RIGHT: Shooting the stairwell slide in Hugo.

SUMMER 2017

VFXVOICE.COM • 27


PROFILE

“Even if it’s a visual effect, it still has this off-handed life to it, kind of like when dialogue works really well it feels like the actors are saying it for the first time. It’s fresh, it’s alive, it’s natural. The same holds true for shooting.” —Rob Legato, ASC TOP: Shutter Island lighthouse. LOWER LEFT: Legato posing with model train on the set of Hugo. LOWER RIGHT: Model of the Titanic. (Photo copyright ©1997 Twentieth Century Fox. All Rights Reserved.)

26 • VFXVOICE.COM SUMMER 2017

MotionBuilder. While he foresees Autodesk’s Maya software and Pixar’s Renderman as continuing to be the standard bearers for rendering, he thinks the hacked videogame engines that have powered these on-set virtual worlds is due for a major overhaul. “That’s not really what (the game engines) were made for,” he says, more elaborate visions dancing in his head. Like any artist, Legato excelled at using the tools at hand to advance his craft. He credits the directors with whom he’s worked with backing his play. “They like this system because it’s directable,” he says of the enthusiastic reception his techniques have received from Cameron, Howard, Scorsese and others. “In terms of the shot planning, the middleman is removed. You’re doing it virtually, it’s real-time. It’s getting more and more handsome as computers get faster and game engines get better.” He estimates Favreau stuck with the visualized camera moves for 75%-80% of the film. For the 20%-25% percent that Favreau did want to change, he let Legato go all-in to make it great.

“That’s where something like this really shines – you can focus the extra effort.” Legato is not a fan of dividing effects work up among a dozen houses. “For The Jungle Book, there was MPC and Weta. Weta did the ape scenes, about 400 shots. And MPC did the other 1,200. “I was delighted Disney let us do it that way,” he says, “because some studios insist on breaking the work apart. They think they can save money and retain more control that way, but what it does is create chaos that you’re trying to manage through triage.” Things have certainly come a long way since he got his start out of college as a commercial producer. “I didn’t have any special interest in effects,” he admits. “I was just interested in moviemaking, but the whole thing about movies is it’s all illusory – you shoot one angle, then you cut to another angle, and it looks like they’re in the same room, but they may not be. It could have been shot at a completely different time. I’ve always loved editing.” Legato would go on to membership in both the Local 600

“I didn’t have any special interest in effects. I was just interested in moviemaking, but the whole thing about movies is it’s all illusory – you shoot one angle, then you cut to another angle, and it looks like they’re in the same room, but they may not be. It could have been shot at a completely different time. I’ve always loved editing.” —Rob Legato, ASC TOP: Because the model plane they were crashing on The Aviator was expensive, Legato knew he had one chance to shoot it live. LOWER LEFT: Another plane model in The Aviator. LOWER RIGHT: Shooting the stairwell slide in Hugo.

SUMMER 2017

VFXVOICE.COM • 27


PROFILE

International Cinematographer’s Guild and Local 700, the Motion Picture Editors Guild, but not before establishing himself as a VFX whiz at Robert Abel & Associates, the legendary commercial house that made its name pioneering electronic effects. Legato was hired for his live-action expertise. “I worked on a stage in the back, but I’d go over to where the visual effects guys were playing, and I’d make suggestions, to the point where they’d said, ‘you should do it.’ Eventually, I did.” From there he moved into episodic television, first MGM’s Twilight Zone series, then at Paramount, where he not only helmed visual effects, but got to direct the occasional episode of Star Trek: The Next Generation and Star Trek: Deep Space Nine. Television effects were just moving from optical to electronics. “You could only do a few optical effects for a weekly series, because the process was slow. But if you did them digitally – or at the time, the analog/digital way – you could do quite a bit more. That was just coming into vogue, and I was one of the few people who bridged the gap.” In 1993, he was recruited by Digital Domain, a visionary new feature-film visual effects house launched by Cameron, makeup effects wizard Stan Winston and Scott Ross, former head of Industrial Light & Magic. It was there that the light bulb went off while working with Ron Howard on Apollo 13. In 1999, Legato left to work for Sony Pictures Imageworks, before going freelance to work with Martin Scorsese on the 2004 Howard Hughes biopic The Aviator. He cites Hughes’s plane crash in Beverly Hills as another pivotal stage in his evolution. “I had them create a pan-and-tilt wheel in the computer, so once we had an animation I could operate the camera. That allowed it to be ‘live,’ so I could react off what I was seeing, like a real cameraman would do on a stage.” As with the Apollo rocket launch, it became one of Legato’s signature secondunit action scenes. “Between the editing and the camerawork, it became a very fluid way of telling a story. A filmmaker shoots dailies, trying to get proper coverage to give you the different options, then you put it together editorially and it makes it an exciting sequence. Because this was an expensive model we were crashing, I had one chance to shoot it live.” Since “action scenes come alive in the editing,” Legato said he didn’t want to be caught “discovering” the best way to compile the material after it had been shot, without the means to go back and try different options. “I had to practice. I practiced on the computer, and I got a pretty decent sequence that felt like it really moved. Then when I went to photograph, it had a life to it that felt like it was traditionally photographed.” The brave new world he’s helped to usher in is opening up possibilities on many fronts. “We’re building tools to bridge the digital and live-action worlds. That’s ultimately what Jim [Cameron] responded to when I showed him the virtual way. Now he can direct the movie. He can hand-hold the camera, shoot the shot he wants to do, edit it himself, literally in the next minute – you shoot it and then you go cut it together. He’s in a computer directing a film, as opposed to directing a bunch of artists trying to emulate the spontaneity of real life. So things have really changed.”

28 • VFXVOICE.COM SUMMER 2017

THE JUNGLE BOOK JOURNAL

TOP: Legato at work on The Jungle Book. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) MIDDLE: For The Jungle Book, the idea was to have an infinite jungle set in downtown Los Angeles, for purposes of creative control and so the filmmakers didn’t have to endure the hardships of a remote location. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) BOTTOM: Shooting took place on a giant bluescreen stage at Los Angeles Center Studios, where the parameters of the virtual sets were digitally mapped onto the stage. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

TOP LEFT: The need for lifelike talking animals was another complicating factor. Legato was charged with creating a workflow essentially identical to shooting a live-action movie. He worked with director Jon Favreau to template the shots, a process he calls “visualizing the movie.” (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

ABOVE: Both director Favreau’s cameras and the actors had motion sensors attached, so their movements could be conformed in the virtual environment and displayed on computers in real time. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

TOP RIGHT: The animal performances were powered by actors digitally assigned a walk cycle, or through the manipulation of motion-sensitized objects. Legato and Director of Photography Bill Pope, ASC could map moves by tracking sensor-equipped platforms – a crane, a dolly, a helicopter or a hand-held camera – capturing those moves in the virtual set. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 29


PROFILE

International Cinematographer’s Guild and Local 700, the Motion Picture Editors Guild, but not before establishing himself as a VFX whiz at Robert Abel & Associates, the legendary commercial house that made its name pioneering electronic effects. Legato was hired for his live-action expertise. “I worked on a stage in the back, but I’d go over to where the visual effects guys were playing, and I’d make suggestions, to the point where they’d said, ‘you should do it.’ Eventually, I did.” From there he moved into episodic television, first MGM’s Twilight Zone series, then at Paramount, where he not only helmed visual effects, but got to direct the occasional episode of Star Trek: The Next Generation and Star Trek: Deep Space Nine. Television effects were just moving from optical to electronics. “You could only do a few optical effects for a weekly series, because the process was slow. But if you did them digitally – or at the time, the analog/digital way – you could do quite a bit more. That was just coming into vogue, and I was one of the few people who bridged the gap.” In 1993, he was recruited by Digital Domain, a visionary new feature-film visual effects house launched by Cameron, makeup effects wizard Stan Winston and Scott Ross, former head of Industrial Light & Magic. It was there that the light bulb went off while working with Ron Howard on Apollo 13. In 1999, Legato left to work for Sony Pictures Imageworks, before going freelance to work with Martin Scorsese on the 2004 Howard Hughes biopic The Aviator. He cites Hughes’s plane crash in Beverly Hills as another pivotal stage in his evolution. “I had them create a pan-and-tilt wheel in the computer, so once we had an animation I could operate the camera. That allowed it to be ‘live,’ so I could react off what I was seeing, like a real cameraman would do on a stage.” As with the Apollo rocket launch, it became one of Legato’s signature secondunit action scenes. “Between the editing and the camerawork, it became a very fluid way of telling a story. A filmmaker shoots dailies, trying to get proper coverage to give you the different options, then you put it together editorially and it makes it an exciting sequence. Because this was an expensive model we were crashing, I had one chance to shoot it live.” Since “action scenes come alive in the editing,” Legato said he didn’t want to be caught “discovering” the best way to compile the material after it had been shot, without the means to go back and try different options. “I had to practice. I practiced on the computer, and I got a pretty decent sequence that felt like it really moved. Then when I went to photograph, it had a life to it that felt like it was traditionally photographed.” The brave new world he’s helped to usher in is opening up possibilities on many fronts. “We’re building tools to bridge the digital and live-action worlds. That’s ultimately what Jim [Cameron] responded to when I showed him the virtual way. Now he can direct the movie. He can hand-hold the camera, shoot the shot he wants to do, edit it himself, literally in the next minute – you shoot it and then you go cut it together. He’s in a computer directing a film, as opposed to directing a bunch of artists trying to emulate the spontaneity of real life. So things have really changed.”

28 • VFXVOICE.COM SUMMER 2017

THE JUNGLE BOOK JOURNAL

TOP: Legato at work on The Jungle Book. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) MIDDLE: For The Jungle Book, the idea was to have an infinite jungle set in downtown Los Angeles, for purposes of creative control and so the filmmakers didn’t have to endure the hardships of a remote location. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.) BOTTOM: Shooting took place on a giant bluescreen stage at Los Angeles Center Studios, where the parameters of the virtual sets were digitally mapped onto the stage. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

TOP LEFT: The need for lifelike talking animals was another complicating factor. Legato was charged with creating a workflow essentially identical to shooting a live-action movie. He worked with director Jon Favreau to template the shots, a process he calls “visualizing the movie.” (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

ABOVE: Both director Favreau’s cameras and the actors had motion sensors attached, so their movements could be conformed in the virtual environment and displayed on computers in real time. (Photo credit: Glen Wilson. Copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

TOP RIGHT: The animal performances were powered by actors digitally assigned a walk cycle, or through the manipulation of motion-sensitized objects. Legato and Director of Photography Bill Pope, ASC could map moves by tracking sensor-equipped platforms – a crane, a dolly, a helicopter or a hand-held camera – capturing those moves in the virtual set. (Photo copyright © 2014 Disney Enterprises, Inc. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 29


FILM

THE SPECIAL EFFECTS SEQUEL MACHINE By PAULA PARISI

TOP: Pirates of the Caribbean: Dead Men Tell No Tales (Photo credit: Film Frame. Copyright © 2017 Disney Enterprises, Inc. All Rights Reserved.) BOTTOM: Wonder Woman (Photo credit: Clay Enos. Copyright © 2015 Warner Bros. Entertainment. All Rights Reserved.)

30 • VFXVOICE.COM SUMMER 2017

After a disappointing summer in 2016, attributed largely to “sequel fatigue,” the studios realize that a number after the title does not automatically mean success. This year’s storylines seem to hold more promise, while heightened achievements of spectacle, whether through verisimilitude of long-ago lands or outrageous fantasy, have pumped expectations. This summer, seven of the 11 big visual effects releases are sequels (or in the case of Universal’s June 9 The Mummy, a reboot). Last summer, of a whopping 14 sequels only three out-earned their predecessors. “This year’s sequels look like a lot more thought and care went into them, and they’re going to be a lot better, including the level of visual effects. I don’t care if a movie has an 8 behind the title, if it’s good, people will see it,” says Comscore Senior Media Analyst Paul Dergarabedian. The Fate of the Furious is a good example. Released April 13, Universal’s eighth installment in the Fast and Furious franchise, with effects by Digital Domain and Double Negative, has already passed the $200 million domestic box-office mark this year. The fact that the film has made $950 million at the international box office (for a global cume of $1.16 billion) points up another trend that bodes well for the visual effects business: increased emphasis on the foreign markets. “Special effects are an international language and films loaded with effects do well all over the world,” Dergarabedian says, adding, “As long as there are films making more than $1 billion, that’s fine. The studios are going to keep doing more of that. It keeps the machine humming.” “War for the Planet of the Apes is like 90% visual effects or something insane. It’s practically an animated movie,” Exhibitor Relations box-office analyst Jeff Bock points out. And you have movies like Beauty and the Beast, which would have traditionally been considered animated but are now solidly in the visual effects realm. Digital Domain is arguably as responsible for actor Dan Stevens’ moving performance as the Beast as director Bill Condon. “We used a direct drive system, where Dan performed sitting in an apparatus that looked like he was from outer space. It registered upwards of 3,500 dots on his face,” the film’s producer, David Hoberman, explains. “What you see onscreen are his real eye movements, his real expressions. It was extraordinary, to have the line so completely blurred between what’s real and what’s fantasy,” says Hoberman, who calls the synthesized performance the film’s most critical element. “Unless we got the Beast right, we had no movie,” Hoberman says. The fact that these spectacles play so well on 3D IMAX screens, which sell at premium prices, also helps boost their box-office haul. To watch from within Beauty and the Beast’s “Be Our Guest” musical set piece, amid dancing napkins, whirling forks and food, and the enchanted moves of candelabra Lumiere and Coxworth the clock, “it’s an experience you don’t easily forget,” Hoberman says. He adds that at this level of visual effects, “these are complicated films that take a huge team effort over a long period of time, but are

worth it.” Movie-going audiences the world over seem to agree. With a dazzling array of eye candy in store for everyone from comic book fans (Spider-Man: Homecoming, Wonder Woman) to the classic drama crowd (King Arthur, Dunkirk), summer 2017 has the deck stacked in favor of solid box-office results. The summer season – 18 weeks that start the first Friday in May and end Labor Day weekend (which this year is the first weekend of September – have always had a disproportionate impact on the annual box-office haul, accounting for about 40% of the 52-week total. “I call it the breadbasket of the box office,” analyst Dergarabedian says, noting an over-abundance of tent pole fare for the warmer months has pushed big releases earlier in the season. Disney was out of the gate with Beauty and the Beast on March 15 and Guardians of the Galaxy Vol. 2 on May 5. The films racked a collective $672 in domestic gross ($1.6 billion worldwide) before the kids were even out of school. Dergarabedian is confident that summer 2017 will sail past the $4.4 billion mark achieved at the North American box office last year, though he thinks it will come up short of 2013’s $4.75 billion record. Exhibitor Relations’ Bock agrees that outperforming last year is a possibility, but worries that the absence of an obvious blockbuster in August could be the Achilles heel. “Two more big films in August and it would be a slam dunk,” Bock states. As things stand, he is puzzled as to why STX’s Luc Besson-directed space epic Valerian and the City of a Thousand Planets is positioned against Christopher Nolan’s WW II drama Dunkirk the weekend of July 21. “Christopher Nolan has a strong box-office track record no matter what the genre,” Bock says of the director, famous for fantasy epics (including two Batman films, Man of Steel and Interstellar).

TOP LEFT: Alien: Covenant (Photo credit: Mark Rogers. Copyright © 2017 20th Century Fox. All Rights Reserved.) FROM TOP RIGHT TO BOTTOM: Transformers: The Last Knight (Photo credit: Paramount Pictures/Bay Films. Copyright © 2017 Paramount Pictures. All Rights Reserved.) Guardians of the Galaxy, Vol. 2 (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.) The Mummy (Photo copyright © 2016 Universal Pictures. All Right Reserved.) Spider-Man: Homecoming (Photo copyright © 2017 Sony Pictures/CTMG Columbia Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 31


FILM

THE SPECIAL EFFECTS SEQUEL MACHINE By PAULA PARISI

TOP: Pirates of the Caribbean: Dead Men Tell No Tales (Photo credit: Film Frame. Copyright © 2017 Disney Enterprises, Inc. All Rights Reserved.) BOTTOM: Wonder Woman (Photo credit: Clay Enos. Copyright © 2015 Warner Bros. Entertainment. All Rights Reserved.)

30 • VFXVOICE.COM SUMMER 2017

After a disappointing summer in 2016, attributed largely to “sequel fatigue,” the studios realize that a number after the title does not automatically mean success. This year’s storylines seem to hold more promise, while heightened achievements of spectacle, whether through verisimilitude of long-ago lands or outrageous fantasy, have pumped expectations. This summer, seven of the 11 big visual effects releases are sequels (or in the case of Universal’s June 9 The Mummy, a reboot). Last summer, of a whopping 14 sequels only three out-earned their predecessors. “This year’s sequels look like a lot more thought and care went into them, and they’re going to be a lot better, including the level of visual effects. I don’t care if a movie has an 8 behind the title, if it’s good, people will see it,” says Comscore Senior Media Analyst Paul Dergarabedian. The Fate of the Furious is a good example. Released April 13, Universal’s eighth installment in the Fast and Furious franchise, with effects by Digital Domain and Double Negative, has already passed the $200 million domestic box-office mark this year. The fact that the film has made $950 million at the international box office (for a global cume of $1.16 billion) points up another trend that bodes well for the visual effects business: increased emphasis on the foreign markets. “Special effects are an international language and films loaded with effects do well all over the world,” Dergarabedian says, adding, “As long as there are films making more than $1 billion, that’s fine. The studios are going to keep doing more of that. It keeps the machine humming.” “War for the Planet of the Apes is like 90% visual effects or something insane. It’s practically an animated movie,” Exhibitor Relations box-office analyst Jeff Bock points out. And you have movies like Beauty and the Beast, which would have traditionally been considered animated but are now solidly in the visual effects realm. Digital Domain is arguably as responsible for actor Dan Stevens’ moving performance as the Beast as director Bill Condon. “We used a direct drive system, where Dan performed sitting in an apparatus that looked like he was from outer space. It registered upwards of 3,500 dots on his face,” the film’s producer, David Hoberman, explains. “What you see onscreen are his real eye movements, his real expressions. It was extraordinary, to have the line so completely blurred between what’s real and what’s fantasy,” says Hoberman, who calls the synthesized performance the film’s most critical element. “Unless we got the Beast right, we had no movie,” Hoberman says. The fact that these spectacles play so well on 3D IMAX screens, which sell at premium prices, also helps boost their box-office haul. To watch from within Beauty and the Beast’s “Be Our Guest” musical set piece, amid dancing napkins, whirling forks and food, and the enchanted moves of candelabra Lumiere and Coxworth the clock, “it’s an experience you don’t easily forget,” Hoberman says. He adds that at this level of visual effects, “these are complicated films that take a huge team effort over a long period of time, but are

worth it.” Movie-going audiences the world over seem to agree. With a dazzling array of eye candy in store for everyone from comic book fans (Spider-Man: Homecoming, Wonder Woman) to the classic drama crowd (King Arthur, Dunkirk), summer 2017 has the deck stacked in favor of solid box-office results. The summer season – 18 weeks that start the first Friday in May and end Labor Day weekend (which this year is the first weekend of September – have always had a disproportionate impact on the annual box-office haul, accounting for about 40% of the 52-week total. “I call it the breadbasket of the box office,” analyst Dergarabedian says, noting an over-abundance of tent pole fare for the warmer months has pushed big releases earlier in the season. Disney was out of the gate with Beauty and the Beast on March 15 and Guardians of the Galaxy Vol. 2 on May 5. The films racked a collective $672 in domestic gross ($1.6 billion worldwide) before the kids were even out of school. Dergarabedian is confident that summer 2017 will sail past the $4.4 billion mark achieved at the North American box office last year, though he thinks it will come up short of 2013’s $4.75 billion record. Exhibitor Relations’ Bock agrees that outperforming last year is a possibility, but worries that the absence of an obvious blockbuster in August could be the Achilles heel. “Two more big films in August and it would be a slam dunk,” Bock states. As things stand, he is puzzled as to why STX’s Luc Besson-directed space epic Valerian and the City of a Thousand Planets is positioned against Christopher Nolan’s WW II drama Dunkirk the weekend of July 21. “Christopher Nolan has a strong box-office track record no matter what the genre,” Bock says of the director, famous for fantasy epics (including two Batman films, Man of Steel and Interstellar).

TOP LEFT: Alien: Covenant (Photo credit: Mark Rogers. Copyright © 2017 20th Century Fox. All Rights Reserved.) FROM TOP RIGHT TO BOTTOM: Transformers: The Last Knight (Photo credit: Paramount Pictures/Bay Films. Copyright © 2017 Paramount Pictures. All Rights Reserved.) Guardians of the Galaxy, Vol. 2 (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.) The Mummy (Photo copyright © 2016 Universal Pictures. All Right Reserved.) Spider-Man: Homecoming (Photo copyright © 2017 Sony Pictures/CTMG Columbia Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 31


FILM

SUMMER VFX BLOCKBUSTERS By IAN FAILES

The traditional summer blockbuster period of movie releases is well underway, with the likes of Guardians of the Galaxy, Vol. 2, Alien: Covenant and Pirates of the Caribbean: Dead Men Tell No Tales already demonstrating their wares – and a multitude of visual effects shots – to the cinema-going public. Here is a breakdown of the other major effects-driven films this summer, and what to look out for from the visual effects teams involved.

TOP: The Mummy. Universal kickstarts its Mummy franchise again, after a run of earlier films that had been at the forefront of digital characters combined make-up and creature effects, sand and effects simulations, and crowd work. In this new film, starring Tom Cruise, the producers have also made heavy use of in-camera stunts and practical work, including filming in the ‘Vomit Comet’ for an aircraft crash sequence. The crash looks to be a signature visual effects shot, as does a bus sequence and plenty of world-ending, mummy-like magical destruction scenes, and some close encounters with the Mummy herself. Principal VFX houses are MPC, Double Negative and Industrial Light & Magic. Director: Alex Kurtzman. Visual Effects Supervisor: Erik Nash. (Photo copyright © 2016 Universal Pictures. All Rights Reserved.)

BOTTOM: Transformers: The Last Knight. Since the first Transformers film in 2007, Industrial Light & Magic has continued to innovate in bringing to life giant metallic robots on the screen, both in terms of animating, transforming and rendering them amid the frenetic, almost chaotic, style of Michael Bay’s franchise. The Last Knight goes into the past and present this time around, reformatting our memories of the role of Transformers in history. It also goes into space. There is more over-the-top action, of course, while the director also plays with the use of IMAX 3D cameras for stereo. Among other studios, Industrial Light & Magic, MPC, Atomic Fiction and Scanline VFX are making the images happen. Director: Michael Bay. Visual Effects Supervisor: Scott Farrar. (Photo credit: Paramount Pictures/Bay Films. Copyright © 2017 Paramount Pictures. All Rights Reserved.)

TOP: Cars 3. Certainly, Cars 3 is unlike the other live-action films listed here, being a fully animated feature, but the technology behind Pixar’s latest movie in the Cars franchise is one that has an important place among other VFX releases. Cars 3 fully adopted the studio’s new physically-based, path-tracing rendering architecture known as RIS inside of its industry standard renderer, RenderMan. The result is more adrenaline pumping – and realistic –action on screen. Director: Brian Fee. (Photo copyright © 2016 Disney/Pixar. All Rights Reserved.) BOTTOM: Wonder Woman. Amazon princess Diana Prince’s (Gal Gadot) chance meeting with a American military pilot Steve Trevor (Chris Pine) during World War I ensures that visual effects are necessary to depict period locations (London and elsewhere). There’s also the effects relating to Wonder Woman’s fighting abilities, her bracelets, powerful shield and magical lasso. All this ties into Justice League coming out later this year. Studios such as MPC and Double Negative are leading the charge. Director: Patty Jenkins. Visual Effects Supervisors: Bill Westenhofer and Frazer Churchill. (Photo credit: Clay Enos. Copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)

32 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 33


FILM

SUMMER VFX BLOCKBUSTERS By IAN FAILES

The traditional summer blockbuster period of movie releases is well underway, with the likes of Guardians of the Galaxy, Vol. 2, Alien: Covenant and Pirates of the Caribbean: Dead Men Tell No Tales already demonstrating their wares – and a multitude of visual effects shots – to the cinema-going public. Here is a breakdown of the other major effects-driven films this summer, and what to look out for from the visual effects teams involved.

TOP: The Mummy. Universal kickstarts its Mummy franchise again, after a run of earlier films that had been at the forefront of digital characters combined make-up and creature effects, sand and effects simulations, and crowd work. In this new film, starring Tom Cruise, the producers have also made heavy use of in-camera stunts and practical work, including filming in the ‘Vomit Comet’ for an aircraft crash sequence. The crash looks to be a signature visual effects shot, as does a bus sequence and plenty of world-ending, mummy-like magical destruction scenes, and some close encounters with the Mummy herself. Principal VFX houses are MPC, Double Negative and Industrial Light & Magic. Director: Alex Kurtzman. Visual Effects Supervisor: Erik Nash. (Photo copyright © 2016 Universal Pictures. All Rights Reserved.)

BOTTOM: Transformers: The Last Knight. Since the first Transformers film in 2007, Industrial Light & Magic has continued to innovate in bringing to life giant metallic robots on the screen, both in terms of animating, transforming and rendering them amid the frenetic, almost chaotic, style of Michael Bay’s franchise. The Last Knight goes into the past and present this time around, reformatting our memories of the role of Transformers in history. It also goes into space. There is more over-the-top action, of course, while the director also plays with the use of IMAX 3D cameras for stereo. Among other studios, Industrial Light & Magic, MPC, Atomic Fiction and Scanline VFX are making the images happen. Director: Michael Bay. Visual Effects Supervisor: Scott Farrar. (Photo credit: Paramount Pictures/Bay Films. Copyright © 2017 Paramount Pictures. All Rights Reserved.)

TOP: Cars 3. Certainly, Cars 3 is unlike the other live-action films listed here, being a fully animated feature, but the technology behind Pixar’s latest movie in the Cars franchise is one that has an important place among other VFX releases. Cars 3 fully adopted the studio’s new physically-based, path-tracing rendering architecture known as RIS inside of its industry standard renderer, RenderMan. The result is more adrenaline pumping – and realistic –action on screen. Director: Brian Fee. (Photo copyright © 2016 Disney/Pixar. All Rights Reserved.) BOTTOM: Wonder Woman. Amazon princess Diana Prince’s (Gal Gadot) chance meeting with a American military pilot Steve Trevor (Chris Pine) during World War I ensures that visual effects are necessary to depict period locations (London and elsewhere). There’s also the effects relating to Wonder Woman’s fighting abilities, her bracelets, powerful shield and magical lasso. All this ties into Justice League coming out later this year. Studios such as MPC and Double Negative are leading the charge. Director: Patty Jenkins. Visual Effects Supervisors: Bill Westenhofer and Frazer Churchill. (Photo credit: Clay Enos. Copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)

32 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 33


FILM

TOP: Spider-Man: Homecoming. A younger Spider-Man (Tom Holland) with a more gadget-filled suit joins the Marvel Cinematic Universe and takes on a new bad guy in Vulture, played by Michael Keaton. Oh, and Iron Man makes an appearance, too. The visual effects accomplishments in the previous webbed-avenger films are well known, and this looks to up the ante. Huge New York set pieces, a dramatic airplane fight, a splitting ferry sequence and significant blending between an on-set and digital swinging Spidey suit are key attractions. Plenty of VFX studios get involved, including Industrial Light & Magic, Sony Pictures Imageworks, Digital Domain and Luma Pictures. Director: Jon Watts. Visual Effects Supervisor: Janek Sirrs. (Photo credit: Chuck Zlotnick. Copyright © 2016 Sony Pictures Entertainment/CTMG Inc. All Rights Reserved.)

34 • VFXVOICE.COM SUMMER 2017

BOTTOM: War for the Planet of the Apes. Over the past two Apes films, Weta Digital has managed to deliver stunningly real performances – with War looking even more brutal in nature. As in the previous outings, the filmmakers have been able to rely on the digital characters without any fear of going for extreme close-ups or worrying about fur interaction. The apes even appear on horseback. Credit must also go to the motion-capture acting carried out by Andy Serkis (Caesar) and his contemporaries, along with a crew that has enabled wireless capture in some incredibly challenging locations. The result is an environment that lets actors simply act, and a workflow that brings the apes to the screen as living, breathing and emoting characters. Director: Matt Reeves. Visual Effects Supervisors: Joe Letteri, Dan Lemmon. (Photo copyright © 2016 20th Century Fox. All Rights Reserved.)

TOP: Valerian and the City of a Thousand Planets. Director Luc Besson has said of his 1997 sci-fi classic The Fifth Element that he was frustrated with the visual effects challenges at the time (both miniatures and relatively early digital approaches were used). Twenty years later, he has been able to rely on performance capture, vast bluescreen sets, and photoreal digital environments and characters to make his latest space adventure Valerian possible. Plenty of practical creature and set work are in there, too. The film, based on the comic books by Pierre Christin and Jean-Claude Mézières, and one that Besson has reportedly wanted to make since he was 10 years old, has several visual effects vendors. The bulk of the work is being handled by Industrial Light & Magic, Weta Digital and Rodeo FX. Director: Luc Besson. Visual Effects Supervisor: Scott Stokdyk. (Photo copyright © 2016 EuropaCorp. Valerian SAS – TF1 Films Production. All Rights Reserved.)

TOP: The Dark Tower. Based on Stephen King’s epic fantasy novel, The Dark Tower is set in an alternative universe where gunslinger Roland Deschain (Idris Elba) searches for a way to save his own world. Expect an array of unusual dystopian imagery, creature effects and portal transportations, thanks to several studios, including MPC. Director: Nikolaj Arcel. Visual Effects Supervisor: Nicolas Aithadi. (Photo credit: Ilze Kitshoff Copyright © 2016 Sony Pictures Entertainment/CTMG, Inc. All Rights Reserved.) BOTTOM: Dunkirk. Much is often made of director Christopher Nolan’s desire to shoot everything ‘for real’, but it is probably more correct to credit Nolan with using all the tools that are available to him. If full-scale boats or aircraft work for a shot, he’ll use them; if miniatures are more appropriate then these will be adopted; and if computer-generated imagery gives results, then that’s what will appear on screen. All three techniques appear to be part of the mix for Dunkirk, the story of the evacuation of hundreds of thousands of allied troops from northern France during World War II. Footage so far includes ships at sea, ships being destroyed and planes making bombing runs. On-set photographs even show cardboard cut-outs standing in for hundreds of soldiers on the beach. Double Negative is handling the visual effects, made all the more challenging by the use of IMAX 65mm and 65mm large-format film stock for maximum impact. Director: Christopher Nolan. Visual Effects Supervisor: Andrew Jackson. (Photo copyright © 2016 Warner Bros. Entertainment Inc., Village Roadshow (BVI) Limited and RatPac-Dune Entertainment LLC. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 35


FILM

TOP: Spider-Man: Homecoming. A younger Spider-Man (Tom Holland) with a more gadget-filled suit joins the Marvel Cinematic Universe and takes on a new bad guy in Vulture, played by Michael Keaton. Oh, and Iron Man makes an appearance, too. The visual effects accomplishments in the previous webbed-avenger films are well known, and this looks to up the ante. Huge New York set pieces, a dramatic airplane fight, a splitting ferry sequence and significant blending between an on-set and digital swinging Spidey suit are key attractions. Plenty of VFX studios get involved, including Industrial Light & Magic, Sony Pictures Imageworks, Digital Domain and Luma Pictures. Director: Jon Watts. Visual Effects Supervisor: Janek Sirrs. (Photo credit: Chuck Zlotnick. Copyright © 2016 Sony Pictures Entertainment/CTMG Inc. All Rights Reserved.)

34 • VFXVOICE.COM SUMMER 2017

BOTTOM: War for the Planet of the Apes. Over the past two Apes films, Weta Digital has managed to deliver stunningly real performances – with War looking even more brutal in nature. As in the previous outings, the filmmakers have been able to rely on the digital characters without any fear of going for extreme close-ups or worrying about fur interaction. The apes even appear on horseback. Credit must also go to the motion-capture acting carried out by Andy Serkis (Caesar) and his contemporaries, along with a crew that has enabled wireless capture in some incredibly challenging locations. The result is an environment that lets actors simply act, and a workflow that brings the apes to the screen as living, breathing and emoting characters. Director: Matt Reeves. Visual Effects Supervisors: Joe Letteri, Dan Lemmon. (Photo copyright © 2016 20th Century Fox. All Rights Reserved.)

TOP: Valerian and the City of a Thousand Planets. Director Luc Besson has said of his 1997 sci-fi classic The Fifth Element that he was frustrated with the visual effects challenges at the time (both miniatures and relatively early digital approaches were used). Twenty years later, he has been able to rely on performance capture, vast bluescreen sets, and photoreal digital environments and characters to make his latest space adventure Valerian possible. Plenty of practical creature and set work are in there, too. The film, based on the comic books by Pierre Christin and Jean-Claude Mézières, and one that Besson has reportedly wanted to make since he was 10 years old, has several visual effects vendors. The bulk of the work is being handled by Industrial Light & Magic, Weta Digital and Rodeo FX. Director: Luc Besson. Visual Effects Supervisor: Scott Stokdyk. (Photo copyright © 2016 EuropaCorp. Valerian SAS – TF1 Films Production. All Rights Reserved.)

TOP: The Dark Tower. Based on Stephen King’s epic fantasy novel, The Dark Tower is set in an alternative universe where gunslinger Roland Deschain (Idris Elba) searches for a way to save his own world. Expect an array of unusual dystopian imagery, creature effects and portal transportations, thanks to several studios, including MPC. Director: Nikolaj Arcel. Visual Effects Supervisor: Nicolas Aithadi. (Photo credit: Ilze Kitshoff Copyright © 2016 Sony Pictures Entertainment/CTMG, Inc. All Rights Reserved.) BOTTOM: Dunkirk. Much is often made of director Christopher Nolan’s desire to shoot everything ‘for real’, but it is probably more correct to credit Nolan with using all the tools that are available to him. If full-scale boats or aircraft work for a shot, he’ll use them; if miniatures are more appropriate then these will be adopted; and if computer-generated imagery gives results, then that’s what will appear on screen. All three techniques appear to be part of the mix for Dunkirk, the story of the evacuation of hundreds of thousands of allied troops from northern France during World War II. Footage so far includes ships at sea, ships being destroyed and planes making bombing runs. On-set photographs even show cardboard cut-outs standing in for hundreds of soldiers on the beach. Double Negative is handling the visual effects, made all the more challenging by the use of IMAX 65mm and 65mm large-format film stock for maximum impact. Director: Christopher Nolan. Visual Effects Supervisor: Andrew Jackson. (Photo copyright © 2016 Warner Bros. Entertainment Inc., Village Roadshow (BVI) Limited and RatPac-Dune Entertainment LLC. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 35


INDUSTRY ROUNDTABLE

UNENDING RACE AGAINST TIME

VFX SUPERVISORS AT THE INTERSECTION OF TECHNOLOGY AND CHARACTER By MICHAEL GOLDMAN

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.)

36 • VFXVOICE.COM SUMMER 2017

If there is one thing notable about the orgy of tent pole, visual effects-driven motion pictures battling for eyeballs during the summer of 2017, it’s the fact that the glut of so many “big movies” with huge, impressive digital shot counts over a summer release season is no longer a big deal – it is expected. From photoreal animals who serve as lead characters in War for the Planet of the Apes to superheroes and villains doing battle in Spider-Man: Homecoming and Wonder Woman, to spacescapes and creatures in Valerian and the City of a Thousand Planets to murderous aliens in Alien: Covenant to another giant robot battle in Transformers: The Last Knight, the return of an ancient evil, back from the dead in yet another incarnation in The Mummy, and a whole lot more, audiences can count on groundbreaking, photoreal, and highly entertaining work this summer. The industry’s top visual effects supervisors, of course, are responsible for making sure that is the outcome, and for them, the pressure never ceases – each job has to end with the same, or greater, success than the last. The responsibilities have never been more complex, the consequences never greater, but at the same time, the creative possibilities for what they are able to execute, if their collaboration with filmmakers and facilities goes right, have never been more tantalizing. VFX Voice recently caught up with a group of well-known visual effects supervisors involved with some of the major films originally slated to come out in the summer of 2017, and asked them to discuss various industry issues that please and vex them, sometimes simultaneously, as illustrated by their current or recent projects. Each supervisor highlighted different trends, themes, issues, or developments – both technical and human in nature – that they are passionate about. Following is a summary of their most compelling comments:

Jerome Chen, Visual Effects Supervisor Jumanji: Welcome to the Jungle Once I sign on to supervise a studio tent pole summer release, I am racing against time. It begins when I open an encrypted script sent by a studio and ends about 16 months later. There are many stages in this race, requiring resources akin to a high-stakes competitive racing event, an endeavor that needs great experience, instinct, money and luck. Determining whether I ‘win’ is subjective. It’s clear that the work needs to look as good as everything else coming out this summer, irrespective of how much time or money has been given to me. The script analysis process requires multiple reads. My first pass will be as a viewer, without much consideration for the VFX work. This helps me understand the story and creates context for the work. My next reads will skim for VFX-centric sentences like “the helicopter races down the canyon” or “a herd of Rhinos attacks.” Long huddles in the war room with my VFX producer are required to sketch the shape of our plan to accomplish the work. Parameters are identified. Some are fixed – namely the VFX budget, the length of our post period, and our delivery date. There is definitely a (recent) trend – the numbers indicating budget and post length are going in the wrong direction lately: down. But numbers of shots are about the same, and image quality is always expected to be summer-worthy. We might as well bring out a dartboard and crystal ball to determine the two most important variables for our budget: how many VFX shots there are, and what assets need to be built. I have my version of the movie in my head after reading the script a half-dozen times, and I try to mind-meld with the director (Jake Kasden) to make sure I have something close to what he and the studio envision. A short-term deadline looms, the first one in our race, and multi-faceted: filmmakers and studio want to see and bless our plan. We also need to form alliances with crucial partners – VFX facilities that will be awarded the work. This will spell success or failure in the execution of our plan. I need to get in the lineup before the bandwidth of the best facilities fills up from other tent pole projects. This movie can only afford two anchor facilities. Typically, I like three to support my show, but we end up picking great partners in Montreal and Melbourne. The studio blesses our plan, and we finally go to Hawaii to commence principal photography. Then, toward the end of the shoot, I read in the trades that my movie is being moved from a summer release to Christmas. My first reaction is relief. My race is always against time, and now I have more time to get it right. But then I think about it. That’s a lot more time. Like maybe another 16 weeks. We had a great plan to pull off the original goal, and now they have the nerve to give us more time? More time means more notes and iterations. But it turns out they do not really want us around for all that time. They extend by maybe a couple more weeks. The movie will sit on a virtual shelf for a few

“There is definitely a [recent] trend – the numbers indicating budget and post length are going in the wrong direction lately: down. But numbers of shots are about the same, and image quality is always expected to be summer-worthy.” —Jerome Chen, Visual Effects Supervisor, Jumanji: Welcome to the Jungle

“It is also critical to understand that one of the biggest reasons the bar continues to rise each year is because of the artists and craftspeople actually doing the work. They are so passionate and dedicated, and keep learning new things, up-skilling, and pushing the envelope of what is considered stateof-the-art visual effects.” —Dan Lemmon, Visual Effects Supervisor, War for the Planet of the Apes

“The dance between creative demands that push technology forward and advancements in technology, which allow creative visions to grow, is what keeps the industry moving forward.” —Charley Henley, Visual Effects Supervisor, Alien: Covenant

SUMMER 2017

VFXVOICE.COM • 37


INDUSTRY ROUNDTABLE

UNENDING RACE AGAINST TIME

VFX SUPERVISORS AT THE INTERSECTION OF TECHNOLOGY AND CHARACTER By MICHAEL GOLDMAN

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.)

36 • VFXVOICE.COM SUMMER 2017

If there is one thing notable about the orgy of tent pole, visual effects-driven motion pictures battling for eyeballs during the summer of 2017, it’s the fact that the glut of so many “big movies” with huge, impressive digital shot counts over a summer release season is no longer a big deal – it is expected. From photoreal animals who serve as lead characters in War for the Planet of the Apes to superheroes and villains doing battle in Spider-Man: Homecoming and Wonder Woman, to spacescapes and creatures in Valerian and the City of a Thousand Planets to murderous aliens in Alien: Covenant to another giant robot battle in Transformers: The Last Knight, the return of an ancient evil, back from the dead in yet another incarnation in The Mummy, and a whole lot more, audiences can count on groundbreaking, photoreal, and highly entertaining work this summer. The industry’s top visual effects supervisors, of course, are responsible for making sure that is the outcome, and for them, the pressure never ceases – each job has to end with the same, or greater, success than the last. The responsibilities have never been more complex, the consequences never greater, but at the same time, the creative possibilities for what they are able to execute, if their collaboration with filmmakers and facilities goes right, have never been more tantalizing. VFX Voice recently caught up with a group of well-known visual effects supervisors involved with some of the major films originally slated to come out in the summer of 2017, and asked them to discuss various industry issues that please and vex them, sometimes simultaneously, as illustrated by their current or recent projects. Each supervisor highlighted different trends, themes, issues, or developments – both technical and human in nature – that they are passionate about. Following is a summary of their most compelling comments:

Jerome Chen, Visual Effects Supervisor Jumanji: Welcome to the Jungle Once I sign on to supervise a studio tent pole summer release, I am racing against time. It begins when I open an encrypted script sent by a studio and ends about 16 months later. There are many stages in this race, requiring resources akin to a high-stakes competitive racing event, an endeavor that needs great experience, instinct, money and luck. Determining whether I ‘win’ is subjective. It’s clear that the work needs to look as good as everything else coming out this summer, irrespective of how much time or money has been given to me. The script analysis process requires multiple reads. My first pass will be as a viewer, without much consideration for the VFX work. This helps me understand the story and creates context for the work. My next reads will skim for VFX-centric sentences like “the helicopter races down the canyon” or “a herd of Rhinos attacks.” Long huddles in the war room with my VFX producer are required to sketch the shape of our plan to accomplish the work. Parameters are identified. Some are fixed – namely the VFX budget, the length of our post period, and our delivery date. There is definitely a (recent) trend – the numbers indicating budget and post length are going in the wrong direction lately: down. But numbers of shots are about the same, and image quality is always expected to be summer-worthy. We might as well bring out a dartboard and crystal ball to determine the two most important variables for our budget: how many VFX shots there are, and what assets need to be built. I have my version of the movie in my head after reading the script a half-dozen times, and I try to mind-meld with the director (Jake Kasden) to make sure I have something close to what he and the studio envision. A short-term deadline looms, the first one in our race, and multi-faceted: filmmakers and studio want to see and bless our plan. We also need to form alliances with crucial partners – VFX facilities that will be awarded the work. This will spell success or failure in the execution of our plan. I need to get in the lineup before the bandwidth of the best facilities fills up from other tent pole projects. This movie can only afford two anchor facilities. Typically, I like three to support my show, but we end up picking great partners in Montreal and Melbourne. The studio blesses our plan, and we finally go to Hawaii to commence principal photography. Then, toward the end of the shoot, I read in the trades that my movie is being moved from a summer release to Christmas. My first reaction is relief. My race is always against time, and now I have more time to get it right. But then I think about it. That’s a lot more time. Like maybe another 16 weeks. We had a great plan to pull off the original goal, and now they have the nerve to give us more time? More time means more notes and iterations. But it turns out they do not really want us around for all that time. They extend by maybe a couple more weeks. The movie will sit on a virtual shelf for a few

“There is definitely a [recent] trend – the numbers indicating budget and post length are going in the wrong direction lately: down. But numbers of shots are about the same, and image quality is always expected to be summer-worthy.” —Jerome Chen, Visual Effects Supervisor, Jumanji: Welcome to the Jungle

“It is also critical to understand that one of the biggest reasons the bar continues to rise each year is because of the artists and craftspeople actually doing the work. They are so passionate and dedicated, and keep learning new things, up-skilling, and pushing the envelope of what is considered stateof-the-art visual effects.” —Dan Lemmon, Visual Effects Supervisor, War for the Planet of the Apes

“The dance between creative demands that push technology forward and advancements in technology, which allow creative visions to grow, is what keeps the industry moving forward.” —Charley Henley, Visual Effects Supervisor, Alien: Covenant

SUMMER 2017

VFXVOICE.COM • 37


INDUSTRY ROUNDTABLE

months before its release. Editor’s Note: Jumanji: Welcome to the Jungle was originally scheduled for a summer 2017 release before recently moving to a Christmas debut. ACCEPT THE SCIENCE, UNLEASH THE ARTIST

“Now that we’ve become accustomed to seeing high-end VFX everywhere, I believe it makes just as much or even more sense to start spending time evaluating the quality of the artistry, and how well visual effects are used to tell a better story.” —Bill Westenhofer, Visual Effects Supervisor, Wonder Woman

38 • VFXVOICE.COM SUMMER 2017

“There have been a number of hardware and software improvements over the last 20 years in VFX, but what is even more phenomenal is the growth in the availability of amazing VFX talent that has evolved in that same time period. The accumulated human knowledge and skills of thousands of people are what have really led to the spread of the sophisticated VFX you see everywhere on (major projects) today.” —Scott Stokdyk, Visual Effects Supervisor, Valerian and the City of a Thousand Planets

Charley Henley, Visual Effects Supervisor Alien: Covenant We’ve always had the VFX challenge of re-creating a realistic, photographic look. From how materials respond to light to the dynamics of hair in water to the movement of cloth, advancements in technology, software, and the increase in computing power have played a large hand in keeping the quality of VFX work ahead of audience expectations. We used to spend a lot of man-hours and creative skill in order to balance the CG world with the real one. It was the job of the artist to fake the photographic look. But advancements in lighting, such as ray tracing becoming workable on a film scale, and improvements in simulating the dynamic physics of the natural world, from water flow to cloth simulations, mean the science in software can now do much of this work. This allows companies to be more efficient, and artists and supervisors to concentrate on the creative potential of working in a CG world. For some of this, artists had to unlearn old techniques and allow the science in the software to take over. Creatively, for some, this can feel like a loss of control – some old skills aren’t needed, like an old manufacturing trade giving way to machine manufacturing. However, this progress allows filmmakers and VFX artists to push the boundaries of how we can imagine images, to further the story and expand the communication language of film. The dance between creative demands that push technology forward and advancements in technology, which allow creative visions to grow, is what keeps the industry moving forward. For example, when we were working with the lighting and compositing teams at MPC for The Jungle Book [2016], we had the challenge of matching the CG environment to the look of the footage, as well as making CG animals feel photoreal. We decided to invest fully in Pixar Renderman’s latest ray-tracing capabilities, which were very computationally heavy, but gave us the best chance of recreating natural-looking light. So we started by designing the scenes and the lighting setups quite creatively, manipulating the high dynamic range imagery (HDRI) to get a result we thought aesthetic to the film, but it never looked real. Therefore, we finally gave in to the science of the software. Our creatures and environments were look-developed under a set of natural light HDRIs, ranging from overcast to sunny. By using these very clean setups for our scenes, we played to the renderer’s programing with a delicate hand, and the results were stunning. In compositing for that kind of work, the trick was to not overwork the renders. Looking to the future of VFX, I would imagine the crossover from game and virtual reality technology to film would play a part. More scenes may become fully CG as the cost of building CG sets becomes less than practical counterparts in many cases. Virtual


INDUSTRY ROUNDTABLE

months before its release. Editor’s Note: Jumanji: Welcome to the Jungle was originally scheduled for a summer 2017 release before recently moving to a Christmas debut. ACCEPT THE SCIENCE, UNLEASH THE ARTIST

“Now that we’ve become accustomed to seeing high-end VFX everywhere, I believe it makes just as much or even more sense to start spending time evaluating the quality of the artistry, and how well visual effects are used to tell a better story.” —Bill Westenhofer, Visual Effects Supervisor, Wonder Woman

38 • VFXVOICE.COM SUMMER 2017

“There have been a number of hardware and software improvements over the last 20 years in VFX, but what is even more phenomenal is the growth in the availability of amazing VFX talent that has evolved in that same time period. The accumulated human knowledge and skills of thousands of people are what have really led to the spread of the sophisticated VFX you see everywhere on (major projects) today.” —Scott Stokdyk, Visual Effects Supervisor, Valerian and the City of a Thousand Planets

Charley Henley, Visual Effects Supervisor Alien: Covenant We’ve always had the VFX challenge of re-creating a realistic, photographic look. From how materials respond to light to the dynamics of hair in water to the movement of cloth, advancements in technology, software, and the increase in computing power have played a large hand in keeping the quality of VFX work ahead of audience expectations. We used to spend a lot of man-hours and creative skill in order to balance the CG world with the real one. It was the job of the artist to fake the photographic look. But advancements in lighting, such as ray tracing becoming workable on a film scale, and improvements in simulating the dynamic physics of the natural world, from water flow to cloth simulations, mean the science in software can now do much of this work. This allows companies to be more efficient, and artists and supervisors to concentrate on the creative potential of working in a CG world. For some of this, artists had to unlearn old techniques and allow the science in the software to take over. Creatively, for some, this can feel like a loss of control – some old skills aren’t needed, like an old manufacturing trade giving way to machine manufacturing. However, this progress allows filmmakers and VFX artists to push the boundaries of how we can imagine images, to further the story and expand the communication language of film. The dance between creative demands that push technology forward and advancements in technology, which allow creative visions to grow, is what keeps the industry moving forward. For example, when we were working with the lighting and compositing teams at MPC for The Jungle Book [2016], we had the challenge of matching the CG environment to the look of the footage, as well as making CG animals feel photoreal. We decided to invest fully in Pixar Renderman’s latest ray-tracing capabilities, which were very computationally heavy, but gave us the best chance of recreating natural-looking light. So we started by designing the scenes and the lighting setups quite creatively, manipulating the high dynamic range imagery (HDRI) to get a result we thought aesthetic to the film, but it never looked real. Therefore, we finally gave in to the science of the software. Our creatures and environments were look-developed under a set of natural light HDRIs, ranging from overcast to sunny. By using these very clean setups for our scenes, we played to the renderer’s programing with a delicate hand, and the results were stunning. In compositing for that kind of work, the trick was to not overwork the renders. Looking to the future of VFX, I would imagine the crossover from game and virtual reality technology to film would play a part. More scenes may become fully CG as the cost of building CG sets becomes less than practical counterparts in many cases. Virtual


INDUSTRY ROUNDTABLE

production using game-engine technology may become the norm and, eventually, final renders will become achievable through the game engine, allowing real-time shooting of CG scenes. This could eventually put VFX production closer in time scale to a practical shoot. APE-BUILDING EVOLVES IN LESS THAN A DECADE

TOP: War for the Planet of the Apes (Photo copyright © 2016 20th Century Fox. All Rights Reserved.) BOTTOM: Alien: Covenant. (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

40 • VFXVOICE.COM SUMMER 2017

Dan Lemmon, Visual Effects Supervisor War for the Planet of the Apes War for the Planet of the Apes represents a big step forward for WETA Digital, not just technologically, but also in terms of the maturation of our craft. On the technical side, it’s the first film in the Apes franchise to fully embrace our Manuka ray-tracing pipeline. It is hard to overstate the significance of switching from a spherical harmonics lighting pipeline to a full ray tracer, especially for furry characters. Our lighting artists previously had to spend their energy wrangling pre-cached passes and dialing bias values and other parameters that only roughly approximated the way light really moves through an environment. Now, they are able to spend that energy making creative decisions on a larger number of shots that already look much more realistic out of the gate. Our shading and texturing artists are able to focus on adding more realistic makeup and environmental effects – cuts, blood, mud, tears, snow, ice – that are modeled more closely on the real physics of those natural phenomenon, rather than crude approximations that break down under different lighting situations. We completely rewrote our hair-shading model between The Jungle Book and War for the Planet of the Apes to more accurately model the cuticle/medulla interface, and the result is more natural, with accurate glints and break-up in backlit setups. War for the Planet of the Apes is also the first film to use our new

ecosystem modeling and simulation tool, called Totara. Totara creates forests and jungles as simulations where the seeds of different species are scattered across terrain and then compete for resources as they grow to maturity, reproduce, and eventually die. The result is pretty stunning in its realism and variation, and also in the render-efficiency of the environment it creates. Totara heavily leverages our instancing scheme, but at the branch level rather than the whole plant, which means you can still get lots of plant-to-plant variation while leveraging the efficiency of instancing. It is also critical to understand that one of the biggest reasons the bar continues to rise each year is because of the artists and craftspeople actually doing the work. They are so passionate and dedicated, and keep learning new things, up-skilling, and pushing the envelope of what is considered state-of-the-art visual effects. I could point to any number of sub-disciplines within VFX, but just to pick one near to my heart, it’s amazing to take a step back and look at how far character facial performance has come since we started making (Apes) films in 2010. On our first movie, Rise of the Planet of the Apes, we were coming off Avatar, where we had done a whole lot of facial animation, and it was still a struggle to make (lead character) Caesar perform in a way that carried the same emotional intensity and subtlety that we saw in (actor) Andy Serkis’s performance. Seven years later, we’ve learned a ton, and with each new character we build, we have greater confidence about what is required to make the characters perform the way we need them to perform. All this loving detail pays off as rich, emotive characters that live and breathe on screen – characters that connect with the audience in a way that carries a film and allows us to tell stories that might not have been possible even 10 years ago.

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.) BOTTOM: Themyscira, home of the Amazons, in Wonder Woman. (Photo copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 41


INDUSTRY ROUNDTABLE

production using game-engine technology may become the norm and, eventually, final renders will become achievable through the game engine, allowing real-time shooting of CG scenes. This could eventually put VFX production closer in time scale to a practical shoot. APE-BUILDING EVOLVES IN LESS THAN A DECADE

TOP: War for the Planet of the Apes (Photo copyright © 2016 20th Century Fox. All Rights Reserved.) BOTTOM: Alien: Covenant. (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

40 • VFXVOICE.COM SUMMER 2017

Dan Lemmon, Visual Effects Supervisor War for the Planet of the Apes War for the Planet of the Apes represents a big step forward for WETA Digital, not just technologically, but also in terms of the maturation of our craft. On the technical side, it’s the first film in the Apes franchise to fully embrace our Manuka ray-tracing pipeline. It is hard to overstate the significance of switching from a spherical harmonics lighting pipeline to a full ray tracer, especially for furry characters. Our lighting artists previously had to spend their energy wrangling pre-cached passes and dialing bias values and other parameters that only roughly approximated the way light really moves through an environment. Now, they are able to spend that energy making creative decisions on a larger number of shots that already look much more realistic out of the gate. Our shading and texturing artists are able to focus on adding more realistic makeup and environmental effects – cuts, blood, mud, tears, snow, ice – that are modeled more closely on the real physics of those natural phenomenon, rather than crude approximations that break down under different lighting situations. We completely rewrote our hair-shading model between The Jungle Book and War for the Planet of the Apes to more accurately model the cuticle/medulla interface, and the result is more natural, with accurate glints and break-up in backlit setups. War for the Planet of the Apes is also the first film to use our new

ecosystem modeling and simulation tool, called Totara. Totara creates forests and jungles as simulations where the seeds of different species are scattered across terrain and then compete for resources as they grow to maturity, reproduce, and eventually die. The result is pretty stunning in its realism and variation, and also in the render-efficiency of the environment it creates. Totara heavily leverages our instancing scheme, but at the branch level rather than the whole plant, which means you can still get lots of plant-to-plant variation while leveraging the efficiency of instancing. It is also critical to understand that one of the biggest reasons the bar continues to rise each year is because of the artists and craftspeople actually doing the work. They are so passionate and dedicated, and keep learning new things, up-skilling, and pushing the envelope of what is considered state-of-the-art visual effects. I could point to any number of sub-disciplines within VFX, but just to pick one near to my heart, it’s amazing to take a step back and look at how far character facial performance has come since we started making (Apes) films in 2010. On our first movie, Rise of the Planet of the Apes, we were coming off Avatar, where we had done a whole lot of facial animation, and it was still a struggle to make (lead character) Caesar perform in a way that carried the same emotional intensity and subtlety that we saw in (actor) Andy Serkis’s performance. Seven years later, we’ve learned a ton, and with each new character we build, we have greater confidence about what is required to make the characters perform the way we need them to perform. All this loving detail pays off as rich, emotive characters that live and breathe on screen – characters that connect with the audience in a way that carries a film and allows us to tell stories that might not have been possible even 10 years ago.

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.) BOTTOM: Themyscira, home of the Amazons, in Wonder Woman. (Photo copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 41


INDUSTRY ROUNDTABLE

VFX MATURE WHEN VFX PEOPLE MATURE

TOP: War for the Planet of the Apes (Photo copyright © 2016 20th Century Fox. All Rights Reserved.) BOTTOM: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

42 • VFXVOICE.COM SUMMER 2017

Scott Stokdyk, Visual Effects Supervisor Valerian and the City of a Thousand Planets There have been a number of hardware and software improvements over the last 20 years in VFX, but what is even more phenomenal is the growth in the availability of amazing VFX talent that has evolved in that same time period. The accumulated human knowledge and skills of thousands of people are what have really led to the spread of the sophisticated VFX you see everywhere on (major projects) today. As the effects industry transitioned in the 1990’s from being a primarily practical/optical industry into a largely digital industry, a lot of people in VFX at the time were young and just starting their careers. By 2017, that talent has developed, matured and honed best practices. Many hard-learned lessons have led to a collective rise in the quality of VFX. We see less inventing things from scratch now, not because it isn’t do-able when necessary, but rather because it isn’t as necessary anymore. Instead, we see more tweaking and reworking of existing methods. Indeed, it is exceptionally hard now to find something that does not have its roots in a previous VFX idea or method. Almost 30 of the other digital artists who I worked with on [1997’s] The Fifth Element [directed by Luc Besson, who also helms Valerian] went on later to work as visual effects supervisors in some capacity. Of course, that doesn’t take into account all the other VFX talent that makes the work happen besides the artists – producers, software engineers, and so on. This group spends thousands of hours of metaphorically beating their heads against the wall until we all find a way to break through on different projects. I think the result of this accumulated VFX intellectual capital is that contemporary VFX supervisors now spend less time hammering CG to look “real,” and focus more on higher-level


INDUSTRY ROUNDTABLE

VFX MATURE WHEN VFX PEOPLE MATURE

TOP: War for the Planet of the Apes (Photo copyright © 2016 20th Century Fox. All Rights Reserved.) BOTTOM: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

42 • VFXVOICE.COM SUMMER 2017

Scott Stokdyk, Visual Effects Supervisor Valerian and the City of a Thousand Planets There have been a number of hardware and software improvements over the last 20 years in VFX, but what is even more phenomenal is the growth in the availability of amazing VFX talent that has evolved in that same time period. The accumulated human knowledge and skills of thousands of people are what have really led to the spread of the sophisticated VFX you see everywhere on (major projects) today. As the effects industry transitioned in the 1990’s from being a primarily practical/optical industry into a largely digital industry, a lot of people in VFX at the time were young and just starting their careers. By 2017, that talent has developed, matured and honed best practices. Many hard-learned lessons have led to a collective rise in the quality of VFX. We see less inventing things from scratch now, not because it isn’t do-able when necessary, but rather because it isn’t as necessary anymore. Instead, we see more tweaking and reworking of existing methods. Indeed, it is exceptionally hard now to find something that does not have its roots in a previous VFX idea or method. Almost 30 of the other digital artists who I worked with on [1997’s] The Fifth Element [directed by Luc Besson, who also helms Valerian] went on later to work as visual effects supervisors in some capacity. Of course, that doesn’t take into account all the other VFX talent that makes the work happen besides the artists – producers, software engineers, and so on. This group spends thousands of hours of metaphorically beating their heads against the wall until we all find a way to break through on different projects. I think the result of this accumulated VFX intellectual capital is that contemporary VFX supervisors now spend less time hammering CG to look “real,” and focus more on higher-level


INDUSTRY ROUNDTABLE

issues like creative designs and concepts. It is not enough to deliver a believable emotional performance from a CG character anymore. Now, you have to do it in a fresh-looking, creative way. Maybe that involves combining interesting effects into character design, or injecting a different reference into the animation performance or any of an endless list of possibilities. When VFX pioneers like John Dykstra combined computers and cameras to create motion-control systems, it took a higherlevel view of the creative aspect of VFX work, and its needs to accomplish that. But it also required an amazing movie framework like Star Wars [1977] to really make a big impact. Today, there are a lot of clever VFX people from many backgrounds that bring their own skills to this mix. It is just a matter of being presented the right challenges in the right movie to produce the next interesting VFX combination. IT’S ABOUT CHARACTER DEVELOPMENT, NOT TECHNIQUES

Bill Westenhofer, Visual Effects Supervisor Wonder Woman Visual effects have indeed become ubiquitous in entertainment of all forms lately. Not only are there more summer blockbusters featuring huge numbers of effects shots, but now, just about everything you see, including dramas and TV series, employ extensive VFX work. Perhaps, therefore, we’ve arrived at a threshold where the way we discuss visual effects needs to change. For years, techniques were so new and their use was novel enough that the spectacle alone could generate audience excitement. It was fitting to devote articles on the next ‘big thing’ that detailed the latest advancement in a new process or technique. But now that we’ve become accustomed to seeing high-end VFX everywhere, I believe it makes just as much or even more sense to start spending time evaluating the quality of the artistry, and how well visual effects are used to tell a better story. To be sure, there will continue to be advancements and new milestones, but now

44 • VFXVOICE.COM SUMMER 2017

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.)

that we’ve come so far and the tools and hardware are available to so many people, I’d love to see us talk about who made the coolest imagery, even if it relies on tried-and-true techniques. I recently finished supervising the VFX work on Wonder Woman (directed by Patty Jenkins). The work includes much of what is expected in a super hero(ine) movie, including digital environments, digital doubles and destruction. The level of detail and fidelity of the digital doubles and face replacements created by Double Negative (DNeg) and Moving Picture Company (MPC) allowed us to do some really fun things action-wise, and even allowed Gal Gadot [as the title character] to be a part of otherwise dangerous shots through the use of digital doubles. This includes shots that wouldn’t have been possible a few years ago, but the use of digital doubles as a process in and of itself is not new. One thing that sets Wonder Woman apart from other films in the genre is the compelling character drama between Diana (Gal Gadot) and Steve Trevor (Chris Pine). Patty Jenkins crafted a journey for them that stands on its own, and I’m proud to say our visual effects work plays a supporting role to create the world in which that happens. Hopefully, if we’ve done our work well enough, we’ll have helped make Diana a badass Amazonian warrior, people will find Themyscira – home of the Amazons, where Diana comes from originally – a beautiful place, and the final battle will have served as a fitting action setting for the culmination of her story. But Patty strived for a balance where the visuals of the film never overshadowed character development, but rather complemented it. Of course, I could talk about how we specifically employed motion capture, how much of the set was digital at the end of the day, and perhaps surprise people regarding which shots actually feature face-replacements, but I’d be just as satisfied if audiences just get lost in the story and appreciate the work as part of the greater whole.


INDUSTRY ROUNDTABLE

issues like creative designs and concepts. It is not enough to deliver a believable emotional performance from a CG character anymore. Now, you have to do it in a fresh-looking, creative way. Maybe that involves combining interesting effects into character design, or injecting a different reference into the animation performance or any of an endless list of possibilities. When VFX pioneers like John Dykstra combined computers and cameras to create motion-control systems, it took a higherlevel view of the creative aspect of VFX work, and its needs to accomplish that. But it also required an amazing movie framework like Star Wars [1977] to really make a big impact. Today, there are a lot of clever VFX people from many backgrounds that bring their own skills to this mix. It is just a matter of being presented the right challenges in the right movie to produce the next interesting VFX combination. IT’S ABOUT CHARACTER DEVELOPMENT, NOT TECHNIQUES

Bill Westenhofer, Visual Effects Supervisor Wonder Woman Visual effects have indeed become ubiquitous in entertainment of all forms lately. Not only are there more summer blockbusters featuring huge numbers of effects shots, but now, just about everything you see, including dramas and TV series, employ extensive VFX work. Perhaps, therefore, we’ve arrived at a threshold where the way we discuss visual effects needs to change. For years, techniques were so new and their use was novel enough that the spectacle alone could generate audience excitement. It was fitting to devote articles on the next ‘big thing’ that detailed the latest advancement in a new process or technique. But now that we’ve become accustomed to seeing high-end VFX everywhere, I believe it makes just as much or even more sense to start spending time evaluating the quality of the artistry, and how well visual effects are used to tell a better story. To be sure, there will continue to be advancements and new milestones, but now

44 • VFXVOICE.COM SUMMER 2017

TOP: Valerian and the City of a Thousand Planets (Photo copyright © 2016 EuropaCorp, Valerian SAS – TF1 Films Production. All Rights Reserved.)

that we’ve come so far and the tools and hardware are available to so many people, I’d love to see us talk about who made the coolest imagery, even if it relies on tried-and-true techniques. I recently finished supervising the VFX work on Wonder Woman (directed by Patty Jenkins). The work includes much of what is expected in a super hero(ine) movie, including digital environments, digital doubles and destruction. The level of detail and fidelity of the digital doubles and face replacements created by Double Negative (DNeg) and Moving Picture Company (MPC) allowed us to do some really fun things action-wise, and even allowed Gal Gadot [as the title character] to be a part of otherwise dangerous shots through the use of digital doubles. This includes shots that wouldn’t have been possible a few years ago, but the use of digital doubles as a process in and of itself is not new. One thing that sets Wonder Woman apart from other films in the genre is the compelling character drama between Diana (Gal Gadot) and Steve Trevor (Chris Pine). Patty Jenkins crafted a journey for them that stands on its own, and I’m proud to say our visual effects work plays a supporting role to create the world in which that happens. Hopefully, if we’ve done our work well enough, we’ll have helped make Diana a badass Amazonian warrior, people will find Themyscira – home of the Amazons, where Diana comes from originally – a beautiful place, and the final battle will have served as a fitting action setting for the culmination of her story. But Patty strived for a balance where the visuals of the film never overshadowed character development, but rather complemented it. Of course, I could talk about how we specifically employed motion capture, how much of the set was digital at the end of the day, and perhaps surprise people regarding which shots actually feature face-replacements, but I’d be just as satisfied if audiences just get lost in the story and appreciate the work as part of the greater whole.


COVER

GAME OF THRONES: VFX ROYALTY By IAN FAILES

The HBO television series Game of Thrones is nothing short of a visual effects phenomenon. Not only has the show dominated the landscape in recent VFX Emmy and VES Awards history, it has also changed the game in how visual effects are produced for TV. Each episode in the series’ six seasons so far – the seventh begins in July – is laden with effects challenges, from detailed city environments, to major battles and, of course, complex flying and fire-breathing dragons. The results are sequences and shots with a cinematic feel that might normally be the domain of blockbuster films. The show has proved dominant in the television visual effects awards space, winning five of the last Primetime Emmys for visual effects, the main episodic VES Award, for the past six years, and an additional four VES Awards in different categories at this year’s ceremony alone. So, what is the secret behind Game of Thrones’ success? It is due primarily to the show’s visual effects team, Visual Effects Supervisor Joe Bauer and Visual Effects Producer Steve Kullback. The duo shepherd a global team of effects artists across many time zones. They give us a small sneak peek at what to expect from the upcoming season. AN EVOLUTION OF EFFECTS

All photos courtesy of HBO

TOP: VFX Producer Steve Kullback: “The biggest challenge on the show is how many steps of production are going on at once, where we’re setting up for the battle that hasn’t been shot yet while we’re shooting another battle, while we’re starting post on the other third. It’s been quite a wild ride.”

46 • VFXVOICE.COM SUMMER 2017

Ever since producers David Benioff and D. B. Weiss first adapted A Song of Ice and Fire novels by George R. R. Martin to create Game of Thrones in 2011 for HBO, visual effects have been, and continue to be, a crucial part in the storytelling process. Bauer, who came on in season three (2013), says the volume of VFX shots has tripled since his first season. Another evolution has been technological, notes Bauer. “Each season we bring in new technology to facilitate not just the cooler shots, but getting the shots done in the time frame. That makes

shooting photographic elements even more important, as opposed to things that you would otherwise do in CG.” An example of this change in the way the visual effects have been approached is dragon fire. It is an effect that would most commonly be achieved with digital simulations, since it makes for easier control and light interaction. But, says Bauer, “the time to get it dialed in properly exceeds the time we have,” so in more recent seasons the VFX team has been utilizing, as much as possible, real flamethrower elements shot practically. And even that technology has had to evolve to suit more action, the fast-paced nature of the production, and growing dragons, which are reportedly the size of 747 airplanes this time around. “In season five we shot the dragon fire on a motion-control Technocrane,” relates Bauer. “Then season six we were on a Titan Technocrane and, for season seven, because the dragons had doubled in size again, we enlisted Spidercam and convinced them to put our flamethrower on their rig. So we were flying the fire source all over the stage, and, in addition, flying the camera all over the stage, too.” VFX Producer Steve Kullback, who has been with the show since season two, adds that the evolution in the approach to VFX has been integrated at the planning stage. “When I started, the producers had not really used concept artists before,” he says. “And they were a little hesitant at first because they were concerned that the vision could become someone else’s vision and less their own, until they gained the confidence that we were on the same page and working exclusively in the service of the story. And then we started working with previs, and they were a little hesitant about that until they came to understand that it could be a tool that worked in their favor and they got to see exactly what they were asking for.”

“Each season we bring in new technology to facilitate not just the cooler shots, but getting the shots done in the time frame. That makes shooting photographic elements even more important, as opposed to things that you would otherwise do in CG.” —Joe Bauer, Visual Effects Supervisor TOP: Over the series, several companies have delved into CG dragon work, including Rhythm & Hues, Pixomondo and BlueBolt. Says VFX Supervisor Joe Bauer on the upcoming season: “I don’t know if I’ll ever again get to work on a show that spends as much time just shooting visual effects as this series this year.” BOTTOM: Scale tricks and intricate compositing have helped bring the giant Wun Wun to life in a number of episodes.

SUMMER 2017

VFXVOICE.COM • 47


COVER

GAME OF THRONES: VFX ROYALTY By IAN FAILES

The HBO television series Game of Thrones is nothing short of a visual effects phenomenon. Not only has the show dominated the landscape in recent VFX Emmy and VES Awards history, it has also changed the game in how visual effects are produced for TV. Each episode in the series’ six seasons so far – the seventh begins in July – is laden with effects challenges, from detailed city environments, to major battles and, of course, complex flying and fire-breathing dragons. The results are sequences and shots with a cinematic feel that might normally be the domain of blockbuster films. The show has proved dominant in the television visual effects awards space, winning five of the last Primetime Emmys for visual effects, the main episodic VES Award, for the past six years, and an additional four VES Awards in different categories at this year’s ceremony alone. So, what is the secret behind Game of Thrones’ success? It is due primarily to the show’s visual effects team, Visual Effects Supervisor Joe Bauer and Visual Effects Producer Steve Kullback. The duo shepherd a global team of effects artists across many time zones. They give us a small sneak peek at what to expect from the upcoming season. AN EVOLUTION OF EFFECTS

All photos courtesy of HBO

TOP: VFX Producer Steve Kullback: “The biggest challenge on the show is how many steps of production are going on at once, where we’re setting up for the battle that hasn’t been shot yet while we’re shooting another battle, while we’re starting post on the other third. It’s been quite a wild ride.”

46 • VFXVOICE.COM SUMMER 2017

Ever since producers David Benioff and D. B. Weiss first adapted A Song of Ice and Fire novels by George R. R. Martin to create Game of Thrones in 2011 for HBO, visual effects have been, and continue to be, a crucial part in the storytelling process. Bauer, who came on in season three (2013), says the volume of VFX shots has tripled since his first season. Another evolution has been technological, notes Bauer. “Each season we bring in new technology to facilitate not just the cooler shots, but getting the shots done in the time frame. That makes

shooting photographic elements even more important, as opposed to things that you would otherwise do in CG.” An example of this change in the way the visual effects have been approached is dragon fire. It is an effect that would most commonly be achieved with digital simulations, since it makes for easier control and light interaction. But, says Bauer, “the time to get it dialed in properly exceeds the time we have,” so in more recent seasons the VFX team has been utilizing, as much as possible, real flamethrower elements shot practically. And even that technology has had to evolve to suit more action, the fast-paced nature of the production, and growing dragons, which are reportedly the size of 747 airplanes this time around. “In season five we shot the dragon fire on a motion-control Technocrane,” relates Bauer. “Then season six we were on a Titan Technocrane and, for season seven, because the dragons had doubled in size again, we enlisted Spidercam and convinced them to put our flamethrower on their rig. So we were flying the fire source all over the stage, and, in addition, flying the camera all over the stage, too.” VFX Producer Steve Kullback, who has been with the show since season two, adds that the evolution in the approach to VFX has been integrated at the planning stage. “When I started, the producers had not really used concept artists before,” he says. “And they were a little hesitant at first because they were concerned that the vision could become someone else’s vision and less their own, until they gained the confidence that we were on the same page and working exclusively in the service of the story. And then we started working with previs, and they were a little hesitant about that until they came to understand that it could be a tool that worked in their favor and they got to see exactly what they were asking for.”

“Each season we bring in new technology to facilitate not just the cooler shots, but getting the shots done in the time frame. That makes shooting photographic elements even more important, as opposed to things that you would otherwise do in CG.” —Joe Bauer, Visual Effects Supervisor TOP: Over the series, several companies have delved into CG dragon work, including Rhythm & Hues, Pixomondo and BlueBolt. Says VFX Supervisor Joe Bauer on the upcoming season: “I don’t know if I’ll ever again get to work on a show that spends as much time just shooting visual effects as this series this year.” BOTTOM: Scale tricks and intricate compositing have helped bring the giant Wun Wun to life in a number of episodes.

SUMMER 2017

VFXVOICE.COM • 47


COVER

between the departments,” comments Bauer. “In VFX, we could be placing tracking markers on the set or running a giant stuffie head on a pole up and down the field. And then later on we are quite substantially in it to finish the shots.” BIG VFX FOR TV

TOP LEFT: Drogon enters the arena to save Daenerys in “Dance of the Dragons.” TOP RIGHT: Each series, the dragons grow larger, often requiring completely new models to be built. BOTTOM LEFT: On set, the dragons are represented in a number of ways – from cardboard cut-outs on sticks, to green “stuffies,” or poles for eye lines. BOTTOM RIGHT: Tyrion Lannister (Peter Dinklage) sizes up one of the dragons.

“There were times during season seven when we had four and five units working simultaneously. We had two motion-control units, and two main units working, and it was not uncommon to have a splinter unit or two. Sometimes we would be in multiple countries at once!” —Steve Kullback

48 • VFXVOICE.COM SUMMER 2017

BIRTH OF THE BATTLE OF THE BASTARDS

Perhaps no clearer example of the scale of the visual effects that Game of Thrones has become known for appeared in the penultimate episode of season six, called “Battle of the Bastards.” The pivotal sequence in that episode sees Jon Snow confronting Lord Ramsay Bolton’s army in a brutal fight, where practical photography, CG horses, and crowds were brought together by visual effects studio Iloura. To make the Jon Snow battle a reality, Bauer and his team led the show’s producers through storyboards and then previs of the sequence. “Our method for doing that,” says Bauer, “was plugging in either LIDARs (light detection and ranging) or set builds that reflected what our real shooting situation was right at the very beginning.” “It’s funny,” adds Bauer. “Our editor has taken the cut sequence, the finished sequence from eight months later and put it next to the very first previs, and it’s shocking how closely they follow.” The main unit filmed the battle over 19 days with main players and extras filmed on private land in Northern Ireland. Iloura then took the reins on generating soldiers, riders on horseback, arrows, blood and gore – including for a monumental ‘oner’ shot which had Jon Snow wreaking havoc on his attackers. “Approaching something like ‘Battle of the Bastards’ was really all about defining what the pieces are and where the handoffs are

The mammoth size of the production on Game of Thrones is well known, with several units shooting simultaneously at different locations. The visual effects production schedule is equally intense. “We have about seven or eight weeks of prep before we start shooting,” explains Kullback. “One of the things we have found as the seasons have progressed is that there has not been adequate time to get out in front of the sequences and prepare them for concepts and previs and design as much as we’d like to and have the luxury of Joe being on set supervising it. So, progressively, Joe has spent more time focusing on the design and execution of those sequences as they’re coming about and we have additional supervisors who are on set with us – people like Eric Carney from The Third Floor, and then additional VFX Supervisors Ted Rae and Stefen Fangmeier. “There were times during season seven,” adds Kullback, “when we had four and five units working simultaneously. We had two motion-control units, and two main units working, and it was not

uncommon to have a splinter unit or two. Sometimes we would be in multiple countries at once!” Interestingly, while certainly produced as a television show, the series is consumed in many different ways – on 4K TVs, computer screens, tablets, smart phones and occasionally on cinema screens. “We treat it like a film on every level at every stage,” states Kullback. “It does play on the smaller screen until it doesn’t. We have had occasions almost every season where there is some kind of theatrical projection, and at first we were very concerned about that because we produce it using a 50-inch monitor in front of us. But I think it was in season four or five where they actually did an IMAX theatrical projection.” “It held up pretty well,” says Bauer, sounding relieved. From the sound of things, season seven’s visual effects should hold up in a big way, too.

BOTTOM: The Battle of Meereen saw fire, water and CG animation combine in full glory.

SUMMER 2017

VFXVOICE.COM • 49


COVER

between the departments,” comments Bauer. “In VFX, we could be placing tracking markers on the set or running a giant stuffie head on a pole up and down the field. And then later on we are quite substantially in it to finish the shots.” BIG VFX FOR TV

TOP LEFT: Drogon enters the arena to save Daenerys in “Dance of the Dragons.” TOP RIGHT: Each series, the dragons grow larger, often requiring completely new models to be built. BOTTOM LEFT: On set, the dragons are represented in a number of ways – from cardboard cut-outs on sticks, to green “stuffies,” or poles for eye lines. BOTTOM RIGHT: Tyrion Lannister (Peter Dinklage) sizes up one of the dragons.

“There were times during season seven when we had four and five units working simultaneously. We had two motion-control units, and two main units working, and it was not uncommon to have a splinter unit or two. Sometimes we would be in multiple countries at once!” —Steve Kullback

48 • VFXVOICE.COM SUMMER 2017

BIRTH OF THE BATTLE OF THE BASTARDS

Perhaps no clearer example of the scale of the visual effects that Game of Thrones has become known for appeared in the penultimate episode of season six, called “Battle of the Bastards.” The pivotal sequence in that episode sees Jon Snow confronting Lord Ramsay Bolton’s army in a brutal fight, where practical photography, CG horses, and crowds were brought together by visual effects studio Iloura. To make the Jon Snow battle a reality, Bauer and his team led the show’s producers through storyboards and then previs of the sequence. “Our method for doing that,” says Bauer, “was plugging in either LIDARs (light detection and ranging) or set builds that reflected what our real shooting situation was right at the very beginning.” “It’s funny,” adds Bauer. “Our editor has taken the cut sequence, the finished sequence from eight months later and put it next to the very first previs, and it’s shocking how closely they follow.” The main unit filmed the battle over 19 days with main players and extras filmed on private land in Northern Ireland. Iloura then took the reins on generating soldiers, riders on horseback, arrows, blood and gore – including for a monumental ‘oner’ shot which had Jon Snow wreaking havoc on his attackers. “Approaching something like ‘Battle of the Bastards’ was really all about defining what the pieces are and where the handoffs are

The mammoth size of the production on Game of Thrones is well known, with several units shooting simultaneously at different locations. The visual effects production schedule is equally intense. “We have about seven or eight weeks of prep before we start shooting,” explains Kullback. “One of the things we have found as the seasons have progressed is that there has not been adequate time to get out in front of the sequences and prepare them for concepts and previs and design as much as we’d like to and have the luxury of Joe being on set supervising it. So, progressively, Joe has spent more time focusing on the design and execution of those sequences as they’re coming about and we have additional supervisors who are on set with us – people like Eric Carney from The Third Floor, and then additional VFX Supervisors Ted Rae and Stefen Fangmeier. “There were times during season seven,” adds Kullback, “when we had four and five units working simultaneously. We had two motion-control units, and two main units working, and it was not

uncommon to have a splinter unit or two. Sometimes we would be in multiple countries at once!” Interestingly, while certainly produced as a television show, the series is consumed in many different ways – on 4K TVs, computer screens, tablets, smart phones and occasionally on cinema screens. “We treat it like a film on every level at every stage,” states Kullback. “It does play on the smaller screen until it doesn’t. We have had occasions almost every season where there is some kind of theatrical projection, and at first we were very concerned about that because we produce it using a 50-inch monitor in front of us. But I think it was in season four or five where they actually did an IMAX theatrical projection.” “It held up pretty well,” says Bauer, sounding relieved. From the sound of things, season seven’s visual effects should hold up in a big way, too.

BOTTOM: The Battle of Meereen saw fire, water and CG animation combine in full glory.

SUMMER 2017

VFXVOICE.COM • 49


COVER

5 OF THE BIGGEST GAME OF THRONES VFX MOMENTS 1. TAKING THE FIGHT TO THE WIGHTS Hodor (Kristian Nairn) is attacked by the Wights.

2. HARDHOME HYSTERIA The Night King observes the battle at Hardhome. Prosthetic creature design by Barrie Gower. Prosthetics sculpted by Tristan Versluis. Makeup applied by Victoria Bancroft Perry, Emma Faulkes and Paula Eden. Visual effects by El Ranchito (environment) and Image Engine (Night King ice treatment).

3. DROGON TAKES CHARGE Drogon wreaks havoc in the stadium.

50 • VFXVOICE.COM SUMMER 2017

Epic storytelling and epic visual effects are now the trademark of Game of Thrones. Here’s a look back at some of the big VFX moments from recent seasons of the TV show, and the visual effects studios behind them. 1. TAKING THE FIGHT TO THE WIGHTS In the final episode of season four, called The Children, Bran and his companions suddenly encounter a horde of skeletal Wights (reanimated corpses). They fight in hand-to-hand and sword-and-axe combat. This was orchestrated on set with carefully choreographed stunt performers wearing prosthetic make-up and green-screen-covered suits, the idea being that visual effects techniques would be used to remove portions of the performers and achieve the skeleton look. Scanline VFX handled these shots, beginning with cyberscans of the performers in their wight get-ups. The studio modeled CG replicas and roto-mated these with the live action to produce the bony and wasting-away limbs. A final touch involved digital snow and some ground and snow simulation for when the Wights first emerge from the ground. 2. HARDHOME HYSTERIA Battles seem to be a mainstay of Game of Thrones, and the one seen in season five’s Hardhome episode was one of the most elaborate. Here, a band of men from the Night’s Watch join the Wildlings as they are attacked by Wights and White Walkers. As is often the case on the show, plate photography of prosthetic performers was augmented with digital hybrid characters, CG environments and green-screen scale photography to enable the inclusion of the giant Wun Wun. This work was completed by El Ranchito, taking on shots previs’d by The Third Foor, and then filmed on an Iceland beach. Meanwhile, Image Engine made slight alterations to photography of the Night King. He was portrayed by an actor in prosthetic make-up, and Image Engine turned parts of the King’s skin to ice.

3. DROGON TAKES CHARGE In “The Dance of Dragons,” also in season five, Rhythm & Hues had a hand in the dragon Drogon’s rescue of Queen Daenerys from the Meereen stadium. Using a dragon model previously made by Pixomondo, artists at Rhythm had the dragon swoop into the stadium and deliver waves of fiery attacks. Digital crowds and stadium extensions were also part of the mix. The fire breathing was realized also with the aid of a practical flamethrower rig fixed on a motion-controlled Technocrane filmed on location in Spain. The motion-control fire elements had the benefit, of course, of making the flames more realistic and linking them to the environment. It was also augmented by Rhythm & Hues to suit the motion required by an aggressive Drogon. 4. BATTLE OF WINTERFELL Jon Snow’s defeat of Lord Ramsay Bolton’s army at Winterfell, a key moment in season six’s “Battle of the Bastards” episode, is not without significant carnage. That came courtesy of the artists at Iloura, who created armies of 3,000 soldiers and, most memorably, close-up horse-and-rider collisions to demonstrate battle at its most visceral. An array of weapons, armor, flags and even body parts were also visual effects creations, as well as added blood, mud, smoke, fire and mist. Perhaps the most stunning component of Iloura’s work was its digital horses, which involved a deep studio of horse video reference from activities – such as steeple chases, jousting, racing and, unfortunately, accidents – to replicate their behaviors in the battle. A combination of motion-capture, key-frame and some crowd-simulation approaches to animation made the horses and their riders possible. 5. MAKING MEEREEN In the same “Battle of the Bastards” episode, Queen Daenerys’ dragons help take out a fleet of ships attacking the city of Meereen. This involved shared visual effects by Rodeo FX, which completed the complex city environments, while Rhythm & Hues delivered the dragons. The Third Floor also contributed previs to the Battle of Meereen, as it had done also for the Winterfell scenes. What made the city shots stand out, in particular, was a major effort on Rodeo’s part to build differentiated areas of Meereen – essentially neighborhoods with distinctive architecture, props and other elements. That made the city feel ‘lived in’ and ‘busy’ as it comes under siege.

4. BATTLE OF WINTERFELL Jon Snow readies for battle against Ramsay Bolton’s army.

5. MAKING MEEREEN Daenerys rides above Meereen while it’s under attack. All photos courtesy of HBO

Using a dragon model previously made by Pixomondo, artists at Rhythm & Hues had the dragon swoop into the stadium and deliver waves of fiery attacks. Digital crowds and stadium extensions were also part of the mix. The fire breathing was realized also with the aid of a practical flamethrower rig fixed on a motion-controlled Technocrane filmed on location in Spain. The motion-control fire elements had the benefit of making the flames more realistic and linking them to the environment.

SUMMER 2017

VFXVOICE.COM • 51


COVER

5 OF THE BIGGEST GAME OF THRONES VFX MOMENTS 1. TAKING THE FIGHT TO THE WIGHTS Hodor (Kristian Nairn) is attacked by the Wights.

2. HARDHOME HYSTERIA The Night King observes the battle at Hardhome. Prosthetic creature design by Barrie Gower. Prosthetics sculpted by Tristan Versluis. Makeup applied by Victoria Bancroft Perry, Emma Faulkes and Paula Eden. Visual effects by El Ranchito (environment) and Image Engine (Night King ice treatment).

3. DROGON TAKES CHARGE Drogon wreaks havoc in the stadium.

50 • VFXVOICE.COM SUMMER 2017

Epic storytelling and epic visual effects are now the trademark of Game of Thrones. Here’s a look back at some of the big VFX moments from recent seasons of the TV show, and the visual effects studios behind them. 1. TAKING THE FIGHT TO THE WIGHTS In the final episode of season four, called The Children, Bran and his companions suddenly encounter a horde of skeletal Wights (reanimated corpses). They fight in hand-to-hand and sword-and-axe combat. This was orchestrated on set with carefully choreographed stunt performers wearing prosthetic make-up and green-screen-covered suits, the idea being that visual effects techniques would be used to remove portions of the performers and achieve the skeleton look. Scanline VFX handled these shots, beginning with cyberscans of the performers in their wight get-ups. The studio modeled CG replicas and roto-mated these with the live action to produce the bony and wasting-away limbs. A final touch involved digital snow and some ground and snow simulation for when the Wights first emerge from the ground. 2. HARDHOME HYSTERIA Battles seem to be a mainstay of Game of Thrones, and the one seen in season five’s Hardhome episode was one of the most elaborate. Here, a band of men from the Night’s Watch join the Wildlings as they are attacked by Wights and White Walkers. As is often the case on the show, plate photography of prosthetic performers was augmented with digital hybrid characters, CG environments and green-screen scale photography to enable the inclusion of the giant Wun Wun. This work was completed by El Ranchito, taking on shots previs’d by The Third Foor, and then filmed on an Iceland beach. Meanwhile, Image Engine made slight alterations to photography of the Night King. He was portrayed by an actor in prosthetic make-up, and Image Engine turned parts of the King’s skin to ice.

3. DROGON TAKES CHARGE In “The Dance of Dragons,” also in season five, Rhythm & Hues had a hand in the dragon Drogon’s rescue of Queen Daenerys from the Meereen stadium. Using a dragon model previously made by Pixomondo, artists at Rhythm had the dragon swoop into the stadium and deliver waves of fiery attacks. Digital crowds and stadium extensions were also part of the mix. The fire breathing was realized also with the aid of a practical flamethrower rig fixed on a motion-controlled Technocrane filmed on location in Spain. The motion-control fire elements had the benefit, of course, of making the flames more realistic and linking them to the environment. It was also augmented by Rhythm & Hues to suit the motion required by an aggressive Drogon. 4. BATTLE OF WINTERFELL Jon Snow’s defeat of Lord Ramsay Bolton’s army at Winterfell, a key moment in season six’s “Battle of the Bastards” episode, is not without significant carnage. That came courtesy of the artists at Iloura, who created armies of 3,000 soldiers and, most memorably, close-up horse-and-rider collisions to demonstrate battle at its most visceral. An array of weapons, armor, flags and even body parts were also visual effects creations, as well as added blood, mud, smoke, fire and mist. Perhaps the most stunning component of Iloura’s work was its digital horses, which involved a deep studio of horse video reference from activities – such as steeple chases, jousting, racing and, unfortunately, accidents – to replicate their behaviors in the battle. A combination of motion-capture, key-frame and some crowd-simulation approaches to animation made the horses and their riders possible. 5. MAKING MEEREEN In the same “Battle of the Bastards” episode, Queen Daenerys’ dragons help take out a fleet of ships attacking the city of Meereen. This involved shared visual effects by Rodeo FX, which completed the complex city environments, while Rhythm & Hues delivered the dragons. The Third Floor also contributed previs to the Battle of Meereen, as it had done also for the Winterfell scenes. What made the city shots stand out, in particular, was a major effort on Rodeo’s part to build differentiated areas of Meereen – essentially neighborhoods with distinctive architecture, props and other elements. That made the city feel ‘lived in’ and ‘busy’ as it comes under siege.

4. BATTLE OF WINTERFELL Jon Snow readies for battle against Ramsay Bolton’s army.

5. MAKING MEEREEN Daenerys rides above Meereen while it’s under attack. All photos courtesy of HBO

Using a dragon model previously made by Pixomondo, artists at Rhythm & Hues had the dragon swoop into the stadium and deliver waves of fiery attacks. Digital crowds and stadium extensions were also part of the mix. The fire breathing was realized also with the aid of a practical flamethrower rig fixed on a motion-controlled Technocrane filmed on location in Spain. The motion-control fire elements had the benefit of making the flames more realistic and linking them to the environment.

SUMMER 2017

VFXVOICE.COM • 51


TV

A GOLDEN AGE OF EFFECTS ON TV

Grodd the gorilla, a CG creation by Encore VFX for The Flash. The creature regularly battles Barry Allen (Grant Gustin). (Photo copyright © 2016 The CW Television Network/Warner Bros. Television. All Rights Reserved.)

Digital Domain’s work on Black Sails includes computergenerated boats and stormy seas, mixed with practical elements and some digital doubles. (Photo copyright © 2016 Starz Entertainment. All Rights Reserved.)

Dan Stevens as David and Rachel Keller as Syd in Legion. The show is connected to the Marvel X-Men series. (Photo copyright © 2016 20th Century Television/ Marvel Television. All Rights Reserved.)

52 • VFXVOICE.COM SUMMER 2017

Game of Thrones might be leading the pack in TV visual effects, but there are an increasing number of episodic series where VFX are also front and center. With several highly-anticipated shows launching or returning this year, some current visual effects supervisors working in episodics reflect on how the industry has changed in effects for TV, how they’ve handled the high workload, and what some of their toughest shots have been. SUPERHEROES LEADING THE CHARGE Amidst the many fantasy (Game of Thrones, Black Sails, Vikings), sci-fi (Westworld, The Expanse, Doctor Who) and zombie (The Walking Dead) TV genres – all of which feature multitudes of visual effects work – one genre has capitalized on the use of VFX to help tell grander stories: superhero and comic titles. Consider, for example, the many CG character enhancements and powers present in Marvel’s Agents of S.H.I.E.L.D and Legion, or the detailed super-humans appearing in DC Comics TV series Supergirl and The Flash. And get ready for more TV superhero action with Marvel’s The Defenders, The Punisher and Inhumans, while DC has shows like Legends of Tomorrow and Krypton. Perhaps just like their movie counterparts, these superhero and comic book shows have been leaning more and more on VFX, delivering deeper environments and more complex characters than ever before. “For Grodd in The Flash,” outlines Encore Visual Effects Supervisor Armen Kevorkian, “marrying a CG gorilla with live-action plates definitely requires a substantial creative and technical effort. Fortunately, the actors we work with do a fantastic job performing against nothing, and that’s half the battle. We make sure the quality of our renders and models are on par to meld seamlessly with the live-action plates.”

TV TOOLS OF THE TRADE Indeed, television shows may only be limited in time and budget, but not imagination, in terms of visual effects. For example, for the final episode of the first season of FX’s Legion – a show that also leans heavily on practical effects – FuseFX was called upon to create several shots featuring a laser force field. “We needed to develop a progressively building laser forcefield effect and have it develop over time within the edit,” says FuseFX Visual Effects Supervisor Michael Adkisson. “This was especially challenging due to the tight post schedule for the show. Using a universal template for the effect allowed us to create initial versions of each shot, review the shots in context of the edit, and then adjust the effect levels accordingly.” Similarly, Digital Domain had some challenging water-simulation shots for season three of Starz’s Black Sails.“From early on, I wanted to be able to art direct our water sims, something anyone working with simulations knows is not particularly common nor easy,” notes Visual Effects Supervisor Aladino Debert. “In the end we were able to import basic sims into Maya from Houdini, tweak things within Maya, animate ships and cameras, and then export those settings back to Houdini.” SPECIAL (EFFECTS) DELIVERY The common theme here is that TV budgets and schedules tend to be restricted. Kevorkian says, “Sometimes you’re faced with schedules when episodes air 15 business days after the shoot wraps.” It’s a stunning achievement that the shots even get done. Most of the supervisors interviewed credited their artists and pipelines for being able to deliver shots on time. But they’re also conscious of the increased ‘savviness’ of producers, and viewers, in knowing what visual effects work and don’t work. Says Debert: “While 10 or 15 years ago you could get away with sub-par work under the excuse of ‘it’s a TV show’, that is, for the most part, not possible anymore. As an artist and consumer of such shows, I welcome that development, and by the same token, as a VFX supervisor I constantly try to figure out ways to improve the quality of the work without breaking the bank.” The supervisors also note getting involved early, working on concepts and previs, and being part of the production have also helped make complex TV VFX easier to navigate. Some supervisors, like The Flash’s Kevorkian, have even had the chance to direct episodes and therefore have a well-rounded visual effects perspective going in. “I am mindful, however,” he says, “to sort of take off my VFX hat and focus on telling a good story and letting the performance stand out. Action and VFX are great but those are secondary to good storytelling. In terms of advice for directing, I’d say it’s important to be passionate. It requires fitting a lot of pieces of a puzzle together and is more work than a lot of people realize.”

American Gods features effects ranging from blood-filled field battles to hallucinatory God appearances, and visions of the afterlife. (Photo copyright © 2016 Starz Entertainment/Fremantle Media North America. All Rights Reserved.)

The tortured neighborhood kids are back for the second season of Stranger Things on Netflix. (Photo copyright © 2016 Netflix. All Rights Reserved.)

“For Grodd in The Flash, marrying a CG gorilla with live-action plates definitely requires a substantial creative and technical effort. Fortunately, the actors we work with do a fantastic job performing against nothing, and that’s half the battle. We make sure the quality of our renders and models are on par to meld seamlessly with the live-action plates.” —Armen Kevorkian, Visual Effects Supervisor, Encore

SUMMER 2017

VFXVOICE.COM • 53


TV

A GOLDEN AGE OF EFFECTS ON TV

Grodd the gorilla, a CG creation by Encore VFX for The Flash. The creature regularly battles Barry Allen (Grant Gustin). (Photo copyright © 2016 The CW Television Network/Warner Bros. Television. All Rights Reserved.)

Digital Domain’s work on Black Sails includes computergenerated boats and stormy seas, mixed with practical elements and some digital doubles. (Photo copyright © 2016 Starz Entertainment. All Rights Reserved.)

Dan Stevens as David and Rachel Keller as Syd in Legion. The show is connected to the Marvel X-Men series. (Photo copyright © 2016 20th Century Television/ Marvel Television. All Rights Reserved.)

52 • VFXVOICE.COM SUMMER 2017

Game of Thrones might be leading the pack in TV visual effects, but there are an increasing number of episodic series where VFX are also front and center. With several highly-anticipated shows launching or returning this year, some current visual effects supervisors working in episodics reflect on how the industry has changed in effects for TV, how they’ve handled the high workload, and what some of their toughest shots have been. SUPERHEROES LEADING THE CHARGE Amidst the many fantasy (Game of Thrones, Black Sails, Vikings), sci-fi (Westworld, The Expanse, Doctor Who) and zombie (The Walking Dead) TV genres – all of which feature multitudes of visual effects work – one genre has capitalized on the use of VFX to help tell grander stories: superhero and comic titles. Consider, for example, the many CG character enhancements and powers present in Marvel’s Agents of S.H.I.E.L.D and Legion, or the detailed super-humans appearing in DC Comics TV series Supergirl and The Flash. And get ready for more TV superhero action with Marvel’s The Defenders, The Punisher and Inhumans, while DC has shows like Legends of Tomorrow and Krypton. Perhaps just like their movie counterparts, these superhero and comic book shows have been leaning more and more on VFX, delivering deeper environments and more complex characters than ever before. “For Grodd in The Flash,” outlines Encore Visual Effects Supervisor Armen Kevorkian, “marrying a CG gorilla with live-action plates definitely requires a substantial creative and technical effort. Fortunately, the actors we work with do a fantastic job performing against nothing, and that’s half the battle. We make sure the quality of our renders and models are on par to meld seamlessly with the live-action plates.”

TV TOOLS OF THE TRADE Indeed, television shows may only be limited in time and budget, but not imagination, in terms of visual effects. For example, for the final episode of the first season of FX’s Legion – a show that also leans heavily on practical effects – FuseFX was called upon to create several shots featuring a laser force field. “We needed to develop a progressively building laser forcefield effect and have it develop over time within the edit,” says FuseFX Visual Effects Supervisor Michael Adkisson. “This was especially challenging due to the tight post schedule for the show. Using a universal template for the effect allowed us to create initial versions of each shot, review the shots in context of the edit, and then adjust the effect levels accordingly.” Similarly, Digital Domain had some challenging water-simulation shots for season three of Starz’s Black Sails.“From early on, I wanted to be able to art direct our water sims, something anyone working with simulations knows is not particularly common nor easy,” notes Visual Effects Supervisor Aladino Debert. “In the end we were able to import basic sims into Maya from Houdini, tweak things within Maya, animate ships and cameras, and then export those settings back to Houdini.” SPECIAL (EFFECTS) DELIVERY The common theme here is that TV budgets and schedules tend to be restricted. Kevorkian says, “Sometimes you’re faced with schedules when episodes air 15 business days after the shoot wraps.” It’s a stunning achievement that the shots even get done. Most of the supervisors interviewed credited their artists and pipelines for being able to deliver shots on time. But they’re also conscious of the increased ‘savviness’ of producers, and viewers, in knowing what visual effects work and don’t work. Says Debert: “While 10 or 15 years ago you could get away with sub-par work under the excuse of ‘it’s a TV show’, that is, for the most part, not possible anymore. As an artist and consumer of such shows, I welcome that development, and by the same token, as a VFX supervisor I constantly try to figure out ways to improve the quality of the work without breaking the bank.” The supervisors also note getting involved early, working on concepts and previs, and being part of the production have also helped make complex TV VFX easier to navigate. Some supervisors, like The Flash’s Kevorkian, have even had the chance to direct episodes and therefore have a well-rounded visual effects perspective going in. “I am mindful, however,” he says, “to sort of take off my VFX hat and focus on telling a good story and letting the performance stand out. Action and VFX are great but those are secondary to good storytelling. In terms of advice for directing, I’d say it’s important to be passionate. It requires fitting a lot of pieces of a puzzle together and is more work than a lot of people realize.”

American Gods features effects ranging from blood-filled field battles to hallucinatory God appearances, and visions of the afterlife. (Photo copyright © 2016 Starz Entertainment/Fremantle Media North America. All Rights Reserved.)

The tortured neighborhood kids are back for the second season of Stranger Things on Netflix. (Photo copyright © 2016 Netflix. All Rights Reserved.)

“For Grodd in The Flash, marrying a CG gorilla with live-action plates definitely requires a substantial creative and technical effort. Fortunately, the actors we work with do a fantastic job performing against nothing, and that’s half the battle. We make sure the quality of our renders and models are on par to meld seamlessly with the live-action plates.” —Armen Kevorkian, Visual Effects Supervisor, Encore

SUMMER 2017

VFXVOICE.COM • 53


TV

COMING TO THE SMALL SCREEN NEAR YOU While things will heat up in visual effects in July for the new season of Game of Thrones, audiences have already been witness to the surreal, mythological and often brutal visual effects in Starz’s American Gods. Then there is season two of Netflix’s Stranger Things coming at Halloween. The first season dazzled audiences with nightmarish creatures and paranormal happenings using mostly subtle VFX work. A collection of startling effects – practical and digital – should also permeate season seven of American Horror Story soon.

And while it might be delayed, that doesn’t make CBS’s Star Trek: Discovery any less anticipated. The franchise has a rich history on television – a place where it pioneered the use of miniatures and model photography, and then digital effects, to explore the final frontiers of space. Of course, there is a mountain of work in just about every TV show of the invisible effects variety. With that in mind, and with other ambitious and creative shots and sequences being designed around the clock, it feels like there is no sign of visual effects in television slowing down anytime soon.

ABOVE and OPPOSITE PAGE: Breakdowns of Iloura’s visual effects work for the “Battle of the Bastards” episode.

54 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 55


TV

COMING TO THE SMALL SCREEN NEAR YOU While things will heat up in visual effects in July for the new season of Game of Thrones, audiences have already been witness to the surreal, mythological and often brutal visual effects in Starz’s American Gods. Then there is season two of Netflix’s Stranger Things coming at Halloween. The first season dazzled audiences with nightmarish creatures and paranormal happenings using mostly subtle VFX work. A collection of startling effects – practical and digital – should also permeate season seven of American Horror Story soon.

And while it might be delayed, that doesn’t make CBS’s Star Trek: Discovery any less anticipated. The franchise has a rich history on television – a place where it pioneered the use of miniatures and model photography, and then digital effects, to explore the final frontiers of space. Of course, there is a mountain of work in just about every TV show of the invisible effects variety. With that in mind, and with other ambitious and creative shots and sequences being designed around the clock, it feels like there is no sign of visual effects in television slowing down anytime soon.

ABOVE and OPPOSITE PAGE: Breakdowns of Iloura’s visual effects work for the “Battle of the Bastards” episode.

54 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 55


PREVIS

BEHIND THE THRONE WITH THE THIRD FLOOR Compiled by ED OCHS

The Third Floor provided visualization and technical planning/ virtual production for Game of Thrones Seasons 4, 5 and 6, and they are at it again for upcoming Season 7. In 2016, The Third Floor won its third Creative Arts Emmy® Award for work as part of the Game of Thrones team for Season 6: Episode 9, “The Battle of the Bastards.” Previs Supervisor Michelle Blok, Visual Effects Plate Supervisor Eric Carney and their Third Floor visioneers work directly with the production in Ireland. Here, Michelle Blok takes VFX Voice readers on an exclusive tour through previs’d scenes from the dynamic visualization process for Season 6 of Game of Thrones.

ARTIC WASTELAND VISION:

BRAN’S VISION:

SLAVER’S BAY/MEEREEN ATTACK:

BATTLE OF THE BASTARDS/JON SNOW ONE’R SHOT:

“The Arctic Wasteland Vision sequence was shot at Magheramorn quarry using the same rocks that had been redressed from another vision shoot. The sequence involved Bran walking through a crowd of thousands of Wights as he approaches the mounted White Walkers. The biggest task for the creative and technical visualization artists on The Third Floor’s team was helping production figure out the shooting methodology. Some shots required multiple tiling passes to create the thousands of characters. Multiple passes were also needed of the White Walkers on their horses against green screens, ensuring there was a full unobscured pass for each rider so VFX could add the decomposing effects to each rider.”

“This Season 6 sequence of Bran’s Vision was shot in a small Northern Ireland quarry. Working from a set designed by the art department, our previs team at The Third Floor worked on visualizing layout of stones to figure out how the scene should be laid out on location, where a partially built tree and a number of large standing rocks needed to be placed in a way to not only achieve the shots, but to also be positioned within the safe minimum working distance from the quarry walls. As CG rocks would be used to expand the number of practical rocks on set, we blocked out the sequence using a lidar scan of the location, analyzing all the rock placements visible in each shot and creating a plan for their placement on location. We then optimized each camera angle to shoot as much coverage as possible on the built set, with the VFX team filling in the remaining rock spiral with CG elements. To aid in planning the shoot, we provided Director of Photography Anett Haellmigk with a camera layout diagram, marking the position of all the cameras required as well as indication of all real and CG stones.”

“The Slaver’s Bay Attack in Season 6: Episode 9 involved all three dragons, the fictional city of Meereen and locations from Spain to Belfast. Our artists at The Third Floor modeled the environment in previs, ensuring the geographic locations could be tied together with the CG world and that the animated CG dragons would be able to swoop and land, interacting in real-life locations. Shots of Danerys riding Drogon were done in a separate greenscreen shoot. With our head of virtual production, Casey Schatz, we were able to drive a motion-base and live-action motion-control camera crane, programming moves for both so the actress would fit seamlessly into the final shots with realistic, dynamic action. The team also programmed a motion-control crane to do fire elements for the dragons burning the ships.”

“Our visualization team collaborated with virtually every department to support the production shoot for the Season 6 ‘Battle of the Bastards’ sequences set near Winterfell. Working from storyboards and with director Miguel Sapochnik, we began by visualizing layouts for the battlefield, including huge armies that we previs’d using groups of low-res cached previs models with varied animation cycles. We then outlined the key action beats, using specific battle formations and wide shots, crane shots and vehicle-mounted shots, etc. to help establish the geography for viewers and ensure hero characters remained visible in the crowd. The previs also informed the visual effects and crowd setups needed for each plate.” “The famous ‘one’r shot,’ which stays on Jon Snow in the thick of the fighting through a series of obstacles and close calls, required particularly close-crafted choreography. Using previs and detailed techvis, we mapped out each pass that would be required for the multi-layered final composite, with color-coded diagrams and Quicktimes to show what would be needed for the practical cameras, stunt elements, horsemaster shots, and CG effects. We calculated camera distances, greenscreen requirements and positions for actors and giant Wun Wun across the entire scene.”

All images copyright HBO and courtesy of The Third Floor, Inc.

56 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 57


PREVIS

BEHIND THE THRONE WITH THE THIRD FLOOR Compiled by ED OCHS

The Third Floor provided visualization and technical planning/ virtual production for Game of Thrones Seasons 4, 5 and 6, and they are at it again for upcoming Season 7. In 2016, The Third Floor won its third Creative Arts Emmy® Award for work as part of the Game of Thrones team for Season 6: Episode 9, “The Battle of the Bastards.” Previs Supervisor Michelle Blok, Visual Effects Plate Supervisor Eric Carney and their Third Floor visioneers work directly with the production in Ireland. Here, Michelle Blok takes VFX Voice readers on an exclusive tour through previs’d scenes from the dynamic visualization process for Season 6 of Game of Thrones.

ARTIC WASTELAND VISION:

BRAN’S VISION:

SLAVER’S BAY/MEEREEN ATTACK:

BATTLE OF THE BASTARDS/JON SNOW ONE’R SHOT:

“The Arctic Wasteland Vision sequence was shot at Magheramorn quarry using the same rocks that had been redressed from another vision shoot. The sequence involved Bran walking through a crowd of thousands of Wights as he approaches the mounted White Walkers. The biggest task for the creative and technical visualization artists on The Third Floor’s team was helping production figure out the shooting methodology. Some shots required multiple tiling passes to create the thousands of characters. Multiple passes were also needed of the White Walkers on their horses against green screens, ensuring there was a full unobscured pass for each rider so VFX could add the decomposing effects to each rider.”

“This Season 6 sequence of Bran’s Vision was shot in a small Northern Ireland quarry. Working from a set designed by the art department, our previs team at The Third Floor worked on visualizing layout of stones to figure out how the scene should be laid out on location, where a partially built tree and a number of large standing rocks needed to be placed in a way to not only achieve the shots, but to also be positioned within the safe minimum working distance from the quarry walls. As CG rocks would be used to expand the number of practical rocks on set, we blocked out the sequence using a lidar scan of the location, analyzing all the rock placements visible in each shot and creating a plan for their placement on location. We then optimized each camera angle to shoot as much coverage as possible on the built set, with the VFX team filling in the remaining rock spiral with CG elements. To aid in planning the shoot, we provided Director of Photography Anett Haellmigk with a camera layout diagram, marking the position of all the cameras required as well as indication of all real and CG stones.”

“The Slaver’s Bay Attack in Season 6: Episode 9 involved all three dragons, the fictional city of Meereen and locations from Spain to Belfast. Our artists at The Third Floor modeled the environment in previs, ensuring the geographic locations could be tied together with the CG world and that the animated CG dragons would be able to swoop and land, interacting in real-life locations. Shots of Danerys riding Drogon were done in a separate greenscreen shoot. With our head of virtual production, Casey Schatz, we were able to drive a motion-base and live-action motion-control camera crane, programming moves for both so the actress would fit seamlessly into the final shots with realistic, dynamic action. The team also programmed a motion-control crane to do fire elements for the dragons burning the ships.”

“Our visualization team collaborated with virtually every department to support the production shoot for the Season 6 ‘Battle of the Bastards’ sequences set near Winterfell. Working from storyboards and with director Miguel Sapochnik, we began by visualizing layouts for the battlefield, including huge armies that we previs’d using groups of low-res cached previs models with varied animation cycles. We then outlined the key action beats, using specific battle formations and wide shots, crane shots and vehicle-mounted shots, etc. to help establish the geography for viewers and ensure hero characters remained visible in the crowd. The previs also informed the visual effects and crowd setups needed for each plate.” “The famous ‘one’r shot,’ which stays on Jon Snow in the thick of the fighting through a series of obstacles and close calls, required particularly close-crafted choreography. Using previs and detailed techvis, we mapped out each pass that would be required for the multi-layered final composite, with color-coded diagrams and Quicktimes to show what would be needed for the practical cameras, stunt elements, horsemaster shots, and CG effects. We calculated camera distances, greenscreen requirements and positions for actors and giant Wun Wun across the entire scene.”

All images copyright HBO and courtesy of The Third Floor, Inc.

56 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 57


SFX

JOHN FRAZIER: SORCERER OF HAVOC By CHRIS MCGOWAN

Photos courtesy of John Frazier

TOP: Even after you win an Oscar (one of three), there are household chores to attend to.

58 • VFXVOICE.COM SUMMER 2017

While he is most renowned for the spectacular havoc he wreaks on movie sets, special effects veteran John Frazier both creates and destroys. Yes, he sends cars spinning through the air and is responsible for the biggest explosion ever filmed with live actors. He worked on the plane crash in Cast Away, tipped a battleship for Michael Bay’s Pearl Harbor and flipped frigates in this year’s Pirates of the Caribbean: Dead Men Tell No Tales. Yet he also fabricates memorable movie props in his Sun Valley, Calif., shop – sculpting futuristic cars, Transformers robots, Black Hawk helicopters, vintage locomotives and whatever else directors want out of foam and fiberglass and other materials. When blockbuster movies need a touch of fantastic realism on a massive scale, Frazier’s firm Fxperts gets the call. Frazier has been a special effects supervisor for the Pirates of the Caribbean, Spider-Man and Transformers franchises, as well as Speed, Twister, The Perfect Storm, Pearl Harbor, Armageddon, The Lone Ranger, and other spectacular films that have pushed the effects envelope. He has coordinated special effects on 115 movies to date, won three Academy Awards and garnered 10 additional nominations for his contributions. He shared an Oscar in 2004 for the special effects on Spider-Man 2 and shared the Academy’s Technical Achievement Award in 2014 for the design and development of the Pneumatic Car Flipper, a device that has become the industry standard for tossing cars in disaster scenes. He shared another Academy Award with Mark Noel for inventing the NAC Servo Winch System to fly cars and heavy props through the air on wires. Frazier is the go-to guy for either building or blowing things up for directors like Michael Bay and Clint Eastwood, with relationships that have lasted decades. His effects wizardry will be on display this year with the newest Pirates and Transformers movies. “It has been good. I have been blessed for 52 years now,” says Frazier. “When visual effects came along, everyone said, ‘You guys are going to be dinosaurs.’ That may happen someday, but not in the next 10 years.” His five decades in film and television started by happenstance. In the early 1960s, Frazier was just another teenager in L.A.’s sprawling San Fernando Valley, dreaming of “football, surfing and girls.” Lots of neighbors worked in the film industry, but he had no connection to it through family or friends and no aspirations in that area. The closest he got to the world of special effects was visiting Disneyland on his high school’s Grad Night and being mesmerized by the Enchanted Tiki Room with its animatronic birds. “I looked at the talking parrots and thought, ‘That’s what I want to do.’ Abraham Lincoln was cool too [Great Moments with Mr. Lincoln].” However, he seemed destined for a different path. “There was plenty of good work around in Southern California. My family background is construction and that’s what I wanted to do.” He enrolled in Los Angeles Trade Tech to study high-rise construction and freeway design. While he was there, producer Jack Shafton hired him to work as a carpenter and handyman at his house on weekends. The year was 1963. Shafton created puppet characters for television shows and commercials and was also involved with The Haunted House nightclub at Hollywood

and Vine. Frazier built some bars for the club and worked on “little mechanical creatures” that were part of the club’s decor. Recognizing the young man’s abilities, Shafton got him a full-time job at NBC. “I didn’t know what special effects were. In those days you didn’t even get screen credit.” When he reported to work the first day, along with a group of other new employees, he was told, ‘You five work for that guy, and you five work for those other guys,’ and I was the odd man out. They said, ‘You go with those special effects guys.’ That’s how I started out – by accident.” He thrived there, and a year into his job his boss left and Frazier was running the department – at the age of 20. “It was a lot of fun, incredible.” It was the golden age of live television and Frazier worked with Bob Hope, Johnny Carson, Dean Martin, Elvis Presley, Laugh-In and The Gong Show, providing whatever was needed for shows or skits. He even helped create the iconic Pillsbury Doughboy, who first appeared in 1965. “At NBC we didn’t do special effects on the scale of features. Bob Hope called me in once and said, ‘I want you to see this thing we did years ago.’” They watched an old skit with a gun. Frazier recalls that Hope said, “’Look at the gun, it’s going to blow apart. Okay, John I need you to make that for me for tonight,’” Frazier remembers. “I said, ‘Mr. Hope, with all due respect, there’s probably 300 man hours in making that gun do that.’ “He said, ‘Okay, hire 300 guys and it’ll take you one hour.’” Another time, shortly before the first Apollo moon landing, Hope called Frazier and asked, “‘What do you know about the lunar rover?’ I said, ‘Not much.’ There were no pictures published of it. It was all classified. Hope asked, “’If I show you a picture, will you make it for me?’” In a scene right out of Men in Black, two government men in sunglasses arrived. “They had a tube and they pulled blue prints out of the tube. ‘You can’t take pictures. You can take a measurement but you can’t write anything down.’ It was basically a dune buggy with a parabolic reflector. They put the blueprints back in the tube and left. These were Secret Service guys or something. I made it for him and it looked just like the one on the moon.” Frazier worked in live TV for eight years and transitioned to movies with Wes Craven’s The Hills Have Eyes (1977). Joe Lombardi [special effects for The Godfather and Apocalypse Now] had suggested that Frazier should make the leap to feature films. “He said, ‘Why don’t you come play with us?’ Variety shows were dying out,” recalls Frazier, who worked with Lombardi on Apocalypse Now. In the ‘80s, Frazier coordinated special effects for everything from Airplane! and War of the Roses to Pee Wee’s Big Adventure and Ferris Bueller’s Day Off. He scaled up his inventiveness working with directors like Clint Eastwood. For Eastwood’s In the Line of Fire (1993), Frazier co-developed “rubber glass,” a silicone rubber product that breaks or crumbles into pieces that look like broken glass or ice. It was Jan de Bont’s Speed (1994) that elevated him into the realm of high-budget action pictures. It was “my big break,” Frazier recalls. “I hooked up with [producer] Ian Bryce.” Waterworld and Twister followed, and then producer Gale Anne Hurd sought him out for

“When visual effects came along, everyone said, ‘You guys are going to be dinosaurs.’ That may happen someday, but not in the next 10 years.” —John Frazier TOP: John Frazier with “Otto Pilot” from Airplane! BOTTOM: John Frazier (standing) working at Elvis Presley’s Aloha from Hawaii 1973 special, the first show seen around the world live via satellite.

SUMMER 2017

VFXVOICE.COM • 59


SFX

JOHN FRAZIER: SORCERER OF HAVOC By CHRIS MCGOWAN

Photos courtesy of John Frazier

TOP: Even after you win an Oscar (one of three), there are household chores to attend to.

58 • VFXVOICE.COM SUMMER 2017

While he is most renowned for the spectacular havoc he wreaks on movie sets, special effects veteran John Frazier both creates and destroys. Yes, he sends cars spinning through the air and is responsible for the biggest explosion ever filmed with live actors. He worked on the plane crash in Cast Away, tipped a battleship for Michael Bay’s Pearl Harbor and flipped frigates in this year’s Pirates of the Caribbean: Dead Men Tell No Tales. Yet he also fabricates memorable movie props in his Sun Valley, Calif., shop – sculpting futuristic cars, Transformers robots, Black Hawk helicopters, vintage locomotives and whatever else directors want out of foam and fiberglass and other materials. When blockbuster movies need a touch of fantastic realism on a massive scale, Frazier’s firm Fxperts gets the call. Frazier has been a special effects supervisor for the Pirates of the Caribbean, Spider-Man and Transformers franchises, as well as Speed, Twister, The Perfect Storm, Pearl Harbor, Armageddon, The Lone Ranger, and other spectacular films that have pushed the effects envelope. He has coordinated special effects on 115 movies to date, won three Academy Awards and garnered 10 additional nominations for his contributions. He shared an Oscar in 2004 for the special effects on Spider-Man 2 and shared the Academy’s Technical Achievement Award in 2014 for the design and development of the Pneumatic Car Flipper, a device that has become the industry standard for tossing cars in disaster scenes. He shared another Academy Award with Mark Noel for inventing the NAC Servo Winch System to fly cars and heavy props through the air on wires. Frazier is the go-to guy for either building or blowing things up for directors like Michael Bay and Clint Eastwood, with relationships that have lasted decades. His effects wizardry will be on display this year with the newest Pirates and Transformers movies. “It has been good. I have been blessed for 52 years now,” says Frazier. “When visual effects came along, everyone said, ‘You guys are going to be dinosaurs.’ That may happen someday, but not in the next 10 years.” His five decades in film and television started by happenstance. In the early 1960s, Frazier was just another teenager in L.A.’s sprawling San Fernando Valley, dreaming of “football, surfing and girls.” Lots of neighbors worked in the film industry, but he had no connection to it through family or friends and no aspirations in that area. The closest he got to the world of special effects was visiting Disneyland on his high school’s Grad Night and being mesmerized by the Enchanted Tiki Room with its animatronic birds. “I looked at the talking parrots and thought, ‘That’s what I want to do.’ Abraham Lincoln was cool too [Great Moments with Mr. Lincoln].” However, he seemed destined for a different path. “There was plenty of good work around in Southern California. My family background is construction and that’s what I wanted to do.” He enrolled in Los Angeles Trade Tech to study high-rise construction and freeway design. While he was there, producer Jack Shafton hired him to work as a carpenter and handyman at his house on weekends. The year was 1963. Shafton created puppet characters for television shows and commercials and was also involved with The Haunted House nightclub at Hollywood

and Vine. Frazier built some bars for the club and worked on “little mechanical creatures” that were part of the club’s decor. Recognizing the young man’s abilities, Shafton got him a full-time job at NBC. “I didn’t know what special effects were. In those days you didn’t even get screen credit.” When he reported to work the first day, along with a group of other new employees, he was told, ‘You five work for that guy, and you five work for those other guys,’ and I was the odd man out. They said, ‘You go with those special effects guys.’ That’s how I started out – by accident.” He thrived there, and a year into his job his boss left and Frazier was running the department – at the age of 20. “It was a lot of fun, incredible.” It was the golden age of live television and Frazier worked with Bob Hope, Johnny Carson, Dean Martin, Elvis Presley, Laugh-In and The Gong Show, providing whatever was needed for shows or skits. He even helped create the iconic Pillsbury Doughboy, who first appeared in 1965. “At NBC we didn’t do special effects on the scale of features. Bob Hope called me in once and said, ‘I want you to see this thing we did years ago.’” They watched an old skit with a gun. Frazier recalls that Hope said, “’Look at the gun, it’s going to blow apart. Okay, John I need you to make that for me for tonight,’” Frazier remembers. “I said, ‘Mr. Hope, with all due respect, there’s probably 300 man hours in making that gun do that.’ “He said, ‘Okay, hire 300 guys and it’ll take you one hour.’” Another time, shortly before the first Apollo moon landing, Hope called Frazier and asked, “‘What do you know about the lunar rover?’ I said, ‘Not much.’ There were no pictures published of it. It was all classified. Hope asked, “’If I show you a picture, will you make it for me?’” In a scene right out of Men in Black, two government men in sunglasses arrived. “They had a tube and they pulled blue prints out of the tube. ‘You can’t take pictures. You can take a measurement but you can’t write anything down.’ It was basically a dune buggy with a parabolic reflector. They put the blueprints back in the tube and left. These were Secret Service guys or something. I made it for him and it looked just like the one on the moon.” Frazier worked in live TV for eight years and transitioned to movies with Wes Craven’s The Hills Have Eyes (1977). Joe Lombardi [special effects for The Godfather and Apocalypse Now] had suggested that Frazier should make the leap to feature films. “He said, ‘Why don’t you come play with us?’ Variety shows were dying out,” recalls Frazier, who worked with Lombardi on Apocalypse Now. In the ‘80s, Frazier coordinated special effects for everything from Airplane! and War of the Roses to Pee Wee’s Big Adventure and Ferris Bueller’s Day Off. He scaled up his inventiveness working with directors like Clint Eastwood. For Eastwood’s In the Line of Fire (1993), Frazier co-developed “rubber glass,” a silicone rubber product that breaks or crumbles into pieces that look like broken glass or ice. It was Jan de Bont’s Speed (1994) that elevated him into the realm of high-budget action pictures. It was “my big break,” Frazier recalls. “I hooked up with [producer] Ian Bryce.” Waterworld and Twister followed, and then producer Gale Anne Hurd sought him out for

“When visual effects came along, everyone said, ‘You guys are going to be dinosaurs.’ That may happen someday, but not in the next 10 years.” —John Frazier TOP: John Frazier with “Otto Pilot” from Airplane! BOTTOM: John Frazier (standing) working at Elvis Presley’s Aloha from Hawaii 1973 special, the first show seen around the world live via satellite.

SUMMER 2017

VFXVOICE.COM • 59


SFX

TOP LEFT: John Frazier with a hot rod manufactured at Fxperts, Inc. TOP RIGHT: John Frazier and Michael Bay in a rare relaxed moment for both. BOTTOM: Shia LeBeouf, Megan Fox, Josh Duhamel and Tyrese Gibson run for their lives in Transformers: Revenge of the Fallen, which featured the biggest explosion with live actors in movie history. (Photo copyright © 2008 Paramount Pictures. All Rights Reserved.)

60 • VFXVOICE.COM SUMMER 2017

Armageddon. “Michael Bay gave me the job and I’ve been with him for 22 years. We were in right place at right time. I got some good breaks.” Bay wanted to send vehicles flying through the air in downtown Los Angeles and Frazier had to come up with a solution. At the time, car tossing was a noisy proposition that involved an explosive charge. And people were moving back to downtown in that era to lofts and apartments and they didn’t want to hear loud booms. Frazier decided to do it pneumatically and co-invented a “car flipper” device (which has a thick steel plate attached to a steel lever structure and modified hydraulic rams) that not only was quiet but enabled precise control over automotive acrobatics. It could shoot the car straight up in the air, flip it end over end, or explore even crazier possibilities. Frazier estimates that he flipped 50 or 60 cars alone for Transformers: Age of Extinction and perhaps 500 for all his films with Bay. The device became an industry standard and won a 2014 Academy Award for Frazier, Chuck Gaspar and Clay Pinney. “We didn’t know how big it would be.” Bay also enabled Frazier’s penchant for pyrotechnics. For Transformers: Revenge of the Fallen (2009), Frazier created the biggest explosion in movie history, on a set in White Sands, New Mexico. For an airstrike on a group of unruly robots, he ignited more than 1,000 gallons of gasoline and 300 sticks of dynamite as stars Shia LaBoeuf, Megan Fox, Josh Duhamel and Tyrese Gibson ran for their lives. “I’ve done 15,000 explosions” to date, observes Frazier, with a hint of glee. Frazier is also able to handle explosions of the auteur variety. “Bob Zemeckis, Sam Raimi and Clint Eastwood are mild guys and

don’t scream. I’ve only seen Clint Eastwood raise his voice one time. But some people need that.” Bay is one of those. “Michael Bay can scream and yell and vent on me – I don’t have an attitude. Some people can’t get things done unless they’re yelling. I don’t know of a big action director who doesn’t scream. They all do.” Despite the histrionics, Frazier likes the way Bay challenges him. “You have to have your shit together. He knows what he wants. For me, when making the Transformers movies, there are just two people – Michael Bay and me. Frazier is also a gimbal guru and Frazier used an outsized one to roll a full-scale replica of the USS Oklahoma battleship for Bay’s Pearl Harbor. “No one has done them bigger,” he notes. “And we have a good track record for safety.” He has also employed giant gimbals for the various Pirates of the Caribbean movies. Frazier’s shop creations, mechanical effects and explosions bring verisimilitude to the screen and compete with digital effects, yet he acknowledges that CGI has helped his career by making certain types of movies possible. “If not for CGI, all these scripts would still be on the shelves.” Marvel’s focus would still be comic books. “CGI has generated work for all of us. Spider-Man has to be digital and without CGI he’d be another superhero on the shelf. For Twister, [Executive Producer] Steven Spielberg said, ‘Show me a tornado and I’ll green light your picture.’ “CGI has gotten so good in the last 15 years that when they say what I did was all CGI and I say, ‘No, we did it all,’ I take that as a compliment. But 15 years ago it was an insult.” Frazier feels that movies with CGI need footage from the real world as well. “You need real stuff. The individual [director] will

“CGI has generated work for all of us. Spider-Man has to be digital and without CGI he’d be another superhero on the shelf. For Twister, [Executive Producer] Steven Spielberg said, ‘Show me a tornado and I’ll green light your picture.’” —John Frazier

TOP: For an airstrike on unruly robots in Transformers: Revenge of the Fallen, Frazier ignited more than 1,000 gallons of gasoline and 300 sticks of dynamite – creating the biggest of about 15,000 explosions he’s conducted to date. (Photo copyright © 2008 Paramount Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 61


SFX

TOP LEFT: John Frazier with a hot rod manufactured at Fxperts, Inc. TOP RIGHT: John Frazier and Michael Bay in a rare relaxed moment for both. BOTTOM: Shia LeBeouf, Megan Fox, Josh Duhamel and Tyrese Gibson run for their lives in Transformers: Revenge of the Fallen, which featured the biggest explosion with live actors in movie history. (Photo copyright © 2008 Paramount Pictures. All Rights Reserved.)

60 • VFXVOICE.COM SUMMER 2017

Armageddon. “Michael Bay gave me the job and I’ve been with him for 22 years. We were in right place at right time. I got some good breaks.” Bay wanted to send vehicles flying through the air in downtown Los Angeles and Frazier had to come up with a solution. At the time, car tossing was a noisy proposition that involved an explosive charge. And people were moving back to downtown in that era to lofts and apartments and they didn’t want to hear loud booms. Frazier decided to do it pneumatically and co-invented a “car flipper” device (which has a thick steel plate attached to a steel lever structure and modified hydraulic rams) that not only was quiet but enabled precise control over automotive acrobatics. It could shoot the car straight up in the air, flip it end over end, or explore even crazier possibilities. Frazier estimates that he flipped 50 or 60 cars alone for Transformers: Age of Extinction and perhaps 500 for all his films with Bay. The device became an industry standard and won a 2014 Academy Award for Frazier, Chuck Gaspar and Clay Pinney. “We didn’t know how big it would be.” Bay also enabled Frazier’s penchant for pyrotechnics. For Transformers: Revenge of the Fallen (2009), Frazier created the biggest explosion in movie history, on a set in White Sands, New Mexico. For an airstrike on a group of unruly robots, he ignited more than 1,000 gallons of gasoline and 300 sticks of dynamite as stars Shia LaBoeuf, Megan Fox, Josh Duhamel and Tyrese Gibson ran for their lives. “I’ve done 15,000 explosions” to date, observes Frazier, with a hint of glee. Frazier is also able to handle explosions of the auteur variety. “Bob Zemeckis, Sam Raimi and Clint Eastwood are mild guys and

don’t scream. I’ve only seen Clint Eastwood raise his voice one time. But some people need that.” Bay is one of those. “Michael Bay can scream and yell and vent on me – I don’t have an attitude. Some people can’t get things done unless they’re yelling. I don’t know of a big action director who doesn’t scream. They all do.” Despite the histrionics, Frazier likes the way Bay challenges him. “You have to have your shit together. He knows what he wants. For me, when making the Transformers movies, there are just two people – Michael Bay and me. Frazier is also a gimbal guru and Frazier used an outsized one to roll a full-scale replica of the USS Oklahoma battleship for Bay’s Pearl Harbor. “No one has done them bigger,” he notes. “And we have a good track record for safety.” He has also employed giant gimbals for the various Pirates of the Caribbean movies. Frazier’s shop creations, mechanical effects and explosions bring verisimilitude to the screen and compete with digital effects, yet he acknowledges that CGI has helped his career by making certain types of movies possible. “If not for CGI, all these scripts would still be on the shelves.” Marvel’s focus would still be comic books. “CGI has generated work for all of us. Spider-Man has to be digital and without CGI he’d be another superhero on the shelf. For Twister, [Executive Producer] Steven Spielberg said, ‘Show me a tornado and I’ll green light your picture.’ “CGI has gotten so good in the last 15 years that when they say what I did was all CGI and I say, ‘No, we did it all,’ I take that as a compliment. But 15 years ago it was an insult.” Frazier feels that movies with CGI need footage from the real world as well. “You need real stuff. The individual [director] will

“CGI has generated work for all of us. Spider-Man has to be digital and without CGI he’d be another superhero on the shelf. For Twister, [Executive Producer] Steven Spielberg said, ‘Show me a tornado and I’ll green light your picture.’” —John Frazier

TOP: For an airstrike on unruly robots in Transformers: Revenge of the Fallen, Frazier ignited more than 1,000 gallons of gasoline and 300 sticks of dynamite – creating the biggest of about 15,000 explosions he’s conducted to date. (Photo copyright © 2008 Paramount Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 61


SFX

TOP: Bumblebee, a robot from the Transformers franchise, was constructed in John Frazier’s shop. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)

TOP LEFT: A replica of the U.S.S. Oklahoma atop a giant gimbal was rolled in Michael Bay’s Pearl Harbor. (Photo copyright © 2001 Touchstone Pictures. All Rights Reserved.) TOP RIGHT: John Frazier’s car flipper sent vehicles flying in the urban destruction scenes of Michael Bay’s Armageddon. (Photo copyright © 2005 Touchstone Pictures/Courtesy Everett Collection. All Rights Reserved.) BOTTOM: For the runaway train in 2010’s Unstoppable, director Tony Scott contacted Frazier and his company, Fxperts, to orchestrate a two-locomotive derailing sequence. It featured redesigned, lightweight versions of a conventional diesel electric locomotive. Frazier’s team took eight weeks to build two 50,000-lb. locomotive action props, each powered by a Freightliner FL70 engine. (Photo copyright © 2010 Twentieth Century Fox. All Rights Reserved.)

62 • VFXVOICE.COM SUMMER 2017

BOTTOM: The fabled city of Alamut in Prince of Persia: The Sands of Time was created as a miniature. (Photo copyright © 2010 Disney Enterprises/ Jerry Bruckheimer Inc. All Rights Reserved.)

say, ‘Whatever you can give us live, we’ll take it. It gives us depth.’ There’s something about it being live that takes the edge off it. You just can’t get it in CGI.” Since so many big-budget action directors keep knocking on Frazier’s door to give him more work, it would seem he knows what he’s talking about. And he still has one of the most fun and unique jobs in the world. About it, he comments, “In the morning I can’t wait to get to work. I just can’t get enough of the business. “If I could, I would make all the movies.”

SUMMER 2017

VFXVOICE.COM • 63


SFX

TOP: Bumblebee, a robot from the Transformers franchise, was constructed in John Frazier’s shop. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)

TOP LEFT: A replica of the U.S.S. Oklahoma atop a giant gimbal was rolled in Michael Bay’s Pearl Harbor. (Photo copyright © 2001 Touchstone Pictures. All Rights Reserved.) TOP RIGHT: John Frazier’s car flipper sent vehicles flying in the urban destruction scenes of Michael Bay’s Armageddon. (Photo copyright © 2005 Touchstone Pictures/Courtesy Everett Collection. All Rights Reserved.) BOTTOM: For the runaway train in 2010’s Unstoppable, director Tony Scott contacted Frazier and his company, Fxperts, to orchestrate a two-locomotive derailing sequence. It featured redesigned, lightweight versions of a conventional diesel electric locomotive. Frazier’s team took eight weeks to build two 50,000-lb. locomotive action props, each powered by a Freightliner FL70 engine. (Photo copyright © 2010 Twentieth Century Fox. All Rights Reserved.)

62 • VFXVOICE.COM SUMMER 2017

BOTTOM: The fabled city of Alamut in Prince of Persia: The Sands of Time was created as a miniature. (Photo copyright © 2010 Disney Enterprises/ Jerry Bruckheimer Inc. All Rights Reserved.)

say, ‘Whatever you can give us live, we’ll take it. It gives us depth.’ There’s something about it being live that takes the edge off it. You just can’t get it in CGI.” Since so many big-budget action directors keep knocking on Frazier’s door to give him more work, it would seem he knows what he’s talking about. And he still has one of the most fun and unique jobs in the world. About it, he comments, “In the morning I can’t wait to get to work. I just can’t get enough of the business. “If I could, I would make all the movies.”

SUMMER 2017

VFXVOICE.COM • 63


VR TRENDS

VFX ARTISTRY PUSHING VR STORYTELLING TO NEW PLACES By DEBRA KAUFMAN

TOP: Bud Myrick, Executive VP/VR Supervisor, FuseVR

“A lot of the skill sets that VR uses are a hybrid of VFX and gaming, So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it not just to a screen space but a world space.” —Bud Myrick, Executive VP, FuseFX

64 • VFXVOICE.COM SUMMER 2017

Virtual Reality is poised to become a dynamic entertainment sector, and VFX houses are leading the way. According to research firm Markets and Markets, virtual reality will be worth $3.9 billion by 2022. It also found that North America will be the “hottest” market due to the number of VR companies in the U.S. and the amount of R&D devoted to this space. VR filmmaking is poised to be a major segment. Major studios such as DreamWorks, Fox and Warner Bros. have spun off VR divisions, are about to, or have been investing in the technology. The technology will apply to live action, animated as well as documentaries. A-list directors, such as Doug Limon, are experimenting with VR film projects. VR projects are becoming part of film festivals such as Sundance and South by Southwest (SXSW). Documentaries are predicted to be a significant VR category. Some are also predicting that Virtual Reality TV shows will be commonplace by 2020-2022 with the advent of VR channels on cable. Visual effects facilities, which create virtual characters and worlds every day, are ideally suited to create VR content, and many of them have already done so, sometimes spinning off dedicated units. Any number of companies are now creating virtual reality elements and experiences, investigating where the demand is coming from, expanding what kind of work they’re doing as they believe the future is strong for this nascent sector. FuseFX just spun off a virtual reality division, dubbed FuseVR, led by Rhythm & Hues veterans Bud Myrick as VR Supervisor and John Heller as VR Creative Director. FuseVR just completed its first virtual reality job, Buzz Aldrin: Cycling Pathways to Mars, which debuted at SXSW, the annual conglomerate of film and interactive media that takes place in Austin, Texas. The project is available on Steam store for HTC Vive headset, and playing at the National Air and Space Museum in Washington, D.C. The job came to FuseVR from VR production company 8i, which, says Myrick, did the holographic capture of former astronaut Buzz Aldrin. “They brought us on because they had seen our space work and wanted our visual skill set,” he says. FuseVR created digital sets and environments, including the Martian surface, built from photogrammetry data captured in Morocco, as well as a spacecraft and a 360-degree view of the Milky Way. Tools used included 3D Max, Maya and Unity, the latter a game engine that provides the real-time rendering needed for VR experiences. For real-time interactivity, VR requires the real-time rendering provided by game engines. “For our TV work creating outer space, everything is rendered in VRay and it can take hours per frame,” he says. “With VR, everything has to be rendered in real-time, so we needed to create a new pipeline to build on our traditional way of modeling.” Myrick points out that there’s “a fine line” between VFX and gaming. “A lot of the skill sets that VR uses are a hybrid of VFX and gaming,” he says. “So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it, not just to a screen space but a world space.” Although Myrick says that VFX artists do have “a leg up” in approaching VR, he reports that there’s a whole other level of complexity. “I have been doing VFX for over 30 years, and in some

TOP and BOTTOM: For FuseVR’s Buzz Aldrin: Cycling Pathways to Mars, FuseVR created digital sets and environments, including the Martian surface, built from photogrammetry data captured in Morocco, as well as a spacecraft and a 360-degree view of the Milky Way. VR production company 8i did the holographic capture of former astronaut Buzz Aldrin.

ways it was like starting over,” he says. “The skills I had were a great starting point, but I had to learn new ways of dealing with live action and how to understand the unwrapped 3D world in a flat plane.” At MPC, Tom Dillon is the Head of Virtual Reality and Immersive Content. When he joined the company four years ago, he was in charge of experiential and interactive projects. “As VR grew and we were doing projects out of New York, MPC VR was born,” he says. The first project was with advertising agency Wieden + Kennedy for a Chrysler commercial and the second was a short film, Catatonic, a horror experience in an insane asylum, directed by Guy Shelmerdine and produced by VR production company VRSE for Gear VR. In 2015, the VR projects began rolling in, says Dillon. “We worked with [VR director] Chris Milk on the VR version of U2’s “Song for Someone,” he recalls. “Then we started working more with movie studios, starting with a marketing piece for Goosebumps, a Jack Black movie. That put us on the map.” By 2016, he says, MPC VR was a dynamic studio, and since then has done VR projects for film marketing, advertising and fine-artsrelated pieces, such as a 360-degree AR dance performance that went to the Sundance Film Festival. MPC also created a VR piece for EDM DJ Kygo, in concert with Sony Music, and a 360-degree video starring a capella singing group Pentatonics, for Lego. Dillon reports that MPC also works closely with Technicolor’s VR Experience Center. “We do post production ourselves and also in collaboration with Technicolor,” says Dillon.” MPC is also involved in augmented reality, which mixes the real world with digital imagery. “VR has opened us up to other avenues,” he says. “We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.” At San Francisco-based INVAR Studios, which opened at the end of 2015, VFX artist Alejandro Franceschi is now a creative technologist for 360-degree projects. The studio just premiered the pilot for Rose Colored, a 15-part narrative series of 15-minute episodes. Franceschi says VFX artists are drawn to the possibilities of the medium. “I think many VFX artists are storytellers at heart,

“VR has opened us up to other avenues. We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.” —Tom Dillon, Head of Virtual Reality & Immersive Content, MPC

SUMMER 2017

VFXVOICE.COM • 65


VR TRENDS

VFX ARTISTRY PUSHING VR STORYTELLING TO NEW PLACES By DEBRA KAUFMAN

TOP: Bud Myrick, Executive VP/VR Supervisor, FuseVR

“A lot of the skill sets that VR uses are a hybrid of VFX and gaming, So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it not just to a screen space but a world space.” —Bud Myrick, Executive VP, FuseFX

64 • VFXVOICE.COM SUMMER 2017

Virtual Reality is poised to become a dynamic entertainment sector, and VFX houses are leading the way. According to research firm Markets and Markets, virtual reality will be worth $3.9 billion by 2022. It also found that North America will be the “hottest” market due to the number of VR companies in the U.S. and the amount of R&D devoted to this space. VR filmmaking is poised to be a major segment. Major studios such as DreamWorks, Fox and Warner Bros. have spun off VR divisions, are about to, or have been investing in the technology. The technology will apply to live action, animated as well as documentaries. A-list directors, such as Doug Limon, are experimenting with VR film projects. VR projects are becoming part of film festivals such as Sundance and South by Southwest (SXSW). Documentaries are predicted to be a significant VR category. Some are also predicting that Virtual Reality TV shows will be commonplace by 2020-2022 with the advent of VR channels on cable. Visual effects facilities, which create virtual characters and worlds every day, are ideally suited to create VR content, and many of them have already done so, sometimes spinning off dedicated units. Any number of companies are now creating virtual reality elements and experiences, investigating where the demand is coming from, expanding what kind of work they’re doing as they believe the future is strong for this nascent sector. FuseFX just spun off a virtual reality division, dubbed FuseVR, led by Rhythm & Hues veterans Bud Myrick as VR Supervisor and John Heller as VR Creative Director. FuseVR just completed its first virtual reality job, Buzz Aldrin: Cycling Pathways to Mars, which debuted at SXSW, the annual conglomerate of film and interactive media that takes place in Austin, Texas. The project is available on Steam store for HTC Vive headset, and playing at the National Air and Space Museum in Washington, D.C. The job came to FuseVR from VR production company 8i, which, says Myrick, did the holographic capture of former astronaut Buzz Aldrin. “They brought us on because they had seen our space work and wanted our visual skill set,” he says. FuseVR created digital sets and environments, including the Martian surface, built from photogrammetry data captured in Morocco, as well as a spacecraft and a 360-degree view of the Milky Way. Tools used included 3D Max, Maya and Unity, the latter a game engine that provides the real-time rendering needed for VR experiences. For real-time interactivity, VR requires the real-time rendering provided by game engines. “For our TV work creating outer space, everything is rendered in VRay and it can take hours per frame,” he says. “With VR, everything has to be rendered in real-time, so we needed to create a new pipeline to build on our traditional way of modeling.” Myrick points out that there’s “a fine line” between VFX and gaming. “A lot of the skill sets that VR uses are a hybrid of VFX and gaming,” he says. “So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it, not just to a screen space but a world space.” Although Myrick says that VFX artists do have “a leg up” in approaching VR, he reports that there’s a whole other level of complexity. “I have been doing VFX for over 30 years, and in some

TOP and BOTTOM: For FuseVR’s Buzz Aldrin: Cycling Pathways to Mars, FuseVR created digital sets and environments, including the Martian surface, built from photogrammetry data captured in Morocco, as well as a spacecraft and a 360-degree view of the Milky Way. VR production company 8i did the holographic capture of former astronaut Buzz Aldrin.

ways it was like starting over,” he says. “The skills I had were a great starting point, but I had to learn new ways of dealing with live action and how to understand the unwrapped 3D world in a flat plane.” At MPC, Tom Dillon is the Head of Virtual Reality and Immersive Content. When he joined the company four years ago, he was in charge of experiential and interactive projects. “As VR grew and we were doing projects out of New York, MPC VR was born,” he says. The first project was with advertising agency Wieden + Kennedy for a Chrysler commercial and the second was a short film, Catatonic, a horror experience in an insane asylum, directed by Guy Shelmerdine and produced by VR production company VRSE for Gear VR. In 2015, the VR projects began rolling in, says Dillon. “We worked with [VR director] Chris Milk on the VR version of U2’s “Song for Someone,” he recalls. “Then we started working more with movie studios, starting with a marketing piece for Goosebumps, a Jack Black movie. That put us on the map.” By 2016, he says, MPC VR was a dynamic studio, and since then has done VR projects for film marketing, advertising and fine-artsrelated pieces, such as a 360-degree AR dance performance that went to the Sundance Film Festival. MPC also created a VR piece for EDM DJ Kygo, in concert with Sony Music, and a 360-degree video starring a capella singing group Pentatonics, for Lego. Dillon reports that MPC also works closely with Technicolor’s VR Experience Center. “We do post production ourselves and also in collaboration with Technicolor,” says Dillon.” MPC is also involved in augmented reality, which mixes the real world with digital imagery. “VR has opened us up to other avenues,” he says. “We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.” At San Francisco-based INVAR Studios, which opened at the end of 2015, VFX artist Alejandro Franceschi is now a creative technologist for 360-degree projects. The studio just premiered the pilot for Rose Colored, a 15-part narrative series of 15-minute episodes. Franceschi says VFX artists are drawn to the possibilities of the medium. “I think many VFX artists are storytellers at heart,

“VR has opened us up to other avenues. We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.” —Tom Dillon, Head of Virtual Reality & Immersive Content, MPC

SUMMER 2017

VFXVOICE.COM • 65


VR TRENDS

TOP: Alejandro Franceschi, VFX artist, INVAR Studios. BOTTOM: From INVAR Studios’ VR pilot, Rose-Colored, a 15-part narrative series of 15-minute episodes.

66 • VFXVOICE.COM SUMMER 2017

and many of them are interested in seeing how and where they can push the medium of VR forward,” he says. “They have the technological background that many traditional storytellers do not. So it’s staking a claim in a new niche and making a go at it. “VFX artists also love a challenge,” he continues. “I think many of them are tired of making sequels, and VR is an opportunity to spread their wings and tell the stories they would like to tell.” That is true for him, as Rose Colored delves into an area that most VR does not: narrative storytelling. “Games already have an infrastructure to be profitable,” he says. “Interactive narratives don’t yet. Our chief executive, Elizabeth Koshy, and I come from films and TV and wanted to make content that spoke to our strength. We wanted to pioneer interactive video in VR.” To achieve real-time rendering, The Mill New York inked a partnership with Epic, and has been integrating its Unreal realtime game engine in its pipeline ever since. That’s what it used, in addition to its own tool set, to create The Human Race, which included two completely digital racing cars. That project, he adds, was “a major leap forward” for The Mill. “Right now, everything is pre-processed and it takes days to go through the VFX pipeline,” says Group CG Director Vince Baertsoen. “And we are doing it in real-time now.” At Luma Pictures, General Manager Jay Lichtman reports his company has done two VR projects for Infiniti with Crispin Porter + Bogusky for the Pebble Beach Concours d’Elegance. With the Oculus Rift VR headset, the user finds him or herself in the driver’s seat, navigating a few of the world’s most notorious roads; the second depicted the car’s design from pencil drawings to manufacture. But, notes Lichtman, these spots, which combined live action


VR TRENDS

TOP: Alejandro Franceschi, VFX artist, INVAR Studios. BOTTOM: From INVAR Studios’ VR pilot, Rose-Colored, a 15-part narrative series of 15-minute episodes.

66 • VFXVOICE.COM SUMMER 2017

and many of them are interested in seeing how and where they can push the medium of VR forward,” he says. “They have the technological background that many traditional storytellers do not. So it’s staking a claim in a new niche and making a go at it. “VFX artists also love a challenge,” he continues. “I think many of them are tired of making sequels, and VR is an opportunity to spread their wings and tell the stories they would like to tell.” That is true for him, as Rose Colored delves into an area that most VR does not: narrative storytelling. “Games already have an infrastructure to be profitable,” he says. “Interactive narratives don’t yet. Our chief executive, Elizabeth Koshy, and I come from films and TV and wanted to make content that spoke to our strength. We wanted to pioneer interactive video in VR.” To achieve real-time rendering, The Mill New York inked a partnership with Epic, and has been integrating its Unreal realtime game engine in its pipeline ever since. That’s what it used, in addition to its own tool set, to create The Human Race, which included two completely digital racing cars. That project, he adds, was “a major leap forward” for The Mill. “Right now, everything is pre-processed and it takes days to go through the VFX pipeline,” says Group CG Director Vince Baertsoen. “And we are doing it in real-time now.” At Luma Pictures, General Manager Jay Lichtman reports his company has done two VR projects for Infiniti with Crispin Porter + Bogusky for the Pebble Beach Concours d’Elegance. With the Oculus Rift VR headset, the user finds him or herself in the driver’s seat, navigating a few of the world’s most notorious roads; the second depicted the car’s design from pencil drawings to manufacture. But, notes Lichtman, these spots, which combined live action


VR TRENDS

TOP: Jay Lichtman, GM, Luma Pictures B OTTOM: Luma has completed two VR projects for Infiniti. In the first, with the Oculus Rift VR headset, the user is placed in the driver’s seat, navigating a few of the world’s most notorious roads. The second project captures a car’s design from pencil drawings to manufacture.

“VR will ultimately grow into being a storytelling medium. But at the moment, the rules of engagement and the tools don’t allow for true storytelling.” —Jay Lichtman, GM, Luma Pictures

68 • VFXVOICE.COM SUMMER 2017

and CGI, were pre-rendered, which meant that the user couldn’t control the experience, unlike real-time rendered project. That highlights the fact that, because VR is a new medium, its definition is still used broadly to include 360-degree experiences that are pre-rendered and passive as well as those with “room space,” that is, the ability of the user to walk around in the VR space and interact in real-time with what is in that room. Lichtman and the Luma team are more interested in the latter. That is where, says Lichtman, “people start to see the full capabilities of VR.” To that end, Luma Pictures is working on two such experiences, fully funded by Luma, although Lichtman can’t reveal details. “I think that VR will ultimately grow into being a storytelling medium,” he says. “But at the moment, the rules of engagement and the tools don’t allow for true storytelling.” At Zoic Labs, a sister company to VFX facility Zoic Studios, Executive Vice President Matt Thunell also defines VR as incorporating real-time rendering and room space. “With 360, the user is passive,” says Thunell. “VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.” Zoic Labs defines itself as “an advanced visualization company focused on the intersection of big data, narrative, design, and emerging technologies.” It provides R&D, software development and user interface design for private companies, as well as the U.S. Department of Defense and the intelligence community. The company has created Cognitive, a proprietary data visualization platform that “aggregates and displays large datasets in a 3D-rendered ‘game environment’ through a web portal. Our current development includes VR integration with the Cognitive platform to enable neuromorphic processing capabilities,” says Thunell. Although Zoic Studios has done several 360-degree projects for Google, Dr. Pepper, Adidas and others, Zoic Labs has yet to use its Cognitive platform for storytelling. But Thunell says the company plans to release a commercial version of it in June. Being able to see complicated data in seven or eight dimensions will be a tremendous boon to government agencies with complicated problems. If the brief history of VFX companies creating VR is any indication, digital artists will push VR storytelling to new and exciting places.

TOP: Matt Thunell, Executive VP/Executive Producer, Zoic Labs RIGHT: Poster for Buzz Aldrin: Cycling Pathways to Mars from FuseVR. BOTTOM: From Luma’s “Dream Road” ad for Infiniti.

“With 360, the user is passive. VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.” —Matt Thunell, Executive VP, Zoic Labs

SUMMER 2017

VFXVOICE.COM • 69


VR TRENDS

TOP: Jay Lichtman, GM, Luma Pictures B OTTOM: Luma has completed two VR projects for Infiniti. In the first, with the Oculus Rift VR headset, the user is placed in the driver’s seat, navigating a few of the world’s most notorious roads. The second project captures a car’s design from pencil drawings to manufacture.

“VR will ultimately grow into being a storytelling medium. But at the moment, the rules of engagement and the tools don’t allow for true storytelling.” —Jay Lichtman, GM, Luma Pictures

68 • VFXVOICE.COM SUMMER 2017

and CGI, were pre-rendered, which meant that the user couldn’t control the experience, unlike real-time rendered project. That highlights the fact that, because VR is a new medium, its definition is still used broadly to include 360-degree experiences that are pre-rendered and passive as well as those with “room space,” that is, the ability of the user to walk around in the VR space and interact in real-time with what is in that room. Lichtman and the Luma team are more interested in the latter. That is where, says Lichtman, “people start to see the full capabilities of VR.” To that end, Luma Pictures is working on two such experiences, fully funded by Luma, although Lichtman can’t reveal details. “I think that VR will ultimately grow into being a storytelling medium,” he says. “But at the moment, the rules of engagement and the tools don’t allow for true storytelling.” At Zoic Labs, a sister company to VFX facility Zoic Studios, Executive Vice President Matt Thunell also defines VR as incorporating real-time rendering and room space. “With 360, the user is passive,” says Thunell. “VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.” Zoic Labs defines itself as “an advanced visualization company focused on the intersection of big data, narrative, design, and emerging technologies.” It provides R&D, software development and user interface design for private companies, as well as the U.S. Department of Defense and the intelligence community. The company has created Cognitive, a proprietary data visualization platform that “aggregates and displays large datasets in a 3D-rendered ‘game environment’ through a web portal. Our current development includes VR integration with the Cognitive platform to enable neuromorphic processing capabilities,” says Thunell. Although Zoic Studios has done several 360-degree projects for Google, Dr. Pepper, Adidas and others, Zoic Labs has yet to use its Cognitive platform for storytelling. But Thunell says the company plans to release a commercial version of it in June. Being able to see complicated data in seven or eight dimensions will be a tremendous boon to government agencies with complicated problems. If the brief history of VFX companies creating VR is any indication, digital artists will push VR storytelling to new and exciting places.

TOP: Matt Thunell, Executive VP/Executive Producer, Zoic Labs RIGHT: Poster for Buzz Aldrin: Cycling Pathways to Mars from FuseVR. BOTTOM: From Luma’s “Dream Road” ad for Infiniti.

“With 360, the user is passive. VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.” —Matt Thunell, Executive VP, Zoic Labs

SUMMER 2017

VFXVOICE.COM • 69


FILM

VINTAGE CREATURES REVISITED By MICHAEL GOLDMAN

TOP: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

70 • VFXVOICE.COM SUMMER 2017

Some of the huge VFX movies hitting big screens this summer feature iconic creatures we have seen before. The fact that Hollywood likes to revive stories and characters from books, comics, television and older movies is nothing new, but in the modern era, putting a new twist on characters or creatures of a fantastical nature, let alone ones with historical pedigrees, requires unbelievably complex visual effects work. During summer 2017, among the familiar creatures returning prominently to the big screen are the famous talking apes in War for the Planet of the Apes, the third movie in the modern reboot of the Planet of the Apes franchise, featuring lead digital character Caesar. We’re also getting a new imagining of The Mummy, with Sofia Boutella playing a malevolent Egyptian princess returned from the dead, but this time swathed not in ancient bandages as much as various CG augmentations. And then, Ridley Scott’s original Xenomorph Alien returns in Alien: Covenant, as murderous as ever, but with far more advanced technical support to bring the creature’s skin and muscles to life. Visual effects supervisors involved with reinventing those CG creatures are faced with many challenges. Also instructive is the fact that these challenges aren’t related to cinema alone – broadcast television is also having a similar visual effects renaissance for iconic creatures and characters. Agents of S.H.I.E.L.D. Visual Effects Supervisor Mark Kolpack talks about that show’s success this past season resurrecting the classic comic-book character known as Ghost Rider.

APES EVOLVE

WETA Digital’s Dan Lemmon served as Visual Effects Supervisor on War for the Planet of the Apes in partnership with Joe Letteri, Senior Visual Effects Supervisor. Lemmon points out that the original 1968 film, Planet of the Apes, which used human actors in ape costumes and extensive prosthetics, though looking modest by today’s standards, in fact earned legendary makeup designer John Chambers an honorary Academy Award for the film’s prosthetic makeup in an era before the Academy even offered a separate makeup award. That standard, Lemmon reminds, lingers in the form of added pressure for today’s filmmakers “because we had big shoes to fill” when they first rebooted the franchise in 2011 with Rise of the Planet of the Apes. “But that movie wasn’t just a reboot – it was an origin story that told how hyper-intelligent apes developed from common chimpanzees, gorillas, and orangutans,” Lemmon explains. “At that time, our story began with common apes, and so our apes had to look indistinguishable from apes our audience would be familiar with. Apes have much shorter legs and much longer arms [than humans] and obviously they generally don’t wear clothes, so we couldn’t hide anatomical differences under tunics like they did in [the original movies]. Also, our apes move quadrupedally, like real apes do, which is a real challenge for any actor in an ape suit. Most importantly, the apes in our reboot did not yet speak, so much of Rise is a silent film – Caesar speaks four words in the entire film.” These factors led WETA to develop methods of emphasizing

“One of the advantages of digital characters is that you have the ability to improve them. … We have changed little details in Caesar’s face to help him better hit some of the intensity and subtlety that we see in [actor] Andy Serkis’s performance [during performance-capture sessions].” —Dan Lemmon, Visual Effects Supervisor, War for the Planet of the Apes

TOP: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 71


FILM

VINTAGE CREATURES REVISITED By MICHAEL GOLDMAN

TOP: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

70 • VFXVOICE.COM SUMMER 2017

Some of the huge VFX movies hitting big screens this summer feature iconic creatures we have seen before. The fact that Hollywood likes to revive stories and characters from books, comics, television and older movies is nothing new, but in the modern era, putting a new twist on characters or creatures of a fantastical nature, let alone ones with historical pedigrees, requires unbelievably complex visual effects work. During summer 2017, among the familiar creatures returning prominently to the big screen are the famous talking apes in War for the Planet of the Apes, the third movie in the modern reboot of the Planet of the Apes franchise, featuring lead digital character Caesar. We’re also getting a new imagining of The Mummy, with Sofia Boutella playing a malevolent Egyptian princess returned from the dead, but this time swathed not in ancient bandages as much as various CG augmentations. And then, Ridley Scott’s original Xenomorph Alien returns in Alien: Covenant, as murderous as ever, but with far more advanced technical support to bring the creature’s skin and muscles to life. Visual effects supervisors involved with reinventing those CG creatures are faced with many challenges. Also instructive is the fact that these challenges aren’t related to cinema alone – broadcast television is also having a similar visual effects renaissance for iconic creatures and characters. Agents of S.H.I.E.L.D. Visual Effects Supervisor Mark Kolpack talks about that show’s success this past season resurrecting the classic comic-book character known as Ghost Rider.

APES EVOLVE

WETA Digital’s Dan Lemmon served as Visual Effects Supervisor on War for the Planet of the Apes in partnership with Joe Letteri, Senior Visual Effects Supervisor. Lemmon points out that the original 1968 film, Planet of the Apes, which used human actors in ape costumes and extensive prosthetics, though looking modest by today’s standards, in fact earned legendary makeup designer John Chambers an honorary Academy Award for the film’s prosthetic makeup in an era before the Academy even offered a separate makeup award. That standard, Lemmon reminds, lingers in the form of added pressure for today’s filmmakers “because we had big shoes to fill” when they first rebooted the franchise in 2011 with Rise of the Planet of the Apes. “But that movie wasn’t just a reboot – it was an origin story that told how hyper-intelligent apes developed from common chimpanzees, gorillas, and orangutans,” Lemmon explains. “At that time, our story began with common apes, and so our apes had to look indistinguishable from apes our audience would be familiar with. Apes have much shorter legs and much longer arms [than humans] and obviously they generally don’t wear clothes, so we couldn’t hide anatomical differences under tunics like they did in [the original movies]. Also, our apes move quadrupedally, like real apes do, which is a real challenge for any actor in an ape suit. Most importantly, the apes in our reboot did not yet speak, so much of Rise is a silent film – Caesar speaks four words in the entire film.” These factors led WETA to develop methods of emphasizing

“One of the advantages of digital characters is that you have the ability to improve them. … We have changed little details in Caesar’s face to help him better hit some of the intensity and subtlety that we see in [actor] Andy Serkis’s performance [during performance-capture sessions].” —Dan Lemmon, Visual Effects Supervisor, War for the Planet of the Apes

TOP: Alien: Covenant (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 71


FILM

physical and facial performances of CG apes to communicate with audiences. “We adjusted our toolset to fit that particular movie and used performance capture, rather than prosthetic makeup, to achieve the most articulate and realistic apes for the story we were trying to tell,” Lemmon adds. Fast forwarding to 2017, two movies later, WETA’s ability to continually evolve its apes has grown exponentially. As Lemmon points out, “one of the advantages of digital characters is that you have the ability to improve them.” Therefore, WETA has again upgraded Caesar and his pals. As a result, for the current movie, Lemmon says “we have changed little details in Caesar’s face to help him better hit some of the intensity and subtlety that we see in [actor] Andy Serkis’s performance [during performance-capture sessions].” “We’ve rebuilt our fur system so that we can carry literally millions more hair strands on our characters, and style them in

72 • VFXVOICE.COM SUMMER 2017

ways that are much more realistic than we could achieve before. We also took our ray-tracing engine, Manuka, and added a new physically-based camera and lighting system to it, which allows us to light and ‘photograph’ our digital apes in a way that more closely matches the on-set cinematography and is more realistic than ever before.” MUMMY’S AUGMENTATION

In the 1930’s, Boris Karloff, swathed in bloody bandages, carried the Mummy illusion with his performance. In 1999 and 2001, Arnold Vosloo played the creature in a couple of feature films co-starring Brendan Fraser, and Jet Li gave it a try in 2008. For those shows, CG was introduced, but largely to give the Mummy mystical powers. The current movie is meant to be a more faithful remake of the original 1932 film, but with a female Mummy (Sofia Boutella).

Erik Nash, Visual Effects Supervisor on the project, says filmmakers decided early on they wanted a hybrid approach to the appearances of the Mummy and her undead minions – only going full CG when absolutely necessary. The idea was to get as much normal director-actor interaction as possible, using the actor’s performance as the creature’s foundation. “Our primary and default methodology was to use augmentation,” Nash explains. “With two notable exceptions, the creatures in The Mummy were created by replacing portions of the performer’s anatomy with digitally-created body parts. This approach required phenomenal amounts of character tracking, rotoscoping and roto-animation. But it resulted in creatures that were undeniably physically real, and grounded in their respective environments. [This way], our director could guide Sophia as he would any other actor, and it gave the other actors a flesh-andblood character to play against.” An identical approach was taken with the undead corpses who serve her. “They were reanimated using stunt performers and dancers in aged and tattered wardrobe who were photographed in the scene,” Nash says. “The costumes they wore and the performances they gave would be seen in the finished film. Moving Picture Company’s VFX team would painstakingly replace exposed heads, arms, and sometimes legs with CG extremities that were gruesomely aged, emaciated and decayed.” Nash feels the best of both worlds was achieved this way, because the end result was characters built on performances of live actors on set, but “they had heads and limbs that were clearly not those of live human beings. Our creature designs took advantage of this approach by featuring damaged and missing tissue, yielding an appearance that could not possibly be achieved through applied prosthetics.” There were only two exceptions to this approach where full CG

OPPOSITE TOP: The Mummy (Photo copyright © 2017 Universal Pictures. All Rights Reserved.) OPPOSITE BOTTOM: The Mummy(Photo copyright © 2017 Universal Pictures. All Rights Reserved.) TOP: War for the Planet of the Apes (Photo copyright © 2017 20th Century Fox. All Rights Reserved.) BOTTOM: War for the Planet of the Apes (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 73


FILM

physical and facial performances of CG apes to communicate with audiences. “We adjusted our toolset to fit that particular movie and used performance capture, rather than prosthetic makeup, to achieve the most articulate and realistic apes for the story we were trying to tell,” Lemmon adds. Fast forwarding to 2017, two movies later, WETA’s ability to continually evolve its apes has grown exponentially. As Lemmon points out, “one of the advantages of digital characters is that you have the ability to improve them.” Therefore, WETA has again upgraded Caesar and his pals. As a result, for the current movie, Lemmon says “we have changed little details in Caesar’s face to help him better hit some of the intensity and subtlety that we see in [actor] Andy Serkis’s performance [during performance-capture sessions].” “We’ve rebuilt our fur system so that we can carry literally millions more hair strands on our characters, and style them in

72 • VFXVOICE.COM SUMMER 2017

ways that are much more realistic than we could achieve before. We also took our ray-tracing engine, Manuka, and added a new physically-based camera and lighting system to it, which allows us to light and ‘photograph’ our digital apes in a way that more closely matches the on-set cinematography and is more realistic than ever before.” MUMMY’S AUGMENTATION

In the 1930’s, Boris Karloff, swathed in bloody bandages, carried the Mummy illusion with his performance. In 1999 and 2001, Arnold Vosloo played the creature in a couple of feature films co-starring Brendan Fraser, and Jet Li gave it a try in 2008. For those shows, CG was introduced, but largely to give the Mummy mystical powers. The current movie is meant to be a more faithful remake of the original 1932 film, but with a female Mummy (Sofia Boutella).

Erik Nash, Visual Effects Supervisor on the project, says filmmakers decided early on they wanted a hybrid approach to the appearances of the Mummy and her undead minions – only going full CG when absolutely necessary. The idea was to get as much normal director-actor interaction as possible, using the actor’s performance as the creature’s foundation. “Our primary and default methodology was to use augmentation,” Nash explains. “With two notable exceptions, the creatures in The Mummy were created by replacing portions of the performer’s anatomy with digitally-created body parts. This approach required phenomenal amounts of character tracking, rotoscoping and roto-animation. But it resulted in creatures that were undeniably physically real, and grounded in their respective environments. [This way], our director could guide Sophia as he would any other actor, and it gave the other actors a flesh-andblood character to play against.” An identical approach was taken with the undead corpses who serve her. “They were reanimated using stunt performers and dancers in aged and tattered wardrobe who were photographed in the scene,” Nash says. “The costumes they wore and the performances they gave would be seen in the finished film. Moving Picture Company’s VFX team would painstakingly replace exposed heads, arms, and sometimes legs with CG extremities that were gruesomely aged, emaciated and decayed.” Nash feels the best of both worlds was achieved this way, because the end result was characters built on performances of live actors on set, but “they had heads and limbs that were clearly not those of live human beings. Our creature designs took advantage of this approach by featuring damaged and missing tissue, yielding an appearance that could not possibly be achieved through applied prosthetics.” There were only two exceptions to this approach where full CG

OPPOSITE TOP: The Mummy (Photo copyright © 2017 Universal Pictures. All Rights Reserved.) OPPOSITE BOTTOM: The Mummy(Photo copyright © 2017 Universal Pictures. All Rights Reserved.) TOP: War for the Planet of the Apes (Photo copyright © 2017 20th Century Fox. All Rights Reserved.) BOTTOM: War for the Planet of the Apes (Photo copyright © 2017 20th Century Fox. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 73


FILM

was utilized. The first instance was the Mummy’s initial appearance. She would later undergo a progressive transformation, but when first appearing, the character “is in such an extreme state of decay, and her limbs are disjointed,” Nash explains. “So a full CG solution was mandatory. Her extreme condition and its attendant inhuman range of motion demanded bespoke rigging and key-frame animation.” The second exception was a sequence shot inside a taxicab where undead characters attack the story’s heroes. Nash says filmmakers “preferred not to be constrained by the physical presence of the undead antagonists in such tight quarters.” Therefore, they agreed to shoot the scene without an actor representing the undead characters. “In editing, the most successful performances were [cut together with] CG characters in action,” he adds. MORE DYNAMIC, ORGANIC XENOMORPH

Visual Effects Supervisor Charley Henley refers to the challenge of revisiting the so-called Xenomorph creature from the Alien franchise for the new film, Alien: Covenant, as “a daunting challenge to satisfy [director] Ridley Scott’s previous success,” meaning the original version of the creature from Scott’s famous 1979 film. Scott’s mission statement this time, Henley says, was to return on the one hand to “the first incarnation of the Xenomorph” as referenced in 1970’s-era creature designs from Swiss artist H.R. Giger, who developed the creature’s original look; and yet, on the other hand, to be able to avoid “the limitations Ridley faced when shooting the original Alien” in terms of how to make the creature’s movements convincing and small details visible to the camera. “The process was one of trying to run the evolution of the TOP TO BOTTOM IN SEQUENCE: Ghost Rider sequence from The Flash. (Photo copyright © 2016 ABC/Marvel Television. All Rights Reserved.)

Erik Nash

Mark Colpack

creature in reverse, to get back to the source and then manage, with restrained control, what we could now achieve with current CG technology and techniques,” Henley explains. “During pre-production, we referenced what Ridley liked most from the original Giger drawings, and [MPC’s] art department helped conceptualize ideas for all the creatures. After much debate, it became clear that Ridley wanted the ability to shoot practically where possible, and so Creature Design and Makeup Effects Supervisor Conor O’Sullivan took over the design to prep for the shoot. The [original] Xenomorph design was too tall at eight feet six inches to work as a conventional suit, so the creature team conceived a hybrid approach inspired by Japanese Bunraku puppets.” That practical methodology allowed Scott’s team to launch into principal photography using a version of the Xenomorph to direct and interact with actors even while the visual effects team kept refining the design further in post, preparing a final CG version. “Based on scans and texture photography of the practical puppet, MPC’s creature department started developing the final CG Xenomorph,” he explains. “The base was as per Giger’s original design, but the surface details and muscles were somewhat more elegant, using references from [wax anatomical sculptures at Italy’s Museo della Specola]. Going CG allowed for a complexity in surface movement using muscle firing systems and the latest in skin shading, while finding the right skin qualities took patient look-development with a special shader written for the creature’s cranium, which had an aspic translucent quality.” Henley adds the biggest complexity then became making the Xenomorph more “dynamic and physical” and “more organic” than the original. Thus, “developing the physical performance character for it was the hardest thing to wrangle,” he says. “Ridley didn’t want to be limited like he had been in 1978, but it needed to keep its edge. Motion capture took us some of the way, and we also experimented with references of animals and insects, along with key framing.” GHOST RIDER SETS TV ON FIRE

When ABC/Marvel’s Agents of S.H.I.E.L.D. launched a major story arc this past season featuring a well-known comic-book character that had also had a two-movie theatrical run named the Ghost Rider, they had to illustrate a man’s transformation into a demon with a flaming skull for a head, and for good measure,

74 • VFXVOICE.COM SUMMER 2017

his vehicle – a souped-up 1969 Dodge Charger dubbed the “Hell Charger” – flames up along with him. Visual Effects Supervisor Mark Kolpack headed what was, by television standards, a massive project to bring the Ghost Rider to life in collaboration with various VFX facilities, particularly Sherman Oaks’ FuseFX, the show’s lead vendor. Kolpack spent his seasonal hiatus formulating a plan to make the character viable for the show’s storyline, financial and scheduling needs, and to satisfy longtime fans of the character. He studied original Felipe Smith comic-book drawings of the Ghost Rider, created Photoshop concepts, and analyzed the 2007 and 2011 Ghost Rider films. Eventually, he concluded the primary challenge would be dealing with the character’s inherently “dead face.” “How can an audience relate to a character that has no skin or eyes?” he wondered. “My quest became to figure out ways that [the audience] could relate emotionally to him.” On the front end, his team needed to create various elements and lighting to mix-and-match with CG components. He eventually concluded that “LEDs would play an important part” on set. Filmmakers collaborated with his team on a wide range of solutions, including: “a special rig with a spinning, stand-in tire” for the vehicle; various other car elements; a balaclava hood, and corresponding Ghost Rider costume configured with LED strips and tracking markers to create interactive lighting elements; various methods for spreading LED light from the character to other elements on set; a special VFX shoot to capture real fire elements, and so on. Kolpack says FuseFX built a special in-house pipeline and labored long on a wide range of details. But regarding solving the aforementioned “dead face,” Kolpack says he got a cue from another Marvel character he had recently seen. “In Deadpool, I saw a squash-and-stretch technique used on his eyes to help him emote through his mask and thought it was brilliant,” he says. “So I decided to employ it on Ghost Rider. Then we also had the tiny coal eyes burn inside the orbits themselves.” An important touch came from making the character what Kolpack calls “a motorhead,” meaning bony indentations on the side of the skull essentially acted like exhaust ports and blew out hot flames. Kolpack had the FuseFX team add a burning flameball effect for that look – based on fire images he had shot years earlier using a Vision Research Phantom Flex camera – by doing fluid simulations in Houdini based on those images. Using a Photogrammetry process, filmmakers also scanned actor Gabriel Luna’s head and body at 3D Scan LA, providing them with geometry and textures for rebuilding various body parts or the interior of his jacket. “The detailed head scan was done for the purpose of his burn reveal into Ghost Rider,” Kolpack elaborates. “We also shot a series of expressions to be used as morph targets so we could match the actor’s performance, and we had simultaneous beauty and polarized texture stills shot after the scanning.” Kolpack hopes Ghost Rider illustrates what is now possible for broadcast TV and, at the end of the day, “helped toward raising production values overall in [television] visual effects.”

SUMMER 2017

VFXVOICE.COM • 75


FILM

was utilized. The first instance was the Mummy’s initial appearance. She would later undergo a progressive transformation, but when first appearing, the character “is in such an extreme state of decay, and her limbs are disjointed,” Nash explains. “So a full CG solution was mandatory. Her extreme condition and its attendant inhuman range of motion demanded bespoke rigging and key-frame animation.” The second exception was a sequence shot inside a taxicab where undead characters attack the story’s heroes. Nash says filmmakers “preferred not to be constrained by the physical presence of the undead antagonists in such tight quarters.” Therefore, they agreed to shoot the scene without an actor representing the undead characters. “In editing, the most successful performances were [cut together with] CG characters in action,” he adds. MORE DYNAMIC, ORGANIC XENOMORPH

Visual Effects Supervisor Charley Henley refers to the challenge of revisiting the so-called Xenomorph creature from the Alien franchise for the new film, Alien: Covenant, as “a daunting challenge to satisfy [director] Ridley Scott’s previous success,” meaning the original version of the creature from Scott’s famous 1979 film. Scott’s mission statement this time, Henley says, was to return on the one hand to “the first incarnation of the Xenomorph” as referenced in 1970’s-era creature designs from Swiss artist H.R. Giger, who developed the creature’s original look; and yet, on the other hand, to be able to avoid “the limitations Ridley faced when shooting the original Alien” in terms of how to make the creature’s movements convincing and small details visible to the camera. “The process was one of trying to run the evolution of the TOP TO BOTTOM IN SEQUENCE: Ghost Rider sequence from The Flash. (Photo copyright © 2016 ABC/Marvel Television. All Rights Reserved.)

Erik Nash

Mark Colpack

creature in reverse, to get back to the source and then manage, with restrained control, what we could now achieve with current CG technology and techniques,” Henley explains. “During pre-production, we referenced what Ridley liked most from the original Giger drawings, and [MPC’s] art department helped conceptualize ideas for all the creatures. After much debate, it became clear that Ridley wanted the ability to shoot practically where possible, and so Creature Design and Makeup Effects Supervisor Conor O’Sullivan took over the design to prep for the shoot. The [original] Xenomorph design was too tall at eight feet six inches to work as a conventional suit, so the creature team conceived a hybrid approach inspired by Japanese Bunraku puppets.” That practical methodology allowed Scott’s team to launch into principal photography using a version of the Xenomorph to direct and interact with actors even while the visual effects team kept refining the design further in post, preparing a final CG version. “Based on scans and texture photography of the practical puppet, MPC’s creature department started developing the final CG Xenomorph,” he explains. “The base was as per Giger’s original design, but the surface details and muscles were somewhat more elegant, using references from [wax anatomical sculptures at Italy’s Museo della Specola]. Going CG allowed for a complexity in surface movement using muscle firing systems and the latest in skin shading, while finding the right skin qualities took patient look-development with a special shader written for the creature’s cranium, which had an aspic translucent quality.” Henley adds the biggest complexity then became making the Xenomorph more “dynamic and physical” and “more organic” than the original. Thus, “developing the physical performance character for it was the hardest thing to wrangle,” he says. “Ridley didn’t want to be limited like he had been in 1978, but it needed to keep its edge. Motion capture took us some of the way, and we also experimented with references of animals and insects, along with key framing.” GHOST RIDER SETS TV ON FIRE

When ABC/Marvel’s Agents of S.H.I.E.L.D. launched a major story arc this past season featuring a well-known comic-book character that had also had a two-movie theatrical run named the Ghost Rider, they had to illustrate a man’s transformation into a demon with a flaming skull for a head, and for good measure,

74 • VFXVOICE.COM SUMMER 2017

his vehicle – a souped-up 1969 Dodge Charger dubbed the “Hell Charger” – flames up along with him. Visual Effects Supervisor Mark Kolpack headed what was, by television standards, a massive project to bring the Ghost Rider to life in collaboration with various VFX facilities, particularly Sherman Oaks’ FuseFX, the show’s lead vendor. Kolpack spent his seasonal hiatus formulating a plan to make the character viable for the show’s storyline, financial and scheduling needs, and to satisfy longtime fans of the character. He studied original Felipe Smith comic-book drawings of the Ghost Rider, created Photoshop concepts, and analyzed the 2007 and 2011 Ghost Rider films. Eventually, he concluded the primary challenge would be dealing with the character’s inherently “dead face.” “How can an audience relate to a character that has no skin or eyes?” he wondered. “My quest became to figure out ways that [the audience] could relate emotionally to him.” On the front end, his team needed to create various elements and lighting to mix-and-match with CG components. He eventually concluded that “LEDs would play an important part” on set. Filmmakers collaborated with his team on a wide range of solutions, including: “a special rig with a spinning, stand-in tire” for the vehicle; various other car elements; a balaclava hood, and corresponding Ghost Rider costume configured with LED strips and tracking markers to create interactive lighting elements; various methods for spreading LED light from the character to other elements on set; a special VFX shoot to capture real fire elements, and so on. Kolpack says FuseFX built a special in-house pipeline and labored long on a wide range of details. But regarding solving the aforementioned “dead face,” Kolpack says he got a cue from another Marvel character he had recently seen. “In Deadpool, I saw a squash-and-stretch technique used on his eyes to help him emote through his mask and thought it was brilliant,” he says. “So I decided to employ it on Ghost Rider. Then we also had the tiny coal eyes burn inside the orbits themselves.” An important touch came from making the character what Kolpack calls “a motorhead,” meaning bony indentations on the side of the skull essentially acted like exhaust ports and blew out hot flames. Kolpack had the FuseFX team add a burning flameball effect for that look – based on fire images he had shot years earlier using a Vision Research Phantom Flex camera – by doing fluid simulations in Houdini based on those images. Using a Photogrammetry process, filmmakers also scanned actor Gabriel Luna’s head and body at 3D Scan LA, providing them with geometry and textures for rebuilding various body parts or the interior of his jacket. “The detailed head scan was done for the purpose of his burn reveal into Ghost Rider,” Kolpack elaborates. “We also shot a series of expressions to be used as morph targets so we could match the actor’s performance, and we had simultaneous beauty and polarized texture stills shot after the scanning.” Kolpack hopes Ghost Rider illustrates what is now possible for broadcast TV and, at the end of the day, “helped toward raising production values overall in [television] visual effects.”

SUMMER 2017

VFXVOICE.COM • 75


PROFILE

STEPHANE CERETTI: DRIVEN BY A LIFELONG PASSION FOR FILM By BARBARA ROBERTSON

TOP: Ceretti on location with Doctor Strange.

76 • VFXVOICE.COM SUMMER 2017

At their best, visual effects smoothly blend science and art, technology and storytelling, technique and artistry. Thus, it’s no surprise that visual effects supervisors at the top of their game embody the same elements. Supervisors like Stephane Ceretti. From freezing Gotham City in Batman and Robin to bending New York City in Doctor Strange, Ceretti’s career path has taken him through some of the most creative and technically challenging visual effects: The Cell, The Matrix Reloaded and Revolutions, Batman Begins, Harry Potter and the Goblet of Fire, X-Men: First Class, Cloud Atlas and, most recently, Guardians of the Galaxy and Doctor Strange. That supervision of Marvel’s Guardians and Doctor Strange resulted in VES, BAFTA and Oscar nominations for both films. Now Ceretti is Visual Effects Supervisor for Marvel’s 2018 Ant-Man release. “Ant-Man and the Wasp,” he corrects affably. “Let’s not forget the Wasp. She’s important.” Ceretti began work on Ant-Man and the Wasp in January. By April, he and the team were breaking down the script and thinking about which studios to cast for the effects. “Susan Pickett [Visual Effects Producer] and I see this like casting an actor,” he says. “We want to pick not only the right studios, but the right people within the studios to assemble a team tailored to the effects we’re doing. We’re making films. We’re not manufacturing a product. It’s like haute couture. If you want a specific dress, you don’t go to H&M. You go to a haute couture

house. The same with visual effects. Everything on a film is tailormade. The visual effects need to be done at the right place.” Born in Lyon, France, the 43-year old Ceretti made his first films when he was eight years old. Like many visual effects artists, he was inspired by Star Wars, in his case, the 1980 film Star Wars: Episode V - The Empire Strikes Back. But Lyon, where the Lumière brothers had lived, also inspired him. “I would go to talks about film at the Institut,” he says, referring to Lyon’s Institut Lumière, which holds many of the Lumières’ first cinematic inventions. Meanwhile at home, science and art went hand in hand. “My dad was an engineer and one of the first to use Apple computers,” he says. “We had an Apple II at home when I was six. My mom loved the arts and movies, and we were always going to see the big films. So, I had that combination of computers, engineering and film. My dad filmed us with his Super 8 camera, and my brother and I were into making our own little films. We did explosions, special effects, animation.” Even so, Ceretti studied science in college before following his passion. “I was doing animation at home using 3D Studio, and ray-tracing images using software on an Atari that took 20 hours to render,” he says. “But I went all the way toward getting an equivalent to a bachelor’s degree in quantum mechanics before I said, ‘OK, I’m going to do what I really want to do.’” He began working in the summer at Lyon-based Infogrames, a large holding company that published games through various subsidiaries. And he studied art and animation at Ecole Emile Cohl in Lyon. Then he made the move that started his career – to Paris where he joined BUF Compagnie, one of the pioneers in the use of computer graphics for visual effects and animation. “When I joined BUF there were around 15 people,” Ceretti says. “They were about to do Batman and Robin, so the timing worked out well. It was a great school for me.” “School” started with learning how to use the studio’s proprietary software, and the only time the computers were free for that training was at night. “I worked all night for three or four weeks to learn their software before I could start working during the day,” he says. “Then they mentored me. They put me on small projects, on TV commercials.” He had joined BUF in September. Within three to four months, the studio had grown to 50 people. By the end of the year, Ceretti was working on Batman and Robin. “It was hard,” he says. “It was a learning curve for everyone. We only had 50 shots, but they were a lot of work for a company like this in 1996/1997. Poison Ivy was growing plants everywhere. Mr. Freeze was using our frozen rain to freeze the town.” Ceretti supervised commercial work for the next few years, but when the studio took on The Cell, he became visual effects supervisor for that film. “It was scary because I had only supervised commercials, and I ended up going to LA,” Ceretti says. “But Kevin [Tod Haug, VFX Supervisor] was great and it was an interesting show in terms of the graphics and visuals we had to create.”

TOP: Poster of Benedict Cumberbatch in Doctor Strange. (Photo copyright © 2016 Marvel Studios. All Rights Reserved.) BOTTOM: Guardians of the Galaxy, Vol. 2 (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 77


PROFILE

STEPHANE CERETTI: DRIVEN BY A LIFELONG PASSION FOR FILM By BARBARA ROBERTSON

TOP: Ceretti on location with Doctor Strange.

76 • VFXVOICE.COM SUMMER 2017

At their best, visual effects smoothly blend science and art, technology and storytelling, technique and artistry. Thus, it’s no surprise that visual effects supervisors at the top of their game embody the same elements. Supervisors like Stephane Ceretti. From freezing Gotham City in Batman and Robin to bending New York City in Doctor Strange, Ceretti’s career path has taken him through some of the most creative and technically challenging visual effects: The Cell, The Matrix Reloaded and Revolutions, Batman Begins, Harry Potter and the Goblet of Fire, X-Men: First Class, Cloud Atlas and, most recently, Guardians of the Galaxy and Doctor Strange. That supervision of Marvel’s Guardians and Doctor Strange resulted in VES, BAFTA and Oscar nominations for both films. Now Ceretti is Visual Effects Supervisor for Marvel’s 2018 Ant-Man release. “Ant-Man and the Wasp,” he corrects affably. “Let’s not forget the Wasp. She’s important.” Ceretti began work on Ant-Man and the Wasp in January. By April, he and the team were breaking down the script and thinking about which studios to cast for the effects. “Susan Pickett [Visual Effects Producer] and I see this like casting an actor,” he says. “We want to pick not only the right studios, but the right people within the studios to assemble a team tailored to the effects we’re doing. We’re making films. We’re not manufacturing a product. It’s like haute couture. If you want a specific dress, you don’t go to H&M. You go to a haute couture

house. The same with visual effects. Everything on a film is tailormade. The visual effects need to be done at the right place.” Born in Lyon, France, the 43-year old Ceretti made his first films when he was eight years old. Like many visual effects artists, he was inspired by Star Wars, in his case, the 1980 film Star Wars: Episode V - The Empire Strikes Back. But Lyon, where the Lumière brothers had lived, also inspired him. “I would go to talks about film at the Institut,” he says, referring to Lyon’s Institut Lumière, which holds many of the Lumières’ first cinematic inventions. Meanwhile at home, science and art went hand in hand. “My dad was an engineer and one of the first to use Apple computers,” he says. “We had an Apple II at home when I was six. My mom loved the arts and movies, and we were always going to see the big films. So, I had that combination of computers, engineering and film. My dad filmed us with his Super 8 camera, and my brother and I were into making our own little films. We did explosions, special effects, animation.” Even so, Ceretti studied science in college before following his passion. “I was doing animation at home using 3D Studio, and ray-tracing images using software on an Atari that took 20 hours to render,” he says. “But I went all the way toward getting an equivalent to a bachelor’s degree in quantum mechanics before I said, ‘OK, I’m going to do what I really want to do.’” He began working in the summer at Lyon-based Infogrames, a large holding company that published games through various subsidiaries. And he studied art and animation at Ecole Emile Cohl in Lyon. Then he made the move that started his career – to Paris where he joined BUF Compagnie, one of the pioneers in the use of computer graphics for visual effects and animation. “When I joined BUF there were around 15 people,” Ceretti says. “They were about to do Batman and Robin, so the timing worked out well. It was a great school for me.” “School” started with learning how to use the studio’s proprietary software, and the only time the computers were free for that training was at night. “I worked all night for three or four weeks to learn their software before I could start working during the day,” he says. “Then they mentored me. They put me on small projects, on TV commercials.” He had joined BUF in September. Within three to four months, the studio had grown to 50 people. By the end of the year, Ceretti was working on Batman and Robin. “It was hard,” he says. “It was a learning curve for everyone. We only had 50 shots, but they were a lot of work for a company like this in 1996/1997. Poison Ivy was growing plants everywhere. Mr. Freeze was using our frozen rain to freeze the town.” Ceretti supervised commercial work for the next few years, but when the studio took on The Cell, he became visual effects supervisor for that film. “It was scary because I had only supervised commercials, and I ended up going to LA,” Ceretti says. “But Kevin [Tod Haug, VFX Supervisor] was great and it was an interesting show in terms of the graphics and visuals we had to create.”

TOP: Poster of Benedict Cumberbatch in Doctor Strange. (Photo copyright © 2016 Marvel Studios. All Rights Reserved.) BOTTOM: Guardians of the Galaxy, Vol. 2 (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 77


PROFILE

TOP: Airborne on Doctor Strange. BOTTOM: Marvel’s Thor: The Dark World (Photo credit: Film Frame. Copyright © 2013 MVLFFLLC. TM & © 2013 Marvel. All Rights Reserved.)

78 • VFXVOICE.COM SUMMER 2017

The Cell gave him his first experience working with a director and with an American crew. “I knew I wanted to be a visual effects supervisor since I saw Empire Strikes Back; I knew I wanted to be that guy,” Ceretti says. “And this was it. It felt good.” After The Cell, he helped with BUF’s work on S1m0ne (2002), and then BUF founder Pierre Buffin asked him to supervise the studio’s work on The Matrix. “That defined my career,” Ceretti says. “I met John Gaeta, Dan Glass, the Wachowskis.” It helped that Ceretti could speak English – his mother was raised in the English-speaking part of Montreal, so they had watched English-language movies at home, and he had grown up spending holidays in Canada and on the US East Coast. But, more importantly, he was ready. “I grew into a good position at BUF faster than I could have in a big studio, but at the same time I had lots of support,” he says. By the time he left BUF in 2008, he had supervised effects there for Alexander, Batman Begins, and Harry Potter and the Goblet of Fire, and was overall supervisor for The Prestige and Babylon A.D. He had been with the studio 12 years. “It felt like time to move,” he says. “I wanted to experiment with working in London.” He started at MPC, supervised that studio’s work on Prince of Persia, and worked there until Dan Glass asked him to start Method’s London studio. And one movie led to another. He became an additional visual effects supervisor for Captain America: The First Avenger, the visual effects supervisor for Fox on X-Men: First Class, working with John Dykstra, visual effects supervisor for Cloud Atlas, and visual effects supervisor – 2nd unit on Thor: The Dark World. “Method loaned me out to the studios,” he says. “I was kind of

independent, but still attached.” X-Men gave him experience managing multiple vendors on a show. And Captain America provided entrée into the Marvel universe, a world he stepped firmly into with Guardians of the Galaxy, then Doctor Strange, and now Ant-Man and the Wasp. Although these are all superhero films, Ceretti is quick to praise the creative differences. “I enjoy doing these movies very much,” he says. “We aren’t doing the same thing over and over. Doctor Strange and Guardians couldn’t be more different. And, for Ant-Man and the Wasp, we have some crazy ideas I can’t talk about. It’s going to be fun.” And he particularly enjoys working on these films as part of the production. “I like what I’m doing now,” he says. “To me, what we do as visual effects artists is to serve the story, so being closer to the director and producer is very rewarding. We’re part of the group of filmmakers. Especially at Marvel. I miss the day-to-day rounds, talking to the guys, but when I watch a film, I can see my influence. That’s very exciting.” Although intrigued by virtual production, Ceretti plans to keep grounding that influence in the real world. “I think as the tools become more advanced, people will get even better at making films that look like they’ve been shot in a real place,” he says. “But, I think we have a better sense of storytelling and performance when we shoot in a real place, and we achieve better effects with a mixture of real life and CG.” If Ceretti weren’t a visual effects supervisor, he would probably be a director of photography or a director.

TOP: Benedict Cumberbatch in Doctor Strange. (Photo copyright © 2015 Marvel. All Rights Reserved.) BOTTOM: Ceretti on location for Doctor Strange in Katmandu. Filming also took place in New York and London.

SUMMER 2017

VFXVOICE.COM • 79


PROFILE

TOP: Airborne on Doctor Strange. BOTTOM: Marvel’s Thor: The Dark World (Photo credit: Film Frame. Copyright © 2013 MVLFFLLC. TM & © 2013 Marvel. All Rights Reserved.)

78 • VFXVOICE.COM SUMMER 2017

The Cell gave him his first experience working with a director and with an American crew. “I knew I wanted to be a visual effects supervisor since I saw Empire Strikes Back; I knew I wanted to be that guy,” Ceretti says. “And this was it. It felt good.” After The Cell, he helped with BUF’s work on S1m0ne (2002), and then BUF founder Pierre Buffin asked him to supervise the studio’s work on The Matrix. “That defined my career,” Ceretti says. “I met John Gaeta, Dan Glass, the Wachowskis.” It helped that Ceretti could speak English – his mother was raised in the English-speaking part of Montreal, so they had watched English-language movies at home, and he had grown up spending holidays in Canada and on the US East Coast. But, more importantly, he was ready. “I grew into a good position at BUF faster than I could have in a big studio, but at the same time I had lots of support,” he says. By the time he left BUF in 2008, he had supervised effects there for Alexander, Batman Begins, and Harry Potter and the Goblet of Fire, and was overall supervisor for The Prestige and Babylon A.D. He had been with the studio 12 years. “It felt like time to move,” he says. “I wanted to experiment with working in London.” He started at MPC, supervised that studio’s work on Prince of Persia, and worked there until Dan Glass asked him to start Method’s London studio. And one movie led to another. He became an additional visual effects supervisor for Captain America: The First Avenger, the visual effects supervisor for Fox on X-Men: First Class, working with John Dykstra, visual effects supervisor for Cloud Atlas, and visual effects supervisor – 2nd unit on Thor: The Dark World. “Method loaned me out to the studios,” he says. “I was kind of

independent, but still attached.” X-Men gave him experience managing multiple vendors on a show. And Captain America provided entrée into the Marvel universe, a world he stepped firmly into with Guardians of the Galaxy, then Doctor Strange, and now Ant-Man and the Wasp. Although these are all superhero films, Ceretti is quick to praise the creative differences. “I enjoy doing these movies very much,” he says. “We aren’t doing the same thing over and over. Doctor Strange and Guardians couldn’t be more different. And, for Ant-Man and the Wasp, we have some crazy ideas I can’t talk about. It’s going to be fun.” And he particularly enjoys working on these films as part of the production. “I like what I’m doing now,” he says. “To me, what we do as visual effects artists is to serve the story, so being closer to the director and producer is very rewarding. We’re part of the group of filmmakers. Especially at Marvel. I miss the day-to-day rounds, talking to the guys, but when I watch a film, I can see my influence. That’s very exciting.” Although intrigued by virtual production, Ceretti plans to keep grounding that influence in the real world. “I think as the tools become more advanced, people will get even better at making films that look like they’ve been shot in a real place,” he says. “But, I think we have a better sense of storytelling and performance when we shoot in a real place, and we achieve better effects with a mixture of real life and CG.” If Ceretti weren’t a visual effects supervisor, he would probably be a director of photography or a director.

TOP: Benedict Cumberbatch in Doctor Strange. (Photo copyright © 2015 Marvel. All Rights Reserved.) BOTTOM: Ceretti on location for Doctor Strange in Katmandu. Filming also took place in New York and London.

SUMMER 2017

VFXVOICE.COM • 79


PROFILE

“I always have my camera with me,” he says. “I’m trying stuff, looking at things. I like telling stories. I’ve started directing some short films.” In the rare moments when he isn’t working, you might find him driving for hours through the desert in his Rav4 hybrid, taking photographs. Or hiking. “I work a lot, though,” he says. “Oh my god, I’m always working. There isn’t much time for fun.” “But,” he adds, “I have fun in my work, so that’s fine.” His work on Ant-Man and the Wasp has just begun. Look for the film on July 6, 2018.

TOP LEFT: A candid staircase moment on Doctor Strange. MIDDLE LEFT: At work on Doctor Strange. BOTTOM LEFT: On location in New York for Doctor Strange.

80 • VFXVOICE.COM SUMMER 2017

TOP LEFT: X-Men: First Class (Photo copyright © 2011 20th Century Fox/Marvel Entertainment. All Rights Reserved.) MIDDLE LEFT: The Cell gave Ceretti his first experience working with a director and an American crew. (Photo copyright © 2000 New Line Cinema. All Rights Reserved.) BOTTOM LEFT: Matrix Reloaded (Photo copyright © 2013 Warner Bros./Village Roadshow Pictures. All Rights Reserved.) TOP RIGHT: Batman and Robin (Photo copyright © 1997 Warner Bros. All Rights Reserved.) BOTTOM RIGHT: Doctor Strange (Photo copyright © 2016 Marvel Studios. All Rights Reserved.)

TOP RIGHT: Ceretti in London on Doctor Strange in 2015. BOTTOM RIGHT: Harry Potter and the Goblet of Fire (Photo copyright © 2015 Warner Bros. Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 81


PROFILE

“I always have my camera with me,” he says. “I’m trying stuff, looking at things. I like telling stories. I’ve started directing some short films.” In the rare moments when he isn’t working, you might find him driving for hours through the desert in his Rav4 hybrid, taking photographs. Or hiking. “I work a lot, though,” he says. “Oh my god, I’m always working. There isn’t much time for fun.” “But,” he adds, “I have fun in my work, so that’s fine.” His work on Ant-Man and the Wasp has just begun. Look for the film on July 6, 2018.

TOP LEFT: A candid staircase moment on Doctor Strange. MIDDLE LEFT: At work on Doctor Strange. BOTTOM LEFT: On location in New York for Doctor Strange.

80 • VFXVOICE.COM SUMMER 2017

TOP LEFT: X-Men: First Class (Photo copyright © 2011 20th Century Fox/Marvel Entertainment. All Rights Reserved.) MIDDLE LEFT: The Cell gave Ceretti his first experience working with a director and an American crew. (Photo copyright © 2000 New Line Cinema. All Rights Reserved.) BOTTOM LEFT: Matrix Reloaded (Photo copyright © 2013 Warner Bros./Village Roadshow Pictures. All Rights Reserved.) TOP RIGHT: Batman and Robin (Photo copyright © 1997 Warner Bros. All Rights Reserved.) BOTTOM RIGHT: Doctor Strange (Photo copyright © 2016 Marvel Studios. All Rights Reserved.)

TOP RIGHT: Ceretti in London on Doctor Strange in 2015. BOTTOM RIGHT: Harry Potter and the Goblet of Fire (Photo copyright © 2015 Warner Bros. Pictures. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 81


GAMES

Traditionally, summer is not a great time for gamers seeking new wares to pass the time. However, every year there is generally a handful of games that come out at other points on the calendar – perhaps to take advantage of the lack of competition or because they’re not blockbuster games. Indeed, the summer of 2017 seems to have brought some high-profile and recognizable names that are worthy of consideration. INJUSTICE 2

Publisher: Warner Bros. Interactive Entertainment Developer: NetherRealm Studios Platforms: PlayStation 4, Xbox One Release date: May 2017

TECH LEAP SPARKS HOT SUMMER TITLES By ANDY EDDY and WILLIE CLARK

TOP and BOTTOM: Injustice 2, which pits DC Comics characters against each other, offers a highly competitive combat experience and a new variety of player choices by enabling customization and enhancement of each combatant on the roster. A robust Gear System, in which each character earns hundreds of pieces of gear throughout the game, is a centerpiece. (Photo copyright © 2017 Warner Bros. Interactive Entertainment)

Growing up, countless kids imagined they were superheroes fighting crime and saving the world as Batman, Supergirl, Superman and Wonder Woman, taking on the worst villains, such as Bane, Harley Quinn, Joker and Poison Ivy. Then in 2013, WBIE and developer NetherRealm Studios brought us a videogame version with Injustice: Gods Among Us, enabling us to pit DC Comics’ characters against each other. This summer, gamers will have a chance to keep the fighting going with the release of a sequel, Injustice 2. NetherRealm – which was founded by the creators of the Mortal Kombat franchise – wanted to offer extremely competitive combat, but also wanted this new version to offer some variety. The studio is accomplishing that by enabling player customization and enhancement of each combatant on the roster. Steve Beran, NetherRealm’s Director of Art, explains that the “robust Gear System” is a feature that’s unique to fighting games, though implementing it provided the studio with design and implementation tests. “To be honest, [the Gear System] was the biggest art challenge we faced in this game. Each character has hundreds of pieces of gear that are earned throughout the game,” Beran says. “This was a massive undertaking and I am extremely proud of our team for nailing it so perfectly.”

STAR TREK: BRIDGE CREW

Publisher: Ubisoft Developer: Red Storm Entertainment Platforms: HTC Vive, Oculus Rift, PlayStation VR Release date: May 2017 Star Trek has been a bellwether in the entertainment industry for over 40 years, with numerous TV shows, movies and video games building a rich canon that grows with new additions every year. Now Ubisoft is chipping in its own entry to the Star Trek universe – and it’s a unique one that should raise some eyebrows. Star Trek: Bridge Crew has been specifically crafted to be experienced via one of the current VR headsets, which will fully immerse the player in learning and mastering all of the critical systems on the bridge of a new Federation starship, the U.S.S. Aegis. Your mission is to explore The Trench, a mysterious stretch of space, for a place that can be colonized by Vulcans. Of course, there’s the constant threat of a sudden Klingon ambush that always looms over the ship’s crew. Bridge Crew will support a cooperative mode for up to four VR players, which should provide some unique challenges for interaction and collaborative crew work (and, visually, the game’s use of full-body avatars and real-time lip synching should add to the immersion). Additionally, the game’s designers have included a mode that will offer procedurally-generated missions, which will enable players to enjoy limitless gameplay challenges.

GET EVEN

Publisher: Bandai Namco Developer: The Farm 51 Platforms: PC, PlayStation 4, Xbox One Release date: May 2017 While roundups are usually filled with sequels and recognizable brands, there is usually one unknown entry that kind of sneaks in out of nowhere. Bandai Namco’s Get Even certainly qualifies for that honor this summer and is a sleeper that will be getting gamers’ attention. It’s a first-person shooter (FPS), but hardly in the mold of veteran FPS franchises, such as Battlefield, Call of Duty, Gears of War or Halo, which are generally fast-paced run-and-gun affairs. Get Even has an FPS-style foundation, but blends in investigative and thriller components that require the player to be much more deliberate and pay more attention to the surroundings.

82 • VFXVOICE.COM SUMMER 2017

Get Even’s visual design has also been carefully crafted to be an integral part of this experience, with The Farm 51 studio methodically constructing the game world in elaborate detail. “We play with a sense of reality and do tricks on player’s perception, so we try to create as realistic visuals as possible, says Wojciech Pazdur, Get Even’s Creative Director. “That is why we use photogrammetry scanning everywhere: characters, buildings, decorations and outdoors. Using 3D scan technology, we re-create the real places and objects pixel by pixel, and all the dirt stains, graffiti on the walls, wrinkles on the skin are exactly like in reality.”

TEKKEN 7

“We play with a sense of reality and do tricks on player’s perception, so we try to create as realistic visuals as possible. That is why we use photogrammetry scanning everywhere: characters, buildings, decorations and outdoors. Using 3D scan technology, we re-create the real places and objects pixel by pixel, and all the dirt stains, graffiti on the walls, wrinkles on the skin are exactly like in reality.” —Wojciech Pazdur, Creative Director, Get Even

TOP and Bottom: Get Even is a first-person shooter with investigative/thriller components that require a more deliberate pace and attentive style of play. The carefully-crafted visual design captures the game world in elaborate detail. Photogrammetry of characters, buildings, decorations and outdoors results in starkly realistic visuals, and 3D scan technology helps re-create real places and objects pixel by pixel. (Photo copyright © 2017 Bandai Namco.)

Publisher: Bandai Namco Developer: Bandai Namco Studios Platforms: PC, PlayStation 4, Xbox One Release date: June 2107 Will seven be a lucky number for Tekken fans? The seventh installment of the Tekken fighting-game franchise heads to home consoles and PC this summer, and there’s reason to believe. The game is being developed by Bandai Namco’s Tokyo studio, with the series’ director Katsuhiro Harada leading the fray. Under the hood, the game is employing the state-of-the-art Unreal Engine 4 as its software foundation, and, according to Bandai Namco, it “sets a new standard in graphics quality for the series, featuring highly detailed characters and dynamic environments.” The Tekken games have always pushed the visual

SUMMER 2017

VFXVOICE.COM • 83


GAMES

Traditionally, summer is not a great time for gamers seeking new wares to pass the time. However, every year there is generally a handful of games that come out at other points on the calendar – perhaps to take advantage of the lack of competition or because they’re not blockbuster games. Indeed, the summer of 2017 seems to have brought some high-profile and recognizable names that are worthy of consideration. INJUSTICE 2

Publisher: Warner Bros. Interactive Entertainment Developer: NetherRealm Studios Platforms: PlayStation 4, Xbox One Release date: May 2017

TECH LEAP SPARKS HOT SUMMER TITLES By ANDY EDDY and WILLIE CLARK

TOP and BOTTOM: Injustice 2, which pits DC Comics characters against each other, offers a highly competitive combat experience and a new variety of player choices by enabling customization and enhancement of each combatant on the roster. A robust Gear System, in which each character earns hundreds of pieces of gear throughout the game, is a centerpiece. (Photo copyright © 2017 Warner Bros. Interactive Entertainment)

Growing up, countless kids imagined they were superheroes fighting crime and saving the world as Batman, Supergirl, Superman and Wonder Woman, taking on the worst villains, such as Bane, Harley Quinn, Joker and Poison Ivy. Then in 2013, WBIE and developer NetherRealm Studios brought us a videogame version with Injustice: Gods Among Us, enabling us to pit DC Comics’ characters against each other. This summer, gamers will have a chance to keep the fighting going with the release of a sequel, Injustice 2. NetherRealm – which was founded by the creators of the Mortal Kombat franchise – wanted to offer extremely competitive combat, but also wanted this new version to offer some variety. The studio is accomplishing that by enabling player customization and enhancement of each combatant on the roster. Steve Beran, NetherRealm’s Director of Art, explains that the “robust Gear System” is a feature that’s unique to fighting games, though implementing it provided the studio with design and implementation tests. “To be honest, [the Gear System] was the biggest art challenge we faced in this game. Each character has hundreds of pieces of gear that are earned throughout the game,” Beran says. “This was a massive undertaking and I am extremely proud of our team for nailing it so perfectly.”

STAR TREK: BRIDGE CREW

Publisher: Ubisoft Developer: Red Storm Entertainment Platforms: HTC Vive, Oculus Rift, PlayStation VR Release date: May 2017 Star Trek has been a bellwether in the entertainment industry for over 40 years, with numerous TV shows, movies and video games building a rich canon that grows with new additions every year. Now Ubisoft is chipping in its own entry to the Star Trek universe – and it’s a unique one that should raise some eyebrows. Star Trek: Bridge Crew has been specifically crafted to be experienced via one of the current VR headsets, which will fully immerse the player in learning and mastering all of the critical systems on the bridge of a new Federation starship, the U.S.S. Aegis. Your mission is to explore The Trench, a mysterious stretch of space, for a place that can be colonized by Vulcans. Of course, there’s the constant threat of a sudden Klingon ambush that always looms over the ship’s crew. Bridge Crew will support a cooperative mode for up to four VR players, which should provide some unique challenges for interaction and collaborative crew work (and, visually, the game’s use of full-body avatars and real-time lip synching should add to the immersion). Additionally, the game’s designers have included a mode that will offer procedurally-generated missions, which will enable players to enjoy limitless gameplay challenges.

GET EVEN

Publisher: Bandai Namco Developer: The Farm 51 Platforms: PC, PlayStation 4, Xbox One Release date: May 2017 While roundups are usually filled with sequels and recognizable brands, there is usually one unknown entry that kind of sneaks in out of nowhere. Bandai Namco’s Get Even certainly qualifies for that honor this summer and is a sleeper that will be getting gamers’ attention. It’s a first-person shooter (FPS), but hardly in the mold of veteran FPS franchises, such as Battlefield, Call of Duty, Gears of War or Halo, which are generally fast-paced run-and-gun affairs. Get Even has an FPS-style foundation, but blends in investigative and thriller components that require the player to be much more deliberate and pay more attention to the surroundings.

82 • VFXVOICE.COM SUMMER 2017

Get Even’s visual design has also been carefully crafted to be an integral part of this experience, with The Farm 51 studio methodically constructing the game world in elaborate detail. “We play with a sense of reality and do tricks on player’s perception, so we try to create as realistic visuals as possible, says Wojciech Pazdur, Get Even’s Creative Director. “That is why we use photogrammetry scanning everywhere: characters, buildings, decorations and outdoors. Using 3D scan technology, we re-create the real places and objects pixel by pixel, and all the dirt stains, graffiti on the walls, wrinkles on the skin are exactly like in reality.”

TEKKEN 7

“We play with a sense of reality and do tricks on player’s perception, so we try to create as realistic visuals as possible. That is why we use photogrammetry scanning everywhere: characters, buildings, decorations and outdoors. Using 3D scan technology, we re-create the real places and objects pixel by pixel, and all the dirt stains, graffiti on the walls, wrinkles on the skin are exactly like in reality.” —Wojciech Pazdur, Creative Director, Get Even

TOP and Bottom: Get Even is a first-person shooter with investigative/thriller components that require a more deliberate pace and attentive style of play. The carefully-crafted visual design captures the game world in elaborate detail. Photogrammetry of characters, buildings, decorations and outdoors results in starkly realistic visuals, and 3D scan technology helps re-create real places and objects pixel by pixel. (Photo copyright © 2017 Bandai Namco.)

Publisher: Bandai Namco Developer: Bandai Namco Studios Platforms: PC, PlayStation 4, Xbox One Release date: June 2107 Will seven be a lucky number for Tekken fans? The seventh installment of the Tekken fighting-game franchise heads to home consoles and PC this summer, and there’s reason to believe. The game is being developed by Bandai Namco’s Tokyo studio, with the series’ director Katsuhiro Harada leading the fray. Under the hood, the game is employing the state-of-the-art Unreal Engine 4 as its software foundation, and, according to Bandai Namco, it “sets a new standard in graphics quality for the series, featuring highly detailed characters and dynamic environments.” The Tekken games have always pushed the visual

SUMMER 2017

VFXVOICE.COM • 83


GAMES

envelope, but judging by the images and gameplay trailers we’ve seen of Tekken 7, the game makers followed through on that promise with a solid coating of sweet eye candy on everything, right down to detailed facial blemishes and wrinkles on character models. Particle effects are plentiful, as well, to give fight scenes some added pop. CRASH BANDICOOT N. SANE TRILOGY

Publisher: Activision Developer: Vicarious Visions Platforms: PlayStation 4 Release date: June 2017

TOP and BOTTOM: Star Trek: Bridge Crew has been specifically crafted to experience via one of the current VR headsets, which will fully immerse the player in learning and mastering all of the critical systems on the bridge of the new Federation starship, the U.S.S. Aegis. The game will support a cooperative mode for up to four VR players, which should provide unique challenges for interaction and collaborative crew work. Visually, the game’s use of full-body avatars and real-time lip synching add to the immersion. (Photo copyright © 2017 Ubisoft.) OPPOSITE PAGE: The seventh installment of the Tekken fighting-game franchise employs the state-of-the-art Unreal Engine 4 as its software foundation and, according to Bandai Namco, “sets a new standard in graphics quality for the series, featuring highly-detailed characters and dynamic environments.” The Tekken games have always pushed the visual envelope, and so does this upgrade, down to detailed facial blemishes and wrinkles on character models. (Photo copyright © 2017 Bandai Namco.)

84 • VFXVOICE.COM SUMMER 2017

Gameplay completionists should enjoy the fact that the game has also been updated with PlayStation 4 trophy support, as well as an auto-save component. MIDDLE-EARTH: SHADOW OF WAR

Publisher: Warner Bros. Interactive Entertainment Developer: Monolith Productions Platforms: PC, PlayStation 4, Xbox One Release date: August 2017

When Crash Bandicoot first came out on Sony’s new PlayStation game console in September 1996, it was lauded for its 3D graphics at a time when most gamers were used to “2D platformers” (such as Super Mario Bros.). And then it went on to become one of the best-selling games on the system. Looking back two decades, what we all thought was incredible now looks quite dated – as you’d expect. Well, the original Crash is back … sort of. Everybody’s favorite marsupial returns, this time on the powerful PlayStation 4. And a lot has changed in game hardware and GPU power since we first saw Crash, especially where visuals are considered. The N. Sane Trilogy brings back the first three Crash games – the original Crash Bandicoot, Crash Bandicoot 2: Cortex Strikes Back and Crash Bandicoot: Warped – but with each having been fully remastered in HD, so don’t expect any of the PlayStation-era aliasing and artifacting here. Everything in the original games is there, but with a massive increase in resolution, detail and color.

Everybody knows you can’t simply walk into Mordor … but that doesn’t mean returning to visit is out of the question, apparently. In this follow up to the 2014 hit Middle-Earth: Shadows of Mordor, WBIE and its Monolith Productions studio in Seattle again take us back to J.R.R. Tolkien’s legendary locale that is Middle-Earth for more adventure and intense action. Talion and Celebrimbor return from the first game, as does the game’s Nemesis System – though in this sequel, the technology layer within the game that enables you to scope out your enemies’ strengths and weaknesses before a battle, which makes your game story unique and personalized, has been expanded from what existed in the first game. Also, while it may put you at momentary risk before you kill your foes, take a close look. Check out the sheen and detail on that skin. Monolith’s art team made sure characters and surroundings will be nearly photorealistic, which raises your immersion into this fantasy world. Before the battle for Middle-Earth begins anew, it’s best you sharpen your swords … and keep an eye on that ring finger. You wouldn’t want to lose it in a volcano.

FINAL FANTASY XII: THE ZODIAC AGE

DESTINY 2

Publisher: Square Enix Developer: Square Enix Platforms: PlayStation 4 Release date: July 2017

Publisher: Activision Developer: Bungie Platforms: PC, PlayStation 4, Xbox One Release date: September 2017

No, no, not the Age of Aquarius. This is the Zodiac Age. Square Enix is giving us the opportunity to return to the battlefields of Ivalice for some intense confrontations and combat. Try not to be too distracted by how the developers have taken the graphics from when the classic role-playing game was originally released on PlayStation 2 in 2006 (where it was one of the best-selling titles on the system) and reworked them to shine on the present-generation PlayStation 4 console. For this new HD version, every aspect of FFXII has been fully revamped: All characters and cinematic scenes have been remastered, and there have been significant improvements made to the combat system, which enables you to fully control your party in each sortie. It will also feature the improvement on the job system, so you can customize how your in-game characters level up through the journey. But the game is also going to be a treat for your ears, too: It will run in 7.1 surround sound, so you’ll feel like you’re right in the middle of the action.

Publisher Activision and development studio Bungie are again teaming on a new Destiny release, the second in the series that follows almost exactly three years from the release of the first Destiny game. Unfortunately, not much was known about Destiny 2’s story and gameplay at the time this article was being put together, but apparently, player-controlled Guardians will have their powers stripped and they’ll be forced out of The Tower safe zone. The sequel will require investigation of “mysterious, unexplored worlds” so players can “reunite humanity’s scattered heroes, stand together and fight back to reclaim our home.” Destiny 2 will be on current-gen console platforms, but also on Windows PC for the first time. However, Sony has negotiated for at least a year of timed exclusive content for PS4 players. And though gamers will look at the September 8 release date as giving them most of the summer off, they might want to reconsider their beach plans as a beta is planned for earlier in the summer.

SUMMER 2017

VFXVOICE.COM • 85


GAMES

envelope, but judging by the images and gameplay trailers we’ve seen of Tekken 7, the game makers followed through on that promise with a solid coating of sweet eye candy on everything, right down to detailed facial blemishes and wrinkles on character models. Particle effects are plentiful, as well, to give fight scenes some added pop. CRASH BANDICOOT N. SANE TRILOGY

Publisher: Activision Developer: Vicarious Visions Platforms: PlayStation 4 Release date: June 2017

TOP and BOTTOM: Star Trek: Bridge Crew has been specifically crafted to experience via one of the current VR headsets, which will fully immerse the player in learning and mastering all of the critical systems on the bridge of the new Federation starship, the U.S.S. Aegis. The game will support a cooperative mode for up to four VR players, which should provide unique challenges for interaction and collaborative crew work. Visually, the game’s use of full-body avatars and real-time lip synching add to the immersion. (Photo copyright © 2017 Ubisoft.) OPPOSITE PAGE: The seventh installment of the Tekken fighting-game franchise employs the state-of-the-art Unreal Engine 4 as its software foundation and, according to Bandai Namco, “sets a new standard in graphics quality for the series, featuring highly-detailed characters and dynamic environments.” The Tekken games have always pushed the visual envelope, and so does this upgrade, down to detailed facial blemishes and wrinkles on character models. (Photo copyright © 2017 Bandai Namco.)

84 • VFXVOICE.COM SUMMER 2017

Gameplay completionists should enjoy the fact that the game has also been updated with PlayStation 4 trophy support, as well as an auto-save component. MIDDLE-EARTH: SHADOW OF WAR

Publisher: Warner Bros. Interactive Entertainment Developer: Monolith Productions Platforms: PC, PlayStation 4, Xbox One Release date: August 2017

When Crash Bandicoot first came out on Sony’s new PlayStation game console in September 1996, it was lauded for its 3D graphics at a time when most gamers were used to “2D platformers” (such as Super Mario Bros.). And then it went on to become one of the best-selling games on the system. Looking back two decades, what we all thought was incredible now looks quite dated – as you’d expect. Well, the original Crash is back … sort of. Everybody’s favorite marsupial returns, this time on the powerful PlayStation 4. And a lot has changed in game hardware and GPU power since we first saw Crash, especially where visuals are considered. The N. Sane Trilogy brings back the first three Crash games – the original Crash Bandicoot, Crash Bandicoot 2: Cortex Strikes Back and Crash Bandicoot: Warped – but with each having been fully remastered in HD, so don’t expect any of the PlayStation-era aliasing and artifacting here. Everything in the original games is there, but with a massive increase in resolution, detail and color.

Everybody knows you can’t simply walk into Mordor … but that doesn’t mean returning to visit is out of the question, apparently. In this follow up to the 2014 hit Middle-Earth: Shadows of Mordor, WBIE and its Monolith Productions studio in Seattle again take us back to J.R.R. Tolkien’s legendary locale that is Middle-Earth for more adventure and intense action. Talion and Celebrimbor return from the first game, as does the game’s Nemesis System – though in this sequel, the technology layer within the game that enables you to scope out your enemies’ strengths and weaknesses before a battle, which makes your game story unique and personalized, has been expanded from what existed in the first game. Also, while it may put you at momentary risk before you kill your foes, take a close look. Check out the sheen and detail on that skin. Monolith’s art team made sure characters and surroundings will be nearly photorealistic, which raises your immersion into this fantasy world. Before the battle for Middle-Earth begins anew, it’s best you sharpen your swords … and keep an eye on that ring finger. You wouldn’t want to lose it in a volcano.

FINAL FANTASY XII: THE ZODIAC AGE

DESTINY 2

Publisher: Square Enix Developer: Square Enix Platforms: PlayStation 4 Release date: July 2017

Publisher: Activision Developer: Bungie Platforms: PC, PlayStation 4, Xbox One Release date: September 2017

No, no, not the Age of Aquarius. This is the Zodiac Age. Square Enix is giving us the opportunity to return to the battlefields of Ivalice for some intense confrontations and combat. Try not to be too distracted by how the developers have taken the graphics from when the classic role-playing game was originally released on PlayStation 2 in 2006 (where it was one of the best-selling titles on the system) and reworked them to shine on the present-generation PlayStation 4 console. For this new HD version, every aspect of FFXII has been fully revamped: All characters and cinematic scenes have been remastered, and there have been significant improvements made to the combat system, which enables you to fully control your party in each sortie. It will also feature the improvement on the job system, so you can customize how your in-game characters level up through the journey. But the game is also going to be a treat for your ears, too: It will run in 7.1 surround sound, so you’ll feel like you’re right in the middle of the action.

Publisher Activision and development studio Bungie are again teaming on a new Destiny release, the second in the series that follows almost exactly three years from the release of the first Destiny game. Unfortunately, not much was known about Destiny 2’s story and gameplay at the time this article was being put together, but apparently, player-controlled Guardians will have their powers stripped and they’ll be forced out of The Tower safe zone. The sequel will require investigation of “mysterious, unexplored worlds” so players can “reunite humanity’s scattered heroes, stand together and fight back to reclaim our home.” Destiny 2 will be on current-gen console platforms, but also on Windows PC for the first time. However, Sony has negotiated for at least a year of timed exclusive content for PS4 players. And though gamers will look at the September 8 release date as giving them most of the summer off, they might want to reconsider their beach plans as a beta is planned for earlier in the summer.

SUMMER 2017

VFXVOICE.COM • 85


GAMES

ANIMATION

THE EMOJI MOVIE By ED OCHS

“A technology jump was needed to achieve (the development team’s) goals. ” —Tom Burlington, Technical Art Director, Destiny 2 Though not much is known about how the new game will look, some advanced tidbits were offered by Tom Burlington, the game’s Technical Art Director. He reveals that the development team has made strides in art, but “a technology jump was also needed to achieve their goals. We made huge leaps forward in our rendering, lighting and particle systems.” Some improvements involve “a new physically-based rendering engine, the artful use of technology, like volumetric lights, 3D gels and area lights, overhauling our particle system to support 20 times more particles, and a new system for motion fields and forces was created enabling artists to simulate the evocative swirling orbital elements of a Nova bomb as it is dynamically influenced by a nearby grenade’s detonation.”

86 • VFXVOICE.COM SUMMER 2017

TOP ROW: Everybody’s favorite marsupial, Crash, returns, this time on powerful PlayStation 4. The N. Sane Trilogy brings back the first three Crash games – the original Crash Bandicoot, Crash Bandicoot 2: Cortex Strikes Back and Crash Bandicoot: Warped – but with each having been fully remastered in HD. Everything in the original games is there, but with a massive increase in resolution, detail and color. (Photo copyright © 2017 Activision.) MIDDLE ROW: For this new HD version, every aspect of Final Fantasy XII has been fully revamped. All characters and cinematic scenes have been remastered, and there have been significant improvements to the combat system, along with an improved job system for character customization. The game will run in 7.1 surround sound for that middle-of-the-action sensation. Also updated is PlayStation 4 trophy support, as well as an auto-save component. (Photo copyright © 2017 Square Enix.) BOTTOM ROW: In this sequel set in J.R.R. Tolkien’s legendary locale, the technology layer within the game that enables scoping out enemies strengths and weaknesses before battle – making the game story unique and personalized – has been expanded from the first game. Monolith’s art team made sure characters and surroundings are nearly photorealistic in detail, increasing immersion into Shadow of War’s rich fantasy world. (Photo copyright © 2017 Warner Bros. Interactive Entertainment.)

“We did not want to make it look like the human world, but (one) with emojis. It had to be unique and funny. It also needed to look like an oppressive society, but beautiful.” —Carlos Zaragoza, Production Designer

Controversy has helped garner unintended attention for The Emoji Movie – the trailer was roundly panned on YouTube last December on grounds of general triviality – along with a big marketing boost on mobile and social media, by corporate partners, and from an all-star team of voice actors. All that means more eyes will catch the animation work on the film his summer than might have otherwise, and that’s good for the animators at Sony Pictures Animation. Directed and co-written by Tony Leondis, and starring the voices of T.J. Miller, Anna Faris, James Corden, Patrick Stewart, Maya Rudolph, Steven Wright, Rob Riggle, Jennifer Coolidge, Jake T. Austin and Sofia Vergara, The Emoji Movie is described by studio wordsmiths as a “app-venture,” which speaks mainly to teens and pre-teens. The story, according to Sony PR, is global in a micro way: “Three emojis (voiced by Miller, Corden and Faris) embark on an epic adventure through a smartphone to save their world from deletion.” It’s true, the stars of the movie are emojis, short for emotive icons, those mostly tiny faces offering a full range of facial expressions people sometimes add to their text message or email. It’s also true the movie takes place inside a smartphone. For the art, design, layout and VFX teams on the project, there was a digital world in that phone, a world of detailed simplicity that never existed before, and a city that had to be built from scratch. Carlos Zaragoza, Production Designer on The Emoji Movie, faced the challenge of designing the Emoji world, and “how to translate a very flat and graphic concept – like the graphics and icons of the emoji sets – into a believable universe that the audience can relate to. “We did not want to make it look like the human world, but (one) with emojis,” Zaragoza says. “It had to be unique and funny. It also needed to look like an oppressive society, but beautiful. “We solved it,” he says, “by making their city, Textopolis, look like a real place, with sidewalks, buildings and transportation, but introducing surreal and unexpected aspects. For example, the buildings are laid out on a regular grid over a flat, white ground that extends to the infinite, like the emoji set on your phone. Everything looks like a simple emoji: the characters, the sets, the graphics, but is rich in detail when needed.” In the end, bringing the Emoji to animated life boiled down to teamwork. “Teamwork with other departments early on in the development of the look of the movie was crucial to achieve the look we were looking for,” Zaragoza concludes. “Working closely with James Williams, the Layout Supervisor on The Emoji Movie to make sure that the camera language and design are on the same page to tell the story. David Smith, Sony Pictures Imageworks’ VFX Supervisor, helped us to find the best look for the worlds and these unique, electronic-expression emojis.”

All photos copyright © 2017 by Sony Pictures Animation, Columbia Pictures and CTMG, Inc. All Rights Reserved.

SUMMER 2017

VFXVOICE.COM • 87


GAMES

ANIMATION

THE EMOJI MOVIE By ED OCHS

“A technology jump was needed to achieve (the development team’s) goals. ” —Tom Burlington, Technical Art Director, Destiny 2 Though not much is known about how the new game will look, some advanced tidbits were offered by Tom Burlington, the game’s Technical Art Director. He reveals that the development team has made strides in art, but “a technology jump was also needed to achieve their goals. We made huge leaps forward in our rendering, lighting and particle systems.” Some improvements involve “a new physically-based rendering engine, the artful use of technology, like volumetric lights, 3D gels and area lights, overhauling our particle system to support 20 times more particles, and a new system for motion fields and forces was created enabling artists to simulate the evocative swirling orbital elements of a Nova bomb as it is dynamically influenced by a nearby grenade’s detonation.”

86 • VFXVOICE.COM SUMMER 2017

TOP ROW: Everybody’s favorite marsupial, Crash, returns, this time on powerful PlayStation 4. The N. Sane Trilogy brings back the first three Crash games – the original Crash Bandicoot, Crash Bandicoot 2: Cortex Strikes Back and Crash Bandicoot: Warped – but with each having been fully remastered in HD. Everything in the original games is there, but with a massive increase in resolution, detail and color. (Photo copyright © 2017 Activision.) MIDDLE ROW: For this new HD version, every aspect of Final Fantasy XII has been fully revamped. All characters and cinematic scenes have been remastered, and there have been significant improvements to the combat system, along with an improved job system for character customization. The game will run in 7.1 surround sound for that middle-of-the-action sensation. Also updated is PlayStation 4 trophy support, as well as an auto-save component. (Photo copyright © 2017 Square Enix.) BOTTOM ROW: In this sequel set in J.R.R. Tolkien’s legendary locale, the technology layer within the game that enables scoping out enemies strengths and weaknesses before battle – making the game story unique and personalized – has been expanded from the first game. Monolith’s art team made sure characters and surroundings are nearly photorealistic in detail, increasing immersion into Shadow of War’s rich fantasy world. (Photo copyright © 2017 Warner Bros. Interactive Entertainment.)

“We did not want to make it look like the human world, but (one) with emojis. It had to be unique and funny. It also needed to look like an oppressive society, but beautiful.” —Carlos Zaragoza, Production Designer

Controversy has helped garner unintended attention for The Emoji Movie – the trailer was roundly panned on YouTube last December on grounds of general triviality – along with a big marketing boost on mobile and social media, by corporate partners, and from an all-star team of voice actors. All that means more eyes will catch the animation work on the film his summer than might have otherwise, and that’s good for the animators at Sony Pictures Animation. Directed and co-written by Tony Leondis, and starring the voices of T.J. Miller, Anna Faris, James Corden, Patrick Stewart, Maya Rudolph, Steven Wright, Rob Riggle, Jennifer Coolidge, Jake T. Austin and Sofia Vergara, The Emoji Movie is described by studio wordsmiths as a “app-venture,” which speaks mainly to teens and pre-teens. The story, according to Sony PR, is global in a micro way: “Three emojis (voiced by Miller, Corden and Faris) embark on an epic adventure through a smartphone to save their world from deletion.” It’s true, the stars of the movie are emojis, short for emotive icons, those mostly tiny faces offering a full range of facial expressions people sometimes add to their text message or email. It’s also true the movie takes place inside a smartphone. For the art, design, layout and VFX teams on the project, there was a digital world in that phone, a world of detailed simplicity that never existed before, and a city that had to be built from scratch. Carlos Zaragoza, Production Designer on The Emoji Movie, faced the challenge of designing the Emoji world, and “how to translate a very flat and graphic concept – like the graphics and icons of the emoji sets – into a believable universe that the audience can relate to. “We did not want to make it look like the human world, but (one) with emojis,” Zaragoza says. “It had to be unique and funny. It also needed to look like an oppressive society, but beautiful. “We solved it,” he says, “by making their city, Textopolis, look like a real place, with sidewalks, buildings and transportation, but introducing surreal and unexpected aspects. For example, the buildings are laid out on a regular grid over a flat, white ground that extends to the infinite, like the emoji set on your phone. Everything looks like a simple emoji: the characters, the sets, the graphics, but is rich in detail when needed.” In the end, bringing the Emoji to animated life boiled down to teamwork. “Teamwork with other departments early on in the development of the look of the movie was crucial to achieve the look we were looking for,” Zaragoza concludes. “Working closely with James Williams, the Layout Supervisor on The Emoji Movie to make sure that the camera language and design are on the same page to tell the story. David Smith, Sony Pictures Imageworks’ VFX Supervisor, helped us to find the best look for the worlds and these unique, electronic-expression emojis.”

All photos copyright © 2017 by Sony Pictures Animation, Columbia Pictures and CTMG, Inc. All Rights Reserved.

SUMMER 2017

VFXVOICE.COM • 87


VFX VAULT

that resembled a radar sensor. Tippett and Hayes also implemented a grill shape for ED-209’s mouth. “For a while, something was wrong with that, but we couldn’t figure it out,” says Tippett. “And then I realized: Hey, you know what, turn the mouth upside down because he’s smiling. And so we turned the mouth upside down and it kind of gave him a more ominous look.” ED-209 ON THE MOVE

ROBOCOP CRASHES 30: ED-209 STILL A STOP-MOTION HERO By IAN FAILES

TOP LEFT: Stop-motion involved frame-by-frame manipulation of the ED-209 puppet. Tippett Studio would go on to animate even more intricate stop-motion puppets and elaborate scenes for RoboCop 2 and RoboCop 3, just as the digital revolution in visual effects was taking hold. TOP RIGHT: Phil Tippett, VES, sets up a stop-motion shot with the ED-209 puppet. As part of the tight budget for the character, the scenes were designed so that the full-scale model was in ‘powered down’ mode. From there, the stop-motion puppet would come to life, then return to the full-scale pose, enabling intercutting between live action and miniature effects. BOTTOM: Tippett articulates the ED-209 stop-motion puppet in front of a rear-projection screen.

In 1987, fierce creatures and hulking robots seen on film were still mostly the domain of either full-sized practical effects or stop-motion animated miniature puppets. When director Paul Verhoeven needed menacing enforcement droid ED-209 to wreak havoc in his future dystopia, RoboCop, which celebrates its 30th anniversary in July 2017, he capitalized on both techniques. The scenes with ED-209, while limited, have become some of the most memorable in VFX film history, partly because of the initial ferocity of the robot’s actions, and owing to its comical turns in navigating a staircase, and its eventual demise. Tippett Studio, led by visual effects supervisors Phil Tippett, VES, and Craig Hayes, devised the ED-209 effects. The studio built both the full-sized model of this Omni Consumer Products (OCP) weapon, a mostly static set piece that was around seven feet tall and weighed 300 pounds, and a matching stop-motion version capable of much more articulation. RoboCop depicted a Detroit, Michigan, future that was crimeridden and run by a mega company. The concern had developed a huge crime-fighting robot that developed a dangerous glitch. The company then tries to win back the public’s favor by reconstructing a newer robot using the body of a slain policeman. The film starred Peter Weller and Nancy Allen. There were several sequels. The film was rebooted in 2014 with director José Padilha and actors Joel Kinnaman, Gary Oldman and Michael Keaton. The original is still regarded as the crown of the RoboCop progression. NICE GUNS

RoboCop screenwriter Edward Neumeier first thought up the ED-209 character after seeing a Japanese model-kit design that had arms with enormous guns. “That was the idea from the beginning,” says Tippett about his studio’s contribution to the film. “For the design of ED-209, I happened to be at a barbeque and I ran into this kid who was amazing. He was barely 20, his name was Craig Hayes, and he worked for this guy who did prototypes for military helmets. So I hired Craig to come on board and he and I would go down to Los Angeles to see director Paul Verhoeven and develop the design for the ED-209.” Verhoeven pushed the design heavily towards a non-anthropomorphic feel, with legs that executed a ‘Z’ configuration and a head

88 • VFXVOICE.COM SUMMER 2017

In the years prior to RoboCop, Tippett had pioneered a stop-motion animation technique called Go-Motion, which, via the use of motion-controlled articulators, allowed a puppet to move while the shutter of the camera was open, and therefore create motion blur. The idea was to avoid the stuttery feel that sometimes came with traditional stop-motion. But it was not used on RoboCop, for two simple reasons. “Well, we couldn’t afford Go-Motion,” says Tippett. “So that was that. And, you know, in terms of believability, robots tend to lend themselves to the stop-motion process.” Instead, a relatively traditional rear-projection set-up was employed for most of the ED-209 shots in RoboCop. That way, scenes with the droid could be effectively composited in-camera, and budgeted accordingly. “We’d shoot a wedge test every day and that would go to the lab the next day, and we’d come back and make corrections,” outlines Tippett. “And depending on the level of complexity, it would be like two, maybe three, days of set-up time and then shoot. And we pretty much got everything first take, so that was from three to four days per set-up.” In terms of animating the puppet, Tippett shared duties mostly with Randy Dutra and Harry Walton. “We had to do a few rehearsal shots because nothing organic could walk like that, so we had to figure out how to make it believable. Not only was it Z-shaped but its head was driven by a big lead screw that moved its legs up and down. It was a very non-human kind of thing.” COMIC RELIEF

Perhaps ED-209’s most memorable scene features the droid chasing RoboCop to a staircase. It’s a structure he fails to navigate and goes tumbling down the stairs. Close-ups of ED-209’s head and tentative toes were stop-motion animated on a staircase set that matched live-action plates, and then the view of his fall achieved with a miniature shot at 96 frames per second. The humor in that sequence is arguably upstaged, however, by one in which RoboCop destroys ED-209 with a powerful gun. Fullscale pyrotechnics enabled the destruction; while stop-motion was then used to reveal just the remaining legs of the droid wobble for seconds before collapsing. “Just before we were ready to go on that shot,” recalls Tippett, “the producer Jon Davison called me up and said, ‘Hey, the movie’s gotten too serious. Do something funny with him.’ So I found some little whirligig model parts, little spinning fans and things like that, and I just played it not as a monster, but as if he was a dysfunctional drunk.

TOP TO BOTTOM: ED-209’s design was based on Z-shaped legs and large, radar-like head, along with a grill mouth and large machine guns for arms. ED-209 designer Craig Hayes makes adjustments to the full-sized model. Its head could move from side to side, but major movements were left to the stop-motion puppet. This set-up shows the stop-motion puppet set up for a boardroom sequence from the film.

SUMMER 2017

VFXVOICE.COM • 89


VFX VAULT

that resembled a radar sensor. Tippett and Hayes also implemented a grill shape for ED-209’s mouth. “For a while, something was wrong with that, but we couldn’t figure it out,” says Tippett. “And then I realized: Hey, you know what, turn the mouth upside down because he’s smiling. And so we turned the mouth upside down and it kind of gave him a more ominous look.” ED-209 ON THE MOVE

ROBOCOP CRASHES 30: ED-209 STILL A STOP-MOTION HERO By IAN FAILES

TOP LEFT: Stop-motion involved frame-by-frame manipulation of the ED-209 puppet. Tippett Studio would go on to animate even more intricate stop-motion puppets and elaborate scenes for RoboCop 2 and RoboCop 3, just as the digital revolution in visual effects was taking hold. TOP RIGHT: Phil Tippett, VES, sets up a stop-motion shot with the ED-209 puppet. As part of the tight budget for the character, the scenes were designed so that the full-scale model was in ‘powered down’ mode. From there, the stop-motion puppet would come to life, then return to the full-scale pose, enabling intercutting between live action and miniature effects. BOTTOM: Tippett articulates the ED-209 stop-motion puppet in front of a rear-projection screen.

In 1987, fierce creatures and hulking robots seen on film were still mostly the domain of either full-sized practical effects or stop-motion animated miniature puppets. When director Paul Verhoeven needed menacing enforcement droid ED-209 to wreak havoc in his future dystopia, RoboCop, which celebrates its 30th anniversary in July 2017, he capitalized on both techniques. The scenes with ED-209, while limited, have become some of the most memorable in VFX film history, partly because of the initial ferocity of the robot’s actions, and owing to its comical turns in navigating a staircase, and its eventual demise. Tippett Studio, led by visual effects supervisors Phil Tippett, VES, and Craig Hayes, devised the ED-209 effects. The studio built both the full-sized model of this Omni Consumer Products (OCP) weapon, a mostly static set piece that was around seven feet tall and weighed 300 pounds, and a matching stop-motion version capable of much more articulation. RoboCop depicted a Detroit, Michigan, future that was crimeridden and run by a mega company. The concern had developed a huge crime-fighting robot that developed a dangerous glitch. The company then tries to win back the public’s favor by reconstructing a newer robot using the body of a slain policeman. The film starred Peter Weller and Nancy Allen. There were several sequels. The film was rebooted in 2014 with director José Padilha and actors Joel Kinnaman, Gary Oldman and Michael Keaton. The original is still regarded as the crown of the RoboCop progression. NICE GUNS

RoboCop screenwriter Edward Neumeier first thought up the ED-209 character after seeing a Japanese model-kit design that had arms with enormous guns. “That was the idea from the beginning,” says Tippett about his studio’s contribution to the film. “For the design of ED-209, I happened to be at a barbeque and I ran into this kid who was amazing. He was barely 20, his name was Craig Hayes, and he worked for this guy who did prototypes for military helmets. So I hired Craig to come on board and he and I would go down to Los Angeles to see director Paul Verhoeven and develop the design for the ED-209.” Verhoeven pushed the design heavily towards a non-anthropomorphic feel, with legs that executed a ‘Z’ configuration and a head

88 • VFXVOICE.COM SUMMER 2017

In the years prior to RoboCop, Tippett had pioneered a stop-motion animation technique called Go-Motion, which, via the use of motion-controlled articulators, allowed a puppet to move while the shutter of the camera was open, and therefore create motion blur. The idea was to avoid the stuttery feel that sometimes came with traditional stop-motion. But it was not used on RoboCop, for two simple reasons. “Well, we couldn’t afford Go-Motion,” says Tippett. “So that was that. And, you know, in terms of believability, robots tend to lend themselves to the stop-motion process.” Instead, a relatively traditional rear-projection set-up was employed for most of the ED-209 shots in RoboCop. That way, scenes with the droid could be effectively composited in-camera, and budgeted accordingly. “We’d shoot a wedge test every day and that would go to the lab the next day, and we’d come back and make corrections,” outlines Tippett. “And depending on the level of complexity, it would be like two, maybe three, days of set-up time and then shoot. And we pretty much got everything first take, so that was from three to four days per set-up.” In terms of animating the puppet, Tippett shared duties mostly with Randy Dutra and Harry Walton. “We had to do a few rehearsal shots because nothing organic could walk like that, so we had to figure out how to make it believable. Not only was it Z-shaped but its head was driven by a big lead screw that moved its legs up and down. It was a very non-human kind of thing.” COMIC RELIEF

Perhaps ED-209’s most memorable scene features the droid chasing RoboCop to a staircase. It’s a structure he fails to navigate and goes tumbling down the stairs. Close-ups of ED-209’s head and tentative toes were stop-motion animated on a staircase set that matched live-action plates, and then the view of his fall achieved with a miniature shot at 96 frames per second. The humor in that sequence is arguably upstaged, however, by one in which RoboCop destroys ED-209 with a powerful gun. Fullscale pyrotechnics enabled the destruction; while stop-motion was then used to reveal just the remaining legs of the droid wobble for seconds before collapsing. “Just before we were ready to go on that shot,” recalls Tippett, “the producer Jon Davison called me up and said, ‘Hey, the movie’s gotten too serious. Do something funny with him.’ So I found some little whirligig model parts, little spinning fans and things like that, and I just played it not as a monster, but as if he was a dysfunctional drunk.

TOP TO BOTTOM: ED-209’s design was based on Z-shaped legs and large, radar-like head, along with a grill mouth and large machine guns for arms. ED-209 designer Craig Hayes makes adjustments to the full-sized model. Its head could move from side to side, but major movements were left to the stop-motion puppet. This set-up shows the stop-motion puppet set up for a boardroom sequence from the film.

SUMMER 2017

VFXVOICE.COM • 89


VFX VAULT

TOP LEFT: Model pieces for the ED-209 puppet. TOP RIGHT: Tippett (left) and Craig Hayes consider the full-scale ED-209. BOTTOM LEFT: Craig Hayes with the full-scale ED-209. BOTTOM RIGHT: ED-209’s full-scale model was around seven feet tall and weighed 300 pounds.

90 • VFXVOICE.COM SUMMER 2017

“I timed the shot out, and then against a piece of black velvet I shot a smoke element that moved in concert with what I was going to do stop-motion wise, and shot that on film first and then back-wound the camera and did an in-camera kind of a trick where I double-exposed the smoke over the stop-motion.”

TOP LEFT: A sketch of the ED-209 design.

REMEMBERING ROBOCOP

BOTTOM RIGHT: Tippett checks the proportions on the full-scale ED-209 model.

Tippett has fond memories of the film’s production, and of its success upon release three decades ago. “The first time we saw everything together was at the cast and crew screening in Hollywood at the Academy Theater,” he says. “We knew that we’d worked on a really terrific movie. Magazines started picking it up, and Newsweek wrote a big cover article on it. It was one of the big smash hits of the summer. “You’re always lucky when you get involved in something that turns out good. Because we work on a lot of things that aren’t,” he concludes.

TOP RIGHT: A wood-frame mock-up of ED-209. BOTTOM LEFT: Close-up on ED-209’s fun. RoboCop screenwriter Edward Neumeier had seen a Japanese model kit with huge guns for arms that inspired the droid.

LEFT: Storyboards made by Tippett Studio show ED-209 being introduced for the boardroom sequence in the film.

SUMMER 2017

VFXVOICE.COM • 91


VFX VAULT

TOP LEFT: Model pieces for the ED-209 puppet. TOP RIGHT: Tippett (left) and Craig Hayes consider the full-scale ED-209. BOTTOM LEFT: Craig Hayes with the full-scale ED-209. BOTTOM RIGHT: ED-209’s full-scale model was around seven feet tall and weighed 300 pounds.

90 • VFXVOICE.COM SUMMER 2017

“I timed the shot out, and then against a piece of black velvet I shot a smoke element that moved in concert with what I was going to do stop-motion wise, and shot that on film first and then back-wound the camera and did an in-camera kind of a trick where I double-exposed the smoke over the stop-motion.”

TOP LEFT: A sketch of the ED-209 design.

REMEMBERING ROBOCOP

BOTTOM RIGHT: Tippett checks the proportions on the full-scale ED-209 model.

Tippett has fond memories of the film’s production, and of its success upon release three decades ago. “The first time we saw everything together was at the cast and crew screening in Hollywood at the Academy Theater,” he says. “We knew that we’d worked on a really terrific movie. Magazines started picking it up, and Newsweek wrote a big cover article on it. It was one of the big smash hits of the summer. “You’re always lucky when you get involved in something that turns out good. Because we work on a lot of things that aren’t,” he concludes.

TOP RIGHT: A wood-frame mock-up of ED-209. BOTTOM LEFT: Close-up on ED-209’s fun. RoboCop screenwriter Edward Neumeier had seen a Japanese model kit with huge guns for arms that inspired the droid.

LEFT: Storyboards made by Tippett Studio show ED-209 being introduced for the boardroom sequence in the film.

SUMMER 2017

VFXVOICE.COM • 91


COMPANY PROFILE

ATOMIC FICTION: RIDING HIGH ON THE CLOUD By PAULA PARISI

TOP: Allied (Photo courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.)

Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

Kevin Baillie, Co-Founder/CEO, Atomic Fiction

Ryan Tudhope, Co-Founder/Visual Effects Supervisor, Atomic Fiction

92 • VFXVOICE.COM SUMMER 2017

Marc Sadeghi, President of Visual Effects, Atomic Fiction

Whether it’s taking viewers to fantastic environments or creating character effects, Atomic Fiction has established itself as a go-to house that can get the job done. Since opening for business seven years ago the company has worked with directors including George Lucas, Robert Zemeckis, J.J. Abrams, Tim Miller and Michael Bay. Compiling an impressive reel at the service of those filmmakers, the company managed to pull off a pretty neat stunt

of its own, scaling up to enterprise strength while retaining a boutique feel. “We see Atomic Fiction as a framework for enabling ambition,” Visual Effects Supervisor Ryan Tudhope says. “The journey is just as valued as the destination.” It’s been a brisk ride for co-founders Tudhope and Kevin Baillie, who launched the company on desktops in their living room in 2010. Atomic Fiction now employs more than 200 digital artists, working from studios in Oakland and Montreal on some of the year’s biggest releases: Pirates of the Caribbean: Dead Men Tell No Tales for Disney, Alien: Covenant for Fox, Paramount’s Transformers: The Last Knight, and the fall’s highly-anticipated Blade Runner 2049, set for Oct. 6 release from Warner Bros. Atomic Fiction also has a Los Angeles business office, run by President of Visual Effects Marc Sadeghi, who juggles feature-film projects with TV projects, including AMC’s The Walking Dead and Netflix’s Stranger Things. To expand quickly, the company relied on its fourth office location, the cloud. Atomic engineered its own cloud-based rendering and project-management platform, Conductor, announcing in 2014 that it would make the technology available for third-party subscription. Companies big and small can use Conductor to cost-effectively access a nearly limitless datacenter. By the time Conductor rolled out in 2015, the system had already racked up more than four million core rendering hours on shows like Paramount’s Transformers: Age of Extinction and Teenage Mutant Ninja Turtles, Zemeckis’ The Walk for Sony, and the Fox TV series Cosmos: A Spacetime Odyssey with Neil deGrasse Tyson. With Conductor fully implemented, Atomic could scale its infrastructure as needed, drawing on the same resources as the largest studios in the world. Having brought Sadeghi aboard in

TOP TO BOTTOM: Game of Thrones, Season 5, Episode 10. CG crowd work of the “Dothraki Hoard”. (Photo courtesy of Atomic Fiction and copyright © 2015 HBO. All Rights Reserved.) Allied (Photo courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.) Boardwalk Empire (Photo courtesy of Atomic Fiction and copyright © 2014 HBO Inc. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 93


COMPANY PROFILE

ATOMIC FICTION: RIDING HIGH ON THE CLOUD By PAULA PARISI

TOP: Allied (Photo courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.)

Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

Kevin Baillie, Co-Founder/CEO, Atomic Fiction

Ryan Tudhope, Co-Founder/Visual Effects Supervisor, Atomic Fiction

92 • VFXVOICE.COM SUMMER 2017

Marc Sadeghi, President of Visual Effects, Atomic Fiction

Whether it’s taking viewers to fantastic environments or creating character effects, Atomic Fiction has established itself as a go-to house that can get the job done. Since opening for business seven years ago the company has worked with directors including George Lucas, Robert Zemeckis, J.J. Abrams, Tim Miller and Michael Bay. Compiling an impressive reel at the service of those filmmakers, the company managed to pull off a pretty neat stunt

of its own, scaling up to enterprise strength while retaining a boutique feel. “We see Atomic Fiction as a framework for enabling ambition,” Visual Effects Supervisor Ryan Tudhope says. “The journey is just as valued as the destination.” It’s been a brisk ride for co-founders Tudhope and Kevin Baillie, who launched the company on desktops in their living room in 2010. Atomic Fiction now employs more than 200 digital artists, working from studios in Oakland and Montreal on some of the year’s biggest releases: Pirates of the Caribbean: Dead Men Tell No Tales for Disney, Alien: Covenant for Fox, Paramount’s Transformers: The Last Knight, and the fall’s highly-anticipated Blade Runner 2049, set for Oct. 6 release from Warner Bros. Atomic Fiction also has a Los Angeles business office, run by President of Visual Effects Marc Sadeghi, who juggles feature-film projects with TV projects, including AMC’s The Walking Dead and Netflix’s Stranger Things. To expand quickly, the company relied on its fourth office location, the cloud. Atomic engineered its own cloud-based rendering and project-management platform, Conductor, announcing in 2014 that it would make the technology available for third-party subscription. Companies big and small can use Conductor to cost-effectively access a nearly limitless datacenter. By the time Conductor rolled out in 2015, the system had already racked up more than four million core rendering hours on shows like Paramount’s Transformers: Age of Extinction and Teenage Mutant Ninja Turtles, Zemeckis’ The Walk for Sony, and the Fox TV series Cosmos: A Spacetime Odyssey with Neil deGrasse Tyson. With Conductor fully implemented, Atomic could scale its infrastructure as needed, drawing on the same resources as the largest studios in the world. Having brought Sadeghi aboard in

TOP TO BOTTOM: Game of Thrones, Season 5, Episode 10. CG crowd work of the “Dothraki Hoard”. (Photo courtesy of Atomic Fiction and copyright © 2015 HBO. All Rights Reserved.) Allied (Photo courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.) Boardwalk Empire (Photo courtesy of Atomic Fiction and copyright © 2014 HBO Inc. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 93


COMPANY PROFILE

Star Trek Beyond. (Photos copyright © 2016 Paramount Pictures. All Rights Reserved.)

94 • VFXVOICE.COM SUMMER 2017

2014 with an eye toward expansion, the company was poised for explosive growth, borne out by a flagship 2016, its work showcased in multiple $100-million-plus U.S. box-office earners: Fox’s Deadpool, Disney’s The Huntsman Winter’s War and Rogue One, Paramount’s Star Trek Beyond and STX’s Bad Moms. Atomic spun-off Conductor Technologies, and in December announced an undisclosed amount of Series A funding from investors including Autodesk, officially launching Baillie, its CEO, and board member Tudhope into the ranks of Silicon Valley entrepreneurs. The two have known each other since their days as digitally-obsessed high school students in Seattle, where at age 16 they got their first jobs, working for Microsoft. That helped get the attention of George Lucas, who on graduation hired them at Lucasfilm, where they did previsualization for Star Wars: Episode One – The Phantom Menace. From there, they went to work at the Orphanage and then ImageMovers Digital, the studio formed by Zemeckis and later absorbed by Disney. Atomic bootstrapped through its first year with a handful of shots on Adam Sandler’s Just Go With It and Transformers: Dark of the Moon, both out in 2011, as well as heart-stopping facial deconstruction on disfigured war veteran Richard Harrow for HBO’s Boardwalk Empire. It was the combination of Zemeckis’ Flight (2012) followed by Star Trek Into the Darkness (2013) that allowed Atomic Fiction to quickly elbow its way into the top tier. The propensity of A-listers to send repeat business their way combined with an aptitude for technology has kept them there. “We see every show as an opportunity to up the ante for our clients, both creatively and technically,” Baille says from Germany, where he was speaking at FMX2017 in Stuttgart. “We’re steadfast about using the challenges of each project to improve our pipeline long-term, rather than crafting one-off solutions for each film. The all-CG shots of the tropical island town in Disney’s Pirates of the Caribbean: Dead Men Tell No Tales advanced our digital foliage and crowd workflows, while Alien: Covenant yielded big steps forward for our 2.5D character augmentation techniques.” A lot of it comes down to trust, says Sadeghi, who worked with Tudhope and Baillie for 10 years at the Orphanage before joining Atomic. (In between, he was co-owner of the Boston-based Zero VFX, another company that invented its own cloud-based rendering solution, Zync, later sold to Google.) “It helps that we can offer Quebec tax incentives through our Montreal facility, but I think at the end of the day most studios, producers or directors are willing to pay a little more for the peace of mind that comes from knowing we’re going to deliver incredible work, by deadline.” Throughput has been turbocharged by Conductor, Sadeghi says. “It really streamlines our process; it’s completely scalable and cuts our rendering time significantly, freeing our teams to focus on creating excellent shots without worry over whether the work can be rendered in time.” It also allows the firm to be nimble, jumping on shows quickly, without having to expand infrastructure. “We don’t need to worry about the horsepower to get things done,” Sadeghi says, sounding like a true cloud cowboy. The company’s in-house supervisors work closely with each

Deadpool. Atomic Fiction was handed an entire motorcycle chase sequence that was originally to have been shot live. (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 95


COMPANY PROFILE

Star Trek Beyond. (Photos copyright © 2016 Paramount Pictures. All Rights Reserved.)

94 • VFXVOICE.COM SUMMER 2017

2014 with an eye toward expansion, the company was poised for explosive growth, borne out by a flagship 2016, its work showcased in multiple $100-million-plus U.S. box-office earners: Fox’s Deadpool, Disney’s The Huntsman Winter’s War and Rogue One, Paramount’s Star Trek Beyond and STX’s Bad Moms. Atomic spun-off Conductor Technologies, and in December announced an undisclosed amount of Series A funding from investors including Autodesk, officially launching Baillie, its CEO, and board member Tudhope into the ranks of Silicon Valley entrepreneurs. The two have known each other since their days as digitally-obsessed high school students in Seattle, where at age 16 they got their first jobs, working for Microsoft. That helped get the attention of George Lucas, who on graduation hired them at Lucasfilm, where they did previsualization for Star Wars: Episode One – The Phantom Menace. From there, they went to work at the Orphanage and then ImageMovers Digital, the studio formed by Zemeckis and later absorbed by Disney. Atomic bootstrapped through its first year with a handful of shots on Adam Sandler’s Just Go With It and Transformers: Dark of the Moon, both out in 2011, as well as heart-stopping facial deconstruction on disfigured war veteran Richard Harrow for HBO’s Boardwalk Empire. It was the combination of Zemeckis’ Flight (2012) followed by Star Trek Into the Darkness (2013) that allowed Atomic Fiction to quickly elbow its way into the top tier. The propensity of A-listers to send repeat business their way combined with an aptitude for technology has kept them there. “We see every show as an opportunity to up the ante for our clients, both creatively and technically,” Baille says from Germany, where he was speaking at FMX2017 in Stuttgart. “We’re steadfast about using the challenges of each project to improve our pipeline long-term, rather than crafting one-off solutions for each film. The all-CG shots of the tropical island town in Disney’s Pirates of the Caribbean: Dead Men Tell No Tales advanced our digital foliage and crowd workflows, while Alien: Covenant yielded big steps forward for our 2.5D character augmentation techniques.” A lot of it comes down to trust, says Sadeghi, who worked with Tudhope and Baillie for 10 years at the Orphanage before joining Atomic. (In between, he was co-owner of the Boston-based Zero VFX, another company that invented its own cloud-based rendering solution, Zync, later sold to Google.) “It helps that we can offer Quebec tax incentives through our Montreal facility, but I think at the end of the day most studios, producers or directors are willing to pay a little more for the peace of mind that comes from knowing we’re going to deliver incredible work, by deadline.” Throughput has been turbocharged by Conductor, Sadeghi says. “It really streamlines our process; it’s completely scalable and cuts our rendering time significantly, freeing our teams to focus on creating excellent shots without worry over whether the work can be rendered in time.” It also allows the firm to be nimble, jumping on shows quickly, without having to expand infrastructure. “We don’t need to worry about the horsepower to get things done,” Sadeghi says, sounding like a true cloud cowboy. The company’s in-house supervisors work closely with each

Deadpool. Atomic Fiction was handed an entire motorcycle chase sequence that was originally to have been shot live. (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 95


COMPANY PROFILE

TOP LEFT and RIGHT: Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.) BOTTOM LEFT and RIGHT: Allied (Photos courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.)

“We’re steadfast about using the challenges of each project to improve our pipeline long-term, rather than crafting one-off solutions for each film. The all-CG shots of the tropical island town in Disney’s Pirates of the Caribbean: Dead Men Tell No Tales advanced our digital foliage and crowd workflows, while Alien: Covenant yielded big steps forward for our 2.5D character augmentation techniques.” —Kevin Baillie, Co-Founder, Atomic Fiction

96 • VFXVOICE.COM SUMMER 2017

show supervisor to make creative decisions and come up with solutions. There have even been times when Atomic has taken on supervisory duties for the entire film, managing the other VFX vendors, as Baillie did for Zemeckis’ The Walk and Allied. For Allied, Atomic used Technicolor’s cloud-based Pulse system to manage the live footage trafficked from the set. “We used Technicolor’s Pulse to automatically generate EXRs from camera footage shot on set, and to store the final VFX results upon completion. From there on out, the final imagery can be sent to DI for final coloring,” Baillie says, noting that the rendering was done using Conductor. While good progress has been made in format standardization, with EXR, Alembic and USD, one thing that hasn’t been standardized is workflow. The 70 shots Tudhope supervised for Ghost in the Shell saw the Atomic team working closely with lead shop MPC, which created some scaling challenges. “We set new records for the number of dependencies we can handle while rendering in the cloud,” Baillie says, explaining that Atomic relied on the staff at Conductor Technologies for help. “We worked on the digital city scenes, and if we had done them ourselves there would have been 20,000-30,000 files – things like cars, holograms, buildings. MPC, because of their workflow,

handed off 120,000 files. As a result of that project, now we’re equipped to handle hundreds of thousands of assets in the cloud.” Conductor is scheduled for commercial release at SIGGRAPH. The fact that most productions continue to split work among two or more facilities has put increased emphasis on resource sharing and project management. “Our industry is becoming more intertwined,” Sadeghi says. “For the most part, the days of proprietary pipelines are over,” he predicts, as visual effects studios are required to share assets, shots, and know-how for the greater good of reaching a director’s vision. “The atmosphere is highly collaborative, which is great, because we all just want to create beautiful work,” he says. Effects shots are increasing in number, as even conventional narratives rely on their accuracy and efficiency. Opportunities for work continue to expand with virtual reality and augmented reality just coming into their own. Even concert touring is getting in on the action. This year, Atomic Fiction created stage visuals for the band Empire of the Sun to use at the Coachella Festival. Atomic plans to grow with the industry, while always remaining true to its founding principles. “From a technology standpoint, we’ve taken our experience with big-shop CG pipelines, systems architecture, asset management, production tracking, etc. and crossbred it with a small-shop ethos to create an environment that I think is pretty unique,” Sadeghi says. Tudhope seconds that: “This idea of a ‘big-shop capability, small-shop vibe’ is core to our values. People come here for the culture, to work with friends and build something greater.” Cloud computing, all three principals agree, is going to really change the game as to how things are built. Baillie envisions a not too distant future in which all processing – not just intensive rendering – is done remotely, the pixels streamed to the desktop. Looking ahead, Baillie says the new projects are “geared towards advancing the CG character pipeline we leveraged for Deadpool.” For that film, Atomic was handed an entire motorcycle chase sequence that was originally to have been shot live. “The car crash and stunts were all going to be shot for real, but when the production realized that to keep the stunts safe was going to compromise the visuals, and also that to make the background unrecognizable as a specific city would require a lot of work, they decided to just move the entire sequence into the digital realm. But the best compliment we get is that people do think it was filmed,” Baillie says. The filmmaking pipeline is “merging more and more with visual effects to the point that it’s often not clear how the two are separate,” Sadeghi says. “In that regard, visual effects feels less like a ‘post’ phase and more like part of the filmmaking process.” A recent addition to the Atomic Fiction family, Animation Department Supervisor Marc Chu, will lead that charge in character work, jumping in on Fox’s Shane Black-directed remake, The Predator, as well as the Stranger Things series and Zemeckis’ upcoming The Women of Marwen. Regardless of where things are headed, Sadeghi says, “It all comes back to our people. And right now our talent recruitment efforts are pretty robust.”

“(Conductor) really streamlines our process; it’s completely scalable and cuts our rendering time significantly, freeing our teams to focus on creating excellent shots without worry over whether the work can be rendered in time. … We don’t need to worry about the horsepower to get things done,” —Marc Sadeghi, President of Visual Effects, Atomic Fiction

TOP and BOTTOM: Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 97


COMPANY PROFILE

TOP LEFT and RIGHT: Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.) BOTTOM LEFT and RIGHT: Allied (Photos courtesy of Atomic Fiction and copyright © 2016 Paramount Pictures. All Rights Reserved.)

“We’re steadfast about using the challenges of each project to improve our pipeline long-term, rather than crafting one-off solutions for each film. The all-CG shots of the tropical island town in Disney’s Pirates of the Caribbean: Dead Men Tell No Tales advanced our digital foliage and crowd workflows, while Alien: Covenant yielded big steps forward for our 2.5D character augmentation techniques.” —Kevin Baillie, Co-Founder, Atomic Fiction

96 • VFXVOICE.COM SUMMER 2017

show supervisor to make creative decisions and come up with solutions. There have even been times when Atomic has taken on supervisory duties for the entire film, managing the other VFX vendors, as Baillie did for Zemeckis’ The Walk and Allied. For Allied, Atomic used Technicolor’s cloud-based Pulse system to manage the live footage trafficked from the set. “We used Technicolor’s Pulse to automatically generate EXRs from camera footage shot on set, and to store the final VFX results upon completion. From there on out, the final imagery can be sent to DI for final coloring,” Baillie says, noting that the rendering was done using Conductor. While good progress has been made in format standardization, with EXR, Alembic and USD, one thing that hasn’t been standardized is workflow. The 70 shots Tudhope supervised for Ghost in the Shell saw the Atomic team working closely with lead shop MPC, which created some scaling challenges. “We set new records for the number of dependencies we can handle while rendering in the cloud,” Baillie says, explaining that Atomic relied on the staff at Conductor Technologies for help. “We worked on the digital city scenes, and if we had done them ourselves there would have been 20,000-30,000 files – things like cars, holograms, buildings. MPC, because of their workflow,

handed off 120,000 files. As a result of that project, now we’re equipped to handle hundreds of thousands of assets in the cloud.” Conductor is scheduled for commercial release at SIGGRAPH. The fact that most productions continue to split work among two or more facilities has put increased emphasis on resource sharing and project management. “Our industry is becoming more intertwined,” Sadeghi says. “For the most part, the days of proprietary pipelines are over,” he predicts, as visual effects studios are required to share assets, shots, and know-how for the greater good of reaching a director’s vision. “The atmosphere is highly collaborative, which is great, because we all just want to create beautiful work,” he says. Effects shots are increasing in number, as even conventional narratives rely on their accuracy and efficiency. Opportunities for work continue to expand with virtual reality and augmented reality just coming into their own. Even concert touring is getting in on the action. This year, Atomic Fiction created stage visuals for the band Empire of the Sun to use at the Coachella Festival. Atomic plans to grow with the industry, while always remaining true to its founding principles. “From a technology standpoint, we’ve taken our experience with big-shop CG pipelines, systems architecture, asset management, production tracking, etc. and crossbred it with a small-shop ethos to create an environment that I think is pretty unique,” Sadeghi says. Tudhope seconds that: “This idea of a ‘big-shop capability, small-shop vibe’ is core to our values. People come here for the culture, to work with friends and build something greater.” Cloud computing, all three principals agree, is going to really change the game as to how things are built. Baillie envisions a not too distant future in which all processing – not just intensive rendering – is done remotely, the pixels streamed to the desktop. Looking ahead, Baillie says the new projects are “geared towards advancing the CG character pipeline we leveraged for Deadpool.” For that film, Atomic was handed an entire motorcycle chase sequence that was originally to have been shot live. “The car crash and stunts were all going to be shot for real, but when the production realized that to keep the stunts safe was going to compromise the visuals, and also that to make the background unrecognizable as a specific city would require a lot of work, they decided to just move the entire sequence into the digital realm. But the best compliment we get is that people do think it was filmed,” Baillie says. The filmmaking pipeline is “merging more and more with visual effects to the point that it’s often not clear how the two are separate,” Sadeghi says. “In that regard, visual effects feels less like a ‘post’ phase and more like part of the filmmaking process.” A recent addition to the Atomic Fiction family, Animation Department Supervisor Marc Chu, will lead that charge in character work, jumping in on Fox’s Shane Black-directed remake, The Predator, as well as the Stranger Things series and Zemeckis’ upcoming The Women of Marwen. Regardless of where things are headed, Sadeghi says, “It all comes back to our people. And right now our talent recruitment efforts are pretty robust.”

“(Conductor) really streamlines our process; it’s completely scalable and cuts our rendering time significantly, freeing our teams to focus on creating excellent shots without worry over whether the work can be rendered in time. … We don’t need to worry about the horsepower to get things done,” —Marc Sadeghi, President of Visual Effects, Atomic Fiction

TOP and BOTTOM: Deadpool (Photos courtesy of Atomic Fiction and copyright © 2016 Twentieth Century Fox/Marvel. All Rights Reserved.)

SUMMER 2017

VFXVOICE.COM • 97


FILM

ANIMATING KEY ASSETS

“THE MOST AWESOME TITLE SEQUENCE OF ALL TIME” By ED OCHS

TOP LEFT: Rocket (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) TOP RIGHT: Baby Groot (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) BELOW: Rocket and Groot (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)

98 • VFXVOICE.COM SUMMER 2017

The mission was clear: The opening sequence for Guardians of the Galaxy Vol. 2 had to be one for the ages. That’s what it said in the script. Nothing short of mind-blowing would do. From the first frame of the film to the end of the opening credits, Framestore’s VFX team faced an onslaught of non-stop, complex challenges and shot-by-shot detail, knowing the final result had to be spectacular and seamless. “The set was made of gold which we replaced entirely in every shot,” says Jonathan Fawkner, VFX Supervisor. “Every plate element, every furry creature, every ‘scattery’ skin alien, every explosion, weapon-blast energy effect and all debris needed to be reflected in the set. There was nowhere to hide any of the usual compositing tricks. Every element needed to be traced against set and reflected in it, and the sheer number of elements and layers made it a very challenging sequence to comp and to render.” Right off the bat, Framestore’s army of resources was called into action. Commensurate with the heavy workload and personnel demands was the concern for maintaining quality while ensuring smooth transitions. Marvel also had its requirements to factor in. “There was a single five-minute-long shot on which the titles were to be placed alongside Baby Groot dancing. Obviously, we could not put over 4000 frames with a single individual artist,” Fawkner says. “A huge amount of care needed to be taken to make sure that the work is cleverly shared and artists hand off in a sensible way. Knowing the way that Marvel wants to keep options open to the end, we had to be careful about layering to offer the most options, but the burden of that falls on comp.” The high-octane flurry of live action, CG, animation, effects and colors fulfilled the director’s mandate for a super-charged opening that never quits. It also confirmed that a mark of great VFX lies in the undetectable unification of diverse and complex elements. “We had 160 individual layers for any one shot, and when you consider that reflections are all de-focused separately from the surface they are reflected in, the complexity is hugely magnified. And that is to say nothing of the animation, the creature FX of fur and skin, the destruction and battle effects, the continuous camera, and all the rainbow colors we would cram into every frame to make it ‘the most awesome title sequence of all time’, which was, frankly, the only guide we had from the script.”

The opening sequence was just as challenging from the animation point of view. “We had to figure out how we were going to come up with a performance that’s coherent with Baby Groot’s character, but also make it funny and entertaining,” says Arslan Elver, Animation Supervisor. “The video reference of James Gunn dancing was fantastic as a starting point. We then pushed it further to be able to accommodate crazy camera movements and distances to cover, which was a challenge for such a small character. We split the sequence into 11 pieces for one continuous camera. The Abilisk also proved itself a tough character to animate with its eight tentacles.” Elver explains another big challenge. “We used an animation-driven approach where we had control over hundreds of space ships within a swarm. It was tricky at times to make sure we got interesting silhouettes out of the sovereign omni-crafts. Also, animating spaceships and making sure they moved in a dynamic but realistic way was a tricky thing to adapt yourself.” Entrusted with the fate of the studio’s hugely popular animation-driven characters, Elver acknowledges an obligation to preserve their arc through consistency, and he feels the team met and exceeded that bar. “As the company that created the assets of Baby Groot and Rocket, our overall challenge was to keep the performances of these two main characters – show-stealers, if you will – consistent. I think we managed to keep the performance solid across our sequences by carefully studying the footage we were given of Bradley Cooper and Sean Gunn, and by shooting our own reference inspired from these performances. We are proud of our work being 100% keyframe animation.” The opening sequence called for cascades of effects. “The environment for our opening sequence included a tumultuous roiling storm,” Fawkner adds. “We played it as if shot overcranked, emboldened by the license afforded us by being set on an alien planet. But plate photography was not an option. The camera soared up toward the low cloud base at times, and the shots were very long, meaning we needed 4,000 frames of high resolution, evolving and traveling 3D clouds. We developed a system that enabled us to model clouds as low-resolution geo, and to art direct the placement and density, animate their shapes and translate them across the sky, all while lighting them with a physically accurate blue sky and bright sun voxelizing at render time. Lightning bolts were volumetric, meaning we could obscure and scatter their light and illuminate beautiful cloud structures within an ever-changing and evolving storm that could fly over our head or we could fly right through.” Guardians’ top unreal characters continue to grow more real, in greater detail. “The solid performances of our two main CG characters were based on a hugely collaborative team effort,” Elver reports. “I think without using mocap or without labeling things as mocap, you can still get great performances out of animation. We worked hard with Rocket to update his fur, shapes, shaders and cloth sim. He looks so real, you want to reach out and touch him.”

TOP to BOTTOM: Baby Groot, finger on the button. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) The all-gold set. (Photo credit: Film Frame. Copyright ©2017 Marvel Studios. All Rights Reserved.) Arslan Elver, Animation Supervisor Jonathan Fawkner, VFX Supervisor

SUMMER 2017

VFXVOICE.COM • 99


FILM

ANIMATING KEY ASSETS

“THE MOST AWESOME TITLE SEQUENCE OF ALL TIME” By ED OCHS

TOP LEFT: Rocket (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) TOP RIGHT: Baby Groot (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) BELOW: Rocket and Groot (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)

98 • VFXVOICE.COM SUMMER 2017

The mission was clear: The opening sequence for Guardians of the Galaxy Vol. 2 had to be one for the ages. That’s what it said in the script. Nothing short of mind-blowing would do. From the first frame of the film to the end of the opening credits, Framestore’s VFX team faced an onslaught of non-stop, complex challenges and shot-by-shot detail, knowing the final result had to be spectacular and seamless. “The set was made of gold which we replaced entirely in every shot,” says Jonathan Fawkner, VFX Supervisor. “Every plate element, every furry creature, every ‘scattery’ skin alien, every explosion, weapon-blast energy effect and all debris needed to be reflected in the set. There was nowhere to hide any of the usual compositing tricks. Every element needed to be traced against set and reflected in it, and the sheer number of elements and layers made it a very challenging sequence to comp and to render.” Right off the bat, Framestore’s army of resources was called into action. Commensurate with the heavy workload and personnel demands was the concern for maintaining quality while ensuring smooth transitions. Marvel also had its requirements to factor in. “There was a single five-minute-long shot on which the titles were to be placed alongside Baby Groot dancing. Obviously, we could not put over 4000 frames with a single individual artist,” Fawkner says. “A huge amount of care needed to be taken to make sure that the work is cleverly shared and artists hand off in a sensible way. Knowing the way that Marvel wants to keep options open to the end, we had to be careful about layering to offer the most options, but the burden of that falls on comp.” The high-octane flurry of live action, CG, animation, effects and colors fulfilled the director’s mandate for a super-charged opening that never quits. It also confirmed that a mark of great VFX lies in the undetectable unification of diverse and complex elements. “We had 160 individual layers for any one shot, and when you consider that reflections are all de-focused separately from the surface they are reflected in, the complexity is hugely magnified. And that is to say nothing of the animation, the creature FX of fur and skin, the destruction and battle effects, the continuous camera, and all the rainbow colors we would cram into every frame to make it ‘the most awesome title sequence of all time’, which was, frankly, the only guide we had from the script.”

The opening sequence was just as challenging from the animation point of view. “We had to figure out how we were going to come up with a performance that’s coherent with Baby Groot’s character, but also make it funny and entertaining,” says Arslan Elver, Animation Supervisor. “The video reference of James Gunn dancing was fantastic as a starting point. We then pushed it further to be able to accommodate crazy camera movements and distances to cover, which was a challenge for such a small character. We split the sequence into 11 pieces for one continuous camera. The Abilisk also proved itself a tough character to animate with its eight tentacles.” Elver explains another big challenge. “We used an animation-driven approach where we had control over hundreds of space ships within a swarm. It was tricky at times to make sure we got interesting silhouettes out of the sovereign omni-crafts. Also, animating spaceships and making sure they moved in a dynamic but realistic way was a tricky thing to adapt yourself.” Entrusted with the fate of the studio’s hugely popular animation-driven characters, Elver acknowledges an obligation to preserve their arc through consistency, and he feels the team met and exceeded that bar. “As the company that created the assets of Baby Groot and Rocket, our overall challenge was to keep the performances of these two main characters – show-stealers, if you will – consistent. I think we managed to keep the performance solid across our sequences by carefully studying the footage we were given of Bradley Cooper and Sean Gunn, and by shooting our own reference inspired from these performances. We are proud of our work being 100% keyframe animation.” The opening sequence called for cascades of effects. “The environment for our opening sequence included a tumultuous roiling storm,” Fawkner adds. “We played it as if shot overcranked, emboldened by the license afforded us by being set on an alien planet. But plate photography was not an option. The camera soared up toward the low cloud base at times, and the shots were very long, meaning we needed 4,000 frames of high resolution, evolving and traveling 3D clouds. We developed a system that enabled us to model clouds as low-resolution geo, and to art direct the placement and density, animate their shapes and translate them across the sky, all while lighting them with a physically accurate blue sky and bright sun voxelizing at render time. Lightning bolts were volumetric, meaning we could obscure and scatter their light and illuminate beautiful cloud structures within an ever-changing and evolving storm that could fly over our head or we could fly right through.” Guardians’ top unreal characters continue to grow more real, in greater detail. “The solid performances of our two main CG characters were based on a hugely collaborative team effort,” Elver reports. “I think without using mocap or without labeling things as mocap, you can still get great performances out of animation. We worked hard with Rocket to update his fur, shapes, shaders and cloth sim. He looks so real, you want to reach out and touch him.”

TOP to BOTTOM: Baby Groot, finger on the button. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.) The all-gold set. (Photo credit: Film Frame. Copyright ©2017 Marvel Studios. All Rights Reserved.) Arslan Elver, Animation Supervisor Jonathan Fawkner, VFX Supervisor

SUMMER 2017

VFXVOICE.COM • 99


[ VES SECTION PROFILE: BAY AREA ]

Original VES Section Celebrates Vibrant, Thriving VFX Community By NAOMI GOLDMAN

100 • VFXVOICE.COM SUMMER 2017

As the Visual Effects Society celebrates its milestone 20th anniversary, much of its growth and diversification is due to its network of Sections, whose members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region – while advancing the Society and industry worldwide. The Bay Area Section in Northern California is the VES’s oldest Section, founded in 2006 and now boasting almost 500 members, and it’s enormously vibrant because of the area’s rich history in visual effects and impassioned leadership at the helm. Often called “Digital Hollywood,” the Bay Area is the birthplace of some of the most recognizable brands in the visual effects industry, including Industrial Light and Magic, Pixar, Tippett Studio, Telltale Games and Atomic Fiction, and the region leads the way in technological innovation in Silicon Valley. “Another element lending to the success of the dynamic Section is the desire for and commitment to community,” says Bay Area Section Chair Lisa Cooke. “A lot of people are working in a freelance capacity and are dispersed, and so the Bay Area Section has taken on the responsibility and opportunity to bring together the VFX community through our events, outreach and education. There was a real hunger for this centralized hub to exchange ideas, network and build relationships, and we’re immensely proud to play that role of convener.” The Bay Area Section enjoys strong support from area studios, local corporations and colleges. That business support has been elevated this year, as the Section launched its Diamond Sponsorship Program, where companies sign on as sponsors for an entire calendar year and all of its programs. The 2017 Diamond Sponsors are Lytro, ASG Advanced Systems Group and Quantum, who serve as integral players in the Bay Area VES Community. The Section also nurtures mutually beneficial relationships with other guilds and associations in the region who partner on events, share resources and reciprocal discounts and foster valuable cross-craft

All photos courtesy of Jason Andrew Photography OPPOSITE TOP: VES Bay Area Section Board and Summit Keynote Speaker Lynwen Brennan, EVP and General Manager, Lucasfilm, ILM and Skywalker Sound, is flanked by VFX colleagues. From left to right: David Valentin, Scott Smith, Laurie Blavin, Jewels Rottiers, Lisa Cooke, Brennan, Kim Lavery, VES, Maggie Oh, Lance Thornton and Corey Rosen. OPPOSITE BOTTOM LEFT: Engaged attendees at the Summit’s interactive roundtable session. OPPOSITE BOTTOM RIGHT: Keynote speaker Rob Legato, ASC, addressing the Summit audience. TOP: The excited capacity Summit crowd at Pixar Animation Studios.

connections, including The American Society of Cinematographers (ASC), Society of Motion Picture & Television Engineers (SMPTE), Women in Animation, the Producers Guild of America (PGA), Directors Guild of America (DGA) and SIGGRAPH. The launch of the Bay Area Summit in

2016 was core to the Section achieving its outreach and education goals. “It’s exciting to be able to bring such great minds together to exchange ideas as we look forward to the limitless possibilities of the artistic and technological innovations in our unique industry,” says Cooke. The collaborative endeavor has quickly become

BOTTOM: The VES Bay Area Summit had numerous industry supporters.

SUMMER 2017

VFXVOICE.COM • 101


[ VES SECTION PROFILE: BAY AREA ]

Original VES Section Celebrates Vibrant, Thriving VFX Community By NAOMI GOLDMAN

100 • VFXVOICE.COM SUMMER 2017

As the Visual Effects Society celebrates its milestone 20th anniversary, much of its growth and diversification is due to its network of Sections, whose members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region – while advancing the Society and industry worldwide. The Bay Area Section in Northern California is the VES’s oldest Section, founded in 2006 and now boasting almost 500 members, and it’s enormously vibrant because of the area’s rich history in visual effects and impassioned leadership at the helm. Often called “Digital Hollywood,” the Bay Area is the birthplace of some of the most recognizable brands in the visual effects industry, including Industrial Light and Magic, Pixar, Tippett Studio, Telltale Games and Atomic Fiction, and the region leads the way in technological innovation in Silicon Valley. “Another element lending to the success of the dynamic Section is the desire for and commitment to community,” says Bay Area Section Chair Lisa Cooke. “A lot of people are working in a freelance capacity and are dispersed, and so the Bay Area Section has taken on the responsibility and opportunity to bring together the VFX community through our events, outreach and education. There was a real hunger for this centralized hub to exchange ideas, network and build relationships, and we’re immensely proud to play that role of convener.” The Bay Area Section enjoys strong support from area studios, local corporations and colleges. That business support has been elevated this year, as the Section launched its Diamond Sponsorship Program, where companies sign on as sponsors for an entire calendar year and all of its programs. The 2017 Diamond Sponsors are Lytro, ASG Advanced Systems Group and Quantum, who serve as integral players in the Bay Area VES Community. The Section also nurtures mutually beneficial relationships with other guilds and associations in the region who partner on events, share resources and reciprocal discounts and foster valuable cross-craft

All photos courtesy of Jason Andrew Photography OPPOSITE TOP: VES Bay Area Section Board and Summit Keynote Speaker Lynwen Brennan, EVP and General Manager, Lucasfilm, ILM and Skywalker Sound, is flanked by VFX colleagues. From left to right: David Valentin, Scott Smith, Laurie Blavin, Jewels Rottiers, Lisa Cooke, Brennan, Kim Lavery, VES, Maggie Oh, Lance Thornton and Corey Rosen. OPPOSITE BOTTOM LEFT: Engaged attendees at the Summit’s interactive roundtable session. OPPOSITE BOTTOM RIGHT: Keynote speaker Rob Legato, ASC, addressing the Summit audience. TOP: The excited capacity Summit crowd at Pixar Animation Studios.

connections, including The American Society of Cinematographers (ASC), Society of Motion Picture & Television Engineers (SMPTE), Women in Animation, the Producers Guild of America (PGA), Directors Guild of America (DGA) and SIGGRAPH. The launch of the Bay Area Summit in

2016 was core to the Section achieving its outreach and education goals. “It’s exciting to be able to bring such great minds together to exchange ideas as we look forward to the limitless possibilities of the artistic and technological innovations in our unique industry,” says Cooke. The collaborative endeavor has quickly become

BOTTOM: The VES Bay Area Summit had numerous industry supporters.

SUMMER 2017

VFXVOICE.COM • 101


[ VES SECTION PROFILE: BAY AREA ]

TOP LEFT: A VES 20th anniversary t-shirt grabs the spotlight. TOP RIGHT: An attendee samples a VR experience. BOTTOM: VR on display at the VES Bay Area Summit.

102 • VFXVOICE.COM SUMMER 2017

“A lot of people are working in a freelance capacity and are dispersed, and so the Bay Area Section has taken on the responsibility and opportunity to bring together the VFX community through our events, outreach and education.” —Lisa Cooke, Bay Area Section Chair

a highly-anticipated annual forum for VFX professionals to come together and innovate. In May 2016, the Section kicked off this ambitious undertaking and hosted its first Bay Area Summit, entitled “Film & Beyond: The Changing Landscape of Entertainment.” Held at Pixar Animation Studios, the Summit featured Karen Dufilho, Executive Producer of Google, as Keynote Speaker, and included speakers and roundtable moderators from The Foundry, ILM, Dolby, Telltale, Inc., Pixar, Jaunt, Tippett Studio and Whamix. “Highlighting new technologies in motion pictures, television, online content, games, apps, virtual reality, scientific research and ‘beyond,’” Cooke continues, “the Summit honored and celebrated the thriving visual effects community, which we are so proud of supporting here in Northern California.” The event was deemed an

instant success from attendees, sponsors and the community-at-large. Back by popular demand, the 2nd Annual Bay Area Summit was convened in May 2017, once again under the direction of Summit Chair Kim Lavery, VES, and an exemplary team of hard-working professionals. “Film & Beyond: The Next Twenty Years” was a nod to both what lies ahead for the global community and the VES. Held again at Pixar Animation Studios, the Summit brought together local and global experts from the fields of animation, broadcast and film, VFX, video games and technology to reflect on our history and provide insights on where to go next. The Summit exudes pride in the region’s leadership in the many modes in which VFX can affect the world, citing its roots in hand-made, traditional visual effects craftsmanship, through expansion into digital tools and technologies, feature-length animation and cinematic videogame content. The Summit featured impressive keynote speakers Lynwen Brennan, EVP and General Manager, Lucasfilm, ILM and Skywalker Sound, and Robert Legato, ASC, acclaimed Visual Effects Supervisor, 2nd Unit Director and Director. The Summit also facilitated a series of interactive roundtable discussions on topics ranging from “Filming for VR,” “Modern VFX Production in the Cloud” and “The Future of Machine Learning in Entertainment,” to “Interactive Characters in AR/VR,” “Merging Realms; Art, Science and Social Awareness” and “Crossover from Live-Action VFX Films to AR/VR/Multi-Media.”

The Section Board is very active, with each member taking on meaningful responsibilities and having the freedom to come up with new ways to serve the membership. Every Board member brings a unique and important talent to the Society. The team is focused on events that bring the membership together to network, socialize and learn. Its anchor events, in addition to the Summit, are the annual Summer BBQ and Holiday Party. Throughout the year, the Section has a full roster of screenings, a museum series, a workshop series, a speaker series and an outreach/mixer series that gets peppered in between. Leadership is looking forward to branching out into mentorship programs and developing its VFX for Good Committee to provide members with opportunities to use their talents to give back to the community. “This is an important time to elevate visual effects as a respected art form and critical contributor to the business – and not just have it characterized as magic. The VES is 20 years old this year, and in that time our industry has seen incredible change. Always remarkable, often exciting and sometimes very challenging. Our goal is to honor the past, celebrate the present and embrace the future. We try to help our membership understand and prepare for our industry’s rapidly changing landscape and we’re excited for all that comes next,” concludes Cooke.

TOP LEFT: Keynoter Rob Legato, ASC, is flanked by Lisa Cooke and Kim Lavery, VES. TOP RIGHT: Life-size renditions of Pixar characters inspire all kinds of behavior. BOTTOM: Buzz Lightyear and Sheriff Woody line up for their badges.

SUMMER 2017

VFXVOICE.COM • 103


[ VES SECTION PROFILE: BAY AREA ]

TOP LEFT: A VES 20th anniversary t-shirt grabs the spotlight. TOP RIGHT: An attendee samples a VR experience. BOTTOM: VR on display at the VES Bay Area Summit.

102 • VFXVOICE.COM SUMMER 2017

“A lot of people are working in a freelance capacity and are dispersed, and so the Bay Area Section has taken on the responsibility and opportunity to bring together the VFX community through our events, outreach and education.” —Lisa Cooke, Bay Area Section Chair

a highly-anticipated annual forum for VFX professionals to come together and innovate. In May 2016, the Section kicked off this ambitious undertaking and hosted its first Bay Area Summit, entitled “Film & Beyond: The Changing Landscape of Entertainment.” Held at Pixar Animation Studios, the Summit featured Karen Dufilho, Executive Producer of Google, as Keynote Speaker, and included speakers and roundtable moderators from The Foundry, ILM, Dolby, Telltale, Inc., Pixar, Jaunt, Tippett Studio and Whamix. “Highlighting new technologies in motion pictures, television, online content, games, apps, virtual reality, scientific research and ‘beyond,’” Cooke continues, “the Summit honored and celebrated the thriving visual effects community, which we are so proud of supporting here in Northern California.” The event was deemed an

instant success from attendees, sponsors and the community-at-large. Back by popular demand, the 2nd Annual Bay Area Summit was convened in May 2017, once again under the direction of Summit Chair Kim Lavery, VES, and an exemplary team of hard-working professionals. “Film & Beyond: The Next Twenty Years” was a nod to both what lies ahead for the global community and the VES. Held again at Pixar Animation Studios, the Summit brought together local and global experts from the fields of animation, broadcast and film, VFX, video games and technology to reflect on our history and provide insights on where to go next. The Summit exudes pride in the region’s leadership in the many modes in which VFX can affect the world, citing its roots in hand-made, traditional visual effects craftsmanship, through expansion into digital tools and technologies, feature-length animation and cinematic videogame content. The Summit featured impressive keynote speakers Lynwen Brennan, EVP and General Manager, Lucasfilm, ILM and Skywalker Sound, and Robert Legato, ASC, acclaimed Visual Effects Supervisor, 2nd Unit Director and Director. The Summit also facilitated a series of interactive roundtable discussions on topics ranging from “Filming for VR,” “Modern VFX Production in the Cloud” and “The Future of Machine Learning in Entertainment,” to “Interactive Characters in AR/VR,” “Merging Realms; Art, Science and Social Awareness” and “Crossover from Live-Action VFX Films to AR/VR/Multi-Media.”

The Section Board is very active, with each member taking on meaningful responsibilities and having the freedom to come up with new ways to serve the membership. Every Board member brings a unique and important talent to the Society. The team is focused on events that bring the membership together to network, socialize and learn. Its anchor events, in addition to the Summit, are the annual Summer BBQ and Holiday Party. Throughout the year, the Section has a full roster of screenings, a museum series, a workshop series, a speaker series and an outreach/mixer series that gets peppered in between. Leadership is looking forward to branching out into mentorship programs and developing its VFX for Good Committee to provide members with opportunities to use their talents to give back to the community. “This is an important time to elevate visual effects as a respected art form and critical contributor to the business – and not just have it characterized as magic. The VES is 20 years old this year, and in that time our industry has seen incredible change. Always remarkable, often exciting and sometimes very challenging. Our goal is to honor the past, celebrate the present and embrace the future. We try to help our membership understand and prepare for our industry’s rapidly changing landscape and we’re excited for all that comes next,” concludes Cooke.

TOP LEFT: Keynoter Rob Legato, ASC, is flanked by Lisa Cooke and Kim Lavery, VES. TOP RIGHT: Life-size renditions of Pixar characters inspire all kinds of behavior. BOTTOM: Buzz Lightyear and Sheriff Woody line up for their badges.

SUMMER 2017

VFXVOICE.COM • 103


PREVIS

A VISIONEER’S GUIDE TO THE GALAXY By ED OCHS

TOP LEFT and RIGHT: The Third Floor, headed by Previs/Postvis Supervisor James Baker, created all of the previs, postvis and techvis for Guardians of the Galaxy Vol. 2, helping to visualize the action and plan technical filming. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) BOTTOM LEFT and RIGHT: The Third Floor’s work for Guardians of the Galaxy Vol. 2 spanned scenes across the movie, including the opening credits, Eclector Escape, Space Chase and Final Battle. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.)

104 • VFXVOICE.COM SUMMER 2017

In the latest evolution of the expanding Marvel universe, The Third Floor created all of the previs, postvis and techvis for Guardians of the Galaxy Vol. 2. The project marks the eleventh Marvel production The Third Floor has contributed to since 2010. The Third Floor’s Supervisor, James Baker, who led the company’s earlier work on Guardians of the Galaxy, headed the visualization team, collaborating with director James Gunn, Visual Effects Supervisor Christopher Townsend, Director of Photography Henry Braham, BSC, Editors Craig Wood and Fred Raskin, and Supervising Art Director Ramsey Avery. Previs was used to map the action and plan filming across the movie for sequences like the opening credits, Space Chase and Final Battle, and especially for the shots with completely digital characters. “James (Gunn) embraces the previs and postvis process and uses it to inform the production shoot and the final vendor work,” Baker states. “He has a very clear idea of what he wants with each shot; in fact he thumbnails the entire movie. Sometimes these go to storyboard artists and then to us, but we also worked straight from his thumbnails and a pitch. It’s great to work with a director who has such a clear vision.” Communications were constant, constructive and rapid-fire. “There was a lot of exchange between James, Chris Townsend and myself in visualizing the sequences; it was very interactive,” Baker recounts. “A lot of times in post during the editing process, James would call us over for shots he would want to add for a re-shoot or a pure visual effects shot. We would turn out something quickly so they could see if it would work in the cut.” Having two fully CG characters, Rocket and Baby Groot, in the movie gave previs artists the opportunity to creatively contribute to the development of moments and details that carried through into the final film. Artists referenced character performances

done by Sean Gunn on set for Rocket, with postvis artists closely following this acting in temping the CG character into live plates. “Something different about this project was having two fully CG main characters, Rocket and Baby Groot,” Baker notes. “It was a lot of fun for our previs artists to be able to help shape and develop performances that are part of the film.” One of the most challenging shots to visualize was the film’s opener. The Third Floor’s Steve Lo led efforts to previs the long, continuous credits shot and adjust to creative changes as the sequence evolved over multiple months. “Probably the most challenging shot of the film for us was the Baby Groot opening credit sequence,” Baker explains. “One of our artists, Steve Lo, was responsible for visualizing most of that shot and did an incredible job. With these kinds of shots – long takes – any changes have a domino effect and they end up being quite involved and taking quite a while. We worked on and off on this one shot for quite a number of months, but I think the end results are worth it; it’s so much fun.” The Space Chase presented unique challenges for depicting ships and motion. The Third Floor’s team focused on ways to sequence the shots and introduce camera framings that would produce the right sense of speed and scale. “A lot of times, action sequences have the action taking place near a large ship that gives you some scale and lets you show the speed of the action by the proximity to the bigger object,” Baker says. “That wasn’t the case for this scene, so we used a combination of editing and camera work to get that sense of speed in the shots.” Techvis provided a way to help the filmmakers and departments figure out shooting approaches for flying character rigs and sweeping orbit shots. The film also features some tricky shots executed with stunt rigs. For example, Baker points to when Quill flies around the Abelisk, with the camera needing to operate

within a closely defined space. Baker says his previs team also created techvis schematics and Quicktimes to help develop how the sweeping camera moves in the Final Battle could successfully be implemented. In certain cases, previs camera data was exported to the Spydercam rigging crew to help set up the physical shot. Visualization also extended to look-development for environments like the Quantum asteroid field, where The Third Floor’s previs artists completed numerous concept iterations in tandem with the filmmakers.

TOP LEFT and RIGHT: Visualization work by The Third Floor for Guardians of the Galaxy Vol. 2 provided the opportunity to help develop character moments and details for Groot and Rocket, which would later be realized as final fully CG effects. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) BOTTOM LEFT and RIGHT: As with its work for the first Guardians movie, The Third Floor helped visualize ideas and layouts for a range of dramatic action in space. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) Previs and postvis images by The Third Floor, Inc. Copyright © 2017 Marvel. Photos from Guardians of the Galaxy Vol. 2 by Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.

SUMMER 2017

VFXVOICE.COM • 105


PREVIS

A VISIONEER’S GUIDE TO THE GALAXY By ED OCHS

TOP LEFT and RIGHT: The Third Floor, headed by Previs/Postvis Supervisor James Baker, created all of the previs, postvis and techvis for Guardians of the Galaxy Vol. 2, helping to visualize the action and plan technical filming. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) BOTTOM LEFT and RIGHT: The Third Floor’s work for Guardians of the Galaxy Vol. 2 spanned scenes across the movie, including the opening credits, Eclector Escape, Space Chase and Final Battle. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.)

104 • VFXVOICE.COM SUMMER 2017

In the latest evolution of the expanding Marvel universe, The Third Floor created all of the previs, postvis and techvis for Guardians of the Galaxy Vol. 2. The project marks the eleventh Marvel production The Third Floor has contributed to since 2010. The Third Floor’s Supervisor, James Baker, who led the company’s earlier work on Guardians of the Galaxy, headed the visualization team, collaborating with director James Gunn, Visual Effects Supervisor Christopher Townsend, Director of Photography Henry Braham, BSC, Editors Craig Wood and Fred Raskin, and Supervising Art Director Ramsey Avery. Previs was used to map the action and plan filming across the movie for sequences like the opening credits, Space Chase and Final Battle, and especially for the shots with completely digital characters. “James (Gunn) embraces the previs and postvis process and uses it to inform the production shoot and the final vendor work,” Baker states. “He has a very clear idea of what he wants with each shot; in fact he thumbnails the entire movie. Sometimes these go to storyboard artists and then to us, but we also worked straight from his thumbnails and a pitch. It’s great to work with a director who has such a clear vision.” Communications were constant, constructive and rapid-fire. “There was a lot of exchange between James, Chris Townsend and myself in visualizing the sequences; it was very interactive,” Baker recounts. “A lot of times in post during the editing process, James would call us over for shots he would want to add for a re-shoot or a pure visual effects shot. We would turn out something quickly so they could see if it would work in the cut.” Having two fully CG characters, Rocket and Baby Groot, in the movie gave previs artists the opportunity to creatively contribute to the development of moments and details that carried through into the final film. Artists referenced character performances

done by Sean Gunn on set for Rocket, with postvis artists closely following this acting in temping the CG character into live plates. “Something different about this project was having two fully CG main characters, Rocket and Baby Groot,” Baker notes. “It was a lot of fun for our previs artists to be able to help shape and develop performances that are part of the film.” One of the most challenging shots to visualize was the film’s opener. The Third Floor’s Steve Lo led efforts to previs the long, continuous credits shot and adjust to creative changes as the sequence evolved over multiple months. “Probably the most challenging shot of the film for us was the Baby Groot opening credit sequence,” Baker explains. “One of our artists, Steve Lo, was responsible for visualizing most of that shot and did an incredible job. With these kinds of shots – long takes – any changes have a domino effect and they end up being quite involved and taking quite a while. We worked on and off on this one shot for quite a number of months, but I think the end results are worth it; it’s so much fun.” The Space Chase presented unique challenges for depicting ships and motion. The Third Floor’s team focused on ways to sequence the shots and introduce camera framings that would produce the right sense of speed and scale. “A lot of times, action sequences have the action taking place near a large ship that gives you some scale and lets you show the speed of the action by the proximity to the bigger object,” Baker says. “That wasn’t the case for this scene, so we used a combination of editing and camera work to get that sense of speed in the shots.” Techvis provided a way to help the filmmakers and departments figure out shooting approaches for flying character rigs and sweeping orbit shots. The film also features some tricky shots executed with stunt rigs. For example, Baker points to when Quill flies around the Abelisk, with the camera needing to operate

within a closely defined space. Baker says his previs team also created techvis schematics and Quicktimes to help develop how the sweeping camera moves in the Final Battle could successfully be implemented. In certain cases, previs camera data was exported to the Spydercam rigging crew to help set up the physical shot. Visualization also extended to look-development for environments like the Quantum asteroid field, where The Third Floor’s previs artists completed numerous concept iterations in tandem with the filmmakers.

TOP LEFT and RIGHT: Visualization work by The Third Floor for Guardians of the Galaxy Vol. 2 provided the opportunity to help develop character moments and details for Groot and Rocket, which would later be realized as final fully CG effects. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) BOTTOM LEFT and RIGHT: As with its work for the first Guardians movie, The Third Floor helped visualize ideas and layouts for a range of dramatic action in space. (Image copyright © 2017 Marvel and courtesy of The Third Floor, Inc.) Previs and postvis images by The Third Floor, Inc. Copyright © 2017 Marvel. Photos from Guardians of the Galaxy Vol. 2 by Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.

SUMMER 2017

VFXVOICE.COM • 105


[ THE VES HANDBOOK ]

Working with the Director and Producer By SCOTT SQUIRES, VES

“The supervisor needs to be confident and part salesperson, as with any job interview. One of the first questions will likely be how they can do a particular type of effect for a sequence in the film. The exact answer to this may be very dependent on learning other details of the project, but they will want to know the supervisor has answers and can work with them on determining the best solution for their needs. The supervisor will want to determine the scope of the work.” —Scott Squires, VES

The VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures (Focal Press), edited by Susan Zwerman and Jeffrey A. Okun, VES has become a ‘go-to’ staple for professionals as a reference, for students as a guide and lay people to reveal the magic of visual effects. Here is an excerpt from the second edition. Before meeting with the director and the producer, the VFX Supervisor or VFX Producer should prepare by obtaining as much information about the project as possible. Is a script available? Who are the key people involved? What were their other projects? What is the project based on? Has the project been greenlit? A super-

106 • VFXVOICE.COM SUMMER 2017

visor may be working for a visual effects company or may be freelance. Becoming familiar with past projects and any additional source material will make it easier to discuss the project requirements using these as references. If a script is available, it is best to read through it and make notes and questions in terms of the director’s vision and how the various sequences will be approached. The VFX Supervisor will have to consider some of the techniques required to accomplish what is described in the script. DEMO REEL Before the meeting a demo reel and resume/credit list should be submitted if

there is time. The supervisor should bring the same reel and multiple copies of the resume to the meeting. The following information applies to other visual effects artists as well. The demo reel should include only the best work and should be on a DVD that is no longer than 5 minutes (2 to 3 minutes may be a more useful maximum). The DVD case cover should include the artist’s name, position, and phone number. The inside flap of the DVD case should list the scenes or films in order and what the artist did or what their job role was. The DVD itself should also include the name and contact information since it may become separated from the case. A supervisor can usually just include the finished shots. A visual effects artist should include the before and after versions of key shots. Normally, this is done by showing the finished shot, then the original plate, and then the finished shot again. It’s not necessary to show a before and after for every shot. Customize it based on your job type (i.e., an animator may want to show an animation test for a shot). The DVD should include the name (and possi-

bly contact info) on the main menu. Avoid showing running footage in the DVD menu screen so that when the reviewers see the shots, the images are full screen. Consider using movie music as a basic soundtrack, which the reviewers can listen to or not. Any music that is considered grating should be avoided. Do not take credit for work you did not do and do not falsify anything on your resume. The facts will be uncovered during the project and will make things very difficult moving forward. THE MEETING The meeting with the director and producer is both a job interview and, it is hoped, a meeting of the minds. They will want to determine if the VFX Supervisor can provide the creative and technical expertise needed for the project and whether they feel they can work with this person for six months to two years, depending on the scope of the project. Does the director feel that he or she can speak in creative film terms and not be caught up in the technical requirements? Does the producer feel that the supervisor has the experience and organizational skills to oversee other artists and companies? They will also be

“It is important for the director to look at visual effects as a required art, not something to be feared or ignored. The supervisor should be looked on as a key part of the creative production team, the same as the director of photography. The director’s and producer’s support of the supervisor will go a long way toward making the filming of the visual effects elements easier and more productive.” —Scott Squires, VES

evaluating how passionate the supervisor is about the project. The supervisor needs to be confident and part salesperson, as with any job interview. One of the first questions will likely be how they can do a particular type of effect for a sequence in the film. The exact answer to this may be very dependent on learning other details of the project, but they will want to know the supervisor has answers and can work with them on determining the best solution for their needs. The supervisor will want to determine the scope of the work, the general look and approach the director is planning, and as many details as are reasonable in an hour meeting. The supervisor needs to evaluate the director and producer and the project as a whole to determine whether it is a project that the supervisor wants to commit to for the next year or two. There is no guarantee when the next potential project will be offered to the supervisor, so that will have to be considered as well. MOVING FORWARD Once the supervisor is selected, the first issue will be determining the true scope of the work with the director, producer, and VFX Producer. A detailed breakdown will have to be done for budgeting, and this budget will have to be adjusted as storyboards and previs are completed. These breakdowns may be sent out to multiple visual effects companies to bid. The supervisor should work with production to make use of as much pre-production time as possible. There will never be enough pre-production time, so it is important to schedule storyboards, previs, designs, and tests to be done. The supervisor needs to be able to communicate clearly to both the director and to others what will be required and how they can strike the right balance of creative design, time, and budget to accomplish the visual effects. The supervisor may have to do mock-ups and work with other artists who can produce designs and mock-ups to try to refine the specifics. If the director has worked with visual

effects before, then the supervisor will have to explain any differences from other approaches that were used on the director’s previous projects. If the director has not done visual effects, then the supervisor will have to explain the basics of the process and what the director will need to know (without getting tied up in the technical details). The supervisor needs to support the director with creative suggestions on shot design, creature design, and other visual effects creative issues. The approach many directors take for their visual effects shots differs from that taken for their other shots, but it is important to design the shots to match as if the objects and scene actually existed during the shoot. The supervisor will work with the director and producer to determine the best approach needed for the visual effects. This includes how to dovetail the visual effects into the design and execution of the film to achieve the best results within the budget. Planning how the elements will need to be filmed during production will be one of many issues the supervisor will have to convey to the director and the key production departments. When the director and supervisor work together well during preproduction, they develop a shorthand for communication, and the supervisor can use their understanding of the direction to guide decisions in production and post-production. It is important for the director to look at visual effects as a required art, not something to be feared or ignored. The supervisor should be looked on as a key part of the creative production team, the same as the director of photography. The director’s and producer’s support of the supervisor will go a long way toward making the filming of the visual effects elements easier and more productive. The first assistant director and the production manager will take their cues from the director and producer, and if plate photography is rushed to the point of compromise, then the finished visual effects will be compromised as well.

SUMMER 2017

VFXVOICE.COM • 107


[ THE VES HANDBOOK ]

Working with the Director and Producer By SCOTT SQUIRES, VES

“The supervisor needs to be confident and part salesperson, as with any job interview. One of the first questions will likely be how they can do a particular type of effect for a sequence in the film. The exact answer to this may be very dependent on learning other details of the project, but they will want to know the supervisor has answers and can work with them on determining the best solution for their needs. The supervisor will want to determine the scope of the work.” —Scott Squires, VES

The VES Handbook of Visual Effects: Industry Standard VFX Practices and Procedures (Focal Press), edited by Susan Zwerman and Jeffrey A. Okun, VES has become a ‘go-to’ staple for professionals as a reference, for students as a guide and lay people to reveal the magic of visual effects. Here is an excerpt from the second edition. Before meeting with the director and the producer, the VFX Supervisor or VFX Producer should prepare by obtaining as much information about the project as possible. Is a script available? Who are the key people involved? What were their other projects? What is the project based on? Has the project been greenlit? A super-

106 • VFXVOICE.COM SUMMER 2017

visor may be working for a visual effects company or may be freelance. Becoming familiar with past projects and any additional source material will make it easier to discuss the project requirements using these as references. If a script is available, it is best to read through it and make notes and questions in terms of the director’s vision and how the various sequences will be approached. The VFX Supervisor will have to consider some of the techniques required to accomplish what is described in the script. DEMO REEL Before the meeting a demo reel and resume/credit list should be submitted if

there is time. The supervisor should bring the same reel and multiple copies of the resume to the meeting. The following information applies to other visual effects artists as well. The demo reel should include only the best work and should be on a DVD that is no longer than 5 minutes (2 to 3 minutes may be a more useful maximum). The DVD case cover should include the artist’s name, position, and phone number. The inside flap of the DVD case should list the scenes or films in order and what the artist did or what their job role was. The DVD itself should also include the name and contact information since it may become separated from the case. A supervisor can usually just include the finished shots. A visual effects artist should include the before and after versions of key shots. Normally, this is done by showing the finished shot, then the original plate, and then the finished shot again. It’s not necessary to show a before and after for every shot. Customize it based on your job type (i.e., an animator may want to show an animation test for a shot). The DVD should include the name (and possi-

bly contact info) on the main menu. Avoid showing running footage in the DVD menu screen so that when the reviewers see the shots, the images are full screen. Consider using movie music as a basic soundtrack, which the reviewers can listen to or not. Any music that is considered grating should be avoided. Do not take credit for work you did not do and do not falsify anything on your resume. The facts will be uncovered during the project and will make things very difficult moving forward. THE MEETING The meeting with the director and producer is both a job interview and, it is hoped, a meeting of the minds. They will want to determine if the VFX Supervisor can provide the creative and technical expertise needed for the project and whether they feel they can work with this person for six months to two years, depending on the scope of the project. Does the director feel that he or she can speak in creative film terms and not be caught up in the technical requirements? Does the producer feel that the supervisor has the experience and organizational skills to oversee other artists and companies? They will also be

“It is important for the director to look at visual effects as a required art, not something to be feared or ignored. The supervisor should be looked on as a key part of the creative production team, the same as the director of photography. The director’s and producer’s support of the supervisor will go a long way toward making the filming of the visual effects elements easier and more productive.” —Scott Squires, VES

evaluating how passionate the supervisor is about the project. The supervisor needs to be confident and part salesperson, as with any job interview. One of the first questions will likely be how they can do a particular type of effect for a sequence in the film. The exact answer to this may be very dependent on learning other details of the project, but they will want to know the supervisor has answers and can work with them on determining the best solution for their needs. The supervisor will want to determine the scope of the work, the general look and approach the director is planning, and as many details as are reasonable in an hour meeting. The supervisor needs to evaluate the director and producer and the project as a whole to determine whether it is a project that the supervisor wants to commit to for the next year or two. There is no guarantee when the next potential project will be offered to the supervisor, so that will have to be considered as well. MOVING FORWARD Once the supervisor is selected, the first issue will be determining the true scope of the work with the director, producer, and VFX Producer. A detailed breakdown will have to be done for budgeting, and this budget will have to be adjusted as storyboards and previs are completed. These breakdowns may be sent out to multiple visual effects companies to bid. The supervisor should work with production to make use of as much pre-production time as possible. There will never be enough pre-production time, so it is important to schedule storyboards, previs, designs, and tests to be done. The supervisor needs to be able to communicate clearly to both the director and to others what will be required and how they can strike the right balance of creative design, time, and budget to accomplish the visual effects. The supervisor may have to do mock-ups and work with other artists who can produce designs and mock-ups to try to refine the specifics. If the director has worked with visual

effects before, then the supervisor will have to explain any differences from other approaches that were used on the director’s previous projects. If the director has not done visual effects, then the supervisor will have to explain the basics of the process and what the director will need to know (without getting tied up in the technical details). The supervisor needs to support the director with creative suggestions on shot design, creature design, and other visual effects creative issues. The approach many directors take for their visual effects shots differs from that taken for their other shots, but it is important to design the shots to match as if the objects and scene actually existed during the shoot. The supervisor will work with the director and producer to determine the best approach needed for the visual effects. This includes how to dovetail the visual effects into the design and execution of the film to achieve the best results within the budget. Planning how the elements will need to be filmed during production will be one of many issues the supervisor will have to convey to the director and the key production departments. When the director and supervisor work together well during preproduction, they develop a shorthand for communication, and the supervisor can use their understanding of the direction to guide decisions in production and post-production. It is important for the director to look at visual effects as a required art, not something to be feared or ignored. The supervisor should be looked on as a key part of the creative production team, the same as the director of photography. The director’s and producer’s support of the supervisor will go a long way toward making the filming of the visual effects elements easier and more productive. The first assistant director and the production manager will take their cues from the director and producer, and if plate photography is rushed to the point of compromise, then the finished visual effects will be compromised as well.

SUMMER 2017

VFXVOICE.COM • 107


[ V-ART ]

TyRuben Ellingson V-Art showcases the talents of worldwide VFX professionals as they create original illustrations, creatures, models and worlds. If you would like to contribute to V-Art, send images and captions to publisher@vfxvoice.com Concept designer and artist TyRuben Ellingson began his career as a visual effects art director at Industrial Light & Magic, making significant contributions to Jurassic Park, Star Wars: Episode IV - A New Hope (Special Edition) and Disclosure. He then went on to collaborate with directors including Guillermo Del Toro and James Cameron. Ellingson is currently Chair and Assistant Professor in the Department of Communication Arts at Virginia Commonwealth University.

TOP to BOTTOM: Top plan view of Scorpion aircraft twin-rotor-attack vehicle for James Cameron’s Avatar. “This was fairly early in production,” says Ellingson, “and these renderings were about the actual scale, functionality, and silhouette of the vehicles – as opposed to surface detail and paint schema.” An exploded view of the AMP suit shoulder construction as seen in Avatar. The design was built and rendered in SketchUp. Ellingson’s first assignment from director Guillermo del Toro on Pacific Rim was to design these massive landing gear for an incredibly large helicopter and a fold-down set of stairs. “The landing gear was going to crush cars, and then all these dudes from the Pacific Rim group were going to stream down the stairs,” the artist explains. “The entire scene was cut, but I always liked the way the landing gear came out.” TOP: A comparison of Ellingson’s initial study sketch for the excavator that appears in the opening of Avatar (seen when Jake is flying down to the surface in a shuttle) and the final SketchUp version that was used in the film.

TOP RIGHT: Design for chain weapon appearing at the beginning of the David S. Goyer film Blade: Trinity. BOTTOM RIGHT: TyRuben Ellingson

BOTTOM: Electronic pistol design as used by Hannibal King in Blade: Trinity. “At the time, it was really outlandish that it had a little disc drive in it – little did we know,” notes Ellingson.

108 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 109


[ V-ART ]

TyRuben Ellingson V-Art showcases the talents of worldwide VFX professionals as they create original illustrations, creatures, models and worlds. If you would like to contribute to V-Art, send images and captions to publisher@vfxvoice.com Concept designer and artist TyRuben Ellingson began his career as a visual effects art director at Industrial Light & Magic, making significant contributions to Jurassic Park, Star Wars: Episode IV - A New Hope (Special Edition) and Disclosure. He then went on to collaborate with directors including Guillermo Del Toro and James Cameron. Ellingson is currently Chair and Assistant Professor in the Department of Communication Arts at Virginia Commonwealth University.

TOP to BOTTOM: Top plan view of Scorpion aircraft twin-rotor-attack vehicle for James Cameron’s Avatar. “This was fairly early in production,” says Ellingson, “and these renderings were about the actual scale, functionality, and silhouette of the vehicles – as opposed to surface detail and paint schema.” An exploded view of the AMP suit shoulder construction as seen in Avatar. The design was built and rendered in SketchUp. Ellingson’s first assignment from director Guillermo del Toro on Pacific Rim was to design these massive landing gear for an incredibly large helicopter and a fold-down set of stairs. “The landing gear was going to crush cars, and then all these dudes from the Pacific Rim group were going to stream down the stairs,” the artist explains. “The entire scene was cut, but I always liked the way the landing gear came out.” TOP: A comparison of Ellingson’s initial study sketch for the excavator that appears in the opening of Avatar (seen when Jake is flying down to the surface in a shuttle) and the final SketchUp version that was used in the film.

TOP RIGHT: Design for chain weapon appearing at the beginning of the David S. Goyer film Blade: Trinity. BOTTOM RIGHT: TyRuben Ellingson

BOTTOM: Electronic pistol design as used by Hannibal King in Blade: Trinity. “At the time, it was really outlandish that it had a little disc drive in it – little did we know,” notes Ellingson.

108 • VFXVOICE.COM SUMMER 2017

SUMMER 2017

VFXVOICE.COM • 109


[ VES NEWS ]

VES Panel Explores State of TV VFX; The VES at FMX in Stuttgart

Magic on the Small Screen: VFX for Episodic On June 1, the VES presented “Magic on the Small Screen: VFX for Episodic,” an overview of episodic VFX at the Frank G. Wells Theater on The Walt Disney Studios lot in Los Angeles. The panel of pros touched on such topics as the current production landscape for scripted series and the trajectory of the business. The event was presented jointly through the efforts of the VES New York and LA Sections. Speakers and panelists included: Chris Brancato (keynote speaker), Co-Creator of Narcos; Lauren F. Ellis (moderator), Visual Effects Producer and head of the Los Angeles office of The Molecule; Armen Kevorkian, VFX Supervisor and Executive Creative Director, Deluxe’s Encore VFX; Christina Murguia, CG Supervisor, Zoic Studios; Tammy Sutton, Visual Effects Producer/Supervisor; Peter Davis, Senior VFX Coordinator, Stargate Studios; Paul Rabwin, Executive Producer, Post Production, ABC Studios; and Eli Dolleman, Senior VFX/Post Executive, Amazon Studios. Sponsors included The Molecule, Encore, Annex Pro, Lenovo, Zoic, Sassoon Film Design and Blackmagic Design.

110 • VFXVOICE.COM SUMMER 2017

As evidence of the growth of the episodic industry, audience members learned that between 2009 and mid-2017 the number of scripted episodic series had grown dramatically, particularly due to online services. In 2017 it is predicted that there will be approximately 520 scripted episodic shows on all platforms, many of them infused with VFX. “We are in a beautiful time for episodics and for the VFX community,” stated Andrew Bly, Co-Founder/CFO of The Molecule, who introduced the panel. “There are nothing but endless possibilities for us on the horizon right now. Now the question is ‘how’ will we use visual effects on our show, not ‘if’ we will use visual effects.” Chris Brancato told attendees that VFX have become more commonplace in television and that VFX houses are now often able to present ideas to improve storytelling. The story is still king, added Brancato, but VFX can supplement story very dramatically. In addition to Narcos, Brancato has written or produced over 200 hours of television, including such series as X-Files, Law and Order: Criminal Intent and Hannibal. He also wrote and co-produced MGM’s popular gangster feature Hoodlum and the sci-fi thriller Species II.

“Today,” said Brancato, “I know if I write it, you guys can figure out how to do it. I don’t hold back on the writing anymore. Your excellence has pretty much allowed me to have carte blanche in letting our imaginations run wild. We are in a 500-channel universe. We have to strive for imagination and world building. Because of the advancement of technologies we get to see shows like Game of Thrones. Visual effects can help build and supplement those worlds. Audiences are also coming to expect more. But you guys have to turn around visual effects for episodics on an insane schedule and you do it. Visual effects can make you feel something. Added to a story, that packs a double wallop.” Brancato’s main message to the VFX community relative to storytellers was “Don’t be afraid to tell us what you can do. I can’t tell you how many times a visual effects person has said something to me which does drive what I write. Often times I find that visual effects people, after they have read a script, will not suggest to go smaller on an effect, but to go for a bigger and better shot. You guys support creators and, at the same time, you are creators yourselves. You teach us what is possible.” In terms of how the industry has changed during the last 10 years, Sutton

said that today there is no let-up in supplying content to clients because of the constant year-round, 24/7 streaming. Years ago the cycle was more September to May with a summer slowdown.

access, the more access they want. You give people one thing and they want the next thing, including more streaming content and more mobile content.” In terms of deadlines for VFX and

Rabwin added that the technology ‘curve’ of acceptability has gotten so much faster. “I can’t predict where we are going to be in a year, never mind five. We are also starting to see VR work its way into mainstream entertainment. I think VR will get out of the headset world and be a shared experience. Right now VR is very isolating, but it will come out of that shell. Things are happening in months, not years.” Dolleman said he thinks Game of Thrones “redefined” viewers’ expectation of television visual effects. “That to me felt like the tipping point to when people would expect feature-grade effects as to what they would watch on a daily basis,” he said. “But viewers still expect a pace that they are accustomed to in television. They don’t want to wait 10 months for another episode. They want to continue to binge. This is what is feeding into the release of the 500-plus series.” He also added that the ability to pinpoint audience tastes via streaming has propelled more diversified genre programs. “The appetite for certain types of content has been identified so now they are being serviced,” Dolleman said. Davis said the Internet was largely responsible for driving the landscape to 500 shows. “The more people have easier

episodics, said Kevorkian, “It is really the relationship you have with your clients and getting as much information on what is coming up early on. From lock to air is about two to three weeks, but if we can know what is coming up we can get a head start on things.” Communication is the main element for turning around shows and deadlines for shows with VFX, added Kevorkian. At the same time, he said, there should ideally be ‘less cooks in the kitchen’ with fewer people making decisions. “If everyone has a say in what it’s going to look like, everyone loses. It’s about establishing the relationship and having one person letting us know. That is how we are able to make things happen.” “Communication is huge,” said Murguia, as did other panel members who reiterated that it’s probably the number one ingredient in meeting TV’s tight deadlines. —Jim McCullaugh The VES at FMX For almost a decade, the VES has been a proud partner with FMX, Europe’s most influential conference dedicated to digital visual arts, technologies and business. Every year, VFX specialists, animation experts, games and transmedia talents

come to FMX to present cutting edge achievements, state of the art tools and pipelines, fascinating real time technologies, spectacular immersive experiences and innovative business models. VES Executive Director Eric Roth led two sessions at this year’s “Beyond the Screen”-themed conference in Stuttgart, Germany. Roth co-chaired the annual CEO Summit with Weta’s Dave Gougé, an intimate conversation with the heads of two dozen companies, focused on the global state of visual effects. Unlike previous years, the primarily non U.S.-based companies represented noted a more robust and positive business outlook with slated projects and staffing on the rise. Roth also had the privilege of interviewing Academy of Motion Picture Arts & Sciences President Cheryl Boone Isaacs. In their discussion, Boone Isaacs shared her insights into the myriad issues and challenges she’s worked to address in her tenure, including diversity and the Best Picture issues at this year’s Academy Awards ceremony. While in Europe, Roth met with the leadership of VES’s newest affiliate, the Germany Section, which represents a diverse cross-section of VFX talent in the region. He also spent time with the London Section’s new leadership, who are approaching the challenges and opportunities in the field with great enthusiasm. VES Board Chair Mike Chambers also spent time with the London Section on his last visit, and both Roth and Chambers will continue to make face-to-face visits to VES Sections across the globe.

OPPOSITE LEFT: Panelists for the “Magic on the Small Screen: VFX for Episodic” presentation included, from left: Eli Dolleman, Amazon Studios; Peter Davis, Stargate Studios; Christina Murguia, Zoic Studios; Armen Kevorkian, Deluxe’s Encore VFX; Tammy Sutton, VFX Producer/Supervisor; and Paul Rabwin, ABC Studios. MIDDLE: Lauren F. Ellis of The Molecule moderated the panel. RIGHT: The audience listens to the “Magic on the Small Screen” presentation. All photos by Brent Armstrong

SUMMER 2017

VFXVOICE.COM • 111


[ VES NEWS ]

VES Panel Explores State of TV VFX; The VES at FMX in Stuttgart

Magic on the Small Screen: VFX for Episodic On June 1, the VES presented “Magic on the Small Screen: VFX for Episodic,” an overview of episodic VFX at the Frank G. Wells Theater on The Walt Disney Studios lot in Los Angeles. The panel of pros touched on such topics as the current production landscape for scripted series and the trajectory of the business. The event was presented jointly through the efforts of the VES New York and LA Sections. Speakers and panelists included: Chris Brancato (keynote speaker), Co-Creator of Narcos; Lauren F. Ellis (moderator), Visual Effects Producer and head of the Los Angeles office of The Molecule; Armen Kevorkian, VFX Supervisor and Executive Creative Director, Deluxe’s Encore VFX; Christina Murguia, CG Supervisor, Zoic Studios; Tammy Sutton, Visual Effects Producer/Supervisor; Peter Davis, Senior VFX Coordinator, Stargate Studios; Paul Rabwin, Executive Producer, Post Production, ABC Studios; and Eli Dolleman, Senior VFX/Post Executive, Amazon Studios. Sponsors included The Molecule, Encore, Annex Pro, Lenovo, Zoic, Sassoon Film Design and Blackmagic Design.

110 • VFXVOICE.COM SUMMER 2017

As evidence of the growth of the episodic industry, audience members learned that between 2009 and mid-2017 the number of scripted episodic series had grown dramatically, particularly due to online services. In 2017 it is predicted that there will be approximately 520 scripted episodic shows on all platforms, many of them infused with VFX. “We are in a beautiful time for episodics and for the VFX community,” stated Andrew Bly, Co-Founder/CFO of The Molecule, who introduced the panel. “There are nothing but endless possibilities for us on the horizon right now. Now the question is ‘how’ will we use visual effects on our show, not ‘if’ we will use visual effects.” Chris Brancato told attendees that VFX have become more commonplace in television and that VFX houses are now often able to present ideas to improve storytelling. The story is still king, added Brancato, but VFX can supplement story very dramatically. In addition to Narcos, Brancato has written or produced over 200 hours of television, including such series as X-Files, Law and Order: Criminal Intent and Hannibal. He also wrote and co-produced MGM’s popular gangster feature Hoodlum and the sci-fi thriller Species II.

“Today,” said Brancato, “I know if I write it, you guys can figure out how to do it. I don’t hold back on the writing anymore. Your excellence has pretty much allowed me to have carte blanche in letting our imaginations run wild. We are in a 500-channel universe. We have to strive for imagination and world building. Because of the advancement of technologies we get to see shows like Game of Thrones. Visual effects can help build and supplement those worlds. Audiences are also coming to expect more. But you guys have to turn around visual effects for episodics on an insane schedule and you do it. Visual effects can make you feel something. Added to a story, that packs a double wallop.” Brancato’s main message to the VFX community relative to storytellers was “Don’t be afraid to tell us what you can do. I can’t tell you how many times a visual effects person has said something to me which does drive what I write. Often times I find that visual effects people, after they have read a script, will not suggest to go smaller on an effect, but to go for a bigger and better shot. You guys support creators and, at the same time, you are creators yourselves. You teach us what is possible.” In terms of how the industry has changed during the last 10 years, Sutton

said that today there is no let-up in supplying content to clients because of the constant year-round, 24/7 streaming. Years ago the cycle was more September to May with a summer slowdown.

access, the more access they want. You give people one thing and they want the next thing, including more streaming content and more mobile content.” In terms of deadlines for VFX and

Rabwin added that the technology ‘curve’ of acceptability has gotten so much faster. “I can’t predict where we are going to be in a year, never mind five. We are also starting to see VR work its way into mainstream entertainment. I think VR will get out of the headset world and be a shared experience. Right now VR is very isolating, but it will come out of that shell. Things are happening in months, not years.” Dolleman said he thinks Game of Thrones “redefined” viewers’ expectation of television visual effects. “That to me felt like the tipping point to when people would expect feature-grade effects as to what they would watch on a daily basis,” he said. “But viewers still expect a pace that they are accustomed to in television. They don’t want to wait 10 months for another episode. They want to continue to binge. This is what is feeding into the release of the 500-plus series.” He also added that the ability to pinpoint audience tastes via streaming has propelled more diversified genre programs. “The appetite for certain types of content has been identified so now they are being serviced,” Dolleman said. Davis said the Internet was largely responsible for driving the landscape to 500 shows. “The more people have easier

episodics, said Kevorkian, “It is really the relationship you have with your clients and getting as much information on what is coming up early on. From lock to air is about two to three weeks, but if we can know what is coming up we can get a head start on things.” Communication is the main element for turning around shows and deadlines for shows with VFX, added Kevorkian. At the same time, he said, there should ideally be ‘less cooks in the kitchen’ with fewer people making decisions. “If everyone has a say in what it’s going to look like, everyone loses. It’s about establishing the relationship and having one person letting us know. That is how we are able to make things happen.” “Communication is huge,” said Murguia, as did other panel members who reiterated that it’s probably the number one ingredient in meeting TV’s tight deadlines. —Jim McCullaugh The VES at FMX For almost a decade, the VES has been a proud partner with FMX, Europe’s most influential conference dedicated to digital visual arts, technologies and business. Every year, VFX specialists, animation experts, games and transmedia talents

come to FMX to present cutting edge achievements, state of the art tools and pipelines, fascinating real time technologies, spectacular immersive experiences and innovative business models. VES Executive Director Eric Roth led two sessions at this year’s “Beyond the Screen”-themed conference in Stuttgart, Germany. Roth co-chaired the annual CEO Summit with Weta’s Dave Gougé, an intimate conversation with the heads of two dozen companies, focused on the global state of visual effects. Unlike previous years, the primarily non U.S.-based companies represented noted a more robust and positive business outlook with slated projects and staffing on the rise. Roth also had the privilege of interviewing Academy of Motion Picture Arts & Sciences President Cheryl Boone Isaacs. In their discussion, Boone Isaacs shared her insights into the myriad issues and challenges she’s worked to address in her tenure, including diversity and the Best Picture issues at this year’s Academy Awards ceremony. While in Europe, Roth met with the leadership of VES’s newest affiliate, the Germany Section, which represents a diverse cross-section of VFX talent in the region. He also spent time with the London Section’s new leadership, who are approaching the challenges and opportunities in the field with great enthusiasm. VES Board Chair Mike Chambers also spent time with the London Section on his last visit, and both Roth and Chambers will continue to make face-to-face visits to VES Sections across the globe.

OPPOSITE LEFT: Panelists for the “Magic on the Small Screen: VFX for Episodic” presentation included, from left: Eli Dolleman, Amazon Studios; Peter Davis, Stargate Studios; Christina Murguia, Zoic Studios; Armen Kevorkian, Deluxe’s Encore VFX; Tammy Sutton, VFX Producer/Supervisor; and Paul Rabwin, ABC Studios. MIDDLE: Lauren F. Ellis of The Molecule moderated the panel. RIGHT: The audience listens to the “Magic on the Small Screen” presentation. All photos by Brent Armstrong

SUMMER 2017

VFXVOICE.COM • 111


[ FINAL FRAME ]

This original illustration is from Andrew Whitehurst, a British visual effects artist, who received an Academy Award for Best Visual Effects for his work on the film Ex Machina. He shared the award with Sara Bennett, Paul Norris and Mark Williams Ardington. He has also worked on Troy (2004), Charlie and the Chocolate Factory (2005) and Harry Potter and the Order of the Phoenix (2007).

112 • VFXVOICE.COM SUMMER 2017


[ FINAL FRAME ]

This original illustration is from Andrew Whitehurst, a British visual effects artist, who received an Academy Award for Best Visual Effects for his work on the film Ex Machina. He shared the award with Sara Bennett, Paul Norris and Mark Williams Ardington. He has also worked on Troy (2004), Charlie and the Chocolate Factory (2005) and Harry Potter and the Order of the Phoenix (2007).

112 • VFXVOICE.COM SUMMER 2017



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.