VFXVOICE.COM FALL 2018
THE WORLD OF THEME PARK VFX CG SHARK ATTACK • LEGION: MIND GAMES • FOCUS: AUSTRALIA/NEW ZEALAND VFX STUNTVIS: ACTION PLANS • VIDEO GAMES’ VFX “ROCK STARS” • PROFILE: JOHN NELSON
COVER 1 VFXV issue 7 FALL 2018.indd 1
8/23/18 11:41 AM
CVR2 SPINFX AD.indd 2
8/23/18 11:42 AM
PG 1 VES CFE AD.indd 1
8/23/18 11:46 AM
[ EXECUTIVE NOTE ]
Welcome to the Fall issue of VFX Voice! Awards season is here and we’re excited that it’s time to honor the creators of visual imagery that drive billions in revenue for film and television. Congratulations to all of the Emmy Award nominees and winners for your outstanding achievements in visual effects. Like you, we’re eager to see how the next few months unfold with numerous VFX-driven films closing out the year…and then, the envelope please! VFX Voice is all about shining a light on visual effects artists and innovators. So buckle up as we take you on a deep dive into the world of VFX that only the VES can provide. In this issue we explore the intersection of VFX and theme parks and the future of VFX-infused attractions. We go behind the scenes of hot properties including TV fan favorite Legion and big-screen titles Jurassic World: Fallen Kingdom and The House with a Clock in Its Walls. We share a salute to gaming legends, introduce you to stuntvis (stunt pros on the cutting edge of effects and stunt choreography), report on VFX happenings in Australia and New Zealand, delve into the craft of CG movie sharks, and much more. Thank you for being a vital part of the global VFX Voice community. We’re proud to be the definitive authority on all things VFX. Keep visiting VFXVoice.com for exclusive online stories between issues. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.
Mike Chambers, Chair, VES Board of Directors
Eric Roth, VES Executive Director
2 • VFXVOICE.COM FALL 2018
PG 2 EXECUTIVE NOTE.indd 2
8/23/18 11:47 AM
PG 3 YANNIX AD.indd 3
8/23/18 11:48 AM
[ CONTENTS ] FEATURES 8 FILM: SHARKS! What makes audiences bite at giant CG sharks?
VFXVOICE.COM
DEPARTMENTS 2 EXECUTIVE NOTE 91 VES SECTION SPOTLIGHT: AUSTRALIA
14 FILM: THE HOUSE WITH A CLOCK IN ITS WALLS Scary family entertainment with an effects-sharp edge. 24 FILM: STUNTVIS Stuntvis artists, VFX supervisors plot fights in advance. 30 TV: LEGION Wildly creative mind-bender is VFX supervisor’s dream job. 38 PROFILE: JOHN NELSON Shaping visions from Gladiator to Blade Runner 2049. 46 COVER: THEME PARKS AND VFX Effects explosion driving a global revolution in immersive rides.
92 VES NEWS 94 VFX CAREERS 96 FINAL FRAME: DISNEYLAND
ON THE COVER: Kong Kong 360 3-D ride at Universal Studios Hollywood (Image courtesy of Universal Studios Hollywood)
58 INDUSTRY ROUNDTABLE Experts discuss theme park trends, issues and what’s to come. 64 FOCUS: AUSTRALIA/NEW ZEALAND Who’s who in the growing VFX industry down under. 74 FILM: VFX PROBLEM SOLVING VFX supervisors run a gauntlet of hypothetical scenarios. 80 FILM: JURASSIC WORLD: FALLEN KINGDOM A gyrosphere races on rails to escape a volcano and dinosaurs. 84 VFX VAULT: WHAT DREAMS MAY COME How the ambitious 1998 film advanced the art of optical flow. 88 GAMES: GUEST COLUMNIST DAVID “DJ” JOHNSON DJ’s behind-the-scenes “rock stars” of video game VFX.
4 • VFXVOICE.COM FALL 2018
PG 4 TOC.indd 4
8/23/18 11:48 AM
PG 5 BLACKMAGIC AD.indd 5
8/23/18 11:49 AM
FALL 2018 • VOL. 2, NO. 4
VFXVOICE
Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com
VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS
EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com
OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Dan Schrecker, Treasurer Rita Cahill, Secretary DIRECTORS Jeff Barnes, Brooke Breton, Kathryn Brillhart Emma Clifton Perry, Bob Coleman, Lisa Cooke Dayne Cowan, Kim Davidson, Richard Edlund, VES Bryan Grill, Pam Hogarth, Joel Hynek, Jeff Kleiser Suresh Kondareddy, Brooke Lyndon-Stanford Tim McGovern, Kevin Rafferty, Scott Ross Tim Sassoon, Lisa Sepp-Wilson, David Tanaka Bill Taylor, VES, Richard Winn Taylor II, VES Joe Weidenbach, Susan Zwerman
CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg David “DJ” Johnson Chris McGowan Barbara Robertson
ALTERNATES Andrew Bly, Charlie Iturriaga Christian Kubsch, Daniel Rosen Katie Stetson, Bill Villarreal
ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth
Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com
Tom Atkin, Founder Allen Battino, VES Logo Design
VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Callie C. Miller, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations
Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2018 The Visual Effects Society. Printed in the U.S.A.
6 • VFXVOICE.COM FALL 2018
PG 6 MASTHEAD.indd 6
8/23/18 11:50 AM
PG 7 fTrack AD.indd 7
8/23/18 11:51 AM
FILM
SHARK! THE ART OF THE CG PREDATOR By IAN FAILES
Is Hollywood obsessed with shark movies? Jon Turteltaub’s The Meg, which features a 70-foot-long monster, is the latest in a string of recent films that have seen hapless beach-goers, sea adventurers and divers meet their match against the ultimate ocean predator. While Steven Spielberg’s Jaws utilized a not-always-working practical shark to menace the characters and audiences, most modern shark movies depend on CG. But what makes a CG shark work? VFX Voice asked visual effects supervisors from several recent shark-related productions about how they went about the task. SHARK EXPERTS
TOP: Jonas Taylor (Jason Statham) battles a 70-foot-long Megalodon ‘underwater’ in The Meg. Director Jon Turteltaub wanted something that looked prehistoric, so the Meg was designed with a mouth that was less turned up at the sides in order to be scarier. Kon-Tiki images copyright © 2012 Nordisk Film. The Shallows images copyright © 2016 Sony Pictures. 47 Meters Down images copyright © 2017 Entertainment Studios Motion Pictures. Pirates of the Caribbean: Dead Men Tell No Tales images copyright © 2017 Walt Disney Pictures. The Meg images copyright © 2018 Warner Bros. Entertainment Inc., Gravity Pictures Film Production Company, and Apelles Entertainment Inc.
8 • VFXVOICE.COM FALL 2018
PG 8-12 SHARKS.indd 8-9
In visual effects, some studios tend to become particularly well known for a certain type of effects work. After the release of Joachim Rønning and Espen Sandberg’s Norwegian feature film Kon-Tiki in 2012, Swedish studio Important Looking Pirates (ILP) quite possibly assumed that mantle in relation to CG sharks. Here, they produced sharks both in and out of the water and incorporated them into more of a documentary-style story than a horror film or thriller. “The shots were for a very important sequence during an emotional peak of the movie,” says ILP Visual Effects Supervisor Niklas Jacobson. “They needed to look absolutely convincing in order to not take the audience out of the drama.” Jacobson says ILP had almost no character animation experience prior to Kon-Tiki, but launched into an ‘epic’ research project by reviewing shark footage from television series such as Planet Earth. The ultimate result – which saw much of the shark action actually out of the water – was so successful that ILP was asked to create digital sharks for more projects, including commercials and other feature films such as Jaume Collet-Serra’s The Shallows, in which a great white terrorizes an injured surfer stuck on a rocky outcrop. Although ILP’s experience in these two projects and others
has been in crafting photoreal digital sharks, Jacobson says this process is greatly enhanced whenever any practical stand-in can be utilized on set, both for performance and water interaction. “If you can run a gray ball or a rubber shark or anything else under the water surface to get a feel for how the caustics play and how the depth fog falls off, that is a nice bonus,” Jacobson states. He adds that then it is a matter of studying all the reference material possible. “Key to realistic motion is great references of shark behavior and movement,” he notes. “The closer references you find to what you are trying to accomplish, the less brute genius mad skills you need to possess.” CAGE-FIGHTING
In Johannes Roberts’ low-budget horror film 47 Meters Down, two sisters get trapped on the sea floor while cage diving only to be stalked by several great whites. UK-based Outpost handled the CG sharks. Reference was their starting point, but an initial challenge also included inserting the digital assets into water tank footage. “The water had a specific murky look with lots of small particles floating about and some volume rays coming from the surface, and it was challenging to set a CG shark within this environment,” outlines Outpost Senior Visual Effects Supervisor Marcin Kolendo. “Our 3D department did a great job of creating a realistic shark asset. Every day I saw tons of reference footage on their screens, ranging from the Blue Planet-type of professional documentaries to lots of amateur footage, especially cage-diving among sharks.” As ILP had done for Kon-Tiki and The Shallows, Outpost also had to craft believable water simulations for their shark scenes as the creatures pierced the surface. The studio also spent a significant amount of time establishing the right look of shark skin, especially when the sharks would surface above the water. “Glistening in the sun, wet, yet rubbery and strong; there is always a lot you can do in comp, but when the CG skin shaders are wrong it will never look good on the screen, regardless of the amount of work from the 2D department,” says Kolendo. Like many shark movies, 47 Meters Down portrays its sharks in a slightly exaggerated way, i.e. with more aggression. But Outpost worked with the director to retain as much of the ‘classic’ aspects
TOP LEFT: Important Looking Pirates’ first experience with CG sharks was for the 2012 film Kon-Tiki. TOP RIGHT: Kon-Tiki saw Important Looking Pirates quickly ramp up on how to model and animate sharks and integrate them in and outside the water, plus produce realistic water simulations. BOTTOM RIGHT: A lighting reference turntable for Important Looking Pirates’ CG shark in The Shallows.
FALL 2018 VFXVOICE.COM • 9
8/23/18 11:52 AM
FILM
SHARK! THE ART OF THE CG PREDATOR By IAN FAILES
Is Hollywood obsessed with shark movies? Jon Turteltaub’s The Meg, which features a 70-foot-long monster, is the latest in a string of recent films that have seen hapless beach-goers, sea adventurers and divers meet their match against the ultimate ocean predator. While Steven Spielberg’s Jaws utilized a not-always-working practical shark to menace the characters and audiences, most modern shark movies depend on CG. But what makes a CG shark work? VFX Voice asked visual effects supervisors from several recent shark-related productions about how they went about the task. SHARK EXPERTS
TOP: Jonas Taylor (Jason Statham) battles a 70-foot-long Megalodon ‘underwater’ in The Meg. Director Jon Turteltaub wanted something that looked prehistoric, so the Meg was designed with a mouth that was less turned up at the sides in order to be scarier. Kon-Tiki images copyright © 2012 Nordisk Film. The Shallows images copyright © 2016 Sony Pictures. 47 Meters Down images copyright © 2017 Entertainment Studios Motion Pictures. Pirates of the Caribbean: Dead Men Tell No Tales images copyright © 2017 Walt Disney Pictures. The Meg images copyright © 2018 Warner Bros. Entertainment Inc., Gravity Pictures Film Production Company, and Apelles Entertainment Inc.
8 • VFXVOICE.COM FALL 2018
PG 8-12 SHARKS.indd 8-9
In visual effects, some studios tend to become particularly well known for a certain type of effects work. After the release of Joachim Rønning and Espen Sandberg’s Norwegian feature film Kon-Tiki in 2012, Swedish studio Important Looking Pirates (ILP) quite possibly assumed that mantle in relation to CG sharks. Here, they produced sharks both in and out of the water and incorporated them into more of a documentary-style story than a horror film or thriller. “The shots were for a very important sequence during an emotional peak of the movie,” says ILP Visual Effects Supervisor Niklas Jacobson. “They needed to look absolutely convincing in order to not take the audience out of the drama.” Jacobson says ILP had almost no character animation experience prior to Kon-Tiki, but launched into an ‘epic’ research project by reviewing shark footage from television series such as Planet Earth. The ultimate result – which saw much of the shark action actually out of the water – was so successful that ILP was asked to create digital sharks for more projects, including commercials and other feature films such as Jaume Collet-Serra’s The Shallows, in which a great white terrorizes an injured surfer stuck on a rocky outcrop. Although ILP’s experience in these two projects and others
has been in crafting photoreal digital sharks, Jacobson says this process is greatly enhanced whenever any practical stand-in can be utilized on set, both for performance and water interaction. “If you can run a gray ball or a rubber shark or anything else under the water surface to get a feel for how the caustics play and how the depth fog falls off, that is a nice bonus,” Jacobson states. He adds that then it is a matter of studying all the reference material possible. “Key to realistic motion is great references of shark behavior and movement,” he notes. “The closer references you find to what you are trying to accomplish, the less brute genius mad skills you need to possess.” CAGE-FIGHTING
In Johannes Roberts’ low-budget horror film 47 Meters Down, two sisters get trapped on the sea floor while cage diving only to be stalked by several great whites. UK-based Outpost handled the CG sharks. Reference was their starting point, but an initial challenge also included inserting the digital assets into water tank footage. “The water had a specific murky look with lots of small particles floating about and some volume rays coming from the surface, and it was challenging to set a CG shark within this environment,” outlines Outpost Senior Visual Effects Supervisor Marcin Kolendo. “Our 3D department did a great job of creating a realistic shark asset. Every day I saw tons of reference footage on their screens, ranging from the Blue Planet-type of professional documentaries to lots of amateur footage, especially cage-diving among sharks.” As ILP had done for Kon-Tiki and The Shallows, Outpost also had to craft believable water simulations for their shark scenes as the creatures pierced the surface. The studio also spent a significant amount of time establishing the right look of shark skin, especially when the sharks would surface above the water. “Glistening in the sun, wet, yet rubbery and strong; there is always a lot you can do in comp, but when the CG skin shaders are wrong it will never look good on the screen, regardless of the amount of work from the 2D department,” says Kolendo. Like many shark movies, 47 Meters Down portrays its sharks in a slightly exaggerated way, i.e. with more aggression. But Outpost worked with the director to retain as much of the ‘classic’ aspects
TOP LEFT: Important Looking Pirates’ first experience with CG sharks was for the 2012 film Kon-Tiki. TOP RIGHT: Kon-Tiki saw Important Looking Pirates quickly ramp up on how to model and animate sharks and integrate them in and outside the water, plus produce realistic water simulations. BOTTOM RIGHT: A lighting reference turntable for Important Looking Pirates’ CG shark in The Shallows.
FALL 2018 VFXVOICE.COM • 9
8/23/18 11:52 AM
FILM
Sharks of the Ghostly Kind
“Key to realistic motion is great references of shark behavior and movement. The closer references you find to what you are trying to accomplish, the less brute genius mad skills you need to possess.” —Niklas Jacobson, Visual Effects Supervisor, ILP
TOP LEFT AND RIGHT: The great white in The Shallows gets snagged on part of the reef. This shows the extent of CG modeling required for the shark, reef and water interaction, and the final rendered shot. BOTTOM LEFT AND RIGHT: The shark in The Shallows eventually meets its doom after being impaled on the sea floor. Live-action tank footage of actress Blake Lively was combined with the final CG shark and sea-floor crashing effects. OPPOSITE BOTTOM: 47 Meters Down quickly goes from a cage-diving-with-sharks adventure to a survival horror film, with bloody results.
of shark behavior as possible while still creating the feeling of terror and panic for a horror film. Says Kolendo: “We looked at how the jaws behave when a shark attacks, what the gills are doing when it turns in anger, how fast is the tail moving to propel the massive body, and then we added that extra few percent to exaggerate the action, creating a shark that is not only believable, but also terrifying.” MEGA-CHALLENGE
Given its shark is 70 feet long, believability in The Meg is already pushed to the limits. Still, that did not deter Production Visual Effects Supervisor Adrian De Wet from closely referencing real shark behavior and using that as a leaping-off point for the film’s Megalodon. “The creature is described in the books as looking like a huge albino great white,” says De Wet. “However, that was really not what the director Jon Turteltaub wanted at all. He wanted something that looked prehistoric. And while it makes evolutionary sense for it to have ended up albino and blind after countless millennia in the total darkness of 10km deep, Jon felt that that didn’t work for his vision. Instead he wanted it to look gnarled, textured, aggressive, moody, and dark – most definitely not a great white.” In addition, De Wet notes that real great white sharks can almost
The CG sharks in Joachim Rønning and Espen Sandberg’s Pirates of the Caribbean: Dead Men Tell No Tales were certainly intended to be photoreal, but not so real per se. Instead, these were ‘ghost sharks’ that are commanded by Captain Armando Salazar (Javier Bardem) as part of his undead pirates. Tasked with the shark visual effects work was MPC, led by Visual Effects Supervisor Patrick Ledda. “We took existing concept work from the production art department and developed them further to achieve the desired look, paying particular attention to what real shark anatomy looks like, but also taking creative freedom when needed,” says Ledda. “For instance, we gave them a rib cage so we could show some broken ribs even though sharks don’t have this bone structure. We looked at a lot of references of rotten fish and dead animals, which wasn’t the most appealing, but it provided a lot of inspiration for our modelers and texturing artists.” The sharks were also given broken spines and fins which allowed animators to have the sharks perform erratic, jagged and un-elegant movements. “We wanted them to swim as if they were zombie sharks,” adds Ledda. At one point the sharks take on a rowboat holding Captain Jack Sparrow (Johnny Depp) and Henry Turner (Brenton Thwaites), biting at the craft and jumping out of the water, a task made more difficult by the fact that Depp and Thwaites were shot practically on a stage. “Creating realistic shark motion, although challenging, is quite objective,” adds Ledda. “There is a lot of reference available for different types of sharks. In our case, it was a much more subjective approach. However, it’s still important to adhere to physical characteristics such as weight, speed and so on in order to make them believable. When interacting with characters, this became quite challenging because they needed to react to certain performance cues while trying to be plausible at the same time.”
TOP: A close-up on one of MPC’s ghost sharks for Pirates of the Caribbean: Dead Men Tell No Tales. MIDDLE: The ghost sharks circle the rowing boat. BOTTOM: Bluescreen footage of the actors served as the original plate for this shot of a CG ghost shark jumping from the water.
look too ‘smiley’ when they appear head-on, as if with a big grin. So the Meg was designed with a mouth that was less turned up at the sides in order to be scarier. The shape and proportion of the giant shark had to be addressed as well. “When we modeled the Meg using correct shark proportions, it looked too sleek and thin, with a too-small dorsal fin,” comments De Wet. “So we had to adjust proportions to Jon’s taste, which meant a fatter shark, with smaller eyes and a larger dorsal fin. There were many iterations of the Meg body shape, from long and sleek to shorter and fatter. Too much of a bulbous shape destroys scale and makes it look like a tuna fish or a giant guppy. Too lean and sleek makes it take on the appearance of an eel.”
10 • VFXVOICE.COM FALL 2018
PG 8-12 SHARKS.indd 10-11
FALL 2018 VFXVOICE.COM • 11
8/23/18 11:52 AM
FILM
Sharks of the Ghostly Kind
“Key to realistic motion is great references of shark behavior and movement. The closer references you find to what you are trying to accomplish, the less brute genius mad skills you need to possess.” —Niklas Jacobson, Visual Effects Supervisor, ILP
TOP LEFT AND RIGHT: The great white in The Shallows gets snagged on part of the reef. This shows the extent of CG modeling required for the shark, reef and water interaction, and the final rendered shot. BOTTOM LEFT AND RIGHT: The shark in The Shallows eventually meets its doom after being impaled on the sea floor. Live-action tank footage of actress Blake Lively was combined with the final CG shark and sea-floor crashing effects. OPPOSITE BOTTOM: 47 Meters Down quickly goes from a cage-diving-with-sharks adventure to a survival horror film, with bloody results.
of shark behavior as possible while still creating the feeling of terror and panic for a horror film. Says Kolendo: “We looked at how the jaws behave when a shark attacks, what the gills are doing when it turns in anger, how fast is the tail moving to propel the massive body, and then we added that extra few percent to exaggerate the action, creating a shark that is not only believable, but also terrifying.” MEGA-CHALLENGE
Given its shark is 70 feet long, believability in The Meg is already pushed to the limits. Still, that did not deter Production Visual Effects Supervisor Adrian De Wet from closely referencing real shark behavior and using that as a leaping-off point for the film’s Megalodon. “The creature is described in the books as looking like a huge albino great white,” says De Wet. “However, that was really not what the director Jon Turteltaub wanted at all. He wanted something that looked prehistoric. And while it makes evolutionary sense for it to have ended up albino and blind after countless millennia in the total darkness of 10km deep, Jon felt that that didn’t work for his vision. Instead he wanted it to look gnarled, textured, aggressive, moody, and dark – most definitely not a great white.” In addition, De Wet notes that real great white sharks can almost
The CG sharks in Joachim Rønning and Espen Sandberg’s Pirates of the Caribbean: Dead Men Tell No Tales were certainly intended to be photoreal, but not so real per se. Instead, these were ‘ghost sharks’ that are commanded by Captain Armando Salazar (Javier Bardem) as part of his undead pirates. Tasked with the shark visual effects work was MPC, led by Visual Effects Supervisor Patrick Ledda. “We took existing concept work from the production art department and developed them further to achieve the desired look, paying particular attention to what real shark anatomy looks like, but also taking creative freedom when needed,” says Ledda. “For instance, we gave them a rib cage so we could show some broken ribs even though sharks don’t have this bone structure. We looked at a lot of references of rotten fish and dead animals, which wasn’t the most appealing, but it provided a lot of inspiration for our modelers and texturing artists.” The sharks were also given broken spines and fins which allowed animators to have the sharks perform erratic, jagged and un-elegant movements. “We wanted them to swim as if they were zombie sharks,” adds Ledda. At one point the sharks take on a rowboat holding Captain Jack Sparrow (Johnny Depp) and Henry Turner (Brenton Thwaites), biting at the craft and jumping out of the water, a task made more difficult by the fact that Depp and Thwaites were shot practically on a stage. “Creating realistic shark motion, although challenging, is quite objective,” adds Ledda. “There is a lot of reference available for different types of sharks. In our case, it was a much more subjective approach. However, it’s still important to adhere to physical characteristics such as weight, speed and so on in order to make them believable. When interacting with characters, this became quite challenging because they needed to react to certain performance cues while trying to be plausible at the same time.”
TOP: A close-up on one of MPC’s ghost sharks for Pirates of the Caribbean: Dead Men Tell No Tales. MIDDLE: The ghost sharks circle the rowing boat. BOTTOM: Bluescreen footage of the actors served as the original plate for this shot of a CG ghost shark jumping from the water.
look too ‘smiley’ when they appear head-on, as if with a big grin. So the Meg was designed with a mouth that was less turned up at the sides in order to be scarier. The shape and proportion of the giant shark had to be addressed as well. “When we modeled the Meg using correct shark proportions, it looked too sleek and thin, with a too-small dorsal fin,” comments De Wet. “So we had to adjust proportions to Jon’s taste, which meant a fatter shark, with smaller eyes and a larger dorsal fin. There were many iterations of the Meg body shape, from long and sleek to shorter and fatter. Too much of a bulbous shape destroys scale and makes it look like a tuna fish or a giant guppy. Too lean and sleek makes it take on the appearance of an eel.”
10 • VFXVOICE.COM FALL 2018
PG 8-12 SHARKS.indd 10-11
FALL 2018 VFXVOICE.COM • 11
8/23/18 11:52 AM
FILM
“We looked at how the jaws behave when a shark attacks, what the gills are doing when it turns in anger, how fast is the tail moving to propel the massive body, and then we added that extra few percent to exaggerate the action, creating a shark that is not only believable, but also terrifying.” —Marcin Kolendo, Senior Visual Effects Supervisor, Outpost
TOP: Production VFX Supervisor Adrian De Wet spent months with teams of effects artists refining the shape and proportions of the Meg.
12 • VFXVOICE.COM FALL 2018
PG 8-12 SHARKS.indd 13
The Meg’s water sequences were filmed in two tanks built in Kumeu, New Zealand. One was relatively shallow but with a large surface area, and contained a high 200-foot greenscreen on one side for water surface shots. The second smaller tank – a deep water tank – was circular and covered by a ‘blue sky’ roof and lined with blue pool liner. “This tank was for underwater action shots, such as when Jonas (Jason Statham) battles the Meg underwater in the third act of the film,” says De Wet. “We built a Meg buck for use underwater when Jonas interacts with it. For instance, for the beats where Jonas stabs the Meg in the eye, I wanted a physical, life-sized replica of the Meg’s face, or at least the part that Jason has to hold onto and climb up. So when we locked down the design of the Meg, we went to a fabricator and they made a full-sized half face. However, we soon discovered that it could not be pulled around underwater at any considerable speed because of the massive water resistance and drag forces involved. “Our solution was to build a rig out of metal wire-frame, which was lightweight, and allowed the water to pass through it as it was being pulled along, while preventing Jason from intersecting with the Meg’s face. This allowed Jason to fight the Meg, holding on to a violently shaking Meg rig while climbing it, stabbing it in the eye, all the time underwater, resulting in a much better look than if it were shot dry-for-wet.” The Meg has certainly arrived at a point in film history where sharks are some of the top villains on our screens, with visual effects artists able to create them as both beautiful ocean creatures or menacing beasts of the deep, depending on the story required. “For our film,” says De Wet, “we invented a ‘Hollywood’ version of the Megalodon that probably couldn’t exist. We even gave it eight gills, knowing that extant sharks either have five or seven, as something for the shark geeks to point out. But, you know, it’s a movie.”
AD
WINTER 2018 VFXVOICE.COM • 13
8/23/18 11:52 AM
FILM
INSIDE THE HOUSE WITH A CLOCK IN ITS WALLS By TREVOR HOGG
TOP LEFT: Cinematographer Rogier Stoffers collaborated with director Eli Roth on Death Wish and The House with the Clock in Its Walls, which were both released in 2018. TOP RIGHT: Eli Roth with his young lead actor Owen Vaccaro, who portrays the apprentice sorcerer Lewis Barnavelt.
14 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 14-15
Venturing away from Cabin Fever, Hostel and The Green Inferno, filmmaker Eli Roth has entered into the realm of family entertainment with his adaption of the Gothic horror children’s novel by John Bellairs, The House with a Clock in Its Walls. Originally published in 1973, the story revolves around orphaned Lewis Barnavelt (Owen Vaccaro), who is sent to live with his uncle Jonathan Barnavelt (Jack Black) and their search for a magical clock that has the power to destroy the world. “I’m a kind and gentle person,” notes Roth. “I don’t know how I can get kinder and gentler! The intention was always to make a fantastic scary kids’ movie. I’ve had any body part at my disposal in all of my previous movies and certainly love doing a delicious kill, but there are a lot of other ideas that I have. I wanted to do something that was much more in the world of Tim Burton, Terry Gilliam and Guillermo del Toro. Or something like a wild Peter Jackson movie. I’ve had such an amazing experience with Amblin Entertainment; it’s the first time I’ve gotten the resources to do that.” Limitations are important to filmmakers. “If you look at the sequence in Get Out when Chris Washington (Daniel Kaluuya) goes to the Sunken Place, they didn’t have money for a set so they put him on wires, and it’s iconic,” notes Roth. “There are many movies that prove that more money doesn’t make them better. I watch some of these giant superhero films and am shocked at how bad the effects are because they’re shooting them up to the last minute and have 50 different visual effects houses. Whereas if you watch Ex Machina or Arrival, the effects are perfect because they are focused.” Arrival led to a BAFTA Award nomination for Louis Morin, who serves as the Visual Effects Supervisor on The House with a Clock in Its Walls. “Louis also did Sicario, and I didn’t know that the Juarez sequence is 55% or 60% computer generated because it looks flawless. You’re not paying attention to any of it. I want everything to look as organic and realistic as possible; that takes time and focus.”
“When I came in the budget was fairly low, but once they saw the potential it was decided to invest in visual effects to enhance the movie,” remarks Louis Morin, who has worked on approximately 800 visual effects shots with Rodeo FX, Hybride, Folks VFX, Alchemy 24 and Mavericks VFX. A shot list was discussed with Cinematographer Rogier Stoffers and Roth, but there was still flexibility to respond to spontaneous ideas on set. “There’s a huge transition from one character to another that we wanted to develop in a unique way. Inspired by practical effects done in the past, we wanted to create something never seen. As always I want nobody to think that CG was involved.” Another example are the Jack-o’-lantern characters. Originally, the Jack-o’-lanterns were to be shot practically but were replaced with digital versions. “We got a note from Spielberg saying that the pumpkins were going to have to move. We came up with an idea that the pumpkins are on vines that act as if they’re snakes. I’ve never seen that before. It’s based on what a pumpkin is and what can we do with it.” Once Roth got involved with the project, he sat down with screenwriter Eric Kripke to incorporate some fantasy set pieces. Recalls Roth, “There was a whole thing with a room full of automatons in the script, and I said to Eric, ‘Once the house is taken over by the evil warlock the house should backfire. All of the things that were there to protect you, like the pumpkins, turn evil.’ I wanted a nightmare sequence of these automatons coming to life,” he explains. “The automatons are so damn creepy even in real life. We had them move in this robotic way and it all looked so real. “There’s this whole thing about how pumpkins ward off evil,” Roth adds. “They turn evil, start fighting, and vomit pumpkin guts. All of those things Spielberg loved too. He said, ‘This pumpkin scene should be iconic. Make it terrifying. Have real danger.’ I have dealt with visual effects before but never had the experience of designing a sequence from scratch that is so dependent on them. You have to trust that it’s all going to look amazing.” Creature Effects Inc. Founder Mark Rappaport produced practical automatons and Jack-o’-lanterns. “Eli Roth and [Production Designer] Jon Hutman sent us designs for the automatons with specific images in mind. We scaled up their renderings and dissected the images so we could reproduce accurate 3D automatons. Our automatons needed to be puppeteered, so that required us to create joints we could lock in place or loosen to puppeteer with a rod. The designs sent us scouring junk shops that still sell old equipment from the 1950s, ‘60s and ‘70s. We machined and scavenged parts. The automatons had the appearance of complex mechanisms, but in actuality the final build was simple without much movement. The movements would later be added with VFX.” 3D printing was utilized for the heads and some of the mechanical-looking parts. “We never have enough time anymore to build, and 3D printing is getting quicker and easier. I have the Eden 260 V and a filament printer we built for quick parts to help with the servo motors.” “The initial pumpkins were our designs based on input from Eli,” remarks Rappaport. “I would show many different iterations of the pumpkins and Eli would comment on each with written
TOP: Eli Roth converses on set with Jack Black and Cate Blanchett. BOTTOM: Cate Blanchett plays the neighbourly and powerful witch Florence Zimmermann.
FALL 2018 VFXVOICE.COM • 15
8/23/18 11:57 AM
FILM
INSIDE THE HOUSE WITH A CLOCK IN ITS WALLS By TREVOR HOGG
TOP LEFT: Cinematographer Rogier Stoffers collaborated with director Eli Roth on Death Wish and The House with the Clock in Its Walls, which were both released in 2018. TOP RIGHT: Eli Roth with his young lead actor Owen Vaccaro, who portrays the apprentice sorcerer Lewis Barnavelt.
14 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 14-15
Venturing away from Cabin Fever, Hostel and The Green Inferno, filmmaker Eli Roth has entered into the realm of family entertainment with his adaption of the Gothic horror children’s novel by John Bellairs, The House with a Clock in Its Walls. Originally published in 1973, the story revolves around orphaned Lewis Barnavelt (Owen Vaccaro), who is sent to live with his uncle Jonathan Barnavelt (Jack Black) and their search for a magical clock that has the power to destroy the world. “I’m a kind and gentle person,” notes Roth. “I don’t know how I can get kinder and gentler! The intention was always to make a fantastic scary kids’ movie. I’ve had any body part at my disposal in all of my previous movies and certainly love doing a delicious kill, but there are a lot of other ideas that I have. I wanted to do something that was much more in the world of Tim Burton, Terry Gilliam and Guillermo del Toro. Or something like a wild Peter Jackson movie. I’ve had such an amazing experience with Amblin Entertainment; it’s the first time I’ve gotten the resources to do that.” Limitations are important to filmmakers. “If you look at the sequence in Get Out when Chris Washington (Daniel Kaluuya) goes to the Sunken Place, they didn’t have money for a set so they put him on wires, and it’s iconic,” notes Roth. “There are many movies that prove that more money doesn’t make them better. I watch some of these giant superhero films and am shocked at how bad the effects are because they’re shooting them up to the last minute and have 50 different visual effects houses. Whereas if you watch Ex Machina or Arrival, the effects are perfect because they are focused.” Arrival led to a BAFTA Award nomination for Louis Morin, who serves as the Visual Effects Supervisor on The House with a Clock in Its Walls. “Louis also did Sicario, and I didn’t know that the Juarez sequence is 55% or 60% computer generated because it looks flawless. You’re not paying attention to any of it. I want everything to look as organic and realistic as possible; that takes time and focus.”
“When I came in the budget was fairly low, but once they saw the potential it was decided to invest in visual effects to enhance the movie,” remarks Louis Morin, who has worked on approximately 800 visual effects shots with Rodeo FX, Hybride, Folks VFX, Alchemy 24 and Mavericks VFX. A shot list was discussed with Cinematographer Rogier Stoffers and Roth, but there was still flexibility to respond to spontaneous ideas on set. “There’s a huge transition from one character to another that we wanted to develop in a unique way. Inspired by practical effects done in the past, we wanted to create something never seen. As always I want nobody to think that CG was involved.” Another example are the Jack-o’-lantern characters. Originally, the Jack-o’-lanterns were to be shot practically but were replaced with digital versions. “We got a note from Spielberg saying that the pumpkins were going to have to move. We came up with an idea that the pumpkins are on vines that act as if they’re snakes. I’ve never seen that before. It’s based on what a pumpkin is and what can we do with it.” Once Roth got involved with the project, he sat down with screenwriter Eric Kripke to incorporate some fantasy set pieces. Recalls Roth, “There was a whole thing with a room full of automatons in the script, and I said to Eric, ‘Once the house is taken over by the evil warlock the house should backfire. All of the things that were there to protect you, like the pumpkins, turn evil.’ I wanted a nightmare sequence of these automatons coming to life,” he explains. “The automatons are so damn creepy even in real life. We had them move in this robotic way and it all looked so real. “There’s this whole thing about how pumpkins ward off evil,” Roth adds. “They turn evil, start fighting, and vomit pumpkin guts. All of those things Spielberg loved too. He said, ‘This pumpkin scene should be iconic. Make it terrifying. Have real danger.’ I have dealt with visual effects before but never had the experience of designing a sequence from scratch that is so dependent on them. You have to trust that it’s all going to look amazing.” Creature Effects Inc. Founder Mark Rappaport produced practical automatons and Jack-o’-lanterns. “Eli Roth and [Production Designer] Jon Hutman sent us designs for the automatons with specific images in mind. We scaled up their renderings and dissected the images so we could reproduce accurate 3D automatons. Our automatons needed to be puppeteered, so that required us to create joints we could lock in place or loosen to puppeteer with a rod. The designs sent us scouring junk shops that still sell old equipment from the 1950s, ‘60s and ‘70s. We machined and scavenged parts. The automatons had the appearance of complex mechanisms, but in actuality the final build was simple without much movement. The movements would later be added with VFX.” 3D printing was utilized for the heads and some of the mechanical-looking parts. “We never have enough time anymore to build, and 3D printing is getting quicker and easier. I have the Eden 260 V and a filament printer we built for quick parts to help with the servo motors.” “The initial pumpkins were our designs based on input from Eli,” remarks Rappaport. “I would show many different iterations of the pumpkins and Eli would comment on each with written
TOP: Eli Roth converses on set with Jack Black and Cate Blanchett. BOTTOM: Cate Blanchett plays the neighbourly and powerful witch Florence Zimmermann.
FALL 2018 VFXVOICE.COM • 15
8/23/18 11:57 AM
FILM
TOP: Jonathan Barnavelt (Jack Black) listens to find the whereabouts of a clock with the ability to bring about the end of the world. BOTTOM: Jack Black is cast in the role of a mediocre warlock and uncle to Lewis.
16 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 16-17
notes such as, ‘Steven Spielberg’s request: ‘Please make scarier.’ We designed the pumpkins starting with scary and amusing, then scary and gross, then scarier and grosser, and finally ‘scarier not gross with teeth.’ We were fortunate that Halloween was only a month away. We drove out to the growers who sell to the pumpkin patches. The pumpkin growers have a few very large pumpkins that they grow and sell to specialty clients only. These pumpkins are huge and weigh 60 to 80 pounds. They became our drawing boards allowing us to sculpt right on these oversized Jack-o’-lanterns.” The limited availability of Cate Blanchett, who portrays the neighborly witch Mrs. Zimmermann, shortened the production schedule. “When we would normally spend months prepping, we were doing it in six to eight weeks,” remarks Roth. “As many scenes that we could delay until the end of the shoot, we did. We started right away with two or three visual effects sequences. Sometimes you have to jump into the fire. I was shot-listing and storyboarding during prep so we could capture the scenes in the amount of time that we had.” He adds that working with child actors Owen Vaccaro and Sunny Suljic was not a problem. “They’ve grown up with movies with visual effects and understand it. It was a lot of fun for them. What was enjoyable was watching Cate Blanchett. I would say, ‘Now the pumpkin is biting your head off. How many times did Woody Allen give you that direction?’” The majority of the action takes place in a house situated in Atlanta that was augmented with some stage work captured at Atlanta Metro Studios. “We shot as much practical as we could,” states Roth. “When you look at the Clock Room, there were set extensions, but we built them. I have never been on a set that literally moved. Everywhere you step something is moving. It was like a Buster Keaton contraption. I underestimated how incredibly challenging it was to film on that set, but am tremendously proud of the sequence. Also, visually this looks like none of my other movies. It has a fantastic fairy-tale quality to it. Rarely do you have something in your head and it perfectly all comes together, as was the case with the automaton attack. Every shot we did was used. It is like a nightmare out of the Brothers Quay [Street of Crocodiles]
come to life.” Practically executing the Clock Room was a monumental task for Special Effects Supervisor Russell Tyrrell. “Videos of the internal gears helped me to see the different speeds and directions they moved. It was insightful, in particular, for the escapement gear and its movement.” Safety was a major concern. “How can we build a room full of gears that mesh together, then ask the cast and crew to work on and around it without grinding themselves into hamburger meat? The smaller gear clusters under 10 feet in diameter were put into motion with stepper motors. We programmed the torque setting for the motors so they would stop immediately if anything came into contact with the gears. “The biggest challenge was the 10-foot and 14.5-foot gears that the main action took place on,” continues Tyrrell. “These were put into motion with a hydraulic unit along with gear reduction. These were challenging because they had to move slowly and it took a lot of force to initiate the movement. As a result of their size and weight, we also had both spotters and dead man’s switches right next to the gears to start and stop them. In order to support the weight of the cast and crew they had to be made of steel. The pedestal bearings were rated for 240,000 pounds so we wouldn’t have any wobble in the gears when people would walk near the edge. On top of all that, we had hydraulic arms with gears on them that came in toward Lewis, along with a 10-foot gear that descended from the sky; all were orchestrated together to create that impending doom effect. This barely touches on the array of mechanical elements we had in motion on this set.” “Everything is alive in the house,” notes Morin. “Believability is in the details. You look at a mirror and it becomes a monitor. You look at a stained glass window and it starts to move. The books and a La-Z-Boy are alive. What is the possible movement of a La-Z-Boy? The La-Z-Boy is like a dog. What can be the eyes? The buttons on the La-Z-Boy. The armrests can be arms. The footrest is where it can smile and talk. We create as we go and play with it. Eli is good at that. He comes up with ideas and reacts. There’s a continuous flow of creativity that goes back and forth between the visual effects and the director.
“There’s a huge transition from one character to another that we wanted to develop in a unique way. Inspired by practical effects done in the past, we wanted to create something never seen; as always I want nobody to think that CG was involved.” —Louis Morin, Visual Effects Supervisor TOP: Florence Zimmermann along with Lewis and Jonathan Barnavelt team together against an evil warlock played by Kyle MacLachlan. BOTTOM: A signature sequence features the Jack-o’-lanterns turning evil and against those they were meant to protect.
FALL 2018 VFXVOICE.COM • 17
8/23/18 11:57 AM
FILM
TOP: Jonathan Barnavelt (Jack Black) listens to find the whereabouts of a clock with the ability to bring about the end of the world. BOTTOM: Jack Black is cast in the role of a mediocre warlock and uncle to Lewis.
16 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 16-17
notes such as, ‘Steven Spielberg’s request: ‘Please make scarier.’ We designed the pumpkins starting with scary and amusing, then scary and gross, then scarier and grosser, and finally ‘scarier not gross with teeth.’ We were fortunate that Halloween was only a month away. We drove out to the growers who sell to the pumpkin patches. The pumpkin growers have a few very large pumpkins that they grow and sell to specialty clients only. These pumpkins are huge and weigh 60 to 80 pounds. They became our drawing boards allowing us to sculpt right on these oversized Jack-o’-lanterns.” The limited availability of Cate Blanchett, who portrays the neighborly witch Mrs. Zimmermann, shortened the production schedule. “When we would normally spend months prepping, we were doing it in six to eight weeks,” remarks Roth. “As many scenes that we could delay until the end of the shoot, we did. We started right away with two or three visual effects sequences. Sometimes you have to jump into the fire. I was shot-listing and storyboarding during prep so we could capture the scenes in the amount of time that we had.” He adds that working with child actors Owen Vaccaro and Sunny Suljic was not a problem. “They’ve grown up with movies with visual effects and understand it. It was a lot of fun for them. What was enjoyable was watching Cate Blanchett. I would say, ‘Now the pumpkin is biting your head off. How many times did Woody Allen give you that direction?’” The majority of the action takes place in a house situated in Atlanta that was augmented with some stage work captured at Atlanta Metro Studios. “We shot as much practical as we could,” states Roth. “When you look at the Clock Room, there were set extensions, but we built them. I have never been on a set that literally moved. Everywhere you step something is moving. It was like a Buster Keaton contraption. I underestimated how incredibly challenging it was to film on that set, but am tremendously proud of the sequence. Also, visually this looks like none of my other movies. It has a fantastic fairy-tale quality to it. Rarely do you have something in your head and it perfectly all comes together, as was the case with the automaton attack. Every shot we did was used. It is like a nightmare out of the Brothers Quay [Street of Crocodiles]
come to life.” Practically executing the Clock Room was a monumental task for Special Effects Supervisor Russell Tyrrell. “Videos of the internal gears helped me to see the different speeds and directions they moved. It was insightful, in particular, for the escapement gear and its movement.” Safety was a major concern. “How can we build a room full of gears that mesh together, then ask the cast and crew to work on and around it without grinding themselves into hamburger meat? The smaller gear clusters under 10 feet in diameter were put into motion with stepper motors. We programmed the torque setting for the motors so they would stop immediately if anything came into contact with the gears. “The biggest challenge was the 10-foot and 14.5-foot gears that the main action took place on,” continues Tyrrell. “These were put into motion with a hydraulic unit along with gear reduction. These were challenging because they had to move slowly and it took a lot of force to initiate the movement. As a result of their size and weight, we also had both spotters and dead man’s switches right next to the gears to start and stop them. In order to support the weight of the cast and crew they had to be made of steel. The pedestal bearings were rated for 240,000 pounds so we wouldn’t have any wobble in the gears when people would walk near the edge. On top of all that, we had hydraulic arms with gears on them that came in toward Lewis, along with a 10-foot gear that descended from the sky; all were orchestrated together to create that impending doom effect. This barely touches on the array of mechanical elements we had in motion on this set.” “Everything is alive in the house,” notes Morin. “Believability is in the details. You look at a mirror and it becomes a monitor. You look at a stained glass window and it starts to move. The books and a La-Z-Boy are alive. What is the possible movement of a La-Z-Boy? The La-Z-Boy is like a dog. What can be the eyes? The buttons on the La-Z-Boy. The armrests can be arms. The footrest is where it can smile and talk. We create as we go and play with it. Eli is good at that. He comes up with ideas and reacts. There’s a continuous flow of creativity that goes back and forth between the visual effects and the director.
“There’s a huge transition from one character to another that we wanted to develop in a unique way. Inspired by practical effects done in the past, we wanted to create something never seen; as always I want nobody to think that CG was involved.” —Louis Morin, Visual Effects Supervisor TOP: Florence Zimmermann along with Lewis and Jonathan Barnavelt team together against an evil warlock played by Kyle MacLachlan. BOTTOM: A signature sequence features the Jack-o’-lanterns turning evil and against those they were meant to protect.
FALL 2018 VFXVOICE.COM • 17
8/23/18 11:57 AM
FILM
TOP: Everything in the house is alive and moving, such as the stained glass situated behind Lewis. BOTTOM: In an effort to impress his new friend Tarby Corrigan (Sunny Suljic), Lewis Barnavelt inadvertently resurrects Isaac Izard, the previous owner of the house. OPPOSITE TOP: Originally, principal photography was to take place in Pittsburgh, but the production was shifted to Atlanta. OPPOSITE BOTTOM: Dealing with visual effects was nothing new for Cate Blanchett and Jack Black as they previously worked on Thor:Ragnarok and Jumanji: Welcome to the Jungle, respectively.
18 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 19
“There are simulations throughout the movie. A big simulation is when hundreds of books attack Lewis. Then there’s a liquid pumpkin that crystalizes. My motto is that less is more. It’s there but not in your face. Everything is art directed in the frame.” A frequent collaborator of Louis Morin is Rodeo FX Visual Effects Supervisor Alexandre Lafortune, who handled the animation of the automatons, garden griffin, Clock Room and Baby Jonathan. “We looked at watches, clocks and browsed a wide variety of gears, some of which had to have steampunk style, while others had to be on a large scale,” Lafortune recounts. “The Clock Room was a big part of the puzzle. We had to build up many types of gears running together, linking together. It had to look like a perfectly functional mechanism, but doesn’t work in reality. There’s one shot that has 450 pieces of gears and pipes; it was quite heavy in terms of geometry. Baby Jonathan is a funny-looking asset—it’s quite disturbing. Putting Jack Black’s head on a baby’s body is not your typical day! We had to match the asset to a puppet more than to the real Jack Black.” Software used by Rodeo FX includes Katana, Maya, ZBrush, Houdini, Nuke and Arnold. “The griffin required four months of
work for the vegetation, twigs, and leaves,” explains Lafortune. “We had an on-set version which was a plastic vegetation sculpture; it was the correct scale, but we had to modify the proportions to enhance the lion aspect and adapt it to be able to move. We underestimated the griffin at the beginning. It’s been a huge challenge to loop the griffin through all the software, because all of the simulations and shading had to follow; this is the most time-consuming part of it. The griffin is the first asset we started on this show, and it will be the last we’ll finish. For the automatons, we created them in the most accurate way possible according to the on-set ones. Since the request was not to be 1:1 with the on-set asset, we made six different versions. We took time in the shading, look development and modeling. Even though they are not heroes, you see automatons from up-close and alive.” Hybride looked after the Snakespeare and rat character transformations under the supervision of Joseph Kasparian and François Lambert. “We did a lot of research from old movies to develop the general look of some sequences,” explains Kasparian. “Here are two examples of style we had to create: old movie look [black and white with high contrast and low dynamic range], and
nickelodeon look [sepia, scratches, low dynamic range, high level of vignette and film jitter]. For the human transformation, we had to do something between the cartoonish style of The Mask and the brutal physics of werewolf movies. For the stained-glass magical animation, we even started looking at what was done by ILM in 1985 on Young Sherlock Holmes.” Reference photography of a real purple snake was utilized for Snakespeare. “From there, we decided to do a pass of design directly on a master shot. We went into modeling instead of drawing to define the size and number of tentacles, making sure that every decision would be achievable up to the final stage without any cheat. The client chose five tentacles and decided never to see the body or head of the creature. Seeing only the tentacles in all the different locations helped us with the composition as we could cheat the size and length if needed.” “Clearly, one of the biggest challenges was to do the human transformations,” notes Kasparian. “We had to mix live plates with CG to pass from one actor to another as smoothly as possible. Everything needed to be stylized without being cartoony. The three actresses had various shapes, different wardrobes and jewelry. The design of the blending process had to go through
FALL 2018 VFXVOICE.COM • 19
8/23/18 11:57 AM
FILM
TOP: Everything in the house is alive and moving, such as the stained glass situated behind Lewis. BOTTOM: In an effort to impress his new friend Tarby Corrigan (Sunny Suljic), Lewis Barnavelt inadvertently resurrects Isaac Izard, the previous owner of the house. OPPOSITE TOP: Originally, principal photography was to take place in Pittsburgh, but the production was shifted to Atlanta. OPPOSITE BOTTOM: Dealing with visual effects was nothing new for Cate Blanchett and Jack Black as they previously worked on Thor:Ragnarok and Jumanji: Welcome to the Jungle, respectively.
18 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 19
“There are simulations throughout the movie. A big simulation is when hundreds of books attack Lewis. Then there’s a liquid pumpkin that crystalizes. My motto is that less is more. It’s there but not in your face. Everything is art directed in the frame.” A frequent collaborator of Louis Morin is Rodeo FX Visual Effects Supervisor Alexandre Lafortune, who handled the animation of the automatons, garden griffin, Clock Room and Baby Jonathan. “We looked at watches, clocks and browsed a wide variety of gears, some of which had to have steampunk style, while others had to be on a large scale,” Lafortune recounts. “The Clock Room was a big part of the puzzle. We had to build up many types of gears running together, linking together. It had to look like a perfectly functional mechanism, but doesn’t work in reality. There’s one shot that has 450 pieces of gears and pipes; it was quite heavy in terms of geometry. Baby Jonathan is a funny-looking asset—it’s quite disturbing. Putting Jack Black’s head on a baby’s body is not your typical day! We had to match the asset to a puppet more than to the real Jack Black.” Software used by Rodeo FX includes Katana, Maya, ZBrush, Houdini, Nuke and Arnold. “The griffin required four months of
work for the vegetation, twigs, and leaves,” explains Lafortune. “We had an on-set version which was a plastic vegetation sculpture; it was the correct scale, but we had to modify the proportions to enhance the lion aspect and adapt it to be able to move. We underestimated the griffin at the beginning. It’s been a huge challenge to loop the griffin through all the software, because all of the simulations and shading had to follow; this is the most time-consuming part of it. The griffin is the first asset we started on this show, and it will be the last we’ll finish. For the automatons, we created them in the most accurate way possible according to the on-set ones. Since the request was not to be 1:1 with the on-set asset, we made six different versions. We took time in the shading, look development and modeling. Even though they are not heroes, you see automatons from up-close and alive.” Hybride looked after the Snakespeare and rat character transformations under the supervision of Joseph Kasparian and François Lambert. “We did a lot of research from old movies to develop the general look of some sequences,” explains Kasparian. “Here are two examples of style we had to create: old movie look [black and white with high contrast and low dynamic range], and
nickelodeon look [sepia, scratches, low dynamic range, high level of vignette and film jitter]. For the human transformation, we had to do something between the cartoonish style of The Mask and the brutal physics of werewolf movies. For the stained-glass magical animation, we even started looking at what was done by ILM in 1985 on Young Sherlock Holmes.” Reference photography of a real purple snake was utilized for Snakespeare. “From there, we decided to do a pass of design directly on a master shot. We went into modeling instead of drawing to define the size and number of tentacles, making sure that every decision would be achievable up to the final stage without any cheat. The client chose five tentacles and decided never to see the body or head of the creature. Seeing only the tentacles in all the different locations helped us with the composition as we could cheat the size and length if needed.” “Clearly, one of the biggest challenges was to do the human transformations,” notes Kasparian. “We had to mix live plates with CG to pass from one actor to another as smoothly as possible. Everything needed to be stylized without being cartoony. The three actresses had various shapes, different wardrobes and jewelry. The design of the blending process had to go through
FALL 2018 VFXVOICE.COM • 19
8/23/18 11:57 AM
FILM
“I would show many different iterations of the pumpkins and Eli would comment on each with written notes such as, ‘Steven Spielberg’s request: ‘Please make scarier.’ We designed the pumpkins starting with scary and amusing, then scary and gross, then scarier and grosser, and finally ‘scarier not gross with teeth’.” —Mark Rappaport, Founder, Creature Effects Inc. TOP: Automatons were created with 3D printing and parts taken from equipment produced in the 1950s, 1960s and 1970s. BOTTOM: Jonathan Barnavelt attempts to cast a spell.
20 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 21
multiple iterations before getting something that worked. One of the other challenges we encountered was the design of a sequence with Time running backwards, provoking a stormy red sky with a moon eclipse. Whenever lightning struck a human, he would become younger. Many plates mixing with many CG elements were required to build the shots.” Mummified Isaac and La-Z-Boy came to cinematic life courtesy of Mavericks VFX CEO and Visual Effects Supervisor Brendan Taylor. “We were having a hard time figuring out how to rig the La-Z-Boy and what the stress on the fabric would do, specifically with the eyes. We ended up finding a sort of match chair at a flea market. We ripped the back off and tried shaping the brow and cheek looking at what that did to the fabric. After this exercise, we opted for blend shapes as it gave us the most flexibility. Creatively, the biggest challenge was nailing the character of the La-Z-Boy. We settled on an impish, snobby dog. Our instincts are always pushing us to be subtle; however, Louis and Eli kept reminding us that this was a magical character in a kids’ movie and we needed to see its emotions. Once the shackles of reality were removed, we started to see how far it could be pushed.”
AD
FALL 2018 VFXVOICE.COM • 21
8/23/18 11:57 AM
PG 21 Folks VFX AD.indd 21
8/23/18 12:02 PM
FILM
LEFT TO RIGHT, TOP TO BOTTOM: Louis Morin Mark Rappaport Russell Tyrrell Alexandre Lafortune Joseph Kasperian
“Clearly, one of the biggest challenges was to do the human transformations. We had to mix live plates with CG to pass from one actor to another as smoothly as possible. Everything needed to be stylized without being cartoony. The three actresses had various shapes, different wardrobes and jewelry. The design of the blending process had to go through multiple iterations before getting something that worked.” —Joseph Kasparian, Visual Effects Supervisor
22 • VFXVOICE.COM FALL 2018
PG 14-22 CLOCK.indd 23
All of the animation and lighting was done in Maya, V-Ray was used for rendering, effects were created in Houdini and compositing produced in Nuke. “With the mummified hand, it was more an execution challenge,” reveals Taylor. “We basically needed to recreate a photoreal hand [including skeleton, ligaments, muscles and skin].” The graveyard scene was initially planned to be done practically. “After a few preview screenings, they came up with a [thoroughly gross] idea that the hand should be transforming from a mummified hand, crawling with bugs and maggots, into the human hand that is in the shot. Once we dug into the shot, a few things became apparent. The match move needed to be perfect. The model needed to perfectly match the practical hand, but there was no scan of the hand. In the end we opted for sculpting the last few frames of the hand to match the plate. The timing of the hand coming out of the grave didn’t allow a lot of time for a slow elegant transition. It needed to be sped up, but not so fast that the audience wouldn’t quite register what was happening.” The signature pumpkin attack was developed and executed by Folks VFX President and Visual Effect Supervisor Sébastien Bergeron. “We looked into a lot of vegetables, from the field to their most rotten state. We researched carved pumpkins and famous horror characters and studied their facial features. We also had to watch a lot of vomiting scenes and references as to what made them disgusting or funny. Finally, we studied snakes for the animation part.” It was difficult to find the right balance in the movement and facial expressions of the Jack-o’-lanterns to keep the audience both scared and emotionally involved in the story. “We had all the effects simulation for the vomiting and pumpkin explosions, combined with a precise action to animate,” Bergeron says. “It was a heavy process that required a lot of back and forth between specialized artists. We had to specifically adapt our pipeline and tools so the creative process remained. While reviewing animation, we were all brainstorming about what kind of ‘gang’ the pumpkins were? Who was their leader? How would they laugh, burp and communicate among themselves? Some great ideas came to life during those fun creative meetings.” “It’s not safe, it’s scary,” Roth says. “That’s where people who have seen my other films will go, ‘This is for sure Eli’s version of a children’s movie.’ I wanted something that you can bring the whole family to, and parents who like scary movies will be like, ‘This is a great one to get our kids into it.’ Then you always have Jack Black and Cate Blanchett to make sure that you’re all right.” Spielberg has been supportive. “He said to me,” recalls Roth. “‘You’ve made a proper Amblin movie. It’s completely of its own, yet in the tradition of those other movies.’ All I’ve ever wanted was to make a movie that, if video stores still existed, would be on the shelf next to Back to the Future and Gremlins. Hopefully, other people will feel the same way.”
AD
FALL 2018 VFXVOICE.COM • 23
8/23/18 11:57 AM
PG 23 SVA-NYC AD.indd 23
8/23/18 12:03 PM
FILM
Previs and its often-related disciplines of techvis and postvis are steps in the filmmaking process that greatly help in planning complex visual effects-heavy sequences. But there’s another step used regularly in such films that does not always get talked about: stuntvis. Stuntvis – or stunt visualization – is usually some kind of video reference undertaken as part of the design of a project’s stunts by a stunt coordinator or action choreographer prior to anything being filmed on set. It’s a blueprint for a major action set piece or a fight sequence, and one that often segues into previs itself or becomes part of the ongoing edit. To learn more about where stuntvis sits in the filmmaking pipeline, VFX Voice spoke to some stuntvis artists and visual effects supervisors about their recent experiences on major films and television shows with this nuanced discipline.
STUNTVIS AMPS UP THE ACTION By IAN FAILES
TOP LEFT: Chris Clements TOP RIGHT: Yung Lee.
WHAT IS STUNTVIS?
Imagine a script calls for a major superhero fistfight between a hero and villain to take place on a freeway overpass. Quite often the stunts team will not have access to the final shooting location, so instead they might imagine the sequence in a rehearsal space and then edit the footage together. That edit, especially for a superhero showdown, will typically include some key placeholder visual effects elements, with the final product a visualization of how the stunt coordinator or action choreographer sees the fight as playing out. The sets and costumes are not important, just the main beats. “Stuntvis has multiple purposes,” outlines Chris Clements, a stuntvis artist with credits on Neflix’s Daredevil and The Punisher television series and Pacific Rim Uprising. “It’s used to sell the director and producers on the vision, to work out certain beats before a shoot to make sure they play correctly to camera and, most importantly, to act as a souped-up storyboard that can be referenced by multiple departments.” “It also gives the post visual effects team a great reference for what the stunt department is thinking the final product should look like,” adds Clements. “It’s not uncommon for the final sequence to be nearly identical to the stuntvis, from the camera angles down to little dust impacts that need to be digitally added.” Clements recently worked with action consultant Philip J. Silvera on two large action sequences in Pacific Rim Uprising, helping to imagine how the frenetic fighting styles of the robot Jaegers would look. “Those sequences required a little bit of everything,” says Clements. “We added in miniature crowds, mech blades composited onto the stunt doubles, missiles, helicopters and a ton of destruction!”
“It’s not uncommon for the final sequence to be nearly identical to the stuntvis, from the camera angles down to little dust impacts that need to be digitally added.” —Chris Clements, Stuntvis Artist
24 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 24-25
Silvera is a strong proponent of stuntvis. He also used it to present his fight ideas on Deadpool and its sequel Deadpool 2, and for the Daredevil series. On this Netflix show, Clements was provided with edits of the planned action sequences to add effects elements. “Typically, this would involve gunshots, blood hits and dust hits to help sell the action,” he explains. “The stunt coordinators I work with are very story-driven, so it becomes important to highlight certain weapons or injuries or portions of a location in the stuntvis. The fun really starts for me when I need to figure out how to achieve a difficult effect in a believable, time-efficient manner. It becomes a tug of war between time and quality.” “On season 2 of Daredevil, we worked on a huge stairwell fight that was sort of a sequel to the famous hallway fight in season 1,” continues Clements. “Daredevil fights with a chain as he makes his way down a stairwell and through about 30 thugs. It was extremely challenging figuring out how to make that chain look semi-realistic and hit all of the story beats in a day or two. I ended up creating several chain models in various positions, and I would transition from model to model depending on what Daredevil was doing. It’s not uncommon for me to have to find or create assets that we may not have access to. Sometimes, we get lucky and I can get some plates of the props or stuntmen on wires that I can composite into the stuntvis.”
TOP AND BOTTOM: A scene from season 2 of Daredevil sees the titular character face off against The Punisher. Fight scenes were choreographed by Philip J. Silvera and regularly involved stuntvis augmentations. (Image copyright © 2016 Netflix)
FALL 2018 VFXVOICE.COM • 25
8/23/18 12:07 PM
FILM
Previs and its often-related disciplines of techvis and postvis are steps in the filmmaking process that greatly help in planning complex visual effects-heavy sequences. But there’s another step used regularly in such films that does not always get talked about: stuntvis. Stuntvis – or stunt visualization – is usually some kind of video reference undertaken as part of the design of a project’s stunts by a stunt coordinator or action choreographer prior to anything being filmed on set. It’s a blueprint for a major action set piece or a fight sequence, and one that often segues into previs itself or becomes part of the ongoing edit. To learn more about where stuntvis sits in the filmmaking pipeline, VFX Voice spoke to some stuntvis artists and visual effects supervisors about their recent experiences on major films and television shows with this nuanced discipline.
STUNTVIS AMPS UP THE ACTION By IAN FAILES
TOP LEFT: Chris Clements TOP RIGHT: Yung Lee.
WHAT IS STUNTVIS?
Imagine a script calls for a major superhero fistfight between a hero and villain to take place on a freeway overpass. Quite often the stunts team will not have access to the final shooting location, so instead they might imagine the sequence in a rehearsal space and then edit the footage together. That edit, especially for a superhero showdown, will typically include some key placeholder visual effects elements, with the final product a visualization of how the stunt coordinator or action choreographer sees the fight as playing out. The sets and costumes are not important, just the main beats. “Stuntvis has multiple purposes,” outlines Chris Clements, a stuntvis artist with credits on Neflix’s Daredevil and The Punisher television series and Pacific Rim Uprising. “It’s used to sell the director and producers on the vision, to work out certain beats before a shoot to make sure they play correctly to camera and, most importantly, to act as a souped-up storyboard that can be referenced by multiple departments.” “It also gives the post visual effects team a great reference for what the stunt department is thinking the final product should look like,” adds Clements. “It’s not uncommon for the final sequence to be nearly identical to the stuntvis, from the camera angles down to little dust impacts that need to be digitally added.” Clements recently worked with action consultant Philip J. Silvera on two large action sequences in Pacific Rim Uprising, helping to imagine how the frenetic fighting styles of the robot Jaegers would look. “Those sequences required a little bit of everything,” says Clements. “We added in miniature crowds, mech blades composited onto the stunt doubles, missiles, helicopters and a ton of destruction!”
“It’s not uncommon for the final sequence to be nearly identical to the stuntvis, from the camera angles down to little dust impacts that need to be digitally added.” —Chris Clements, Stuntvis Artist
24 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 24-25
Silvera is a strong proponent of stuntvis. He also used it to present his fight ideas on Deadpool and its sequel Deadpool 2, and for the Daredevil series. On this Netflix show, Clements was provided with edits of the planned action sequences to add effects elements. “Typically, this would involve gunshots, blood hits and dust hits to help sell the action,” he explains. “The stunt coordinators I work with are very story-driven, so it becomes important to highlight certain weapons or injuries or portions of a location in the stuntvis. The fun really starts for me when I need to figure out how to achieve a difficult effect in a believable, time-efficient manner. It becomes a tug of war between time and quality.” “On season 2 of Daredevil, we worked on a huge stairwell fight that was sort of a sequel to the famous hallway fight in season 1,” continues Clements. “Daredevil fights with a chain as he makes his way down a stairwell and through about 30 thugs. It was extremely challenging figuring out how to make that chain look semi-realistic and hit all of the story beats in a day or two. I ended up creating several chain models in various positions, and I would transition from model to model depending on what Daredevil was doing. It’s not uncommon for me to have to find or create assets that we may not have access to. Sometimes, we get lucky and I can get some plates of the props or stuntmen on wires that I can composite into the stuntvis.”
TOP AND BOTTOM: A scene from season 2 of Daredevil sees the titular character face off against The Punisher. Fight scenes were choreographed by Philip J. Silvera and regularly involved stuntvis augmentations. (Image copyright © 2016 Netflix)
FALL 2018 VFXVOICE.COM • 25
8/23/18 12:07 PM
FILM
“I prefer to call stuntvis ‘action design,’ as it involves a lot more than stunts and requires the involvement of many key departments. The entire process includes the development of scene blocking, choreography, stunt testing and rigging, set design, camera placement and movement, lighting setup, VFX breakdown, editing, and prop and weapon conceptualization.” —Yung Lee, Stuntvis Artist
TOP: A final shot from the opening taxi cab chase in Kingsman: The Golden Circle. (Image copyright © 2017 20th Century Fox)
26 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 26-27
A LARGE PART OF THE PROCESS
“I prefer to call stuntvis ‘action design,’ as it involves a lot more than stunts and requires the involvement of many key departments,” says stuntvis artist Yung Lee, who has worked on films including Kingsman: The Golden Circle and Solo: A Star Wars Story. “The entire process includes the development of scene blocking, choreography, stunt testing and rigging, set design, camera placement and movement, lighting setup, VFX breakdown, editing, and prop and weapon conceptualization.” Lee collaborates regularly with Stunt Coordinator Brad Allan’s Action Design Inc. team, which specializes in designing high-end fight and stunt choreography. “My role involves editing the action sequences from the start of storyboard animatics to the stuntvis edits and finally to on-set editing during the production shoot, which I then pass onto editorial,” says Lee. “One of my key roles is to design/shoot/direct concept action scenes involving a lot of VFX elements. So if we have a piece in a fight scene involving a heavy VFX action beat, I will usually take charge in helping to flesh that concept out. Probably one of the most fun parts of my role is doing the stunt-camera tests – things like getting rigged up on a wire, and being dropped 70 feet down to the ground while falling with the performer and trying to keep him in frame the whole time!” On Kingsman: The Golden Circle, Lee worked closely with fight coordinator Guillermo Grispo to help imagine the diner showdown sequence, a long fight between several characters that plays out in stitched ‘oner’ shots, something the team called ‘hyper-cam’ that was first tested using an iPhone (Imageworks crafted the final visual effects work). “I was responsible for making all the hidden
cuts in that scene work,” notes Lee, “while Guillermo would adjust choreography accordingly to help the stitch work better. Together we created the first pass of that sequence as Brad oversaw the performances. Then we had our on-set camera operator, Chris Cowan, come on board as an additional action designer to help spice up some of the camera moves for the sequence in the later passes.” The Golden Circle also features an elaborate taxi cab chase through the streets of London, full of fast hand-to-hand combat and some seemingly impossible camera moves as the cars race
TOP Stuntvis helped make parts of the conveyex raid in Solo possible. (Image copyright © 2018 Lucasfilm Ltd.) BOTTOM: Stuntvis incorporating stunt performers was generated for some of the robot Jaeger battles in Pacific Rim Uprising. (Image copyright © 2018 Universal Pictures)
FALL 2018 VFXVOICE.COM • 27
8/23/18 12:07 PM
FILM
“I prefer to call stuntvis ‘action design,’ as it involves a lot more than stunts and requires the involvement of many key departments. The entire process includes the development of scene blocking, choreography, stunt testing and rigging, set design, camera placement and movement, lighting setup, VFX breakdown, editing, and prop and weapon conceptualization.” —Yung Lee, Stuntvis Artist
TOP: A final shot from the opening taxi cab chase in Kingsman: The Golden Circle. (Image copyright © 2017 20th Century Fox)
26 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 26-27
A LARGE PART OF THE PROCESS
“I prefer to call stuntvis ‘action design,’ as it involves a lot more than stunts and requires the involvement of many key departments,” says stuntvis artist Yung Lee, who has worked on films including Kingsman: The Golden Circle and Solo: A Star Wars Story. “The entire process includes the development of scene blocking, choreography, stunt testing and rigging, set design, camera placement and movement, lighting setup, VFX breakdown, editing, and prop and weapon conceptualization.” Lee collaborates regularly with Stunt Coordinator Brad Allan’s Action Design Inc. team, which specializes in designing high-end fight and stunt choreography. “My role involves editing the action sequences from the start of storyboard animatics to the stuntvis edits and finally to on-set editing during the production shoot, which I then pass onto editorial,” says Lee. “One of my key roles is to design/shoot/direct concept action scenes involving a lot of VFX elements. So if we have a piece in a fight scene involving a heavy VFX action beat, I will usually take charge in helping to flesh that concept out. Probably one of the most fun parts of my role is doing the stunt-camera tests – things like getting rigged up on a wire, and being dropped 70 feet down to the ground while falling with the performer and trying to keep him in frame the whole time!” On Kingsman: The Golden Circle, Lee worked closely with fight coordinator Guillermo Grispo to help imagine the diner showdown sequence, a long fight between several characters that plays out in stitched ‘oner’ shots, something the team called ‘hyper-cam’ that was first tested using an iPhone (Imageworks crafted the final visual effects work). “I was responsible for making all the hidden
cuts in that scene work,” notes Lee, “while Guillermo would adjust choreography accordingly to help the stitch work better. Together we created the first pass of that sequence as Brad oversaw the performances. Then we had our on-set camera operator, Chris Cowan, come on board as an additional action designer to help spice up some of the camera moves for the sequence in the later passes.” The Golden Circle also features an elaborate taxi cab chase through the streets of London, full of fast hand-to-hand combat and some seemingly impossible camera moves as the cars race
TOP Stuntvis helped make parts of the conveyex raid in Solo possible. (Image copyright © 2018 Lucasfilm Ltd.) BOTTOM: Stuntvis incorporating stunt performers was generated for some of the robot Jaeger battles in Pacific Rim Uprising. (Image copyright © 2018 Universal Pictures)
FALL 2018 VFXVOICE.COM • 27
8/23/18 12:07 PM
FILM
Supes on Stuntvis Visual effects supervisors Rob Bredow (Solo: A Star Wars Story), Dan Glass (Deadpool 2) and Angus Bickerton (Kingsman: The Golden Circle) share how stuntvis became a crucial part of making major VFX shots possible in their films. Rob Bredow “For the opening speeder chase, stuntvis involved the stunts team taking out test versions of the special effects cars onto a giant empty tarmac, trying a bunch of different stunts and then cutting that together. I would work closely with Brad Allan, who was the second unit director and the stunt coordinator, and Yung Lee to conceptualize and edit and try different ideas in stuntvis. In some cases, Yung would take the art department’s concept art and comp that quickly as a background to get a sense of where the spaceport was and how it was all going to fit together. Editorial used that as a slug and then, of course, visual effects took it from there.” Dan Glass “The director, David Leitch, is a former stunt coordinator, and he and his team at 87Eleven are used to doing a lot of action design. All of the fights went through a stuntvis process of the stunt teams working in a gymnasium or a created space with pads and boxes that mimicked the ultimate stage or location
“The ability to quickly create 3D animation with real-time rendering is extremely useful on set. ... I believe it is going to play a big role in the future of filmmaking.” —Yung Lee, Stuntvis Artist through the streets. (The Third Floor also provided previs of the action, while Framestore delivered the final VFX.) The Action Design Inc. team imagined the taxi chase with a real taxi mock-up and some partial, even cardboard, set pieces and stunt performers in a rehearsal space, which stuntvis then used to help stitch shots, add weapon effects and even sparks to the shots. TOOLS OF THE TRADE
Stuntvis is regularly delivered on tight deadlines, and often requires quick iterations and constant collaboration between stuntvis artists and editors. For that reason, the usual tool of choice is Adobe After Effects. Clements says he also utilizes Video Copilot’s Element 3D to drop in CG models – typically weapons – to shots. “I also have a ridiculous library of blood hits and muzzle flash assets that I’ve compiled over the years,” he adds. “One of my favorite stuntvis pieces was in season 2 of Daredevil when the Punisher was locked up and fought virtually every inmate in the prison. I think I went through every blood asset that I had for that
28 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 29
one. It was also fun coming up with a way for the blood to soak into the Punisher’s white jumpsuit to create the Punisher skull logo there.” Lee is also an advocate for Adobe After Effects, which he uses hand in hand with Adobe Premiere for editing, oftentimes delivering quick on-set composites. A recent addition to his arsenal, however, has been game engines. “I decided to use some Grand Theft Auto V footage I shot in-game to use in our taxi stuntvis for The Golden Circle,” says Lee. “The game’s graphics are great. I was playing it at the time, and used the editor in the game to quickly and easily block out a shot, which I then comped our taxi into in After Effects. Then in Solo, working with Brad Allan again, I used Unreal Engine to add speeders, skiffs, and the AT-hauler to a scene. It never took very long, generally 20 minutes to an hour for each shot once I got used to it, and the renders came out so clean and quick in about only a minute or two. “The ability to quickly create 3D animation with real-time rendering is just extremely useful on set,” continues Lee. “I had used it for moments in the speeder chase [in Solo] as I would animate and render out Moloch’s speeder into some shots very quickly and easily for Brad to see the rough idea of the final shot. Props to ILM for supplying me with the assets on demand. I believe it is going to play a big role in the future of filmmaking.”
work. Then they would piece all of that together with some crude effects, and those are used as the templates for both Dave to comment on and ultimately for us to plan to and to shoot from. We were even using the stuntvis to cut into the convoy scenes. We used a combination of traditional previs and stuntvis and plates to help piece together the story.” Angus Bickerton “The stunt team, led by Brad Allan and Guillermo Grispo, with Yung Lee, love kinetic choreography, but in their stuntvis you always know where you are and what’s happening. That’s one thing that director Matthew Vaughan has a big mantra about as well. He hates just ‘screen chaos’. He wants to know where everything is. He needs to know all the blocking. They came up with hugely inventive fight choreography and inventive camera choreography. It’s amazing what you can do with an iPhone in a warehouse with some cardboard boxes. Along with the previs, the stuntvis really informed our exterior and greenscreen interior taxi shoot.”
“It’s amazing what you can do with an iPhone in a warehouse with some cardboard boxes.” —Angus Bickerton, Visual Effects Supervisor OPPOSITE TOP: A final shot from the convoy sequence in Deadpool 2, in which Deadpool and Cable fight on a moving prisoner transport truck. (Image copyright © 2018 20th Century Fox) OPPOSITE BOTTOM: A scene from the opening speeder chase on Corellia in Solo. (Image copyright © 2018 Lucasfilm Ltd.) BOTTOM: Director Matthew Vaughan and actor Taron Egerton discuss a scene from the taxi cab chase in Kingsman: The Golden Circle. (Image copyright © 2017 20th Century Fox)
FALL 2018 VFXVOICE.COM • 29
8/23/18 12:07 PM
FILM
Supes on Stuntvis Visual effects supervisors Rob Bredow (Solo: A Star Wars Story), Dan Glass (Deadpool 2) and Angus Bickerton (Kingsman: The Golden Circle) share how stuntvis became a crucial part of making major VFX shots possible in their films. Rob Bredow “For the opening speeder chase, stuntvis involved the stunts team taking out test versions of the special effects cars onto a giant empty tarmac, trying a bunch of different stunts and then cutting that together. I would work closely with Brad Allan, who was the second unit director and the stunt coordinator, and Yung Lee to conceptualize and edit and try different ideas in stuntvis. In some cases, Yung would take the art department’s concept art and comp that quickly as a background to get a sense of where the spaceport was and how it was all going to fit together. Editorial used that as a slug and then, of course, visual effects took it from there.” Dan Glass “The director, David Leitch, is a former stunt coordinator, and he and his team at 87Eleven are used to doing a lot of action design. All of the fights went through a stuntvis process of the stunt teams working in a gymnasium or a created space with pads and boxes that mimicked the ultimate stage or location
“The ability to quickly create 3D animation with real-time rendering is extremely useful on set. ... I believe it is going to play a big role in the future of filmmaking.” —Yung Lee, Stuntvis Artist through the streets. (The Third Floor also provided previs of the action, while Framestore delivered the final VFX.) The Action Design Inc. team imagined the taxi chase with a real taxi mock-up and some partial, even cardboard, set pieces and stunt performers in a rehearsal space, which stuntvis then used to help stitch shots, add weapon effects and even sparks to the shots. TOOLS OF THE TRADE
Stuntvis is regularly delivered on tight deadlines, and often requires quick iterations and constant collaboration between stuntvis artists and editors. For that reason, the usual tool of choice is Adobe After Effects. Clements says he also utilizes Video Copilot’s Element 3D to drop in CG models – typically weapons – to shots. “I also have a ridiculous library of blood hits and muzzle flash assets that I’ve compiled over the years,” he adds. “One of my favorite stuntvis pieces was in season 2 of Daredevil when the Punisher was locked up and fought virtually every inmate in the prison. I think I went through every blood asset that I had for that
28 • VFXVOICE.COM FALL 2018
PG 24-29 STUNTVIS.indd 29
one. It was also fun coming up with a way for the blood to soak into the Punisher’s white jumpsuit to create the Punisher skull logo there.” Lee is also an advocate for Adobe After Effects, which he uses hand in hand with Adobe Premiere for editing, oftentimes delivering quick on-set composites. A recent addition to his arsenal, however, has been game engines. “I decided to use some Grand Theft Auto V footage I shot in-game to use in our taxi stuntvis for The Golden Circle,” says Lee. “The game’s graphics are great. I was playing it at the time, and used the editor in the game to quickly and easily block out a shot, which I then comped our taxi into in After Effects. Then in Solo, working with Brad Allan again, I used Unreal Engine to add speeders, skiffs, and the AT-hauler to a scene. It never took very long, generally 20 minutes to an hour for each shot once I got used to it, and the renders came out so clean and quick in about only a minute or two. “The ability to quickly create 3D animation with real-time rendering is just extremely useful on set,” continues Lee. “I had used it for moments in the speeder chase [in Solo] as I would animate and render out Moloch’s speeder into some shots very quickly and easily for Brad to see the rough idea of the final shot. Props to ILM for supplying me with the assets on demand. I believe it is going to play a big role in the future of filmmaking.”
work. Then they would piece all of that together with some crude effects, and those are used as the templates for both Dave to comment on and ultimately for us to plan to and to shoot from. We were even using the stuntvis to cut into the convoy scenes. We used a combination of traditional previs and stuntvis and plates to help piece together the story.” Angus Bickerton “The stunt team, led by Brad Allan and Guillermo Grispo, with Yung Lee, love kinetic choreography, but in their stuntvis you always know where you are and what’s happening. That’s one thing that director Matthew Vaughan has a big mantra about as well. He hates just ‘screen chaos’. He wants to know where everything is. He needs to know all the blocking. They came up with hugely inventive fight choreography and inventive camera choreography. It’s amazing what you can do with an iPhone in a warehouse with some cardboard boxes. Along with the previs, the stuntvis really informed our exterior and greenscreen interior taxi shoot.”
“It’s amazing what you can do with an iPhone in a warehouse with some cardboard boxes.” —Angus Bickerton, Visual Effects Supervisor OPPOSITE TOP: A final shot from the convoy sequence in Deadpool 2, in which Deadpool and Cable fight on a moving prisoner transport truck. (Image copyright © 2018 20th Century Fox) OPPOSITE BOTTOM: A scene from the opening speeder chase on Corellia in Solo. (Image copyright © 2018 Lucasfilm Ltd.) BOTTOM: Director Matthew Vaughan and actor Taron Egerton discuss a scene from the taxi cab chase in Kingsman: The Golden Circle. (Image copyright © 2017 20th Century Fox)
FALL 2018 VFXVOICE.COM • 29
8/23/18 12:07 PM
TV
THE UNBRIDLED CREATIVITY OF LEGION By TREVOR HOGG
TOP: Lou Pecora OPPOSITE TOP: A full digital double was built of Dan Stevens as an older homeless man who gets chopped in half for a 60-frame shot. OPPOSITE BOTTOM: The White Space narrative segments required everything being roto so that background could be digitally replaced with a uniform shade of white.
30 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 30-31
Even though Zoic Studios VFX Supervisor Lou Pecora has worked on X-Men projects before, such as X-Men: Days of Future Past and X-Men: Apocalypse, the FX show Legion is something else entirely. In Legion, David Haller (Dan Stevens) is a psychiatric patient who learns that he is the son of powerful mutant and purges his infected mind from a malevolent presence known as Amahl Farouk/ Shadow King (Navid Negahban). “It’s so not a superhero show,” explains Pecora. “It’s more like what would happen if somebody really did have a mutant power. They would think that they were crazy and so would everybody else around them. Dan Stevens plays David Haller so sweetly that you feel for him and are cheering for him the whole time. By the end of this season we see David feel like he got burned, hurt and betrayed. [Creator and Showrunner] Noah Hawley is basically telling a villain’s backstory.” A shift occurred in the visual effects team during production of the second season of the series. “I got on the show as they were shooting the end of episode nine and about to start shooting episode 10,” says Pecora. “I took over supervising all of the visual effects and inherited a few vendors that turned out to be fantastic, like Crafty Apes, Muse VFX and Chicken Bone VFX. It’s a California tax credit show, so we had to use Californian companies as much as possible, which is the reverse pressure I’ve been under for the last 10 years. Then we had Floyd County Productions [based in Atlanta] who did some killer hand-drawn animation.” Over a period of five months, 1,300 visual effects shots were produced. “Because we got started later on Legion than what was ideal, previs wasn’t utilized as much as I would have liked,” reveals Pecora. “However, Muse VFX was able to do animatics for the creature in episode six.” Not many assets or effects were carried over from season one. “There were a few specific effects like these little particulates in the air, a specific energy wave and the way that people get thrown around violently that we referenced from the first season. “But the thing I found to be the most difficult to deal with is the lack of reuse and amortization of assets and effects. We build these assets and look development effects for one shot or a few shots. You get very little reuse on Legion. For instance, we built a full digital double of Dan Stevens as an older homeless guy that had to hold up medium in frame. He gets chopped in half for a 60-frame shot. That’s it. All that work for 60 frames. There is certainly no getting bored of the ‘same old effect’ all the time. “The writing on this show is solid,” notes Pecora. “Noah has a strong vision for where it needs to go, which keeps the curveballs down to a minimum. But sometimes he’ll see something that is getting close to being realized and he’ll say, ‘It would be cool if it was x, y or z.’ Or, ‘This isn’t working. What can we do to supplement it so it is working?’ I’ll give you an example. There’s this submarine that drives around the desert selling donuts. Initially, it looked too plain. They rebuilt the submarine and re-shot the sequence, but there was still something missing. I know that Noah is a fan of the Beatles, so we put this periscope stack on the top to make it look vaguely like the Yellow Submarine. We showed it to Noah, and he loved it. A stack of CG periscopes that is in just six shots.”
“I’ve always felt that this show is like David Lynch directing a ‘70s Saturday morning kids show with Wes Anderson as the production designer and Stanley Kubrick as the DP. There are lot of those types of visuals styles and cues.” —Lou Pecora, VFX Supervisor, Zoic Studios
FALL 2018 VFXVOICE.COM • 31
8/28/18 4:34 PM
TV
THE UNBRIDLED CREATIVITY OF LEGION By TREVOR HOGG
TOP: Lou Pecora OPPOSITE TOP: A full digital double was built of Dan Stevens as an older homeless man who gets chopped in half for a 60-frame shot. OPPOSITE BOTTOM: The White Space narrative segments required everything being roto so that background could be digitally replaced with a uniform shade of white.
30 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 30-31
Even though Zoic Studios VFX Supervisor Lou Pecora has worked on X-Men projects before, such as X-Men: Days of Future Past and X-Men: Apocalypse, the FX show Legion is something else entirely. In Legion, David Haller (Dan Stevens) is a psychiatric patient who learns that he is the son of powerful mutant and purges his infected mind from a malevolent presence known as Amahl Farouk/ Shadow King (Navid Negahban). “It’s so not a superhero show,” explains Pecora. “It’s more like what would happen if somebody really did have a mutant power. They would think that they were crazy and so would everybody else around them. Dan Stevens plays David Haller so sweetly that you feel for him and are cheering for him the whole time. By the end of this season we see David feel like he got burned, hurt and betrayed. [Creator and Showrunner] Noah Hawley is basically telling a villain’s backstory.” A shift occurred in the visual effects team during production of the second season of the series. “I got on the show as they were shooting the end of episode nine and about to start shooting episode 10,” says Pecora. “I took over supervising all of the visual effects and inherited a few vendors that turned out to be fantastic, like Crafty Apes, Muse VFX and Chicken Bone VFX. It’s a California tax credit show, so we had to use Californian companies as much as possible, which is the reverse pressure I’ve been under for the last 10 years. Then we had Floyd County Productions [based in Atlanta] who did some killer hand-drawn animation.” Over a period of five months, 1,300 visual effects shots were produced. “Because we got started later on Legion than what was ideal, previs wasn’t utilized as much as I would have liked,” reveals Pecora. “However, Muse VFX was able to do animatics for the creature in episode six.” Not many assets or effects were carried over from season one. “There were a few specific effects like these little particulates in the air, a specific energy wave and the way that people get thrown around violently that we referenced from the first season. “But the thing I found to be the most difficult to deal with is the lack of reuse and amortization of assets and effects. We build these assets and look development effects for one shot or a few shots. You get very little reuse on Legion. For instance, we built a full digital double of Dan Stevens as an older homeless guy that had to hold up medium in frame. He gets chopped in half for a 60-frame shot. That’s it. All that work for 60 frames. There is certainly no getting bored of the ‘same old effect’ all the time. “The writing on this show is solid,” notes Pecora. “Noah has a strong vision for where it needs to go, which keeps the curveballs down to a minimum. But sometimes he’ll see something that is getting close to being realized and he’ll say, ‘It would be cool if it was x, y or z.’ Or, ‘This isn’t working. What can we do to supplement it so it is working?’ I’ll give you an example. There’s this submarine that drives around the desert selling donuts. Initially, it looked too plain. They rebuilt the submarine and re-shot the sequence, but there was still something missing. I know that Noah is a fan of the Beatles, so we put this periscope stack on the top to make it look vaguely like the Yellow Submarine. We showed it to Noah, and he loved it. A stack of CG periscopes that is in just six shots.”
“I’ve always felt that this show is like David Lynch directing a ‘70s Saturday morning kids show with Wes Anderson as the production designer and Stanley Kubrick as the DP. There are lot of those types of visuals styles and cues.” —Lou Pecora, VFX Supervisor, Zoic Studios
FALL 2018 VFXVOICE.COM • 31
8/28/18 4:34 PM
TV
TOP: A blizzard takes place in a Jackson Pollock-inspired painting with David Haller (Dan Stevens) looking for Sydney Barrett (Rachel Keller) sitting inside an igloo. BOTTOM: A bunch of lollipops of different sizes were bought along with a toy carousel found on eBay to create an episode intro that starts off from an aerial perspective.
Key in developing the look of Legion are Production Designer Michael Wylie, Art Director Nick Ralbovsky and Concept Artist Laurent Ben-Mimoun. “The nice thing about this show is the way this team works and that everyone is of the same mind,” states Pecora. “That was proven when we were sitting in meetings talking about these laser weapons that pop up out of the ground. Keith Gordon was directing this episode and asked, ‘What does this weapon look like?’ Nathaniel Halpern, one of the Executive Producers and writers on the show, leaned back, rubbed his chin and replied, ‘If someone was going to design a gun to kill the Beatles… it should look like that.’ Wylie got up and drew a little picture on the whiteboard. ‘Like this?’ he asked. ‘Yeah, like that,’ Nathaniel said. We all snapped pictures of it, and when we showed up on set these things popped out of the ground that looked just like the whiteboard drawing that Wylie did.” “Stanley Kubrick is a big influence for both Noah and Polly Morgan, the DP,” observes Pecora. “I’ve always felt that this show is like David Lynch directing a ‘70s Saturday morning kids show with Wes Anderson as the production designer and Stanley Kubrick as the DP. There are lot of those types of visuals styles and cues. This is a dream show for someone trying to get familiar with how things work with gear on set. On a typical show you have a set of primes to use for the shoot, and maybe on an extravagant show you’ll have 15 to 20 lenses that you’ll use through the course of a two-hour feature. In one episode alone on Legion we used 56 lenses plus split and strip diopters, and all kinds of trick camera gear like a Squishy Lens, which I had never seen in person before.” Before Pecora joined Legion the delusion creature had already been designed. “The infant form was well on its way by the time I got involved. The grown-up form was developed as we worked on the episode. It’s a surreal creature which is its own thing apart from the infant version. There wasn’t a lot of research involved. It’s a delusion monster that comes from an egg. It has part of its morphology based on a chicken, but certainly surpassed it. It has wider legs, suckers and claws, and a more developed head. When the creature pops out of the back of Ptonomy Wallace [Jeremie Harris] you get a nice overhead shot where its legs spread out and plant on the ground. That was an homage to The Thing, directed by
John Carpenter, which is one of my all-time favorite movies.” Another interesting task was producing a singing mouse for episode six which sings “Slave to Love” by Bryan Ferry. “Muse FX has some really good animators, lighters and compositors,” remarks Pecora. “The direction from [editor] Regis Kimble kept the animation on track as far as what he was looking for. We kept thinking the mouse looked a bit like a marsupial because a [real] mouse can’t stand up and sing. There had to be some liberties taken in that regard. A very different-looking CG mouse goes into Amahl Farouk’s cell in the finale, and he talks to it. The mouse then goes into Sydney Barrett’s (Rachel Keller) bedroom and whispers into her ear. We shot HDRIs, but I believe they’re only good for a first pass of lighting. You need to get in there and do the diligence to make the creatures and characters look artistically solid.” Legion’s creators prefer the use of old-school stage magic over digital effects whenever possible. “We built a lot of stuff,” notes Pecora. “In episode 10 there’s this atomic structure that pops up around David and traps him. When they talked about building this
TOP: A submarine driving around the desert selling donuts was considered to be too plain. BOTTOM: Periscope stacks were added, inspired by the Yellow Submarine featured in a classic animated film by the Beatles.
“There’s this submarine that drives around the desert selling donuts. Initially, it looked too plain. They rebuilt the submarine and re-shot the sequence, but there was still something missing. I know that Noah is a fan of the Beatles, so we put this periscope stack on the top to make it look vaguely like the Yellow Submarine. We showed it to Noah, and he loved it. A stack of CG periscopes that is in just six shots.” —Lou Pecora, VFX Supervisor, Zoic Studios
32 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 32-33
FALL 2018 VFXVOICE.COM • 33
8/28/18 4:34 PM
TV
TOP: A blizzard takes place in a Jackson Pollock-inspired painting with David Haller (Dan Stevens) looking for Sydney Barrett (Rachel Keller) sitting inside an igloo. BOTTOM: A bunch of lollipops of different sizes were bought along with a toy carousel found on eBay to create an episode intro that starts off from an aerial perspective.
Key in developing the look of Legion are Production Designer Michael Wylie, Art Director Nick Ralbovsky and Concept Artist Laurent Ben-Mimoun. “The nice thing about this show is the way this team works and that everyone is of the same mind,” states Pecora. “That was proven when we were sitting in meetings talking about these laser weapons that pop up out of the ground. Keith Gordon was directing this episode and asked, ‘What does this weapon look like?’ Nathaniel Halpern, one of the Executive Producers and writers on the show, leaned back, rubbed his chin and replied, ‘If someone was going to design a gun to kill the Beatles… it should look like that.’ Wylie got up and drew a little picture on the whiteboard. ‘Like this?’ he asked. ‘Yeah, like that,’ Nathaniel said. We all snapped pictures of it, and when we showed up on set these things popped out of the ground that looked just like the whiteboard drawing that Wylie did.” “Stanley Kubrick is a big influence for both Noah and Polly Morgan, the DP,” observes Pecora. “I’ve always felt that this show is like David Lynch directing a ‘70s Saturday morning kids show with Wes Anderson as the production designer and Stanley Kubrick as the DP. There are lot of those types of visuals styles and cues. This is a dream show for someone trying to get familiar with how things work with gear on set. On a typical show you have a set of primes to use for the shoot, and maybe on an extravagant show you’ll have 15 to 20 lenses that you’ll use through the course of a two-hour feature. In one episode alone on Legion we used 56 lenses plus split and strip diopters, and all kinds of trick camera gear like a Squishy Lens, which I had never seen in person before.” Before Pecora joined Legion the delusion creature had already been designed. “The infant form was well on its way by the time I got involved. The grown-up form was developed as we worked on the episode. It’s a surreal creature which is its own thing apart from the infant version. There wasn’t a lot of research involved. It’s a delusion monster that comes from an egg. It has part of its morphology based on a chicken, but certainly surpassed it. It has wider legs, suckers and claws, and a more developed head. When the creature pops out of the back of Ptonomy Wallace [Jeremie Harris] you get a nice overhead shot where its legs spread out and plant on the ground. That was an homage to The Thing, directed by
John Carpenter, which is one of my all-time favorite movies.” Another interesting task was producing a singing mouse for episode six which sings “Slave to Love” by Bryan Ferry. “Muse FX has some really good animators, lighters and compositors,” remarks Pecora. “The direction from [editor] Regis Kimble kept the animation on track as far as what he was looking for. We kept thinking the mouse looked a bit like a marsupial because a [real] mouse can’t stand up and sing. There had to be some liberties taken in that regard. A very different-looking CG mouse goes into Amahl Farouk’s cell in the finale, and he talks to it. The mouse then goes into Sydney Barrett’s (Rachel Keller) bedroom and whispers into her ear. We shot HDRIs, but I believe they’re only good for a first pass of lighting. You need to get in there and do the diligence to make the creatures and characters look artistically solid.” Legion’s creators prefer the use of old-school stage magic over digital effects whenever possible. “We built a lot of stuff,” notes Pecora. “In episode 10 there’s this atomic structure that pops up around David and traps him. When they talked about building this
TOP: A submarine driving around the desert selling donuts was considered to be too plain. BOTTOM: Periscope stacks were added, inspired by the Yellow Submarine featured in a classic animated film by the Beatles.
“There’s this submarine that drives around the desert selling donuts. Initially, it looked too plain. They rebuilt the submarine and re-shot the sequence, but there was still something missing. I know that Noah is a fan of the Beatles, so we put this periscope stack on the top to make it look vaguely like the Yellow Submarine. We showed it to Noah, and he loved it. A stack of CG periscopes that is in just six shots.” —Lou Pecora, VFX Supervisor, Zoic Studios
32 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 32-33
FALL 2018 VFXVOICE.COM • 33
8/28/18 4:34 PM
TV
“The ultimate grandfather of all challenges on this project was the schedule. We’re running against airdates. When you go into a sound mix for an episode that you haven’t got one final visual effects shot in, it can be a nail-biting experience. I can’t thank the sound team enough for their undying patience with us!” —Lou Pecora, VFX Supervisor, Zoic Studios
TOP: A practical atomic-structured cell was constructed and lit with LED lights while Dan Stevens was supported by wirework.
34 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 35
thing of steel, putting LED ribbon lights on the inside and outside doing chase patterns, I thought, ‘We’re going to have to replace it with CG and repaint parts of David.’ They looked at me and said, ‘You’re not going to have to replace anything. It’s going into the show like this.’ You know how many times I’ve replaced things that I’ve heard are going into a show as built? We took a lot of pictures of Dan in case we had to rebuild him. But we didn’t need it in the end. That’s the first time ever for me that something which is supposed to be a high-tech visual effect ended up going into the show as it was built. I’m still pinching myself about that.” A desert battle takes place in the season two finale between David Haller and Amahl Farouk. “We went to the Polsa Rosa Ranch in Acton, California, and shot this whole sequence,” explains Pecora. “Polly was insistent that she didn’t want to shoot it on bluescreen; she wanted the natural light even though we knew that the sky was going to be replaced with storm clouds and giant animations. There are so many benefits to doing it for real as opposed to on a bluescreen. Mark Byers’ awesome special effects team built this track and a platform on a cart. They strapped Dan and Navid on a pair of bicycle seats and floated them across the ground. We had these giant cranes orbiting with Libra Head stabilizers keeping jitter and shake at bay. Camera operator Mitch Dubin was able to do these creative camera moves that looked really cool. You’re not going to get that on a bluescreen stage. I went out there the next day with a crew to shoot clean plates of desert with all of the tracks removed to be used in the clean-up.” In addition to lots of paint and roto, extensive simulations were required for this sequence. “The clouds had to look real
and natural but move in unnatural ways,” explains Pecora. “A supercell normally spins and rotates but our version approaches and expands. We found that when you’re far away, you’ve got to put more movement in the clouds, and when you’re close up you need to put less movement, otherwise, they look like time-lapse photography.” Creative input was welcomed from everyone on the production team. “We had this situation where Sydney Barrett is walking through this art gallery and there’s a framed greenscreen on the wall. She’s in this Egon Schiele exhibit where David keeps finding her. ‘What is this greenscreen suppose to be?’ I asked editor and producer Regis Kimble. He told me, ‘We are supposed to find Syd sitting inside an igloo in an abstract painting.’ That was the direction that I had to act on. We decided to make our own Jackson Pollock painting. We push into it and all of the layers of the blobs of paint become individual panes that you go through. You whip through the paint and find moving snow which transitions into a snowstorm, then an igloo and finally Syd sitting inside. Compositor Brian Harris really owned that shot.” “For another episode, we got to come up with the intro,” reveals Pecora. “The opening shot had to find its way to Lenny Busker (Aubrey Plaza), David and Oliver Bird (Jemaine Clement) riding on a carousel. Getting to [the carousel] was a little undeveloped when I started on the shot, so I mocked up this idea of a stack of giant lollipops with a carousel in the middle that are all spinning against each other. The idea went over well with show editor Todd Desrosiers, Regis and Noah, so we bought a bunch of lollipops of different sizes and found a toy carousel on eBay for $30. We shot
TOP: The wires were digitally painted out and the lighting made more luminescent to emphasize the levitational abilities of the imprisoned David Haller.
FALL 2018 VFXVOICE.COM • 35
8/28/18 4:34 PM
TV
“The ultimate grandfather of all challenges on this project was the schedule. We’re running against airdates. When you go into a sound mix for an episode that you haven’t got one final visual effects shot in, it can be a nail-biting experience. I can’t thank the sound team enough for their undying patience with us!” —Lou Pecora, VFX Supervisor, Zoic Studios
TOP: A practical atomic-structured cell was constructed and lit with LED lights while Dan Stevens was supported by wirework.
34 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 35
thing of steel, putting LED ribbon lights on the inside and outside doing chase patterns, I thought, ‘We’re going to have to replace it with CG and repaint parts of David.’ They looked at me and said, ‘You’re not going to have to replace anything. It’s going into the show like this.’ You know how many times I’ve replaced things that I’ve heard are going into a show as built? We took a lot of pictures of Dan in case we had to rebuild him. But we didn’t need it in the end. That’s the first time ever for me that something which is supposed to be a high-tech visual effect ended up going into the show as it was built. I’m still pinching myself about that.” A desert battle takes place in the season two finale between David Haller and Amahl Farouk. “We went to the Polsa Rosa Ranch in Acton, California, and shot this whole sequence,” explains Pecora. “Polly was insistent that she didn’t want to shoot it on bluescreen; she wanted the natural light even though we knew that the sky was going to be replaced with storm clouds and giant animations. There are so many benefits to doing it for real as opposed to on a bluescreen. Mark Byers’ awesome special effects team built this track and a platform on a cart. They strapped Dan and Navid on a pair of bicycle seats and floated them across the ground. We had these giant cranes orbiting with Libra Head stabilizers keeping jitter and shake at bay. Camera operator Mitch Dubin was able to do these creative camera moves that looked really cool. You’re not going to get that on a bluescreen stage. I went out there the next day with a crew to shoot clean plates of desert with all of the tracks removed to be used in the clean-up.” In addition to lots of paint and roto, extensive simulations were required for this sequence. “The clouds had to look real
and natural but move in unnatural ways,” explains Pecora. “A supercell normally spins and rotates but our version approaches and expands. We found that when you’re far away, you’ve got to put more movement in the clouds, and when you’re close up you need to put less movement, otherwise, they look like time-lapse photography.” Creative input was welcomed from everyone on the production team. “We had this situation where Sydney Barrett is walking through this art gallery and there’s a framed greenscreen on the wall. She’s in this Egon Schiele exhibit where David keeps finding her. ‘What is this greenscreen suppose to be?’ I asked editor and producer Regis Kimble. He told me, ‘We are supposed to find Syd sitting inside an igloo in an abstract painting.’ That was the direction that I had to act on. We decided to make our own Jackson Pollock painting. We push into it and all of the layers of the blobs of paint become individual panes that you go through. You whip through the paint and find moving snow which transitions into a snowstorm, then an igloo and finally Syd sitting inside. Compositor Brian Harris really owned that shot.” “For another episode, we got to come up with the intro,” reveals Pecora. “The opening shot had to find its way to Lenny Busker (Aubrey Plaza), David and Oliver Bird (Jemaine Clement) riding on a carousel. Getting to [the carousel] was a little undeveloped when I started on the shot, so I mocked up this idea of a stack of giant lollipops with a carousel in the middle that are all spinning against each other. The idea went over well with show editor Todd Desrosiers, Regis and Noah, so we bought a bunch of lollipops of different sizes and found a toy carousel on eBay for $30. We shot
TOP: The wires were digitally painted out and the lighting made more luminescent to emphasize the levitational abilities of the imprisoned David Haller.
FALL 2018 VFXVOICE.COM • 35
8/28/18 4:34 PM
TV
“In one episode alone on Legion we used 56 lenses plus split and strip diopters, and all kinds of trick camera gear like a Squishy Lens, which I had never seen in person before.” —Lou Pecora, VFX Supervisor, Zoic Studios
TOP: Live-action plates were shot at Polsa Rosa Ranch for the dramatic showdown between David Haller and Amahl Farouk (Navid Negahban). BOTTOM: Previs artist Lochlon Johnston was critical in helping to choreograph the battle in the sky sequences.
36 • VFXVOICE.COM FALL 2018
PG 30-36 LEGION.indd 37
the pieces individually on a Lazy Susan and comped them together. We showed Noah, and he said ‘Cool. Can we plant it in the ground somewhere and have cars and traffic driving around it, like the size of the Colosseum or the Pentagon?’ In the end we have this giant lollipop from an aerial perspective and then you’re suddenly into the shot. That is an example of how collaborative the team is on this show, and it is absolutely a blast to be part of.” Complicating matters is Legion being delivered in 4K and dynamic range, especially for the educational segments narrated by Jon Hamm. “That was an unanticipated challenge,” states Pecora. “It’s simply called The White Space. There are all of these different scenes that play out from a girl on a cellphone, to a bunch of cheerleaders having this twitch disorder that is somehow communicable, to a Salem-style witch hanging, to Farouk fixing his car which turns into a time machine. The set consists of white curtains with white floors and white walls. We were supposed to ‘just’ clean up the white. Not a problem. However, as soon as you throw White Space on a dynamic range monitor you see things that you don’t even see on DPXs for a film-style QC. We ended up having to roto every single thing in the White Space. We replaced the many different shades of white with a uniform one. We did the Red Space in episode six and that was not much easier for some reason.” “Oddly enough, the weirdest little things are the big challenges,” notes Pecora. “There’s a scene where this video plays out in these pools of water. Rudimentary aquariums were built on top of picnic tables that were filled with this blue liquid. I was trying to figure out what that video effect looks like without making this cheesy scene that you’ve seen a million times with chroma shifting on the edges and distortion or water ripple. We had to come up with something clever in a hurry. The ultimate grandfather of all challenges on this project was the schedule. We’re running against airdates. When you go into a sound mix for an episode that you haven’t got one final visual effects shot in, it can be a nail-biting experience. I can’t thank the sound team enough for their undying patience with us!” In spite of the many challenges and the merciless schedule, working on Legion has been a career highlight for Pecora. “If I could describe the whole process in two words it would be ‘unbridled creativity.’ Legion is a show that has everyone in all departments firing on all creative cylinders.” He adds, “There are times when you get near to the end of a show and are counting the days for it to be over, but that was not the case with Legion. This was the best time I’ve ever had on anything I’ve worked on in 25 years. I was missing Legion before it was even over!”
AD
FALL 2018 VFXVOICE.COM • 37
8/28/18 4:34 PM
PG 37 VFXV House AD rev.indd 37
8/29/18 4:09 PM
PROFILE
VFX SUPERVISOR JOHN NELSON: LOOKING THROUGH THE LENS FOR PERFECTION By BARBARA ROBERTSON
All images courtesy of John Nelson. TOP: John Nelson in 1982 at Robert Abel and Associates.
38 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 38-39
One foggy night in June, John Nelson was driving near Santa Monica when he spotted a soccer game in the distance. “I pulled over and started taking pictures of the soccer players in the fog, backlit by the lights,” he says. “That’s me with a camera. I’m constantly taking pictures. Thousands. Wherever I am. The camera is part of my body. When I look through the lens, I’m thinking but not thinking. I find the composition and go with it.” He didn’t know then that those photos would help him land the job as overall Visual Effects Supervisor for Blade Runner 2049, which in turn would lead to an Oscar and a BAFTA Award for visual effects, among other honors. “At the end of working on Point Blank, the producers asked me if I’d be interested in their next movie, a sequel to Blade Runner,” Nelson says. “I said, ‘Are you kidding me?’ To offer that to a visual effects supervisor who loves science fiction – it’s the Holy Grail. I had to meet Denis [Villeneuve, director]. When I did, I showed him [the photos of] the soccer players that I had shot at night. We hit it off, and we made the movie.” Foggy photos aside, Nelson also brought to the table a 35-year award-winning career as a feature film visual effects supervisor, a career during which he has received three Oscar nominations and an Oscar for supervising Ridley Scott’s Gladiator. Raised in a Detroit suburb, Nelson was obsessed with cinema and movies as a child. He even worked as a head usher in a movie theater as a teenager. “I was pulled to cinema and movies like a moth to the flame,” he says. But when he enrolled at the University of Michigan, he was encouraged to enter as a pre-med student by his father, a research chemist in the auto industry, and his mother, a nurse. After two years, he told his father he wanted to change majors and study filmmaking. “He said, ‘You’ll never make a dime. I’ll not pay for it,’” Nelson says. Undaunted, Nelson applied for and received a full scholarship and, because the university didn’t have a film degree, crafted a program himself with art, speech and television production classes. He was on his way. Nelson became an apprentice to a cameraman in the university’s television center after graduating, and began making 16mm films. “I submitted the films to festivals and won some awards, so I decided to go to California and look for a job,” he says. When he arrived, one of the people he called was Douglas Trumbull, VES. He somehow convinced Trumbull’s assistant to put the call through. “I talked to him for 15 or 20 minutes, and he was very helpful,” Nelson says. “He asked me if I was in the camera union. When I told him I wasn’t, he said I should talk to Robert Abel.” That was in 1979. Robert Abel & Associates was an innovative and award-winning production studio specializing in television commercials, and a pioneer in the use of computer graphics. When Nelson called, they happened to need a cameraman. They told him if he could show up the next day and load a particular camera, they could use him. He did, and was hired.
“I moved from Detroit to L.A. with my girlfriend, now my wife, and became a cameraman on the night shift,” Nelson says. “After two, two and a half years, I said to Bob Abel, ‘You know, I find this computer stuff interesting.’ It was just the beginning of vector graphics coming out of aerospace and architecture. Bob was a quick decision maker. He said, ‘OK. Here’s a job due in two weeks. We’ll see how you do.’” It was a commercial for American Airlines. Nelson became the technical director, and the commercial was a success. After that, he worked on hundreds of TV commercials at Abel, sharing a cubicle with John Hughes, who would found Rhythm & Hues, and working on award-winning commercials with Steve Beck. In 1987, after eight years at Abel as technical director, animator and cameraman, Nelson received a job offer that took him to Germany for two years. “I was asked to help set up the company Mental Images,” he says. Mental Images was a Berlin-based software company that created the Mental Ray rendering software. The company also had a computer animation division that used the software for production. Visual effects supervisors John Andrew Berton Jr. and Stefen Fangmeier also worked in the animation division at that time. “My wife and I lived in Europe for two years,” Nelson says. “It was a wonderful experience, but a bit of a culture shock. I was used to the way we made decisions in L.A., which was pretty quick. One of the wonderful things about the people in Germany is that they’re careful, meticulous and very deliberate. I worked my ass off.” When he came back to the U.S., he took a job at Industrial Light & Magic, first in the commercials division and then in the feature film division. “I went from being a commercial director to an animator and
“I was pulled to cinema and movies like a moth to the flame.” —John Nelson TOP: Ridley Scott, Giannina Facio and John Nelson at 2018 BAFTA Awards. BOTTOM: From left: Richard Hoover, Paul Lambert, Gerd Nefzer and John Nelson at the 2018 Academy Awards.
FALL 2018 VFXVOICE.COM • 39
8/23/18 12:16 PM
PROFILE
VFX SUPERVISOR JOHN NELSON: LOOKING THROUGH THE LENS FOR PERFECTION By BARBARA ROBERTSON
All images courtesy of John Nelson. TOP: John Nelson in 1982 at Robert Abel and Associates.
38 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 38-39
One foggy night in June, John Nelson was driving near Santa Monica when he spotted a soccer game in the distance. “I pulled over and started taking pictures of the soccer players in the fog, backlit by the lights,” he says. “That’s me with a camera. I’m constantly taking pictures. Thousands. Wherever I am. The camera is part of my body. When I look through the lens, I’m thinking but not thinking. I find the composition and go with it.” He didn’t know then that those photos would help him land the job as overall Visual Effects Supervisor for Blade Runner 2049, which in turn would lead to an Oscar and a BAFTA Award for visual effects, among other honors. “At the end of working on Point Blank, the producers asked me if I’d be interested in their next movie, a sequel to Blade Runner,” Nelson says. “I said, ‘Are you kidding me?’ To offer that to a visual effects supervisor who loves science fiction – it’s the Holy Grail. I had to meet Denis [Villeneuve, director]. When I did, I showed him [the photos of] the soccer players that I had shot at night. We hit it off, and we made the movie.” Foggy photos aside, Nelson also brought to the table a 35-year award-winning career as a feature film visual effects supervisor, a career during which he has received three Oscar nominations and an Oscar for supervising Ridley Scott’s Gladiator. Raised in a Detroit suburb, Nelson was obsessed with cinema and movies as a child. He even worked as a head usher in a movie theater as a teenager. “I was pulled to cinema and movies like a moth to the flame,” he says. But when he enrolled at the University of Michigan, he was encouraged to enter as a pre-med student by his father, a research chemist in the auto industry, and his mother, a nurse. After two years, he told his father he wanted to change majors and study filmmaking. “He said, ‘You’ll never make a dime. I’ll not pay for it,’” Nelson says. Undaunted, Nelson applied for and received a full scholarship and, because the university didn’t have a film degree, crafted a program himself with art, speech and television production classes. He was on his way. Nelson became an apprentice to a cameraman in the university’s television center after graduating, and began making 16mm films. “I submitted the films to festivals and won some awards, so I decided to go to California and look for a job,” he says. When he arrived, one of the people he called was Douglas Trumbull, VES. He somehow convinced Trumbull’s assistant to put the call through. “I talked to him for 15 or 20 minutes, and he was very helpful,” Nelson says. “He asked me if I was in the camera union. When I told him I wasn’t, he said I should talk to Robert Abel.” That was in 1979. Robert Abel & Associates was an innovative and award-winning production studio specializing in television commercials, and a pioneer in the use of computer graphics. When Nelson called, they happened to need a cameraman. They told him if he could show up the next day and load a particular camera, they could use him. He did, and was hired.
“I moved from Detroit to L.A. with my girlfriend, now my wife, and became a cameraman on the night shift,” Nelson says. “After two, two and a half years, I said to Bob Abel, ‘You know, I find this computer stuff interesting.’ It was just the beginning of vector graphics coming out of aerospace and architecture. Bob was a quick decision maker. He said, ‘OK. Here’s a job due in two weeks. We’ll see how you do.’” It was a commercial for American Airlines. Nelson became the technical director, and the commercial was a success. After that, he worked on hundreds of TV commercials at Abel, sharing a cubicle with John Hughes, who would found Rhythm & Hues, and working on award-winning commercials with Steve Beck. In 1987, after eight years at Abel as technical director, animator and cameraman, Nelson received a job offer that took him to Germany for two years. “I was asked to help set up the company Mental Images,” he says. Mental Images was a Berlin-based software company that created the Mental Ray rendering software. The company also had a computer animation division that used the software for production. Visual effects supervisors John Andrew Berton Jr. and Stefen Fangmeier also worked in the animation division at that time. “My wife and I lived in Europe for two years,” Nelson says. “It was a wonderful experience, but a bit of a culture shock. I was used to the way we made decisions in L.A., which was pretty quick. One of the wonderful things about the people in Germany is that they’re careful, meticulous and very deliberate. I worked my ass off.” When he came back to the U.S., he took a job at Industrial Light & Magic, first in the commercials division and then in the feature film division. “I went from being a commercial director to an animator and
“I was pulled to cinema and movies like a moth to the flame.” —John Nelson TOP: Ridley Scott, Giannina Facio and John Nelson at 2018 BAFTA Awards. BOTTOM: From left: Richard Hoover, Paul Lambert, Gerd Nefzer and John Nelson at the 2018 Academy Awards.
FALL 2018 VFXVOICE.COM • 39
8/23/18 12:16 PM
PROFILE
TOP: Nelson says, “What goes on in my mind goes on the white board and is fixed asap.” BOTTOM: Shooting boards for Blade Runner 2049 on set.
technical director, but it was for features,” Nelson says. The first feature was the visual effects Oscar-winning Terminator 2, supervised by Dennis Muren VES, ASC, which released in 1991. Among the other animators working on the film was Stephen Rosenbaum, who competed with Nelson for an Oscar this year. “I worked with maybe 15 or 16 people, many of whom have gone on to become visual effects supervisors,” Nelson says. “We all did just about everything – model, animate, light, render, composite. I loved working on that show.” He worked on several shots for T2, including, notably, the iconic scene in which the shotgunned head of the chrome terminator splits open and reseals. After T2, Nelson moved back down to Los Angeles where he spent a brief time at Rhythm & Hues. He then joined Sony Pictures Imageworks, where he stayed for several years working as a visual effects supervisor on, as he puts it, whatever they had, including In the Line of Fire, The Pelican Brief and Anaconda. “I learned a lot, but after City of Angels it was time to move on,” he says. “So I interviewed with Ridley Scott for Gladiator. It came down to Scott Anderson, Mike Fink or me. Scott [Anderson] and I had met at ILM – he and I had been hired for T2 on the same day. He was offered Paul Verhoeven’s Hollow Man, so I ended up doing Gladiator and winning the Oscar along with Tim Burke and Rob Harvey of Mill Film, as well as Special Effects Supervisor Neil Corbould. ” “There were some brilliant people at Mill Film: Tim, Rob, Grahame Andrew, Ben Morris of ILM, who I was up against for the Oscar this year, and others,” Nelson says. “But it was a small house used to doing commercials. So I helped the process of trying to get a large film done.” The studio’s recreation of actor Oliver Reed – after he died during production – by mapping a CG mask of Reed’s face onto a body double received much publicity, but the complex work involved in creating the opening battle shot is just as interesting to Nelson. “It’s a big pan from left to right that we shot in VistaVision late in the day,” he says. “We had 300, maybe 500 soldiers and could afford to build only one or two catapults. So we shot the left side, middle, and right side with locked off plates, shooting quickly. Then, we sewed all these into one big pan cell, took out the lens
“In all three of these movies, Gladiator, I, Robot and Iron Man, there is a big idea that visual effects has to answer. For Gladiator, you have to believe in the overwhelming technical superiority of Rome. … For I, Robot, you have to believe a robot can think and emote. And for Iron Man, you have to believe a thousand-pound suit can fly. ... Every shot had to reinforce that big idea.” —John Nelson
40 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 40-41
distortion, created a virtual movie inside, and then added lens distortion and a bunch of CG effects. It looks like a big moving shot. When one catapult fires, the CG takes it all the way into the forest where they hook up with Neil Corbould’s propane explosions – also shot locked off in tiles. I wanted to use many 2D photographic effects in the first act of the film to acclimate the audience with photographic realism before we introduced 3D CGI when we got to Rome. After Gladiator, Nelson took time off to be with his ailing father. After his dad passed, he was a visual effects supervisor for K-19: The Widowmaker, a visual effects supervisor at CFX for The Matrix Reloaded and Matrix Revolutions, and did a stint as Visual Effects Supervisor on Sorcerer’s Apprentice. “I also did some second unit stuff for Sorcerer’s Apprentice, which got me in the DGA,” he says. Nelson’s next big films as overall Visual Effects Supervisor were
TOP: John Nelson and Joe Wehmeyer on set shooting HDRI’s on Blade Runner 2049. BOTTOM: Nelson on set on Blade Runner 2049 lining up the women for “The Merge.”
“Denis [Villeneuve] said he wondered if it was a good idea to do a sequel [to Blade Runner] because the first was so impressive and bold. But he was certainly the man to do it. He is a brilliant painter of cinematic visuals in the same way Ridley [Scott] is. Two master painters looking at the same original material.” —John Nelson
FALL 2018 VFXVOICE.COM • 41
8/23/18 12:16 PM
PROFILE
TOP: Nelson says, “What goes on in my mind goes on the white board and is fixed asap.” BOTTOM: Shooting boards for Blade Runner 2049 on set.
technical director, but it was for features,” Nelson says. The first feature was the visual effects Oscar-winning Terminator 2, supervised by Dennis Muren VES, ASC, which released in 1991. Among the other animators working on the film was Stephen Rosenbaum, who competed with Nelson for an Oscar this year. “I worked with maybe 15 or 16 people, many of whom have gone on to become visual effects supervisors,” Nelson says. “We all did just about everything – model, animate, light, render, composite. I loved working on that show.” He worked on several shots for T2, including, notably, the iconic scene in which the shotgunned head of the chrome terminator splits open and reseals. After T2, Nelson moved back down to Los Angeles where he spent a brief time at Rhythm & Hues. He then joined Sony Pictures Imageworks, where he stayed for several years working as a visual effects supervisor on, as he puts it, whatever they had, including In the Line of Fire, The Pelican Brief and Anaconda. “I learned a lot, but after City of Angels it was time to move on,” he says. “So I interviewed with Ridley Scott for Gladiator. It came down to Scott Anderson, Mike Fink or me. Scott [Anderson] and I had met at ILM – he and I had been hired for T2 on the same day. He was offered Paul Verhoeven’s Hollow Man, so I ended up doing Gladiator and winning the Oscar along with Tim Burke and Rob Harvey of Mill Film, as well as Special Effects Supervisor Neil Corbould. ” “There were some brilliant people at Mill Film: Tim, Rob, Grahame Andrew, Ben Morris of ILM, who I was up against for the Oscar this year, and others,” Nelson says. “But it was a small house used to doing commercials. So I helped the process of trying to get a large film done.” The studio’s recreation of actor Oliver Reed – after he died during production – by mapping a CG mask of Reed’s face onto a body double received much publicity, but the complex work involved in creating the opening battle shot is just as interesting to Nelson. “It’s a big pan from left to right that we shot in VistaVision late in the day,” he says. “We had 300, maybe 500 soldiers and could afford to build only one or two catapults. So we shot the left side, middle, and right side with locked off plates, shooting quickly. Then, we sewed all these into one big pan cell, took out the lens
“In all three of these movies, Gladiator, I, Robot and Iron Man, there is a big idea that visual effects has to answer. For Gladiator, you have to believe in the overwhelming technical superiority of Rome. … For I, Robot, you have to believe a robot can think and emote. And for Iron Man, you have to believe a thousand-pound suit can fly. ... Every shot had to reinforce that big idea.” —John Nelson
40 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 40-41
distortion, created a virtual movie inside, and then added lens distortion and a bunch of CG effects. It looks like a big moving shot. When one catapult fires, the CG takes it all the way into the forest where they hook up with Neil Corbould’s propane explosions – also shot locked off in tiles. I wanted to use many 2D photographic effects in the first act of the film to acclimate the audience with photographic realism before we introduced 3D CGI when we got to Rome. After Gladiator, Nelson took time off to be with his ailing father. After his dad passed, he was a visual effects supervisor for K-19: The Widowmaker, a visual effects supervisor at CFX for The Matrix Reloaded and Matrix Revolutions, and did a stint as Visual Effects Supervisor on Sorcerer’s Apprentice. “I also did some second unit stuff for Sorcerer’s Apprentice, which got me in the DGA,” he says. Nelson’s next big films as overall Visual Effects Supervisor were
TOP: John Nelson and Joe Wehmeyer on set shooting HDRI’s on Blade Runner 2049. BOTTOM: Nelson on set on Blade Runner 2049 lining up the women for “The Merge.”
“Denis [Villeneuve] said he wondered if it was a good idea to do a sequel [to Blade Runner] because the first was so impressive and bold. But he was certainly the man to do it. He is a brilliant painter of cinematic visuals in the same way Ridley [Scott] is. Two master painters looking at the same original material.” —John Nelson
FALL 2018 VFXVOICE.COM • 41
8/23/18 12:16 PM
PROFILE
“What gets me going is when I have a hard problem and I can marshal all the technique at my disposal and all I’ve learned from my experience to come up with an elegant, creative solution that gives a director what he wants.” —John Nelson TOP LEFT: Nelson on Blade Runner 2049: “This is one of the first pictures I showed Denis Villenueve when I interviewed for the job.” TOP RIGHT: Nelson, Ridley Scott and Production Designer Arthur Max at the Napoleonic fort used to make the Colosseum for Gladiator. (Photo: John Mathieson). BOTTOM: Still of scout of Bourne Woods set on Gladiator. Nelson says, “This view was eventually done by sewing three VistaVision plates together, adding a ton of CG to them and then doing a pan across a stitched plate.”
42 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 43
I, Robot and Iron Man, and he received Oscar nominations for both films. “In all three of these movies, Gladiator, I, Robot and Iron Man, there is a big idea that visual effects has to answer,” he says. “For Gladiator, you have to believe in the overwhelming technical superiority of Rome. We’ve learned so much from the Romans in terms of architecture, government, economics, and even sporting venues. Every shot had to reinforce that big idea.” “For I, Robot, you have to believe a robot can think and emote,” he continues. “And for Iron Man, you have to believe a thousand-pound suit can fly. Tony Stark is not from another planet, he’s not magically defying the rules of physics. He’s a smart rich guy who built a thousand-pound suit with enough thrust to put him in the air.” Following Iron Man, Nelson was visual effects supervisor for a while on both World War Z and Blackhat, but neither project went well. After the highs of the previous two films, Nelson had hit a low period. He decided to take a year off. “My wife describes it as the best year in 20,” he says. “I think one of the several reasons we’ve made it together is that she has worked in the business (at Boss Film) and knows what it’s like.” “Filmmaking is a really, really hard job because it’s creative,
technical, political, and certainly economic as well,” he adds. “You try to balance all those things together. When people leave movies, they often say it’s for creative reasons, but usually it’s just personalities. I try to be as open and collaborative as I can be, but sometimes, for whatever reason, it just doesn’t work out. When it doesn’t, you learn from the experience and move on. Fortunately, most of the shows I work on turn out well, and sometimes, they work out wonderfully.” And that brings us to Blade Runner 2049. After the year off, Nelson decided to work on a relatively low-budget movie called Point Break. “I called up some friends and said that this is a small movie, but it could be cool,” he says. “We did 1,300 VFX shots for $15 million. It was absurdly inexpensive for that many good-looking shots. The producers were Alcon Entertainment, and at the end they asked me if I would be interested in their next movie: a sequel to Blade Runner.” “It was an incredible trip,” Nelson says. “It was so difficult. It consumed my life and my being for two years. Denis [Villeneuve] said he wondered if it was a good idea to do a sequel because the first was so impressive and bold. But he was certainly the man to do it. He is a brilliant painter of cinematic visuals in the same way Ridley [Scott] is. Two master painters looking at the same original material. We tried to make a movie with an analog feel. Part of my job is to understand what a director wants, so I didn’t show Denis anything that didn’t look analog. I think we made a better film by reining the VFX in. It felt intuitive. I’m far too close to this movie to be objective, but people say it feels like I went to a place where we can feel the connection to the old place. And that felt like a high compliment.” Now, with a second Oscar facing his first on the mantel in the living room, Nelson plans to take a little time off to enjoy his accomplishments, and his family, his wife and son Miles, who also worked on Blade Runner. “I promised my wife that after two movies in a row, I would give her six months,” he says. “There are a couple projects out there that I would love to do, but we will see.” As for those Oscars: “Every day we put them in different poses,” Nelson says. “I have all these action figures and wooden pose men.
TOP LEFT: Says Nelson, “To populate the Colosseum, we shot extras against green wearing blue fabric so we could pull both the outline of the extra and the color of their wardrobe.” TOP RIGHT: Nelson on the set of Iron Man with Robert Downey Jr. and Shane Mahan. BOTTOM: Nelson, Robert Downey Jr. and Victoria Alonso on the last day of Iron Man.
FALL 2018 VFXVOICE.COM • 43
8/23/18 12:16 PM
PROFILE
“What gets me going is when I have a hard problem and I can marshal all the technique at my disposal and all I’ve learned from my experience to come up with an elegant, creative solution that gives a director what he wants.” —John Nelson TOP LEFT: Nelson on Blade Runner 2049: “This is one of the first pictures I showed Denis Villenueve when I interviewed for the job.” TOP RIGHT: Nelson, Ridley Scott and Production Designer Arthur Max at the Napoleonic fort used to make the Colosseum for Gladiator. (Photo: John Mathieson). BOTTOM: Still of scout of Bourne Woods set on Gladiator. Nelson says, “This view was eventually done by sewing three VistaVision plates together, adding a ton of CG to them and then doing a pan across a stitched plate.”
42 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 43
I, Robot and Iron Man, and he received Oscar nominations for both films. “In all three of these movies, Gladiator, I, Robot and Iron Man, there is a big idea that visual effects has to answer,” he says. “For Gladiator, you have to believe in the overwhelming technical superiority of Rome. We’ve learned so much from the Romans in terms of architecture, government, economics, and even sporting venues. Every shot had to reinforce that big idea.” “For I, Robot, you have to believe a robot can think and emote,” he continues. “And for Iron Man, you have to believe a thousand-pound suit can fly. Tony Stark is not from another planet, he’s not magically defying the rules of physics. He’s a smart rich guy who built a thousand-pound suit with enough thrust to put him in the air.” Following Iron Man, Nelson was visual effects supervisor for a while on both World War Z and Blackhat, but neither project went well. After the highs of the previous two films, Nelson had hit a low period. He decided to take a year off. “My wife describes it as the best year in 20,” he says. “I think one of the several reasons we’ve made it together is that she has worked in the business (at Boss Film) and knows what it’s like.” “Filmmaking is a really, really hard job because it’s creative,
technical, political, and certainly economic as well,” he adds. “You try to balance all those things together. When people leave movies, they often say it’s for creative reasons, but usually it’s just personalities. I try to be as open and collaborative as I can be, but sometimes, for whatever reason, it just doesn’t work out. When it doesn’t, you learn from the experience and move on. Fortunately, most of the shows I work on turn out well, and sometimes, they work out wonderfully.” And that brings us to Blade Runner 2049. After the year off, Nelson decided to work on a relatively low-budget movie called Point Break. “I called up some friends and said that this is a small movie, but it could be cool,” he says. “We did 1,300 VFX shots for $15 million. It was absurdly inexpensive for that many good-looking shots. The producers were Alcon Entertainment, and at the end they asked me if I would be interested in their next movie: a sequel to Blade Runner.” “It was an incredible trip,” Nelson says. “It was so difficult. It consumed my life and my being for two years. Denis [Villeneuve] said he wondered if it was a good idea to do a sequel because the first was so impressive and bold. But he was certainly the man to do it. He is a brilliant painter of cinematic visuals in the same way Ridley [Scott] is. Two master painters looking at the same original material. We tried to make a movie with an analog feel. Part of my job is to understand what a director wants, so I didn’t show Denis anything that didn’t look analog. I think we made a better film by reining the VFX in. It felt intuitive. I’m far too close to this movie to be objective, but people say it feels like I went to a place where we can feel the connection to the old place. And that felt like a high compliment.” Now, with a second Oscar facing his first on the mantel in the living room, Nelson plans to take a little time off to enjoy his accomplishments, and his family, his wife and son Miles, who also worked on Blade Runner. “I promised my wife that after two movies in a row, I would give her six months,” he says. “There are a couple projects out there that I would love to do, but we will see.” As for those Oscars: “Every day we put them in different poses,” Nelson says. “I have all these action figures and wooden pose men.
TOP LEFT: Says Nelson, “To populate the Colosseum, we shot extras against green wearing blue fabric so we could pull both the outline of the extra and the color of their wardrobe.” TOP RIGHT: Nelson on the set of Iron Man with Robert Downey Jr. and Shane Mahan. BOTTOM: Nelson, Robert Downey Jr. and Victoria Alonso on the last day of Iron Man.
FALL 2018 VFXVOICE.COM • 43
8/23/18 12:16 PM
PROFILE
“People think visual effects is about tons and tons of detail. It’s not about that. It’s about how it makes you feel emotionally. That’s what it is for me – to find that emotional perfect zone that gives the director what he wants, and to find inside what I think is special. To craft that and make it intuitive.” —John Nelson
TOP: Nelson says, “My best shot from Terminator 2.” BOTTOM: Nelson in his office on Point Break. “To get it done requires the calendar with Post-its. Things can move, but deadlines remain the same.”
44 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 45
Recently I had a pose man put his arm around new Oscar as if he was introducing him to old Oscar.” “I feel very blessed,” he adds. “I’m honored to work in the movie business, which I think is a privilege. People in this business really love what they’re doing. It’s not a paycheck, it’s a passion. What gets me going is when I have a hard problem and I can marshal all the technique at my disposal and all I’ve learned from my experience to come up with an elegant, creative solution that gives a director what he wants. People think visual effects is about tons and tons of detail. It’s not about that. It’s about how it makes you feel emotionally. That’s what it is for me – to find that emotional perfect zone that gives the director what he wants, and to find inside what I think is special. To craft that and make it intuitive.” Nelson is always looking through the lens for that “special” quality. On his personal website are two strong images, a striking, abstract photograph with bright streaks forming patterns and perhaps a face against a dark background, and an image from Blade Runner 2049 of holographic Joi in a fuchsia fog. Of the photograph, Nelson says, “I shot a bunch of Christmas lights in my front yard and they look like sparks,” he says. “I’m taking photographs or movies on my iPhone wherever I go. I take these runs, two miles to the beach, two miles on the beach, then two miles back. It’s my meditative place. But if I see something, I look through the lens, move through this intuitive zone to find the perfect spot, get it and move on. I do the same thing with a viewfinder. Coming from camera has been very good for me.”
TOP LEFT: Nelson and Deborah Gaydos at the 2018 BAFTA Awards. TOP RIGHT: Extras, Nelson and Roger Deakins shoot a test on Blade Runner 2049. BOTTOM RIGHT: Nelson walking 18% gray and shiny balls with Macbeth chart and real zombie reference through the shot for lighting reference on World War Z.
FALL 2018 VFXVOICE.COM • 45
8/23/18 12:16 PM
PROFILE
“People think visual effects is about tons and tons of detail. It’s not about that. It’s about how it makes you feel emotionally. That’s what it is for me – to find that emotional perfect zone that gives the director what he wants, and to find inside what I think is special. To craft that and make it intuitive.” —John Nelson
TOP: Nelson says, “My best shot from Terminator 2.” BOTTOM: Nelson in his office on Point Break. “To get it done requires the calendar with Post-its. Things can move, but deadlines remain the same.”
44 • VFXVOICE.COM FALL 2018
PG 38-45 JOHN NELSON.indd 45
Recently I had a pose man put his arm around new Oscar as if he was introducing him to old Oscar.” “I feel very blessed,” he adds. “I’m honored to work in the movie business, which I think is a privilege. People in this business really love what they’re doing. It’s not a paycheck, it’s a passion. What gets me going is when I have a hard problem and I can marshal all the technique at my disposal and all I’ve learned from my experience to come up with an elegant, creative solution that gives a director what he wants. People think visual effects is about tons and tons of detail. It’s not about that. It’s about how it makes you feel emotionally. That’s what it is for me – to find that emotional perfect zone that gives the director what he wants, and to find inside what I think is special. To craft that and make it intuitive.” Nelson is always looking through the lens for that “special” quality. On his personal website are two strong images, a striking, abstract photograph with bright streaks forming patterns and perhaps a face against a dark background, and an image from Blade Runner 2049 of holographic Joi in a fuchsia fog. Of the photograph, Nelson says, “I shot a bunch of Christmas lights in my front yard and they look like sparks,” he says. “I’m taking photographs or movies on my iPhone wherever I go. I take these runs, two miles to the beach, two miles on the beach, then two miles back. It’s my meditative place. But if I see something, I look through the lens, move through this intuitive zone to find the perfect spot, get it and move on. I do the same thing with a viewfinder. Coming from camera has been very good for me.”
TOP LEFT: Nelson and Deborah Gaydos at the 2018 BAFTA Awards. TOP RIGHT: Extras, Nelson and Roger Deakins shoot a test on Blade Runner 2049. BOTTOM RIGHT: Nelson walking 18% gray and shiny balls with Macbeth chart and real zombie reference through the shot for lighting reference on World War Z.
FALL 2018 VFXVOICE.COM • 45
8/23/18 12:16 PM
COVER
THE WORLD OF THEME PARKS AND VFX By CHRIS McGOWAN
TOP: Harry Potter and the Forbidden Journey at The Wizarding World of Harry Potter at Universal Studios Florida. (Image courtesy of Universal Studios Florida.) BOTTOM: Avatar: Flight of Passage at Disney’s Animal Kingdom. (Image courtesy of the Walt Disney Company)
46 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 46-47
Not so long ago, in a galaxy not so far away, the first guests entered the Star Wars universe with the help of motion simulation and a first-person-perspective film created by George Lucas, Dennis Muren, VES and Industrial Light & Magic (ILM). Star Tours debuted in 1987 at Disneyland and was a groundbreaking ancestor of recent theme park attractions like Avatar: Flight of Passage, King Kong 360 3-D and Pirates of the Caribbean: Battle for the Sunken Treasure, rides that incorporate ultra-high frame-rate, ultra-high-resolution digital imagery, motion simulation and/or sensory stimuli. Such attractions place guests in the middle of the narrative, usually one based on a film franchise. It is an evolving form of entertainment – a generally short and intense storytelling mode – that offers media companies an additional way to leverage IP as well as create original content. This hybrid form is bringing VFX artists and theme park specialists together in ever-increasing numbers to provide visual effects and storytelling for attractions as well as the ride queues leading up to them. Theme park designers have realized that “flat rides” need to adapt to today’s audiences, which generally have a short attention span and are interested in more immersive fare. Walt Disney Imagineering (WDI) and Lightstorm Entertainment (co-founded by James Cameron) teamed with Weta Digital to create Avatar: Flight of Passage for Disney’s Animal Kingdom; Universal Creative worked with Weta on King Kong 360 3-D for Universal Studios Hollywood; WDI collaborated with ILM on Pirates of the Caribbean: Battle for the Sunken Treasure for Shanghai Disneyland Park; and Sally Corporation worked with Pure Imagination Studios on numerous Justice League: Battle for Metropolis rides for multiple Six Flags parks. Framestore, Pixomondo and The Third Floor are among the
other VFX/animation companies working with theme park ride designers. ILM has also joined forces with The Void on Star Wars: Secrets of the Empire, a location-based VR experience. The attraction is not, strictly speaking, situated in a theme park, and it clocks in at 25 minutes as opposed to the few minutes for most “dark rides,” but it hints at future possibilities both in and out of parks. Dark rides with CGI are a crucial part of today’s theme parks, which are thriving. The top 10 theme park operators drew in 475.8 million visitors in 2017, an increase of 8.6% over the previous year, according to a report from TEA (Themed Entertainment Association) and engineering firm AECOM. Walt Disney Attractions was No. 1 on the list (with 150 million visitors, twice that of the No. 2 company, Merlin Entertainments based in England. Universal Parks and Resorts was next, followed by three Chinese companies – OCT Group, Fantawild and Chimelong Group – and then Six Flags Entertainment Corporation. The top 25 worldwide destinations expanded attendance by 4.7%. Disney dominates the list, with the three most popular locations and eight of the top 10. The TEA/AECOM report forecasts that China will be the largest theme park market by 2020, driven by Shanghai Disneyland (which had its first full year in 2017), Universal Studios Beijing (set to open in 2020), the new all-VR theme park, Oriental Science Fiction Valley, and other new parks. The entities that design CGI-charged rides – like WDI, Universal Creative, Sally Corporation (based in Jacksonville, Florida), Falcon’s Creative Group (Orlando, Florida), The Hettema Group (Pasadena, California), Thinkwell Group (Los Angeles), and ITEC Entertainment (Orlando, Florida) – team with VFX people and firms on projects that often require a lot of manpower and processing power. For example, Pirates of the Caribbean: Battle for
TOP: Avatar: Flight of Passage at Disney’s Animal Kingdom. (Image courtesy of the Walt Disney Company) BOTTOM: Pirates of the Caribbean: Battle for the Sunken Treasure at Shanghai Disneyland (Photo: Ryan Wendler. Image courtesy of the Walt Disney Company)
FALL 2018 VFXVOICE.COM • 47
8/23/18 12:17 PM
COVER
THE WORLD OF THEME PARKS AND VFX By CHRIS McGOWAN
TOP: Harry Potter and the Forbidden Journey at The Wizarding World of Harry Potter at Universal Studios Florida. (Image courtesy of Universal Studios Florida.) BOTTOM: Avatar: Flight of Passage at Disney’s Animal Kingdom. (Image courtesy of the Walt Disney Company)
46 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 46-47
Not so long ago, in a galaxy not so far away, the first guests entered the Star Wars universe with the help of motion simulation and a first-person-perspective film created by George Lucas, Dennis Muren, VES and Industrial Light & Magic (ILM). Star Tours debuted in 1987 at Disneyland and was a groundbreaking ancestor of recent theme park attractions like Avatar: Flight of Passage, King Kong 360 3-D and Pirates of the Caribbean: Battle for the Sunken Treasure, rides that incorporate ultra-high frame-rate, ultra-high-resolution digital imagery, motion simulation and/or sensory stimuli. Such attractions place guests in the middle of the narrative, usually one based on a film franchise. It is an evolving form of entertainment – a generally short and intense storytelling mode – that offers media companies an additional way to leverage IP as well as create original content. This hybrid form is bringing VFX artists and theme park specialists together in ever-increasing numbers to provide visual effects and storytelling for attractions as well as the ride queues leading up to them. Theme park designers have realized that “flat rides” need to adapt to today’s audiences, which generally have a short attention span and are interested in more immersive fare. Walt Disney Imagineering (WDI) and Lightstorm Entertainment (co-founded by James Cameron) teamed with Weta Digital to create Avatar: Flight of Passage for Disney’s Animal Kingdom; Universal Creative worked with Weta on King Kong 360 3-D for Universal Studios Hollywood; WDI collaborated with ILM on Pirates of the Caribbean: Battle for the Sunken Treasure for Shanghai Disneyland Park; and Sally Corporation worked with Pure Imagination Studios on numerous Justice League: Battle for Metropolis rides for multiple Six Flags parks. Framestore, Pixomondo and The Third Floor are among the
other VFX/animation companies working with theme park ride designers. ILM has also joined forces with The Void on Star Wars: Secrets of the Empire, a location-based VR experience. The attraction is not, strictly speaking, situated in a theme park, and it clocks in at 25 minutes as opposed to the few minutes for most “dark rides,” but it hints at future possibilities both in and out of parks. Dark rides with CGI are a crucial part of today’s theme parks, which are thriving. The top 10 theme park operators drew in 475.8 million visitors in 2017, an increase of 8.6% over the previous year, according to a report from TEA (Themed Entertainment Association) and engineering firm AECOM. Walt Disney Attractions was No. 1 on the list (with 150 million visitors, twice that of the No. 2 company, Merlin Entertainments based in England. Universal Parks and Resorts was next, followed by three Chinese companies – OCT Group, Fantawild and Chimelong Group – and then Six Flags Entertainment Corporation. The top 25 worldwide destinations expanded attendance by 4.7%. Disney dominates the list, with the three most popular locations and eight of the top 10. The TEA/AECOM report forecasts that China will be the largest theme park market by 2020, driven by Shanghai Disneyland (which had its first full year in 2017), Universal Studios Beijing (set to open in 2020), the new all-VR theme park, Oriental Science Fiction Valley, and other new parks. The entities that design CGI-charged rides – like WDI, Universal Creative, Sally Corporation (based in Jacksonville, Florida), Falcon’s Creative Group (Orlando, Florida), The Hettema Group (Pasadena, California), Thinkwell Group (Los Angeles), and ITEC Entertainment (Orlando, Florida) – team with VFX people and firms on projects that often require a lot of manpower and processing power. For example, Pirates of the Caribbean: Battle for
TOP: Avatar: Flight of Passage at Disney’s Animal Kingdom. (Image courtesy of the Walt Disney Company) BOTTOM: Pirates of the Caribbean: Battle for the Sunken Treasure at Shanghai Disneyland (Photo: Ryan Wendler. Image courtesy of the Walt Disney Company)
FALL 2018 VFXVOICE.COM • 47
8/23/18 12:17 PM
COVER
TOP: Justice League: Battle for Metropolis at Six Flags Parks. (Image courtesy of Sally Corporation) BOTTOM: Star Wars: Secrets of the Empire, a location-based entertainment experience that uses VR and is highly interactive, was created by ILMxLAB and The VOID, (Image courtesy of ILMxLAB and The VOID) OPPOSITE TOP: Guardians of the Galaxy – Mission: Breakout! at Disney California Adventure. (Image courtesy of the Walt Disney Company) OPPOSITE MIDDLE AND BOTTOM: The Dream of Anhui “flying theater” attraction for the Wanda Hefei Movie Park in Anhui, China, in both the 360-degree view and traditional HD. (Image courtesy of Tippett Studio)
48 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 48-49
the Sunken Treasure had more media data than one and a half feature films, according to WDI. And, “an estimated 300 people worked for Weta on Avatar: Flight of Passage over the course of the project,” according to Thrain Shadbolt, Weta Digital VFX Supervisor. “We’re definitely seeing more crossover,” says Justin Yu, Lionsgate Entertainment Group’s Director of Global Live & Location Based Entertainment. “People come to theme parks to enter and engage with worlds that are not their own. Technology is a means to that end, so VFX and video need to be seamlessly integrated into the environment and the overall experience in order to be truly effective. When they’re incorporated correctly, they’re two of the most powerful tools we have to bring new worlds to life.” “As the theme park owners began to realize the limitations of their hardware rides, their focus for potential attractions shifted to the unlimited possibilities of media-driven experiences, and, in turn, they began hiring VFX companies who could apply the talents of their feature film artists to the creation of more immersive experiences for individual attractions,” observes David Garber, Executive Producer of Themed Entertainment for Pixomondo, which has various ongoing projects in the Middle East, Asia and the U.S. “If [theme parks] want repeat visitors, then they need to provide a variety of exciting experiences, creating state-of-the-art rides as well as the more conventional ones,” comments Tippett Studio President Jules Roman. “They are always very careful in their overall cost analysis, and visitor through-put is most definitely a factor. Although just looking at the three-to-four-hour wait for the state-of-the-art Flight of Passage ride shows you the appetite for exciting, unique, immersive experiences is insatiable.” “There is an explosion of activity in the theme park world with demands for high-end digital imagery and, of course, that means more VFX,” adds Roman. “So yes, more companies are getting involved.” Tippett Studio created the “flying theater” attraction Dream of Anhui for the Wanda Hefei Movie Park in Anhui, China. It had 6K, 48fps, large-format imagery and sensory effects (scents and wind) and was nominated for a 2017 VES award. The firm was founded in 1984 by animation pioneer Phil Tippett, VES, who worked on the Star Wars trilogy, Jurassic Park and other notable films. “There’s no question that media and VFX are being used more in rides and attractions, both as an augmentation of traditional attraction types, and more significantly a whole new generation of media-based experiences. Talented VFX creators are increasingly an important part of attraction teams,” comments industry veteran Phil Hettema. “However, it’s not just the decision to include VFX in experiential storytelling that’s game changing. The quality and execution of the VFX make a huge difference, and we need creative talent to make that possible.” Hettema worked with Universal Creative on landmark rides such as Back to the Future: The Ride (Universal Studios Florida) and The Amazing Adventures of Spider-Man (Universal’s Islands
of Adventure) before creating his own company, The Hettema Group, in 2002. “I was so fortunate to serve as one of the producers of that seminal attraction,” he recalls about Back to the Future. “I think one of my biggest contributions was in the early mock-up and test phases, where we really tested to understand the possibilities and constraints of that kind of large-format simulation experience and the kinds of ‘movement’ which would be most powerfully immersive. Of course, getting to work with Douglas Trumbull, VES, and his brilliant filmmaking and VFX Solutions was a thrill.” [Trumball was Special Effects Supervisor for 2001: A Space Odyssey, Blade Runner and other classics.] “The Adventures of Spider-Man really set the bar for a new generation of attractions that use CGI to create an experience we had never seen before,” says Hettema. “Following that, the Harry Potter rides as well as Disney Shanghai’s Pirates of the Caribbean attraction and Animal Kingdom’s Avatar: Flight of Passage have really continued to push the envelope, pulling us closer and closer into a fully immersive world.” He adds that the latter was a “breakthrough ride. The ride immerses you in the Pandora world as depicted in the sensational movie. The imagery in the ride is sophisticated, detailed and full of life. It’s an example of what can be created when enormous attention is paid to the visual imagery and a top-notch VFX company is hired to create the work.” The addition of CGI to theme park rides can also help bring new life to an outdated ride. A notable example of this is Disney California Adventure’s clever conversion of the Twilight Zone: Tower of Terror (a drop-tower dark ride), to Guardians of the Galaxy – Mission: Breakout! (which opened in 2017). Original 4K, 120 fps video of the Guardians cast syncs with the elevator’s jolting movements, an amusing narrative and lively 1970s era pop to create a unique thrill ride. WDI worked with Framestore and others on Guardians, which won a 2018 THEA award for Outstanding Achievement – Attraction Reimagining. Framestore’s Creative Director Ben West comments, “The ride utilizes a moving eyepoint technique that simulates depth in a non-stereoscopic way. This allows the audience to feel a part of the action, present in the same space as the Guardians. Unlike film, every detail is seen in focus with no lensing effect. This required a greater attention to detail using lighting and composition for a more theatrical approach to depth and staging.” He adds, “The chance to be a part of the Marvel Cinematic Universe and go on a ride with the Guardians is a dream come true for any fan. The ride is successful because it allows people to experience the thrills and spills of Guardians of the Galaxy in a seamless extension of the film world.” The Star Wars franchise is currently represented by Star Tours: The Adventures Continue, a 2011 upgrade of the original ride that is now present in four Disney parks, as well as Star Wars: Secrets of the Empire. The latter is a location-based entertainment experience that has VR, is highly interactive, and is vastly longer than a theme park attraction. Created by ILMxLAB (Lucasfilm’s immersive entertainment and VR laboratory) and The Void (which focuses on mixed-reality experiences and is based in Lindon, Utah), Star Wars: Secrets of the Empire was nominated in the 2018
FALL 2018 VFXVOICE.COM • 49
8/23/18 12:17 PM
COVER
TOP: Justice League: Battle for Metropolis at Six Flags Parks. (Image courtesy of Sally Corporation) BOTTOM: Star Wars: Secrets of the Empire, a location-based entertainment experience that uses VR and is highly interactive, was created by ILMxLAB and The VOID, (Image courtesy of ILMxLAB and The VOID) OPPOSITE TOP: Guardians of the Galaxy – Mission: Breakout! at Disney California Adventure. (Image courtesy of the Walt Disney Company) OPPOSITE MIDDLE AND BOTTOM: The Dream of Anhui “flying theater” attraction for the Wanda Hefei Movie Park in Anhui, China, in both the 360-degree view and traditional HD. (Image courtesy of Tippett Studio)
48 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 48-49
the Sunken Treasure had more media data than one and a half feature films, according to WDI. And, “an estimated 300 people worked for Weta on Avatar: Flight of Passage over the course of the project,” according to Thrain Shadbolt, Weta Digital VFX Supervisor. “We’re definitely seeing more crossover,” says Justin Yu, Lionsgate Entertainment Group’s Director of Global Live & Location Based Entertainment. “People come to theme parks to enter and engage with worlds that are not their own. Technology is a means to that end, so VFX and video need to be seamlessly integrated into the environment and the overall experience in order to be truly effective. When they’re incorporated correctly, they’re two of the most powerful tools we have to bring new worlds to life.” “As the theme park owners began to realize the limitations of their hardware rides, their focus for potential attractions shifted to the unlimited possibilities of media-driven experiences, and, in turn, they began hiring VFX companies who could apply the talents of their feature film artists to the creation of more immersive experiences for individual attractions,” observes David Garber, Executive Producer of Themed Entertainment for Pixomondo, which has various ongoing projects in the Middle East, Asia and the U.S. “If [theme parks] want repeat visitors, then they need to provide a variety of exciting experiences, creating state-of-the-art rides as well as the more conventional ones,” comments Tippett Studio President Jules Roman. “They are always very careful in their overall cost analysis, and visitor through-put is most definitely a factor. Although just looking at the three-to-four-hour wait for the state-of-the-art Flight of Passage ride shows you the appetite for exciting, unique, immersive experiences is insatiable.” “There is an explosion of activity in the theme park world with demands for high-end digital imagery and, of course, that means more VFX,” adds Roman. “So yes, more companies are getting involved.” Tippett Studio created the “flying theater” attraction Dream of Anhui for the Wanda Hefei Movie Park in Anhui, China. It had 6K, 48fps, large-format imagery and sensory effects (scents and wind) and was nominated for a 2017 VES award. The firm was founded in 1984 by animation pioneer Phil Tippett, VES, who worked on the Star Wars trilogy, Jurassic Park and other notable films. “There’s no question that media and VFX are being used more in rides and attractions, both as an augmentation of traditional attraction types, and more significantly a whole new generation of media-based experiences. Talented VFX creators are increasingly an important part of attraction teams,” comments industry veteran Phil Hettema. “However, it’s not just the decision to include VFX in experiential storytelling that’s game changing. The quality and execution of the VFX make a huge difference, and we need creative talent to make that possible.” Hettema worked with Universal Creative on landmark rides such as Back to the Future: The Ride (Universal Studios Florida) and The Amazing Adventures of Spider-Man (Universal’s Islands
of Adventure) before creating his own company, The Hettema Group, in 2002. “I was so fortunate to serve as one of the producers of that seminal attraction,” he recalls about Back to the Future. “I think one of my biggest contributions was in the early mock-up and test phases, where we really tested to understand the possibilities and constraints of that kind of large-format simulation experience and the kinds of ‘movement’ which would be most powerfully immersive. Of course, getting to work with Douglas Trumbull, VES, and his brilliant filmmaking and VFX Solutions was a thrill.” [Trumball was Special Effects Supervisor for 2001: A Space Odyssey, Blade Runner and other classics.] “The Adventures of Spider-Man really set the bar for a new generation of attractions that use CGI to create an experience we had never seen before,” says Hettema. “Following that, the Harry Potter rides as well as Disney Shanghai’s Pirates of the Caribbean attraction and Animal Kingdom’s Avatar: Flight of Passage have really continued to push the envelope, pulling us closer and closer into a fully immersive world.” He adds that the latter was a “breakthrough ride. The ride immerses you in the Pandora world as depicted in the sensational movie. The imagery in the ride is sophisticated, detailed and full of life. It’s an example of what can be created when enormous attention is paid to the visual imagery and a top-notch VFX company is hired to create the work.” The addition of CGI to theme park rides can also help bring new life to an outdated ride. A notable example of this is Disney California Adventure’s clever conversion of the Twilight Zone: Tower of Terror (a drop-tower dark ride), to Guardians of the Galaxy – Mission: Breakout! (which opened in 2017). Original 4K, 120 fps video of the Guardians cast syncs with the elevator’s jolting movements, an amusing narrative and lively 1970s era pop to create a unique thrill ride. WDI worked with Framestore and others on Guardians, which won a 2018 THEA award for Outstanding Achievement – Attraction Reimagining. Framestore’s Creative Director Ben West comments, “The ride utilizes a moving eyepoint technique that simulates depth in a non-stereoscopic way. This allows the audience to feel a part of the action, present in the same space as the Guardians. Unlike film, every detail is seen in focus with no lensing effect. This required a greater attention to detail using lighting and composition for a more theatrical approach to depth and staging.” He adds, “The chance to be a part of the Marvel Cinematic Universe and go on a ride with the Guardians is a dream come true for any fan. The ride is successful because it allows people to experience the thrills and spills of Guardians of the Galaxy in a seamless extension of the film world.” The Star Wars franchise is currently represented by Star Tours: The Adventures Continue, a 2011 upgrade of the original ride that is now present in four Disney parks, as well as Star Wars: Secrets of the Empire. The latter is a location-based entertainment experience that has VR, is highly interactive, and is vastly longer than a theme park attraction. Created by ILMxLAB (Lucasfilm’s immersive entertainment and VR laboratory) and The Void (which focuses on mixed-reality experiences and is based in Lindon, Utah), Star Wars: Secrets of the Empire was nominated in the 2018
FALL 2018 VFXVOICE.COM • 49
8/23/18 12:17 PM
COVER
“As the theme park owners began to realize the limitations of their hardware rides, their focus for potential attractions shifted to the unlimited possibilities of media-driven experiences, and, in turn, they began hiring VFX companies who could apply the talents of their feature film artists to the creation of more immersive experiences for individual attractions.” —David Garber, Executive Producer, Themed Entertainment, Pixomondo
BOTTOM: The Amazing Adventures of Spider-Man at Marvel Super Hero Island at Islands of Adventure at Universal Studios Florida. (Image courtesy of Universal Studios Florida)
50 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 51
VES Awards category of Outstanding Visual Effects in a Special Venue Project. Four participants at a time don Oculus HMDs and haptic vests and work together to retrieve intelligence, solve puzzles and, of course, blast stormtroopers. Ceiling-mounted motion-tracking cameras read the user’s movements. There are sensory stimuli (the rumble of a transport, the heat and smell of lava) and interactivity (the opportunity to hold and shoot a blaster). Star Wars: Secrets of the Empire (tickets currently cost $29.95-$36.95 per person) is situated beside theme parks (next to Disney Springs in the Walt Disney World Resort and Downtown Disney next to Disneyland), as well as in the Venetian (Las Vegas) and the Glendale Galleria (a mall in Glendale, California). Nine additional locations are in the works. “Immersive location-based experiences represent new ways for visitors to actively engage with stories,” comments Vicki Dobbs Beck, Executive in Charge at ILMxLAB. “Fans have long wanted to step into the world of Star Wars – not just to observe, but to engage. Star Wars: Secrets of the Empire provided a way for people to experience the lava planet Mustafar – the heat, the danger and the mystery. We wanted to invite people on a shared mission in which they became a fundamental part of the story.” She continues, “Immersive entertainment, VR in particular, makes it possible not just to see a location, but to be in that location interacting with characters. No longer are you viewing
a world on a screen. Now, the world of the story is all around you. You are at the center of your adventure and that can ultimately lead to deeper engagement and more lasting impact.” Scale may well determine the use of VR and AR in parks. “Theme parks go to great lengths to create a fantasy in physical form – one generally unencumbered by devices like VR headsets. Theme parks may one day be a wonderful opportunity to use mixed-reality glasses to overlay digital elements such as flying spaceships that would otherwise have been impossible to create. In essence, the digital elements would ‘complete the magic.’ In the case of a hyper-reality experience like Star Wars: Secrets of the Empire, the experience is much more self-contained and can more easily be scaled outside theme parks,” comments Beck. ILMxLAB is also collaborating with Walt Disney Imagineering on a Millennium Falcon ride for Star Wars: Galaxy’s Edge lands in Disneyland and Disney World. About the use of VR, Curtis Hickman, Chief Creative Officer and Co-founder of The Void, comments, “Over time you will see this sort of technology being used more and more as it really does provide the solution to a market that demands more immersive experiences, both in LBE [location based entertainment] and theme park attractions.” The longstanding synergy of theme parks and the movies, pioneered by Disneyland, has only continued to expand. The Disney and Universal parks are the leaders in this regard, of course. And Warner Bros. Movie World (Australia’s Gold Coast), Twentieth Century Fox World (set to open in Malaysia) and multiple park zones tied to Lionsgate Entertainment Group are other examples of studios moving into theme parks. “We are fortunate to have multiple blockbuster franchises in our portfolio, including The Hunger Games, Twilight and Saw, that have incredibly strong and supportive fan bases,” says Jenefer Brown, Lionsgate SVP of Global Live and Location Based Entertainment. “By connecting these properties with theme parks, we can continue to engage and interact with fans, deepening their experiences beyond the screen and immersing them in the world of the films. Creating compelling and authentic attractions for a property expands the lore, builds upon our existing fan base, and generates additional revenue for the studio.”
TOP LEFT: The Amazing Adventures of Spider-Man at Marvel Super Hero Island at Islands of Adventure at Universal Studios Florida. (Image courtesy of Universal Studios Florida) TOP RIGHT AND BOTTOM: Panem Aerial Tour is a hovercraft-motion-simulator attraction that takes guests on a ride to and through the Capitol at “The World of The Hunger Games” at Motiongate Dubai. (Image courtesy of Lionsgate)
FALL 2018 VFXVOICE.COM • 51
8/23/18 12:17 PM
COVER
“As the theme park owners began to realize the limitations of their hardware rides, their focus for potential attractions shifted to the unlimited possibilities of media-driven experiences, and, in turn, they began hiring VFX companies who could apply the talents of their feature film artists to the creation of more immersive experiences for individual attractions.” —David Garber, Executive Producer, Themed Entertainment, Pixomondo
BOTTOM: The Amazing Adventures of Spider-Man at Marvel Super Hero Island at Islands of Adventure at Universal Studios Florida. (Image courtesy of Universal Studios Florida)
50 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 51
VES Awards category of Outstanding Visual Effects in a Special Venue Project. Four participants at a time don Oculus HMDs and haptic vests and work together to retrieve intelligence, solve puzzles and, of course, blast stormtroopers. Ceiling-mounted motion-tracking cameras read the user’s movements. There are sensory stimuli (the rumble of a transport, the heat and smell of lava) and interactivity (the opportunity to hold and shoot a blaster). Star Wars: Secrets of the Empire (tickets currently cost $29.95-$36.95 per person) is situated beside theme parks (next to Disney Springs in the Walt Disney World Resort and Downtown Disney next to Disneyland), as well as in the Venetian (Las Vegas) and the Glendale Galleria (a mall in Glendale, California). Nine additional locations are in the works. “Immersive location-based experiences represent new ways for visitors to actively engage with stories,” comments Vicki Dobbs Beck, Executive in Charge at ILMxLAB. “Fans have long wanted to step into the world of Star Wars – not just to observe, but to engage. Star Wars: Secrets of the Empire provided a way for people to experience the lava planet Mustafar – the heat, the danger and the mystery. We wanted to invite people on a shared mission in which they became a fundamental part of the story.” She continues, “Immersive entertainment, VR in particular, makes it possible not just to see a location, but to be in that location interacting with characters. No longer are you viewing
a world on a screen. Now, the world of the story is all around you. You are at the center of your adventure and that can ultimately lead to deeper engagement and more lasting impact.” Scale may well determine the use of VR and AR in parks. “Theme parks go to great lengths to create a fantasy in physical form – one generally unencumbered by devices like VR headsets. Theme parks may one day be a wonderful opportunity to use mixed-reality glasses to overlay digital elements such as flying spaceships that would otherwise have been impossible to create. In essence, the digital elements would ‘complete the magic.’ In the case of a hyper-reality experience like Star Wars: Secrets of the Empire, the experience is much more self-contained and can more easily be scaled outside theme parks,” comments Beck. ILMxLAB is also collaborating with Walt Disney Imagineering on a Millennium Falcon ride for Star Wars: Galaxy’s Edge lands in Disneyland and Disney World. About the use of VR, Curtis Hickman, Chief Creative Officer and Co-founder of The Void, comments, “Over time you will see this sort of technology being used more and more as it really does provide the solution to a market that demands more immersive experiences, both in LBE [location based entertainment] and theme park attractions.” The longstanding synergy of theme parks and the movies, pioneered by Disneyland, has only continued to expand. The Disney and Universal parks are the leaders in this regard, of course. And Warner Bros. Movie World (Australia’s Gold Coast), Twentieth Century Fox World (set to open in Malaysia) and multiple park zones tied to Lionsgate Entertainment Group are other examples of studios moving into theme parks. “We are fortunate to have multiple blockbuster franchises in our portfolio, including The Hunger Games, Twilight and Saw, that have incredibly strong and supportive fan bases,” says Jenefer Brown, Lionsgate SVP of Global Live and Location Based Entertainment. “By connecting these properties with theme parks, we can continue to engage and interact with fans, deepening their experiences beyond the screen and immersing them in the world of the films. Creating compelling and authentic attractions for a property expands the lore, builds upon our existing fan base, and generates additional revenue for the studio.”
TOP LEFT: The Amazing Adventures of Spider-Man at Marvel Super Hero Island at Islands of Adventure at Universal Studios Florida. (Image courtesy of Universal Studios Florida) TOP RIGHT AND BOTTOM: Panem Aerial Tour is a hovercraft-motion-simulator attraction that takes guests on a ride to and through the Capitol at “The World of The Hunger Games” at Motiongate Dubai. (Image courtesy of Lionsgate)
FALL 2018 VFXVOICE.COM • 51
8/23/18 12:17 PM
COVER
TOP: Skull Island: Reign of Kong, a ride at the Islands of Adventure theme park at Universal Studios Florida. (Image courtesy of Universal Studios Florida) BOTTOM LEFT: Star Journey is a motion-based simulator ride at Wanda Entertainment Center in Wuhan, China. (Image courtesy of Lionsgate and Pixomondo) BOTTOM RIGHT: Purple Heaven Palace in Hubei in the Air, a domed-screen flying ride, at Wanda Entertainment Center in Wuhan, China. (Image courtesy of Lionsgate and Pixomondo)
52 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 53
She adds, “Our film franchises lend themselves well to theme parks because their stories establish worlds fans want an opportunity to explore in real life. While our films appeal to a diverse, four-quadrant audience, it is worth noting that pre-teens, teens and young adults – key demographics for theme parks – form especially strong connections with our franchises, which make them a natural fit for themed attractions. “VFX are the key component to the successful design of immersive media-based attractions. CGI is one of the most powerful tools we have available to us to bring environments to life in theme park attractions in a cost-effective and authentic way,” comments Brown. Lionsgate will leave its paw print in various lands and parks: The World of The Hunger Games (Lionsgate Zone, Motiongate Dubai, UAE) in 2018; Lionsgate Entertainment World (Novotown, Hengqin, China) in 2019; Lionsgate Entertainment City (multiple locations in the U.S. and Europe) in 2019; and Lionsgate Movie World (Jeju Shinwha World, South Korea) in 2020. “All of our deals to date have been licensing deals with no risk to the studio. In the future we will also explore the idea of investing and increasing our upside if the opportunity makes sense,” comments Brown.
The Lionsgate lands are the latest theme park zones devoted to a particular movie (“blockbuster worlds”). This “single-franchise approach” owes much to a certain boy wizard. The Wizarding World of Harry Potter opened in 2010 at Universal’s Islands of Adventure (with an exclusive license from Warner Bros. Entertainment) and was a tipping point. Guests entered, explored and inhabited an entire park area devoted to the J.K. Rowling books, and it proved enormously successful. While there, they could take the Harry Potter and the Forbidden Journey and Harry Potter and the Escape from Gringotts rides (the latter opened in 2014). Recently, other “themed areas” in Disney and Universal parks have opened up or are coming soon, tied to Cars, Toy Story, Frozen, Shrek, Madagascar, Avatar and Star Wars, among other properties. These lands immerse guests to a certain degree, and then their CGI-infused attractions enhance participation with characters and stories. Lionsgate’s deals are one example of the growth of CGI-charged rides in the Middle East and Asia. “Asia and the Arab Emirates seem to be the markets that are currently expanding rapidly,” says Pixomondo’s Garber, who has worked on many immersive dome experiences around the world. His company also developed two projects at Wanda Entertainment Center in Wuhan, China: Star Journey (a motion-based simulator ride) and Hubei in the Air (a domed-screen flying ride). As a basis for Hubei in the Air, drones captured 90,000 photographs of Hubei Province. “We created a program that translated the images into a 3D model of the
TOP: Gringotts bank in the Harry Potter and the Escape from Gringotts ride at “The Wizarding World of Harry Potter – Diagon Alley” at Universal Studios Florida. (Image courtesy of Universal Studios Florida) BOTTOM: Star Wars Land, coming to Disneyland and Walt Disney Resorts in 2019, takes guests to a new planet. (Image courtesy of Walt Disney Productions and Lucasfilm)
FALL 2018 VFXVOICE.COM • 53
8/23/18 12:17 PM
COVER
TOP: Skull Island: Reign of Kong, a ride at the Islands of Adventure theme park at Universal Studios Florida. (Image courtesy of Universal Studios Florida) BOTTOM LEFT: Star Journey is a motion-based simulator ride at Wanda Entertainment Center in Wuhan, China. (Image courtesy of Lionsgate and Pixomondo) BOTTOM RIGHT: Purple Heaven Palace in Hubei in the Air, a domed-screen flying ride, at Wanda Entertainment Center in Wuhan, China. (Image courtesy of Lionsgate and Pixomondo)
52 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 53
She adds, “Our film franchises lend themselves well to theme parks because their stories establish worlds fans want an opportunity to explore in real life. While our films appeal to a diverse, four-quadrant audience, it is worth noting that pre-teens, teens and young adults – key demographics for theme parks – form especially strong connections with our franchises, which make them a natural fit for themed attractions. “VFX are the key component to the successful design of immersive media-based attractions. CGI is one of the most powerful tools we have available to us to bring environments to life in theme park attractions in a cost-effective and authentic way,” comments Brown. Lionsgate will leave its paw print in various lands and parks: The World of The Hunger Games (Lionsgate Zone, Motiongate Dubai, UAE) in 2018; Lionsgate Entertainment World (Novotown, Hengqin, China) in 2019; Lionsgate Entertainment City (multiple locations in the U.S. and Europe) in 2019; and Lionsgate Movie World (Jeju Shinwha World, South Korea) in 2020. “All of our deals to date have been licensing deals with no risk to the studio. In the future we will also explore the idea of investing and increasing our upside if the opportunity makes sense,” comments Brown.
The Lionsgate lands are the latest theme park zones devoted to a particular movie (“blockbuster worlds”). This “single-franchise approach” owes much to a certain boy wizard. The Wizarding World of Harry Potter opened in 2010 at Universal’s Islands of Adventure (with an exclusive license from Warner Bros. Entertainment) and was a tipping point. Guests entered, explored and inhabited an entire park area devoted to the J.K. Rowling books, and it proved enormously successful. While there, they could take the Harry Potter and the Forbidden Journey and Harry Potter and the Escape from Gringotts rides (the latter opened in 2014). Recently, other “themed areas” in Disney and Universal parks have opened up or are coming soon, tied to Cars, Toy Story, Frozen, Shrek, Madagascar, Avatar and Star Wars, among other properties. These lands immerse guests to a certain degree, and then their CGI-infused attractions enhance participation with characters and stories. Lionsgate’s deals are one example of the growth of CGI-charged rides in the Middle East and Asia. “Asia and the Arab Emirates seem to be the markets that are currently expanding rapidly,” says Pixomondo’s Garber, who has worked on many immersive dome experiences around the world. His company also developed two projects at Wanda Entertainment Center in Wuhan, China: Star Journey (a motion-based simulator ride) and Hubei in the Air (a domed-screen flying ride). As a basis for Hubei in the Air, drones captured 90,000 photographs of Hubei Province. “We created a program that translated the images into a 3D model of the
TOP: Gringotts bank in the Harry Potter and the Escape from Gringotts ride at “The Wizarding World of Harry Potter – Diagon Alley” at Universal Studios Florida. (Image courtesy of Universal Studios Florida) BOTTOM: Star Wars Land, coming to Disneyland and Walt Disney Resorts in 2019, takes guests to a new planet. (Image courtesy of Walt Disney Productions and Lucasfilm)
FALL 2018 VFXVOICE.COM • 53
8/23/18 12:17 PM
COVER
“If [theme parks] want repeat visitors then they need to provide a variety of exciting experiences, creating state-of-the-art rides as well as the more conventional ones. They are always very careful in their overall cost analysis, and visitor through-put is most definitely a factor. Although just looking at the three-to-four hour wait for the stateof-the-art Flight of Passage ride shows you the appetite for exciting, unique, immersive experiences is insatiable.” —Jules Roman, President, Tippett Studio
TOP AND BOTTOM: Guardians of the Galaxy – Mission: Breakout! drop ride at Disney California Adventure. (Images courtesy of the Walt Disney Company)
environment, and once built we could fly our camera anywhere within that world,” Garber explains. In the ride, the 3D assets are rendered at 6.5K and projected on a 65’ by 40’ curved dome. About the future, Garber says, “Theme parks and entertainment centers are now vehicles for storied experiences that were inseminated into the culture via audio-visual experiences that were called movies, but in reality are now simply 90-minute-long trailers created solely to market their theme park manifestations. And the brushes used by the VFX/CGI artists are the tools that have taken us there.” “I would say the sky’s the limit. I think the sound environments may have to catch up. But the more convincing and detailed the VFX the better the ride – or so it would appear. And like I mentioned, the global public is thrilled to participate,” says Roman. “I wouldn’t say that animation/CGI is the only way to create immersion, but it’s certainly an incredibly valuable aspect of creating truly immersive environments,” Hettema remarks. “What I find the most exciting these days is the blending of physical and virtual worlds in which the seams become invisible. That usually involves both traditional and cutting-edge systems and technology. “We’ve always said that if we can imagine it, we can build it,” Hettema adds. “CGI really makes that more true than ever. It really feels like there are very few limits on what we can imagine and create.”
Previs and Theme Parks Previs, or previsualization, has become indispensable for making big special-effects movies, and is also a crucial part of the process for creating theme park rides. Indeed, its history in the latter area stretches back farther than most people realize. “Theme park designers have used various types of visualization to envision attractions since the development of Disneyland,” says Chris Edwards, CEO of The Third Floor, which has provided previs for several fixed-location attractions. “This included design methods such as the architectural renderings and scale models Walt Disney commissioned in the 1960s to showcase the vision for what later became Epcot Center, all before the advent of computer-aided design (CAD).” Edwards continues, “Computer graphics advanced significantly during the 1980s and, in 1993, visual effects pioneer Douglas Trumbull, VES, was able to leverage computer graphics, including previsualization, to deliver sequences for the Secrets of the Luxor Pyramid attraction in Las Vegas. To this day, theme park designers continue to embrace the previs process as an effective way to orchestrate all aspects of a guest’s immersive experience, including architectural layout, set dressing, ride vehicle motion, animatronic characters, effects elements and any projected media. All of these elements must work together with the sound design to tell a complete story, usually in less than five minutes.” The Third Floor has contributed previs to many effects-laden major productions. In theme parks, The Third Floor has contributed previs to Despicable Me Minion Mayhem, Harry Potter and the Escape from Gringotts, Skull Island: Reign of Kong, Fast and Furious: Supercharged and other attractions. “Over the years, we’ve continue to develop more and more sophisticated meth-
ods to support the design of location-based attractions using techniques and technologies from our film and games industry work,” says Edwards. “Visualizing themed attractions can be quite complicated,” comments Albert Cheng, Visualization Supervisor at The Third Floor. “Working with client creative and technical teams, we try very hard to provide accurate representations of the experience that’s being envisioned.” He points out different aspects. “You need to consider ride timing and the cycling of an experience across the physical ride movement. When a ride has an optimum timing for user experience or guest flow, you need to work with that. Visualizing in stereo always adds complexity for previs image creation as well.” Animatronics are another factor in the mix. “Where animatronics are involved, we make sure to engage with experts who know the range of motions that can be achieved with the puppets so that what is previsualized is properly represented,” says Cheng. He continues, “Much of the complexity comes from visualizing content that can seamlessly integrate, as much as possible, with the physical ride set build, while accounting for a constantly moving perspective of guests in a ride vehicle, in order to preserve the illusion of space and depth.” Screens must also be considered. Cheng comments, “When producing previs, we’ll commonly create environments made to look like they are an extension of the ride set. Knowing the specifications and locations of the screens that will ultimately be used to project the media is critical to the success of the ride experience and very integral in how we represent the previs.” He explains, “Usually we’ll work from a model or from the
OPPOSITE BOTTOM: Visualization by The Third Floor of dome-show media for “The Marvel Experience” from Hero Ventures and Marvel Entertainment. (Image courtesy of The Third Floor and copyright © Marvel)
54 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 55
FALL 2018 VFXVOICE.COM • 55
8/23/18 12:17 PM
COVER
“If [theme parks] want repeat visitors then they need to provide a variety of exciting experiences, creating state-of-the-art rides as well as the more conventional ones. They are always very careful in their overall cost analysis, and visitor through-put is most definitely a factor. Although just looking at the three-to-four hour wait for the stateof-the-art Flight of Passage ride shows you the appetite for exciting, unique, immersive experiences is insatiable.” —Jules Roman, President, Tippett Studio
TOP AND BOTTOM: Guardians of the Galaxy – Mission: Breakout! drop ride at Disney California Adventure. (Images courtesy of the Walt Disney Company)
environment, and once built we could fly our camera anywhere within that world,” Garber explains. In the ride, the 3D assets are rendered at 6.5K and projected on a 65’ by 40’ curved dome. About the future, Garber says, “Theme parks and entertainment centers are now vehicles for storied experiences that were inseminated into the culture via audio-visual experiences that were called movies, but in reality are now simply 90-minute-long trailers created solely to market their theme park manifestations. And the brushes used by the VFX/CGI artists are the tools that have taken us there.” “I would say the sky’s the limit. I think the sound environments may have to catch up. But the more convincing and detailed the VFX the better the ride – or so it would appear. And like I mentioned, the global public is thrilled to participate,” says Roman. “I wouldn’t say that animation/CGI is the only way to create immersion, but it’s certainly an incredibly valuable aspect of creating truly immersive environments,” Hettema remarks. “What I find the most exciting these days is the blending of physical and virtual worlds in which the seams become invisible. That usually involves both traditional and cutting-edge systems and technology. “We’ve always said that if we can imagine it, we can build it,” Hettema adds. “CGI really makes that more true than ever. It really feels like there are very few limits on what we can imagine and create.”
Previs and Theme Parks Previs, or previsualization, has become indispensable for making big special-effects movies, and is also a crucial part of the process for creating theme park rides. Indeed, its history in the latter area stretches back farther than most people realize. “Theme park designers have used various types of visualization to envision attractions since the development of Disneyland,” says Chris Edwards, CEO of The Third Floor, which has provided previs for several fixed-location attractions. “This included design methods such as the architectural renderings and scale models Walt Disney commissioned in the 1960s to showcase the vision for what later became Epcot Center, all before the advent of computer-aided design (CAD).” Edwards continues, “Computer graphics advanced significantly during the 1980s and, in 1993, visual effects pioneer Douglas Trumbull, VES, was able to leverage computer graphics, including previsualization, to deliver sequences for the Secrets of the Luxor Pyramid attraction in Las Vegas. To this day, theme park designers continue to embrace the previs process as an effective way to orchestrate all aspects of a guest’s immersive experience, including architectural layout, set dressing, ride vehicle motion, animatronic characters, effects elements and any projected media. All of these elements must work together with the sound design to tell a complete story, usually in less than five minutes.” The Third Floor has contributed previs to many effects-laden major productions. In theme parks, The Third Floor has contributed previs to Despicable Me Minion Mayhem, Harry Potter and the Escape from Gringotts, Skull Island: Reign of Kong, Fast and Furious: Supercharged and other attractions. “Over the years, we’ve continue to develop more and more sophisticated meth-
ods to support the design of location-based attractions using techniques and technologies from our film and games industry work,” says Edwards. “Visualizing themed attractions can be quite complicated,” comments Albert Cheng, Visualization Supervisor at The Third Floor. “Working with client creative and technical teams, we try very hard to provide accurate representations of the experience that’s being envisioned.” He points out different aspects. “You need to consider ride timing and the cycling of an experience across the physical ride movement. When a ride has an optimum timing for user experience or guest flow, you need to work with that. Visualizing in stereo always adds complexity for previs image creation as well.” Animatronics are another factor in the mix. “Where animatronics are involved, we make sure to engage with experts who know the range of motions that can be achieved with the puppets so that what is previsualized is properly represented,” says Cheng. He continues, “Much of the complexity comes from visualizing content that can seamlessly integrate, as much as possible, with the physical ride set build, while accounting for a constantly moving perspective of guests in a ride vehicle, in order to preserve the illusion of space and depth.” Screens must also be considered. Cheng comments, “When producing previs, we’ll commonly create environments made to look like they are an extension of the ride set. Knowing the specifications and locations of the screens that will ultimately be used to project the media is critical to the success of the ride experience and very integral in how we represent the previs.” He explains, “Usually we’ll work from a model or from the
OPPOSITE BOTTOM: Visualization by The Third Floor of dome-show media for “The Marvel Experience” from Hero Ventures and Marvel Entertainment. (Image courtesy of The Third Floor and copyright © Marvel)
54 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 55
FALL 2018 VFXVOICE.COM • 55
8/23/18 12:17 PM
COVER
China’s Oriental Science Fiction Valley All-VR Theme Park
characteristics of the screen that is expected to be used. There are various tricks we use to determine the distortion necessary, depending on the moving viewer’s position and the projector position. Every unique screen needs to be accounted for separately, so the projections need to be set up with different sizes and distortion parameters in order to display correctly when you are trying to simulate in previs what will be happening in the real world.” The “need for speed” is present both with previs for feature films and for theme parks. Cheng notes, “Previs, in general, is inherently set up to be a fast process because you want to explore ideas creatively. This holds true for both feature films and themed attractions. Similarly, there is often a fixed amount of time and budget so the need to work quickly is important overall. Ride projects also often involve high stakes for the related IP, so there’s a drive to produce the most creative and impactful project possible.” Virtual Reality has become an important and useful tool in creating rides for theme park experiences. “We use it as part of our standard process when working on ride or dome projections because it allows us to simulate and experience the full 360-degree view of the experience before anything is built. TOP LEFT AND TOP RIGHT: On-ride content visualization and final ride art for Despicable Me Minion Mayhem, a simulator ride at Universal Studios Florida. (Image courtesy of The Third Floor and copyright © Universal Studios Florida) BOTTOM LEFT: Chris Edwards BOTTOM RIGHT: Albert Cheng
56 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 57
Working in stereo is also very standard practice now, whereas most feature previs is done in traditional 2D,” says Cheng. “We are very close to mimicking a ‘Holodeck’ experience, to enable clients to step inside of a simulator and visualize, and adapt the media and vehicle motion in real time,” comments Edwards. “We have a history of providing real-time design experiences for filmmakers live on set in a workflow known as virtual production. Adding the extra sensory elements for fully immersive attraction design to that process is just a matter of time. The projects we see on the horizon are going to benefit significantly from such a real-time, iterative design process. Our team is focused towards this development goal every day.” Edwards concludes, “As themed attractions get more and more technically complex, using a form of visualization to depict all aspects of the project in progress has become near imperative. From an artistic perspective, previs allows designers to tweak the experiences to a very high degree, adjusting the placement, movement, performance and lighting. Just as fans these days are probably never going to lower their expectations for immersive entertainment, so designers and engineers will continue to use more and more visualization and technology to meet and exceed those expectations.”
Virtual reality in theme parks in the U.S. and Europe is still in its infancy, but a Chinese joint venture is betting a $470 million investment that VR can support an entire park right now. Oriental Science Fiction Valley opened April 29, 2018 and features 35 VR attractions spread over 330 acres on the outskirts of Guiyang, the capital of China’s southwestern province of Guizho. It is the world’s first outdoor theme park that exclusively employs virtual and augmented reality in its rides and activities. “There is fierce competition in the theme park market now,” the park’s CEO, Chen Jianli, told CGTN (China Global Television Network). “We’re trying to give customers a completely different experience by combining modern technologies such as VR and AR with traditional recreational facilities. The sci-fi part of it is important for the feelings it creates.”
“We’re trying to give customers a completely different experience by combining modern technologies such as VR and AR with traditional recreational facilities. The sci-fi part of it is important for the feelings it creates.” — Chen Jianli, CEO, Oriental Science Fiction Valley
Oriental Science Fiction Valley is a joint venture between Guizhou Province and Shuimu Animation, a subsidiary of Oriental Times Network Media. The futuristic park features virtual rollercoasters, virtual shooter games and virtual tours of the region’s scenic spots. VR-equipped bungee jumpers can also leap from the arm of a 53-meter-tall, 700-ton, robot-like statue that is vaguely Transformers-ish and is the signature decoration of the park. Virtual reality gained a toehold in global theme parks over the last few years. In many cases, VR has been used as an overlay on existing rides, although some attractions are beginning to utilize it from the ground up, as in Oriental Science Fiction Valley. VR Park Dubai, arguably the closest current competition to Oriental Science Fiction Valley, features 18 VR and AR experiences and is an indoor theme park in the Dubai Mall in Dubai, UAE. In addition, The Void and ILMxLAB have teamed to offer entertainment of a different type in Star Wars: Secrets of the Empire, a 25-minute multi-sensory and interactive VR experience available at Disney Springs, Downtown Disney and multiple other locations. TOP LEFT: Oriental Science Fiction Valley as it neared completion in early 2018. The park opened in April 2018 and features 35 VR attractions spread over 330 acres on the outskirts of Guiyang, China. (Image courtesy of East Science Valley/Oriental Times Network Media) TOP RIGHT, BOTTOM LEFT AND BOTTOM RIGHT: Oriental Science Fiction Valley (Images courtesy of East Science Valley/Oriental Times Network Media)
FALL 2018 VFXVOICE.COM • 57
8/23/18 12:17 PM
COVER
China’s Oriental Science Fiction Valley All-VR Theme Park
characteristics of the screen that is expected to be used. There are various tricks we use to determine the distortion necessary, depending on the moving viewer’s position and the projector position. Every unique screen needs to be accounted for separately, so the projections need to be set up with different sizes and distortion parameters in order to display correctly when you are trying to simulate in previs what will be happening in the real world.” The “need for speed” is present both with previs for feature films and for theme parks. Cheng notes, “Previs, in general, is inherently set up to be a fast process because you want to explore ideas creatively. This holds true for both feature films and themed attractions. Similarly, there is often a fixed amount of time and budget so the need to work quickly is important overall. Ride projects also often involve high stakes for the related IP, so there’s a drive to produce the most creative and impactful project possible.” Virtual Reality has become an important and useful tool in creating rides for theme park experiences. “We use it as part of our standard process when working on ride or dome projections because it allows us to simulate and experience the full 360-degree view of the experience before anything is built. TOP LEFT AND TOP RIGHT: On-ride content visualization and final ride art for Despicable Me Minion Mayhem, a simulator ride at Universal Studios Florida. (Image courtesy of The Third Floor and copyright © Universal Studios Florida) BOTTOM LEFT: Chris Edwards BOTTOM RIGHT: Albert Cheng
56 • VFXVOICE.COM FALL 2018
PG 46-57 THEME PARKS.indd 57
Working in stereo is also very standard practice now, whereas most feature previs is done in traditional 2D,” says Cheng. “We are very close to mimicking a ‘Holodeck’ experience, to enable clients to step inside of a simulator and visualize, and adapt the media and vehicle motion in real time,” comments Edwards. “We have a history of providing real-time design experiences for filmmakers live on set in a workflow known as virtual production. Adding the extra sensory elements for fully immersive attraction design to that process is just a matter of time. The projects we see on the horizon are going to benefit significantly from such a real-time, iterative design process. Our team is focused towards this development goal every day.” Edwards concludes, “As themed attractions get more and more technically complex, using a form of visualization to depict all aspects of the project in progress has become near imperative. From an artistic perspective, previs allows designers to tweak the experiences to a very high degree, adjusting the placement, movement, performance and lighting. Just as fans these days are probably never going to lower their expectations for immersive entertainment, so designers and engineers will continue to use more and more visualization and technology to meet and exceed those expectations.”
Virtual reality in theme parks in the U.S. and Europe is still in its infancy, but a Chinese joint venture is betting a $470 million investment that VR can support an entire park right now. Oriental Science Fiction Valley opened April 29, 2018 and features 35 VR attractions spread over 330 acres on the outskirts of Guiyang, the capital of China’s southwestern province of Guizho. It is the world’s first outdoor theme park that exclusively employs virtual and augmented reality in its rides and activities. “There is fierce competition in the theme park market now,” the park’s CEO, Chen Jianli, told CGTN (China Global Television Network). “We’re trying to give customers a completely different experience by combining modern technologies such as VR and AR with traditional recreational facilities. The sci-fi part of it is important for the feelings it creates.”
“We’re trying to give customers a completely different experience by combining modern technologies such as VR and AR with traditional recreational facilities. The sci-fi part of it is important for the feelings it creates.” — Chen Jianli, CEO, Oriental Science Fiction Valley
Oriental Science Fiction Valley is a joint venture between Guizhou Province and Shuimu Animation, a subsidiary of Oriental Times Network Media. The futuristic park features virtual rollercoasters, virtual shooter games and virtual tours of the region’s scenic spots. VR-equipped bungee jumpers can also leap from the arm of a 53-meter-tall, 700-ton, robot-like statue that is vaguely Transformers-ish and is the signature decoration of the park. Virtual reality gained a toehold in global theme parks over the last few years. In many cases, VR has been used as an overlay on existing rides, although some attractions are beginning to utilize it from the ground up, as in Oriental Science Fiction Valley. VR Park Dubai, arguably the closest current competition to Oriental Science Fiction Valley, features 18 VR and AR experiences and is an indoor theme park in the Dubai Mall in Dubai, UAE. In addition, The Void and ILMxLAB have teamed to offer entertainment of a different type in Star Wars: Secrets of the Empire, a 25-minute multi-sensory and interactive VR experience available at Disney Springs, Downtown Disney and multiple other locations. TOP LEFT: Oriental Science Fiction Valley as it neared completion in early 2018. The park opened in April 2018 and features 35 VR attractions spread over 330 acres on the outskirts of Guiyang, China. (Image courtesy of East Science Valley/Oriental Times Network Media) TOP RIGHT, BOTTOM LEFT AND BOTTOM RIGHT: Oriental Science Fiction Valley (Images courtesy of East Science Valley/Oriental Times Network Media)
FALL 2018 VFXVOICE.COM • 57
8/23/18 12:17 PM
INDUSTRY ROUNDTABLE
NEW ADVENTURES AT THE INTERSECTION OF THEME PARKS AND VFX By CHRIS McGOWAN
and try to best our last “performance” if you will. Animation, special effects, show programming, lighting – everything was tweaked as we went from park to park. We also went back and applied many of those tweaks to the earlier JL rides to make sure they were all performing at the same level. There were some very unique moments in the Magic Mountain version of Justice League: Battle for Metropolis [which debuted in 2017]. We changed the queue pre-show to be more of a “batched” system, where guests are held in a series of rooms where they are told about their mission by Superman, Batman and Cyborg. We also brought in massive toroidal projection screens that wrap around you. These changes added up to a “next level” Justice League ride that is really something to experience. Every dark ride project is a challenge. There are many ways the projects can go “off the rails,” but we have been doing this for 40 years, producing over 60 dark rides during that time, so we are pretty good at keeping the team on task. MATT AITKEN, VISUAL EFFECTS SUPERVISOR, WETA DIGITAL
VFX Voice asked industry experts about the technical demands and continuing evolution of VFX and video used in theme park rides and other fixed-location entertainment. Here they discuss the use of 3D, domed and big screens, and what’s coming in the future. They also talk about dark rides and LBEs (Location-Based Entertainments), such as Justice League: Battle for Metropolis, Avatar: Flight of Passage, King Kong 360 3-D, Panem Aerial Tour (The World of The Hunger Games), Guardians of the Galaxy – Mission: Breakout!, Dream of Anhui and Star Wars: Secrets of the Empire (a VR LBE). RICH HILL, CREATIVE DIRECTOR, SALLY CORPORATION TOP: Justice League: Battle for Metropolis (Image courtesy of Sally Corporation) ABOVE: Rich Hill
“We are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride.” —Rich Hill, Creative Director, Sally Corporation
58 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 58-59
Over the past five years or so, Sally has been incorporating more and more CGI into our attractions. Designing a ride that’s all about superheroes (aka our recently opened Justice League: Battle for Metropolis attraction for Six Flags) didn’t lend itself to an army of animatronic figures. They may look great, but I wanted to see them doing dynamic things like running at 100 mph, flying across the sky and fighting villains, and those things are tough to pull off with animatronics. The Justice League Alien Invasion attraction [which opened in 2011 at Warner Bros. Movie World in Australia] was the first time we used CG gaming in one of our rides. It was very important to have digitally animated scenes due to the “superhero factor.” The dynamic motions really necessitated using CG. As far as I’m concerned, we are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride. When we were given the opportunity to create seven Justice League dark rides, we wanted to give each one something special
We faced many challenges creating King Kong: 360 3-D for Universal Studios Hollywood. On the technical side, there was the immense scale of the screens we were projecting on. These required very high-resolution imagery to avoid looking pixelated. Though the ride lasts less than two minutes, we had to render more pixels than a standard feature-length CG-animated movie. We also had to contend with designing the ride content to minimize projector cross-talk and manage stereo imagery from multiple rider perspectives so that everyone was able to experience the ride in full 3D. Creatively, the attraction had to work as a compelling narrative experience for each of the riders in the tram train: King Kong 360 3-D is a stop on the Universal Studios backlot tram tour, and guests experience the ride arranged in a line in front of 175-foot-wide screens. The story had to play out so that everyone was able to get involved in it no matter where they were sitting, which required careful stage management of the action. In 2010 when we launched this project, the idea of driving the audience into a large, custom-built ‘cinema,’ parking them up on a motion-base between two giant curved screens that filled their entire field of view no matter where they looked, and then taking them on a wild ride through Skull Island was, as far as I know, completely new. In many ways, our experience working on King Kong 360 3-D was the same as any film digital visual effects project. We used the same set of software tools and produced the work in our standard visual effects pipeline. But there are some key differences to the way we have to approach these projects. For example, we have to come up with a new way of allocating the work across the large team of artists working on the ride. In a typical film project, the work is made up of many short shots and artists can work shot by shot. But the media we are creating for these rides has no edit points and plays out as one continuous take. So we have to set up workflows to enable the team to all work concurrently on what will
TOP: King Kong 360 3-D (Image courtesy of Universal Studios Hollywood) ABOVE: Matt Aitken
“Without our innovations in deep compositing, it would have been quite difficult to deliver this project [King Kong 360 3-D].” —Matt Aitken, Visual Effects Supervisor, Weta Digital
FALL 2018 VFXVOICE.COM • 59
8/23/18 12:18 PM
INDUSTRY ROUNDTABLE
NEW ADVENTURES AT THE INTERSECTION OF THEME PARKS AND VFX By CHRIS McGOWAN
and try to best our last “performance” if you will. Animation, special effects, show programming, lighting – everything was tweaked as we went from park to park. We also went back and applied many of those tweaks to the earlier JL rides to make sure they were all performing at the same level. There were some very unique moments in the Magic Mountain version of Justice League: Battle for Metropolis [which debuted in 2017]. We changed the queue pre-show to be more of a “batched” system, where guests are held in a series of rooms where they are told about their mission by Superman, Batman and Cyborg. We also brought in massive toroidal projection screens that wrap around you. These changes added up to a “next level” Justice League ride that is really something to experience. Every dark ride project is a challenge. There are many ways the projects can go “off the rails,” but we have been doing this for 40 years, producing over 60 dark rides during that time, so we are pretty good at keeping the team on task. MATT AITKEN, VISUAL EFFECTS SUPERVISOR, WETA DIGITAL
VFX Voice asked industry experts about the technical demands and continuing evolution of VFX and video used in theme park rides and other fixed-location entertainment. Here they discuss the use of 3D, domed and big screens, and what’s coming in the future. They also talk about dark rides and LBEs (Location-Based Entertainments), such as Justice League: Battle for Metropolis, Avatar: Flight of Passage, King Kong 360 3-D, Panem Aerial Tour (The World of The Hunger Games), Guardians of the Galaxy – Mission: Breakout!, Dream of Anhui and Star Wars: Secrets of the Empire (a VR LBE). RICH HILL, CREATIVE DIRECTOR, SALLY CORPORATION TOP: Justice League: Battle for Metropolis (Image courtesy of Sally Corporation) ABOVE: Rich Hill
“We are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride.” —Rich Hill, Creative Director, Sally Corporation
58 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 58-59
Over the past five years or so, Sally has been incorporating more and more CGI into our attractions. Designing a ride that’s all about superheroes (aka our recently opened Justice League: Battle for Metropolis attraction for Six Flags) didn’t lend itself to an army of animatronic figures. They may look great, but I wanted to see them doing dynamic things like running at 100 mph, flying across the sky and fighting villains, and those things are tough to pull off with animatronics. The Justice League Alien Invasion attraction [which opened in 2011 at Warner Bros. Movie World in Australia] was the first time we used CG gaming in one of our rides. It was very important to have digitally animated scenes due to the “superhero factor.” The dynamic motions really necessitated using CG. As far as I’m concerned, we are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride. When we were given the opportunity to create seven Justice League dark rides, we wanted to give each one something special
We faced many challenges creating King Kong: 360 3-D for Universal Studios Hollywood. On the technical side, there was the immense scale of the screens we were projecting on. These required very high-resolution imagery to avoid looking pixelated. Though the ride lasts less than two minutes, we had to render more pixels than a standard feature-length CG-animated movie. We also had to contend with designing the ride content to minimize projector cross-talk and manage stereo imagery from multiple rider perspectives so that everyone was able to experience the ride in full 3D. Creatively, the attraction had to work as a compelling narrative experience for each of the riders in the tram train: King Kong 360 3-D is a stop on the Universal Studios backlot tram tour, and guests experience the ride arranged in a line in front of 175-foot-wide screens. The story had to play out so that everyone was able to get involved in it no matter where they were sitting, which required careful stage management of the action. In 2010 when we launched this project, the idea of driving the audience into a large, custom-built ‘cinema,’ parking them up on a motion-base between two giant curved screens that filled their entire field of view no matter where they looked, and then taking them on a wild ride through Skull Island was, as far as I know, completely new. In many ways, our experience working on King Kong 360 3-D was the same as any film digital visual effects project. We used the same set of software tools and produced the work in our standard visual effects pipeline. But there are some key differences to the way we have to approach these projects. For example, we have to come up with a new way of allocating the work across the large team of artists working on the ride. In a typical film project, the work is made up of many short shots and artists can work shot by shot. But the media we are creating for these rides has no edit points and plays out as one continuous take. So we have to set up workflows to enable the team to all work concurrently on what will
TOP: King Kong 360 3-D (Image courtesy of Universal Studios Hollywood) ABOVE: Matt Aitken
“Without our innovations in deep compositing, it would have been quite difficult to deliver this project [King Kong 360 3-D].” —Matt Aitken, Visual Effects Supervisor, Weta Digital
FALL 2018 VFXVOICE.COM • 59
8/23/18 12:18 PM
INDUSTRY ROUNDTABLE
snagged ent of CG nd water ed shot.
entually ea floor. ke Lively ea-floor effects.
TOP: The Panem Aerial Tour at The World of The Hunger Games, Motiongate Dubai. (Image courtesy of Lionsgate)
kly goes ture to a y results.
ABOVE: Gene Rogers
“AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.” —Gene Rogers, VP, Global Live & Location Based Entertainment, Lionsgate Entertainment Group
60 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 60-61
ultimately be one big shot. While the environment is rendering, we can continue to work on the animation of the creatures, characters, props, vehicles, etc. that populate the environment and are often one of the final things to get creative approval. We can render these elements as separate layers as they are approved and then composite them into the environment using our deep compositing workflows. Without our innovations in deep compositing, it would have been quite difficult to deliver this project. 3D is a key component in immersing the rider in the world of the ride. In King Kong 360 3-D, the backlot tour tram takes a detour to Skull Island and a CG tram car with digital-double tourists actually appears in this ride. 3D helps to create the sense that what the rider is seeing beyond the windows of the physical tram they are sitting in is a fully dimensional world extending out from their location in all directions.
guest’s perception of the media. 3D adds a depth that is missed when not present. For the Panem Aerial Tour, the key set-up of the ride is being able to see vast expanses of Panem in the air and feel like you’re flying with other hovercrafts right next to you. 3D is very effective in selling that reality. As screen resolution, processing power and speed, and accessibility continue to improve, there are exciting implications for the types of experiences we will soon be able to create. Software advances for animation and rigging also drastically reduce timelines. AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.
GENE ROGERS, VP, GLOBAL LIVE & LOCATION BASED ENTERTAINMENT, LIONSGATE ENTERTAINMENT GROUP
In February 2015, we were given the challenge of completely reimagining Tower of Terror and reopening it as Guardians of the Galaxy – Mission: Breakout! by the summer of 2017. We were able to work alongside Marvel Studios while they were shooting Guardians of the Galaxy, Vol. 2, providing both access to director James Gunn and his incredible team while also limiting our schedule and flexibility of the shooting window. If we could do it, we would be able to open a completely new attraction in the dimensional universe just three weeks after the cinematic universe opened the second installment of the smash film. The most unique challenge was taking this preexisting ride system and seeing what it was capable of, other than what had originally been imagined for the previous attraction. We thought it would be funny to disrupt the show and open the elevator doors while the ride was still moving. More than just being
The Panem Aerial Tour is a hovercraft motion-simulator attraction that takes guests on an exhilarating, in-world ride to and through the Capitol [The World of The Hunger Games, Motiongate Dubai]. Guests are immersed in a 3D media tunnel created by two 6.2 meter x 24 meter screens that curve to envelope the ride vehicle and provide optimum sightlines for all guests. Nine digital projection HighLite laser-based projectors create seamless, stunning 3D images of custom ride media created for the attraction, which includes original cast members from the The Hunger Games film series reprising their roles. In any attraction with multiple projectors, alignment becomes critical. Any slight misalignment will contribute greatly to the
AMY JUPITER, VFX EXECUTIVE PRODUCER, WDI (WALT DISNEY IMAGINEERING)
TOP: Guardians of the Galaxy – Mission: Breakout! (Image courtesy of Walt Disney Productions) ABOVE: Amy Jupiter
“Add into this mix [of large-format films] the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially.” —Amy Jupiter, VFX Executive Producer, Walt Disney Imagineering
FALL 2018 VFXVOICE.COM • 61
8/23/18 12:18 PM
INDUSTRY ROUNDTABLE
snagged ent of CG nd water ed shot.
entually ea floor. ke Lively ea-floor effects.
TOP: The Panem Aerial Tour at The World of The Hunger Games, Motiongate Dubai. (Image courtesy of Lionsgate)
kly goes ture to a y results.
ABOVE: Gene Rogers
“AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.” —Gene Rogers, VP, Global Live & Location Based Entertainment, Lionsgate Entertainment Group
60 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 60-61
ultimately be one big shot. While the environment is rendering, we can continue to work on the animation of the creatures, characters, props, vehicles, etc. that populate the environment and are often one of the final things to get creative approval. We can render these elements as separate layers as they are approved and then composite them into the environment using our deep compositing workflows. Without our innovations in deep compositing, it would have been quite difficult to deliver this project. 3D is a key component in immersing the rider in the world of the ride. In King Kong 360 3-D, the backlot tour tram takes a detour to Skull Island and a CG tram car with digital-double tourists actually appears in this ride. 3D helps to create the sense that what the rider is seeing beyond the windows of the physical tram they are sitting in is a fully dimensional world extending out from their location in all directions.
guest’s perception of the media. 3D adds a depth that is missed when not present. For the Panem Aerial Tour, the key set-up of the ride is being able to see vast expanses of Panem in the air and feel like you’re flying with other hovercrafts right next to you. 3D is very effective in selling that reality. As screen resolution, processing power and speed, and accessibility continue to improve, there are exciting implications for the types of experiences we will soon be able to create. Software advances for animation and rigging also drastically reduce timelines. AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.
GENE ROGERS, VP, GLOBAL LIVE & LOCATION BASED ENTERTAINMENT, LIONSGATE ENTERTAINMENT GROUP
In February 2015, we were given the challenge of completely reimagining Tower of Terror and reopening it as Guardians of the Galaxy – Mission: Breakout! by the summer of 2017. We were able to work alongside Marvel Studios while they were shooting Guardians of the Galaxy, Vol. 2, providing both access to director James Gunn and his incredible team while also limiting our schedule and flexibility of the shooting window. If we could do it, we would be able to open a completely new attraction in the dimensional universe just three weeks after the cinematic universe opened the second installment of the smash film. The most unique challenge was taking this preexisting ride system and seeing what it was capable of, other than what had originally been imagined for the previous attraction. We thought it would be funny to disrupt the show and open the elevator doors while the ride was still moving. More than just being
The Panem Aerial Tour is a hovercraft motion-simulator attraction that takes guests on an exhilarating, in-world ride to and through the Capitol [The World of The Hunger Games, Motiongate Dubai]. Guests are immersed in a 3D media tunnel created by two 6.2 meter x 24 meter screens that curve to envelope the ride vehicle and provide optimum sightlines for all guests. Nine digital projection HighLite laser-based projectors create seamless, stunning 3D images of custom ride media created for the attraction, which includes original cast members from the The Hunger Games film series reprising their roles. In any attraction with multiple projectors, alignment becomes critical. Any slight misalignment will contribute greatly to the
AMY JUPITER, VFX EXECUTIVE PRODUCER, WDI (WALT DISNEY IMAGINEERING)
TOP: Guardians of the Galaxy – Mission: Breakout! (Image courtesy of Walt Disney Productions) ABOVE: Amy Jupiter
“Add into this mix [of large-format films] the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially.” —Amy Jupiter, VFX Executive Producer, Walt Disney Imagineering
FALL 2018 VFXVOICE.COM • 61
8/23/18 12:18 PM
INDUSTRY ROUNDTABLE
funny, it would also support the visual illusion that the Guardians were actually there in the attraction with us. This meant that we would have to figure out a way to shoot our live-action plate in a way that would lock the audience’s point of view, their eyepoint, to the elevator movement. Since we had to shoot so early in our schedule, we devised a method to shoot our plates with a vertical stack of seven 4K cameras shooting at 120fps, with the two cameras at either end of the stack tilted so we could create a huge vertical plate on which we would be virtually able to “reshoot” the Guardians – when we understood the ride profile to derive a fully dimensional 3D camera that matched exactly with the ride’s movement. There has always been a crossover between large-format film for special venues such as theme park attractions and the standard cinema VFX and animation world. ILM has collaborated with Walt Disney Imagineering for many years on projects, from the original Star Tours to the newest versions of Pirates of the Caribbean and Soarin’ Around the World (Shanghai Disneyland), to the Iron Man Experience attraction in Hong Kong Disneyland. Large-format displays utilizing ultra-high frame rate and ultra-high-resolution imagery bring with them much more data, many more assets, and much more scrutiny of the imagery. Most of our attractions include multiple minutes of these large-format films with no cuts at all. Add into this mix the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially. Great attractions are those in which our guests are moved and transported emotionally by the experience they have. All of the show elements in that design, including video and VFX which seamlessly support that experience and do not diminish it, help create a great and immersive guest experience.
TOP: Dream of Anhui (Image courtesy of Tippett Studio) ABOVE: Chris Morley
“It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle [in producing Dream of Anhui].” —Chris Morley, Visual Effects Supervisor, Tippett Studio
62 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 63
CHRIS MORLEY, VISUAL EFFECTS SUPERVISOR, TIPPETT STUDIO
The Dream of Anhui ride was our first step into the world of theme park entertainment. It was a great project to establish our foundation as a media provider for physical rides. Dream of Anhui featured an 80-person flight deck with six degrees of freedom in front of a 30-meter-wide, 180-degree dome. Tippett Studio had over 30 years of experience dealing with screen media so we were in pretty good shape on that front. We knew we would need to deal with the spherical projection crossbounce of light, image contrast, saturation, and the motion of the image matching the motion of the ride. It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle. We spent weeks testing various workflows and decided that the Tippett Studio character pipeline would not be efficient for this type of project. We found a fantastic piece of software by the name of Clarisse and built an asset pipeline that would feed it. Clarisse proved to be a wonderful tool, and the Isotropix team was very helpful and collaborative along the way. The project lasted about two years in total. We spent about a
year in pre-production, research, location scouting and previsualization before we started building the final environments. Once we had the solid previsualization foundation using photogrammetry of the real locations, we built the 15 computer-generated environments in about eight months. We had about 60 people on the project from multiple departments throughout the Tippett pipeline. For rides, we are hired to be production. We come up with the concept, write the script, design the characters, create the storyboards, direct the media, compose the music, and work closely with the ride movement and installation team. We researched a few flying theater rides before starting on Dream of Anhui and knew we wanted to make this one special. We created a character with a dream of flying, we worked hard on creative and seamless transitions between environments, but most of all we created an objective for each scene, something the viewer could follow into the next. This led the viewer’s eye where we wanted and created a thoughtful journey through the beautiful landscape of the Anhui province. JON WALKENHORST, CHIEF TECHNOLOGY OFFICER, THE VOID
Star Wars: Secrets of the Empire was a true collaboration between The Void and ILMxLAB. Both companies shared the goal of telling location-based hyper-reality immersive stories. The Void uses custom VR technology along with physical stages to create immersive experiences that inspire exploration and engagement. Participants wear The Void’s proprietary design of the VR equipment, including head mounted displays (HMDs), backtop computers and haptic vests to engage the senses and transport them to new virtual worlds. Star Wars: Secrets of the Empire is a fully immersive VR experience that, with special gear, transports you deep into the Star Wars universe, allowing you to roam freely, hear, touch, feel and see. We utilize many of the same creation tools and technologies and guidelines [as in VFX for feature films]. For a location-based VR experience we might focus on a shorter experience, but offer more detail or opportunity to explore a given room or interact with the characters. The key difference would be leveraging the VR engines in rendering and layering additional guest interaction and other senses. The Void’s hyper-reality experiences go beyond other location-based VR experiences. We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.
TOP: Star Wars: Secrets of the Empire (Image courtesy of ILMxLAB and The Void) ABOVE: Jon Walkenhorst
“We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.” —Jon Walkenhorst, Chief Technology Officer, The Void
FALL 2018 VFXVOICE.COM • 63
8/23/18 12:18 PM
INDUSTRY ROUNDTABLE
funny, it would also support the visual illusion that the Guardians were actually there in the attraction with us. This meant that we would have to figure out a way to shoot our live-action plate in a way that would lock the audience’s point of view, their eyepoint, to the elevator movement. Since we had to shoot so early in our schedule, we devised a method to shoot our plates with a vertical stack of seven 4K cameras shooting at 120fps, with the two cameras at either end of the stack tilted so we could create a huge vertical plate on which we would be virtually able to “reshoot” the Guardians – when we understood the ride profile to derive a fully dimensional 3D camera that matched exactly with the ride’s movement. There has always been a crossover between large-format film for special venues such as theme park attractions and the standard cinema VFX and animation world. ILM has collaborated with Walt Disney Imagineering for many years on projects, from the original Star Tours to the newest versions of Pirates of the Caribbean and Soarin’ Around the World (Shanghai Disneyland), to the Iron Man Experience attraction in Hong Kong Disneyland. Large-format displays utilizing ultra-high frame rate and ultra-high-resolution imagery bring with them much more data, many more assets, and much more scrutiny of the imagery. Most of our attractions include multiple minutes of these large-format films with no cuts at all. Add into this mix the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially. Great attractions are those in which our guests are moved and transported emotionally by the experience they have. All of the show elements in that design, including video and VFX which seamlessly support that experience and do not diminish it, help create a great and immersive guest experience.
TOP: Dream of Anhui (Image courtesy of Tippett Studio) ABOVE: Chris Morley
“It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle [in producing Dream of Anhui].” —Chris Morley, Visual Effects Supervisor, Tippett Studio
62 • VFXVOICE.COM FALL 2018
PG 58-63 ROUNDTABLE.indd 63
CHRIS MORLEY, VISUAL EFFECTS SUPERVISOR, TIPPETT STUDIO
The Dream of Anhui ride was our first step into the world of theme park entertainment. It was a great project to establish our foundation as a media provider for physical rides. Dream of Anhui featured an 80-person flight deck with six degrees of freedom in front of a 30-meter-wide, 180-degree dome. Tippett Studio had over 30 years of experience dealing with screen media so we were in pretty good shape on that front. We knew we would need to deal with the spherical projection crossbounce of light, image contrast, saturation, and the motion of the image matching the motion of the ride. It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle. We spent weeks testing various workflows and decided that the Tippett Studio character pipeline would not be efficient for this type of project. We found a fantastic piece of software by the name of Clarisse and built an asset pipeline that would feed it. Clarisse proved to be a wonderful tool, and the Isotropix team was very helpful and collaborative along the way. The project lasted about two years in total. We spent about a
year in pre-production, research, location scouting and previsualization before we started building the final environments. Once we had the solid previsualization foundation using photogrammetry of the real locations, we built the 15 computer-generated environments in about eight months. We had about 60 people on the project from multiple departments throughout the Tippett pipeline. For rides, we are hired to be production. We come up with the concept, write the script, design the characters, create the storyboards, direct the media, compose the music, and work closely with the ride movement and installation team. We researched a few flying theater rides before starting on Dream of Anhui and knew we wanted to make this one special. We created a character with a dream of flying, we worked hard on creative and seamless transitions between environments, but most of all we created an objective for each scene, something the viewer could follow into the next. This led the viewer’s eye where we wanted and created a thoughtful journey through the beautiful landscape of the Anhui province. JON WALKENHORST, CHIEF TECHNOLOGY OFFICER, THE VOID
Star Wars: Secrets of the Empire was a true collaboration between The Void and ILMxLAB. Both companies shared the goal of telling location-based hyper-reality immersive stories. The Void uses custom VR technology along with physical stages to create immersive experiences that inspire exploration and engagement. Participants wear The Void’s proprietary design of the VR equipment, including head mounted displays (HMDs), backtop computers and haptic vests to engage the senses and transport them to new virtual worlds. Star Wars: Secrets of the Empire is a fully immersive VR experience that, with special gear, transports you deep into the Star Wars universe, allowing you to roam freely, hear, touch, feel and see. We utilize many of the same creation tools and technologies and guidelines [as in VFX for feature films]. For a location-based VR experience we might focus on a shorter experience, but offer more detail or opportunity to explore a given room or interact with the characters. The key difference would be leveraging the VR engines in rendering and layering additional guest interaction and other senses. The Void’s hyper-reality experiences go beyond other location-based VR experiences. We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.
TOP: Star Wars: Secrets of the Empire (Image courtesy of ILMxLAB and The Void) ABOVE: Jon Walkenhorst
“We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.” —Jon Walkenhorst, Chief Technology Officer, The Void
FALL 2018 VFXVOICE.COM • 63
8/23/18 12:18 PM
FOCUS
Visual effects in Australia and New Zealand has a rich history. Leaps and bounds have been made in the tech of VFX in that part of the world, as well as in iconic film franchises. Soon, the visual effects industry ‘down under,’ particularly in Australia, is potentially set to go through another transformation ignited by the recent announcement that Mill Film will be opening up a major studio presence in Adelaide. VFX Voice outlines the who’s who of the VFX industry in Australia and New Zealand, asking several of the major players their thoughts on where things are right now and where they might be in the future. A QUICK STATE OF PLAY
VFX DOWN UNDER: THE NOW AND THE NEW IN AUSTRALIA AND NEW ZEALAND By IAN FAILES
TOP: Luma Pictures’ Melbourne office
64 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 64-65
Australia is dominated by a few major VFX players, including large and medium-sized studios that predominantly do feature film and episodic work. There are also plenty of smaller studios working in commercials and post-production. New Zealand has several similarly purposed studios, too, but the main player is, of course, Weta Digital, one of the major visual effects studios in the world. Both countries are home to key technological innovators in visual effects. Blackmagic Design hails from Australia, and the compositing system Flame was originally developed in Melbourne. VFX collaboration tool cineSync sprung from Adelaide. Over in New Zealand, Weta Digital was the originator of the 3D texturepainting tool MARI and numerous other technical developments in motion capture, compositing, lighting and rendering over a number of productions. Like many cities and countries the world over, Australia and New Zealand offer production incentives for visual effects and animation studios. In Australia, this can result in up to a 40% rebate on qualifying expenditures when state and federal incentives are combined (see www.ausfilm.com.au for more information). New Zealand offers a cash grant program that can reach up to 20% (more info: www.nzfilm.co.nz). There are also a number of film studio locations available in each country, adding film incentives to the mix. “Australia has physical infrastructure in terms of state-of-the-art studio complexes and highly skilled film-service companies and crew that can handle small to big-budget productions,” outlines Rachelle Gibson from Ausfilm, which administers Australia’s federal incentives program. “Australian VFX studios have had a stellar year working on everything from the world’s biggest blockbusters to critically acclaimed indie films. There is fantastic talent in Australia’s VFX industry, which is on par with other great VFX hubs around the world.” Similarly, in relation to New Zealand, Weta Digital Executive Producer David Conley notes that that country’s government has been supportive of the local VFX industry. “We are fortunate in New Zealand that we have had a stable platform from which to grow our business,” he says. “The incentive strategy coming out of central government has been consistent over the last few years, even with changes in government and policy priorities. The numbers may go up and down a bit, but we’ve seen a commitment
to the importance of having this industry here in New Zealand, and the role it plays as a catalyst for the entertainment and technology business sectors.” THE MAJOR PLAYERS AND THEIR MAJOR WORKS
Australian and New Zealand visual effects studios work on everything from major blockbusters to smaller independent films, immersive entertainment, and now heavily in streaming series. Animal Logic’s Head of Production Ingrid Johnston highlights Peter Rabbit as one of her studio’s recent stand-outs, not only for the photorealistic creature work, but also as part of Animal’s move into developing its own IP. “As our first film produced out of Animal Logic Entertainment, we have been part of the film from the very beginning of the process, and as a hybrid character film we were combining our experience in Animation and traditional VFX,” says Johnston. Animal Logic has also expanded to Vancouver, where production on animated features, including The Lego Movie 2: The Second Part, has been taking place. A few other Australian studios are also part of larger global entities. Method Studios in Australia (formerly Iloura) has made a concerted effort to maintain a style of effects work that has kept the company operating for more than 20 years out of Melbourne and now also in Sydney. “While we’re part of a global company with locations beyond Australia that gives us the opportunity to work on bigger projects, we work hard to maintain our boutique style of intimate service to our clients,” states Method Studios Visual Effects Supervisor Glenn Melenhorst, who has headed up the company’s work on projects such as Game of Thrones, Jumanji: Welcome to the Jungle
TOP: One of Weta Digital’s major films released in 2018 was Avengers: Infinity War. The studio produced a CG Thanos, among other effects. (Image copyright © 2018 Marvel) MIDDLE: Weta Digital Executive Producer David Conley BOTTOM: Artists at work at Weta Digital in Wellington.
FALL 2018 VFXVOICE.COM • 65
8/23/18 12:23 PM
FOCUS
Visual effects in Australia and New Zealand has a rich history. Leaps and bounds have been made in the tech of VFX in that part of the world, as well as in iconic film franchises. Soon, the visual effects industry ‘down under,’ particularly in Australia, is potentially set to go through another transformation ignited by the recent announcement that Mill Film will be opening up a major studio presence in Adelaide. VFX Voice outlines the who’s who of the VFX industry in Australia and New Zealand, asking several of the major players their thoughts on where things are right now and where they might be in the future. A QUICK STATE OF PLAY
VFX DOWN UNDER: THE NOW AND THE NEW IN AUSTRALIA AND NEW ZEALAND By IAN FAILES
TOP: Luma Pictures’ Melbourne office
64 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 64-65
Australia is dominated by a few major VFX players, including large and medium-sized studios that predominantly do feature film and episodic work. There are also plenty of smaller studios working in commercials and post-production. New Zealand has several similarly purposed studios, too, but the main player is, of course, Weta Digital, one of the major visual effects studios in the world. Both countries are home to key technological innovators in visual effects. Blackmagic Design hails from Australia, and the compositing system Flame was originally developed in Melbourne. VFX collaboration tool cineSync sprung from Adelaide. Over in New Zealand, Weta Digital was the originator of the 3D texturepainting tool MARI and numerous other technical developments in motion capture, compositing, lighting and rendering over a number of productions. Like many cities and countries the world over, Australia and New Zealand offer production incentives for visual effects and animation studios. In Australia, this can result in up to a 40% rebate on qualifying expenditures when state and federal incentives are combined (see www.ausfilm.com.au for more information). New Zealand offers a cash grant program that can reach up to 20% (more info: www.nzfilm.co.nz). There are also a number of film studio locations available in each country, adding film incentives to the mix. “Australia has physical infrastructure in terms of state-of-the-art studio complexes and highly skilled film-service companies and crew that can handle small to big-budget productions,” outlines Rachelle Gibson from Ausfilm, which administers Australia’s federal incentives program. “Australian VFX studios have had a stellar year working on everything from the world’s biggest blockbusters to critically acclaimed indie films. There is fantastic talent in Australia’s VFX industry, which is on par with other great VFX hubs around the world.” Similarly, in relation to New Zealand, Weta Digital Executive Producer David Conley notes that that country’s government has been supportive of the local VFX industry. “We are fortunate in New Zealand that we have had a stable platform from which to grow our business,” he says. “The incentive strategy coming out of central government has been consistent over the last few years, even with changes in government and policy priorities. The numbers may go up and down a bit, but we’ve seen a commitment
to the importance of having this industry here in New Zealand, and the role it plays as a catalyst for the entertainment and technology business sectors.” THE MAJOR PLAYERS AND THEIR MAJOR WORKS
Australian and New Zealand visual effects studios work on everything from major blockbusters to smaller independent films, immersive entertainment, and now heavily in streaming series. Animal Logic’s Head of Production Ingrid Johnston highlights Peter Rabbit as one of her studio’s recent stand-outs, not only for the photorealistic creature work, but also as part of Animal’s move into developing its own IP. “As our first film produced out of Animal Logic Entertainment, we have been part of the film from the very beginning of the process, and as a hybrid character film we were combining our experience in Animation and traditional VFX,” says Johnston. Animal Logic has also expanded to Vancouver, where production on animated features, including The Lego Movie 2: The Second Part, has been taking place. A few other Australian studios are also part of larger global entities. Method Studios in Australia (formerly Iloura) has made a concerted effort to maintain a style of effects work that has kept the company operating for more than 20 years out of Melbourne and now also in Sydney. “While we’re part of a global company with locations beyond Australia that gives us the opportunity to work on bigger projects, we work hard to maintain our boutique style of intimate service to our clients,” states Method Studios Visual Effects Supervisor Glenn Melenhorst, who has headed up the company’s work on projects such as Game of Thrones, Jumanji: Welcome to the Jungle
TOP: One of Weta Digital’s major films released in 2018 was Avengers: Infinity War. The studio produced a CG Thanos, among other effects. (Image copyright © 2018 Marvel) MIDDLE: Weta Digital Executive Producer David Conley BOTTOM: Artists at work at Weta Digital in Wellington.
FALL 2018 VFXVOICE.COM • 65
8/23/18 12:23 PM
FOCUS
A-Z: Top Australia and New Zealand VFX Studios A quick guide to just some of the most active visual effects studios based in Australia and New Zealand. Alt.vfx. A boutique commercials house located primarily in Brisbane that gained notoriety for a popular TVC for beer brand Tooheys that utilized CG deer. Other commercials it has worked on for overseas clients, including Japanese Pepsi spots, have generated a huge following. www.altvfx.com Animal Logic. This Sydney-based studio, which now also has an office in Vancouver, has been operating for more than 25 years. It began in traditional VFX and continues to do so with projects like Peter Rabbit and several Marvel films, as well as in animated features such as the LEGO movies, while also developing its own IP. www.animallogic.com Assembly. Auckland’s Assembly is strong on character animation for commercials and promos, but also regularly delivers traditional visual effects work. www.assemblyltd.com
TOP: Method Studios delivered CG rhinos for a dramatic scene in Jumanji: Welcome to the Jungle. (Image copyright © 2017 Sony Pictures) MIDDLE LEFT: Head of Mill Film Lauren McCallum MIDDLE: Method Studios Visual Effects Supervisor Glenn Melenhorst MIDDLE RIGHT: Luma Pictures VP and Head of Production Vince Cirelli BOTTOM: Inside a review suite at Method Studios in Melbourne. OPPOSITE BOTTOM: Luma Pictures worked on this chase sequence in Black Panther. (Image copyright © 2018 Marvel)
Blockhead. Blockhead has Auckland and Sydney offices, and works predominantly in commercials while also representing several colorists. www.blockheadvfx.com Cause+FX. Auckland’s Cause+FX studio mixes film, television and commercials work. It had particular success with realizing massive battles and fight scenes for the Spartacus TV series. www.causefx.nz Cutting Edge. A long-time player in the Australian post-production industry, Cutting Edge is in some ways one of the few diverse studios that works in editing, visual effects and grading. www.cuttingedge.com.au Digipost. Digipost hails from Auckland and has been in operation since 1990. Some of its stand-out work in recent years includes work on the Spartacus TV series and a number of feature films and other television shows. www.digipost.co.nz
Fin. Sydney-based Fin is a small studio with a heavy bent on design in commercials and stand-alone film VFX sequences. www.findesign.com.au Luma Pictures. After opening a Melbourne office to join its Santa Monica operation, Luma Pictures has specialized in providing high-end VFX work for stand-out sequences in major effects films, including Black Panther, Doctor Strange and Thor: Ragnarok. www.lumapictures.com Method Studios. Formerly Iloura – a studio with a 20-year history in Melbourne – Method Studios now operates in Sydney and Melbourne (as well as several locations around the globe). It contributed to such projects as Game of Thrones, Jumanji: Welcome to the Jungle and Mad Max: Fury Road. www.methodstudios.com Plastic Wax. A specialist cinematics studio, Plastic Wax in Sydney works not only in executing ‘cut scenes’ and promo trailers for games, but also in story development and motion capture for them. www.plasticwax.com Resin. Resin is one of the few VFX studios in Adelaide, and has recently ramped up its TV production on such shows as Tidelands, The Tick and Electric Dreams. www.resin.tv Rising Sun Pictures. Adelaide-based Rising Sun Pictures is well known for one-off VFX magical moments in the films it works on, most spectacularly with the slow-motion Quicksilver sequences in X-Men: Days of Future Past and X-Men: Apocalypse. www.rsp.com.au Weta Digital. A powerhouse visual effects studio, Wellington’s Weta Digital has amassed a quarter century of iconic VFX work, ranging from Lord of the Rings to the Planet of the Apes reboot. Its sister company, Weta Workshop, is also iconic in the practical and miniature effects world. www.wetafx.co.nz
and Christopher Robin. “We’re open to all kinds of work, with a definite leaning toward character animation. We’re equally comfortable being the sole vendor on a smaller creature show or a small part of a large film.” Luma Pictures, in Melbourne, also has a second office in Santa Monica, where it began. The studio opened in Australia in an effort to make a controlled expansion, according to Luma VP and Head of Production Vince Cirelli. “The positive aspects of having an office in Australia are that the government is very supportive of the film industry, the market isn’t heavily saturated – there are only a few other facilities in the area, Melbourne is just an amazing city
66 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 66-67
FALL 2018 VFXVOICE.COM • 67
8/23/18 12:23 PM
FOCUS
A-Z: Top Australia and New Zealand VFX Studios A quick guide to just some of the most active visual effects studios based in Australia and New Zealand. Alt.vfx. A boutique commercials house located primarily in Brisbane that gained notoriety for a popular TVC for beer brand Tooheys that utilized CG deer. Other commercials it has worked on for overseas clients, including Japanese Pepsi spots, have generated a huge following. www.altvfx.com Animal Logic. This Sydney-based studio, which now also has an office in Vancouver, has been operating for more than 25 years. It began in traditional VFX and continues to do so with projects like Peter Rabbit and several Marvel films, as well as in animated features such as the LEGO movies, while also developing its own IP. www.animallogic.com Assembly. Auckland’s Assembly is strong on character animation for commercials and promos, but also regularly delivers traditional visual effects work. www.assemblyltd.com
TOP: Method Studios delivered CG rhinos for a dramatic scene in Jumanji: Welcome to the Jungle. (Image copyright © 2017 Sony Pictures) MIDDLE LEFT: Head of Mill Film Lauren McCallum MIDDLE: Method Studios Visual Effects Supervisor Glenn Melenhorst MIDDLE RIGHT: Luma Pictures VP and Head of Production Vince Cirelli BOTTOM: Inside a review suite at Method Studios in Melbourne. OPPOSITE BOTTOM: Luma Pictures worked on this chase sequence in Black Panther. (Image copyright © 2018 Marvel)
Blockhead. Blockhead has Auckland and Sydney offices, and works predominantly in commercials while also representing several colorists. www.blockheadvfx.com Cause+FX. Auckland’s Cause+FX studio mixes film, television and commercials work. It had particular success with realizing massive battles and fight scenes for the Spartacus TV series. www.causefx.nz Cutting Edge. A long-time player in the Australian post-production industry, Cutting Edge is in some ways one of the few diverse studios that works in editing, visual effects and grading. www.cuttingedge.com.au Digipost. Digipost hails from Auckland and has been in operation since 1990. Some of its stand-out work in recent years includes work on the Spartacus TV series and a number of feature films and other television shows. www.digipost.co.nz
Fin. Sydney-based Fin is a small studio with a heavy bent on design in commercials and stand-alone film VFX sequences. www.findesign.com.au Luma Pictures. After opening a Melbourne office to join its Santa Monica operation, Luma Pictures has specialized in providing high-end VFX work for stand-out sequences in major effects films, including Black Panther, Doctor Strange and Thor: Ragnarok. www.lumapictures.com Method Studios. Formerly Iloura – a studio with a 20-year history in Melbourne – Method Studios now operates in Sydney and Melbourne (as well as several locations around the globe). It contributed to such projects as Game of Thrones, Jumanji: Welcome to the Jungle and Mad Max: Fury Road. www.methodstudios.com Plastic Wax. A specialist cinematics studio, Plastic Wax in Sydney works not only in executing ‘cut scenes’ and promo trailers for games, but also in story development and motion capture for them. www.plasticwax.com Resin. Resin is one of the few VFX studios in Adelaide, and has recently ramped up its TV production on such shows as Tidelands, The Tick and Electric Dreams. www.resin.tv Rising Sun Pictures. Adelaide-based Rising Sun Pictures is well known for one-off VFX magical moments in the films it works on, most spectacularly with the slow-motion Quicksilver sequences in X-Men: Days of Future Past and X-Men: Apocalypse. www.rsp.com.au Weta Digital. A powerhouse visual effects studio, Wellington’s Weta Digital has amassed a quarter century of iconic VFX work, ranging from Lord of the Rings to the Planet of the Apes reboot. Its sister company, Weta Workshop, is also iconic in the practical and miniature effects world. www.wetafx.co.nz
and Christopher Robin. “We’re open to all kinds of work, with a definite leaning toward character animation. We’re equally comfortable being the sole vendor on a smaller creature show or a small part of a large film.” Luma Pictures, in Melbourne, also has a second office in Santa Monica, where it began. The studio opened in Australia in an effort to make a controlled expansion, according to Luma VP and Head of Production Vince Cirelli. “The positive aspects of having an office in Australia are that the government is very supportive of the film industry, the market isn’t heavily saturated – there are only a few other facilities in the area, Melbourne is just an amazing city
66 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 66-67
FALL 2018 VFXVOICE.COM • 67
8/23/18 12:23 PM
FOCUS
TOP LEFT: Inside the Luma Pictures office in Melbourne. TOP RIGHT: A scene from Peter Rabbit. Animal Logic crafted the CG characters and also produced the film out of its Animal Logic Entertainment outfit. (Image copyright © 2018 Sony Pictures) MIDDLE: Animal Logic’s Head of Production Ingrid Johnston. BOTTOM: A Peter Rabbit shot is set up for shooting. Animal Logic would later add the CG rabbits and other animals. (Image copyright © 2018 Sony Pictures)
68 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 69
to live in, and the quality of life is wonderful. It would have been in some ways easier to open a facility in the same time zone as our Los Angeles-based office, but the advantage of having a facility in Melbourne is that we’re able to chase the sun, so to speak, and work on projects for our clients around the clock. “The most challenging aspect to having a facility in Australia is the difficulty that comes with recruiting,” adds Cirelli. “Because you’re on an island, a good portion of your labor is foreign to the country. What’s good about Luma’s structure is that we’re a staffbased model. We don’t just hire on a project basis, we keep our artists long term, which allows us to grow very organically.” Over the Tasman Sea, in New Zealand, Weta Digital is currently embarking on several Avatar sequels – and occasionally also faces staffing issues. “As a small country with a limited local market for film VFX artists, the growth of the education models and training infrastructure has been slow and steady,” notes David Conley. “We continue to increase our engagement at the secondary school, university and post-grad levels to make sure we are providing specific feedback that can help prepare students for entering the market. For more experienced artists, we recruit like most larger facilities do. The industry itself is pretty small, so getting the word out tends not to be the issue, it’s about finding the right fit.” As noted, there are a myriad of smaller visual effects studios based in Australia and New Zealand which concentrate on TVC (television commercials), episodic and promo deliveries for local and overseas clients. Brisbane-based Alt.vfx Co-founder and Visual Effects Supervisor Colin Renshaw is adamant that this kind of work in the region is highly regarded. “I’ve just been part of the Film Craft jury at Cannes,” he says, “and I can tell you that craft-wise, the work coming out of Australia can hold its head high alongside the work from any country in the world. We have worldclass artists in all disciplines, from advertising to film. There’s a good number of large VFX and animation studios working in film, drawing work and artists from all over the world, and in advertising, we are doing the same. Of course, budgets have always been and continue to be an issue, but as a company, we have always had an international outlook, and that has allowed us to grow our reputation in other markets outside of Australia and New Zealand.”
AD
FALL 2018 VFXVOICE.COM • 69
8/23/18 12:23 PM
PG 69 METHOD AD.indd 69
8/23/18 12:25 PM
FOCUS
A TIME FOR CHANGE
VFX Training Puts You in the Studio As the VFX industry in Australia ramps up, so too has the availability of training courses in the country. One institution offering something a little different is the University of Technology Sydney, which formed a collaboration with local studio Animal Logic to create the UTS Animal Logic Academy (UTS ALA, www.animallogicacademy.uts.edu.au) and to offer a one-year Master of Animation and Visualization degree. The big difference from a normal higher education course is that UTS ALA runs much more like a real VFX or animation studio. “There are no lectures, there is no homework,” explains Creative Lead Chris Ebeling. “And there aren’t any individual assignments or projects. We all work together on one production, and like any studio, have our artists split across a range of disciplines, including R&D and studio production roles.” UTS ALA had its first intake in 2017, with students so far embarking on a short animated project, a VR experience and a mixed-reality platform. The idea has been to give students – who come from all kinds of disciplines including other than animation or visual effects – a wide range of experience in what is a diverse industry. “We like to think that we are giving our crew a head start,” says Ebeling. “Not only will our students be able to hit the ground in any major studio, they will also have the aptitude, knowledge and experience in working with emerging technologies and how to problem-solve and create innovative content here.”
TOP: A recent project produced at UTS ALA involved designing, modeling, animating and producing effects for some killer robots.
An increase in the incentives offered through South Australia’s Post Production, Digital and Visual Effects rebate impacted Technicolor’s decision to open Mill Film’s facility in Adelaide (a branch will also open in Montreal). “Technicolor saw an opportunity to access VFX talent that is already in Australia and provide another hub from which to service Technicolor’s global clients,” says Global Head of Mill Film Lauren McCallum. “The facility in Adelaide expands Technicolor’s footprint in the region, while enhancing the support it can provide clients who require world-class talent to work on productions around the globe – and around the clock.” McCallum says Technicolor, which will also operate a Technicolor Academy VFX training program out of the Adelaide facility, is on target to open the Australian studio in the fall of 2018. “Over the next five years it’s our intention to grow Mill Film Adelaide from a core team up to a team of 500 production, technologists and creatives,” she adds. The impact this will have on the Australian and New Zealand visual effects industries remains to be seen. But with a clear increase right now in visual effects production coming from comic book films, animated productions and streaming episodic television, VFX down under looks set to continue to be busy.
AD
BOTTOM LEFT: Students, otherwise known as ‘crew,’ at the UTS Animal Logic Academy. BOTTOM RIGHT: Chris Ebeling
70 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 71
FALL 2018 VFXVOICE.COM • 71
8/23/18 12:24 PM
PG 71 AUSTRALIAN FILM COM AD.indd 71
8/23/18 12:26 PM
FOCUS
Hello and Review, from the Other Side of the World When Adelaide-based VFX studio Rising Sun Pictures was simultaneously working on Superman Returns and Harry Potter and the Goblet of Fire, they would need to send shots for remote review to U.S. and U.K. time zones. But just sending the files required mountains of ‘explanation’. Instead, they mashed up a prototype over a weekend for a chat room and QuickTime player combo. It would ultimately become cineSync, the popular remote review and collaboration tool, and an example of the many tech innovations coming out of Australia. CineSync, developed by Cospective, is used by every major U.S. film studio, the vast majority of major TV networks and every major VFX facility in the world. “It’s become part of the furniture in post-production, although what constitutes ‘post-production’ these days is up for debate,” says Cospective CEO Rory McGregor. “Now that VFX supervisors are involved from previs, cineSync’s role in the life of a project continues to expand.” McGregor adds that cineSync’s development in Australia is apt, given its long distance from some of the central filmmaking hubs around the world. “Being based in Australia, we have an innate understanding of working remotely,” he says. “We adopted a global mindset from the very beginning – and because we develop and sell our software from far-flung Australia, we meet remote communication challenges every day.”
AD TOP: Alt.vfx’s CG panda for a ‘Tile’ TVC. (Image copyright © 2017 Tile) MIDDLE LEFT: Alt.vfx co-founder and Visual Effects Supervisor Colin Renshaw.
RIGHT: Rory McGregor, CEO of Cospective, which develops cineSync. BOTTOM: A cineSync frame from the production of Pacific Rim Uprising. (Image courtesy of Cospective)
72 • VFXVOICE.COM FALL 2018
PG 64-72 AUS/NZ.indd 73
MIDDLE RIGHT: Artists at Alt.vfx office BOTTOM: A still from the visual effects work by Rising Sun Pictures for X-Men: Apocalypse. (Image copyright © 2016 20th Century Fox)
FALL 2018 VFXVOICE.COM • 73
8/23/18 12:24 PM
PG 73 FOCAL PRESS MASTERS of FX AD.indd 73
8/23/18 12:27 PM
FILM
MEETING THE CHALLENGE: HOW WOULD YOU SOLVE THREE BIG VFX PROBLEMS? By IAN FAILES
TOP: Barson won a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for Game of Thrones in 2012. (Image copyright © 2011 HBO)
It’s often said that visual effects is about problem solving – how can the best shot be achieved in the time necessary and the budget allowed, with all the many other variables that tend to get thrown in during production? What exactly does problem solving in visual effects involve? For insight into the process, VFX Voice asked three supervisors how they might tackle a particular visual effects scenario. They were asked what first steps they’d take, how they’d actually plan the effects work, and things to consider during any live-action shoot. While there were no scripts or story treatments to go from – usually the source of many answers (and perhaps further problems) for VFX supervisors – each scenario presented these common issues apparent in VFX problem solving: Where do you look for reference? Should the approach be practical or CG? How would the shots be bid out? How do you collaborate with the other filmmakers? Read on for fun and frank discussion from seasoned Visual Effects Supervisors Angela Barson, Adam Howard and Brendan Taylor.
First steps: This type of scenario is a great example of how the visual effects supervisor needs to work very closely with other departments. Communication and planning is key. It’s critical to have an early discussion with the special effects supervisor, stunt coordinator, production designer, director and producer. Any sequence involving fire has a level of complexity and safety requirements beyond a normal shoot. Putting a child into the mix increases that complexity. Initially, it’s important to know what the location is, how much fire, smoke and destruction is wanted, and how close all of this happens to the actors. Storyboards would be a huge help, even at this early stage. Anything which involves SFX, VFX and stunts always needs careful planning, and visuals will make sure everyone
is imagining the same scenario. Planning the VFX: The best results will always be achieved using practical fire, which means the amount of practical vs. digital will mostly come down to safety. Where practical fire can’t be used, interactive lighting would be needed so the VFX fire looks like it’s affecting the environment. The amount of smoke is also a consideration. Depending on the location, practical smoke may not be allowed. Ideally, the smoke would be mostly practical with more added in post where needed. The collapsing building needs to be considered - which pieces collapse, what is revealed behind, how close is this to the actors, and if additional fire and sparks are thrown up. Burning embers and ash help make the scene look hotter and more dangerous.
TOP: This scene from Deadpool could serve as inspiration for the fire scenario. Here, a scene of Wade Wilson (Ryan Reynolds) caught in a factory fire was filmed mostly without much real fire. (Image copyright © 2016 Marvel)
BOTTOM: Rodeo FX would add fire to the shots from their own effects elements or CG simulations, ensuring that they managed to show individual parts of the building actually burning, rather than just placing the elements over the top. (Image copyright © 2016 Marvel)
Scenario No. 1: A firefighter races through a house – ablaze and collapsing - in search of a child inside. Angela Barson: Is a co-founder of London VFX studio BlueBolt. Barson won a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for Game of Thrones in 2012.
BOTTOM: Angela Barson
74 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 74-75
FALL 2018 VFXVOICE.COM • 75
8/23/18 12:37 PM
FILM
MEETING THE CHALLENGE: HOW WOULD YOU SOLVE THREE BIG VFX PROBLEMS? By IAN FAILES
TOP: Barson won a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for Game of Thrones in 2012. (Image copyright © 2011 HBO)
It’s often said that visual effects is about problem solving – how can the best shot be achieved in the time necessary and the budget allowed, with all the many other variables that tend to get thrown in during production? What exactly does problem solving in visual effects involve? For insight into the process, VFX Voice asked three supervisors how they might tackle a particular visual effects scenario. They were asked what first steps they’d take, how they’d actually plan the effects work, and things to consider during any live-action shoot. While there were no scripts or story treatments to go from – usually the source of many answers (and perhaps further problems) for VFX supervisors – each scenario presented these common issues apparent in VFX problem solving: Where do you look for reference? Should the approach be practical or CG? How would the shots be bid out? How do you collaborate with the other filmmakers? Read on for fun and frank discussion from seasoned Visual Effects Supervisors Angela Barson, Adam Howard and Brendan Taylor.
First steps: This type of scenario is a great example of how the visual effects supervisor needs to work very closely with other departments. Communication and planning is key. It’s critical to have an early discussion with the special effects supervisor, stunt coordinator, production designer, director and producer. Any sequence involving fire has a level of complexity and safety requirements beyond a normal shoot. Putting a child into the mix increases that complexity. Initially, it’s important to know what the location is, how much fire, smoke and destruction is wanted, and how close all of this happens to the actors. Storyboards would be a huge help, even at this early stage. Anything which involves SFX, VFX and stunts always needs careful planning, and visuals will make sure everyone
is imagining the same scenario. Planning the VFX: The best results will always be achieved using practical fire, which means the amount of practical vs. digital will mostly come down to safety. Where practical fire can’t be used, interactive lighting would be needed so the VFX fire looks like it’s affecting the environment. The amount of smoke is also a consideration. Depending on the location, practical smoke may not be allowed. Ideally, the smoke would be mostly practical with more added in post where needed. The collapsing building needs to be considered - which pieces collapse, what is revealed behind, how close is this to the actors, and if additional fire and sparks are thrown up. Burning embers and ash help make the scene look hotter and more dangerous.
TOP: This scene from Deadpool could serve as inspiration for the fire scenario. Here, a scene of Wade Wilson (Ryan Reynolds) caught in a factory fire was filmed mostly without much real fire. (Image copyright © 2016 Marvel)
BOTTOM: Rodeo FX would add fire to the shots from their own effects elements or CG simulations, ensuring that they managed to show individual parts of the building actually burning, rather than just placing the elements over the top. (Image copyright © 2016 Marvel)
Scenario No. 1: A firefighter races through a house – ablaze and collapsing - in search of a child inside. Angela Barson: Is a co-founder of London VFX studio BlueBolt. Barson won a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for Game of Thrones in 2012.
BOTTOM: Angela Barson
74 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 74-75
FALL 2018 VFXVOICE.COM • 75
8/23/18 12:37 PM
FILM
projects including Hurricane Heist, Birdman or (The Unexpected Virtue of Ignorance) and Cosmos: A Spacetime Odyssey. He has four Emmy awards for VFX related to various Star Trek TV series, and was also nominated for a VES Award for Outstanding Visual Effects in a Special Venue Project for Harry Potter and the Forbidden Journey.
TOP: Framestore’s visual effects for Gravity dealt with many aspects of space scenarios. A liveaction shoot with the actors shot with partial spacesuits and props against pre-animated LED panels helped inform light interaction, with the studio producing digital suits, space structures and reflections. (Image copyright © 2013 Warner Bros.) MIDDLE: Adam Howard BOTTOM: For Interstellar, more in-camera effects for cockpit space scenes were relied upon, including rear-projection screens for outside the spacecraft, and wire work for weightlessness scenes. (Image copyright © 2014 Warner Bros.)
These are relatively straightforward to add in post, so it’s worth considering doing all or most of this in post rather than adding an additional layer of complexity to the shoot. Instead of creating CG fire and smoke, it would be worth doing an element shoot of practical fire which can be added to shots. Even if CG fire and smoke are needed, it’s always good to be able to mix this with practical elements. Depending on the location, rather than just shooting simple elements, it’s good to create some structures that can be burnt to give form to your elements – window frames, beams of wood, etc. Smoke elements can also be created in confined spaces so they react correctly with the shape of the rooms. Bidding: For bidding, the sequences would get split into easy, medium and hard shots, with an estimate of the number of each type of shot needed. The script pages are a good guide for this. If there are storyboards, then the breakdown can be done against the boards which would give a far more accurate estimate. For each shot it should be noted what additional VFX work is required – fire, smoke, heat haze, embers, falling debris. If this isn’t accounted for at the early bidding stage, costs will increase when in post. During the shoot: When fire is involved, it’s always worth shooting additional reference footage of any practical fire that’s used on set. If this is shot as if for an element (underexposed, locked off, full frame), then it can be useful as additional elements to add into the final shot. There is often a desire to add heat haze in front of the lens. This is fine to do in non-VFX shots, but where you know you have to do any VFX work, it’s best to add any heat haze effects in post. Otherwise you’re battling with moving distortion on the shot, which obviously makes things very difficult. Scenario No. 2: In space, several astronauts aboard the Space Shuttle conduct experiments, enter data into tablets, view the Earth below, and go about their daily routines. Adam Howard: Is a freelance visual effects supervisor with recent
76 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 76-77
First steps: After reading the script and coming up with a few visual ideas, it’s important to get to know the director and know the way he or she thinks. During the production the VFX supervisor can be working very much as the director’s right hand and extra set of eyes, and you need to be able to communicate very clearly from the beginning. After meeting with the director, I meet with the DP to get his or her ideas and concepts. The process of creating is very much one of teamwork, so early discussions set the groundwork for great communication throughout the production. Planning the VFX: I would come up with lots of questions as script, casting and production design are all fleshed out: - Does the production designer plan on building a full interior and exterior set of the space shuttle or will he or she build partial sets and have the VFX department extend digitally? - Does the director plan on doing previs for this sequence? If so, does he or she want it to be run through VFX or handled as a standalone department with input from all department heads? - What is the design of the spacesuits and are there elements of them that will require VFX augmentation? - How do we handle reflections of the surrounding space on spacesuit visors? - Do we go with practical LED panel reflections or do we add the visors in post to help speed up the production schedule and make the shoot more comfortable for the actors? - What does the stunt department have in mind regarding flying actors in the interior and exterior sets for zero gravity shots? - What will we be seeing outside the space shuttle windows? In this case it is Earth, but might we be seeing other areas of space represented on screens or data tablets, and will that ‘look’ be coming from the production design department or will it be coming from VFX (or both)? Bidding: Bidding is a pretty standard process of breaking down exactly what will be required within each shot and then assigning line item prices to that. Once we get numbers back, I work with the VFX producer to decide which houses we will go with and which work will go where. In most cases the VFX budget tends to have been predetermined by production and the studio. This can be a challenge when the director has a specific vision that might not be able to be accommodated within the predetermined budget. But part of the job is helping the director get what he or she wants while staying within the budget and delivery schedule. During the shoot: Having been a VFX compositor and animator for nearly 30 years, I am particularly sensitive to the way live action and elements are shot. When possible, my aim is always to make as little unnecessary cleanup work as possible for the VFX crews so that they can really concentrate on the job at hand of making a beautiful and believable shot. So, clean greenscreens,
Phone a Friend You never need to be alone in VFX problem solving, suggests Mavericks VFX Founder/CEO Brendan Taylor. When working through the ‘twins fighting atop a train’ scenario, he decided there was no need to re-invent the wheel in how the twins part of the sequence might be achieved. So he rang Visual Effects Supervisor Geoff Scott who had regularly worked on cloning shots for Orphan Black, where the same actor played different characters in the same scene. “Geoff said that a lot of the success of the clone sequences [twinning, he calls it] relies on the actor who is playing the double on set who will be replaced,” relates Taylor. “If the double is a good match and a good performer, the shot has a much better chance at succeeding. A quick breakdown on how to do twins shots: you shoot the actor as one character and the double as the other character, either in a lock off or in motion control. Then you shoot the actor in the double’s spot and do the whole thing over again. Then you roto out the actor in one plate and comp them into the other. “Geoff also said that the very first clone shot they did on Orphan Black was outside, and because you don’t have control over the way the sun moves during the day, by the time you get to the swapping the actors, the light has completely changed. Then you have to composite an actor who is lit left to right into a plate that is lit right to left. It will never match. So they never did it outside again.”
TOP: A scene from Orphan Black shows the characters Sarah and Cosima, both played by Tatiana Maslany. (Image copyright © 2016 BBC America)
FALL 2018 VFXVOICE.COM • 77
8/23/18 12:38 PM
FILM
projects including Hurricane Heist, Birdman or (The Unexpected Virtue of Ignorance) and Cosmos: A Spacetime Odyssey. He has four Emmy awards for VFX related to various Star Trek TV series, and was also nominated for a VES Award for Outstanding Visual Effects in a Special Venue Project for Harry Potter and the Forbidden Journey.
TOP: Framestore’s visual effects for Gravity dealt with many aspects of space scenarios. A liveaction shoot with the actors shot with partial spacesuits and props against pre-animated LED panels helped inform light interaction, with the studio producing digital suits, space structures and reflections. (Image copyright © 2013 Warner Bros.) MIDDLE: Adam Howard BOTTOM: For Interstellar, more in-camera effects for cockpit space scenes were relied upon, including rear-projection screens for outside the spacecraft, and wire work for weightlessness scenes. (Image copyright © 2014 Warner Bros.)
These are relatively straightforward to add in post, so it’s worth considering doing all or most of this in post rather than adding an additional layer of complexity to the shoot. Instead of creating CG fire and smoke, it would be worth doing an element shoot of practical fire which can be added to shots. Even if CG fire and smoke are needed, it’s always good to be able to mix this with practical elements. Depending on the location, rather than just shooting simple elements, it’s good to create some structures that can be burnt to give form to your elements – window frames, beams of wood, etc. Smoke elements can also be created in confined spaces so they react correctly with the shape of the rooms. Bidding: For bidding, the sequences would get split into easy, medium and hard shots, with an estimate of the number of each type of shot needed. The script pages are a good guide for this. If there are storyboards, then the breakdown can be done against the boards which would give a far more accurate estimate. For each shot it should be noted what additional VFX work is required – fire, smoke, heat haze, embers, falling debris. If this isn’t accounted for at the early bidding stage, costs will increase when in post. During the shoot: When fire is involved, it’s always worth shooting additional reference footage of any practical fire that’s used on set. If this is shot as if for an element (underexposed, locked off, full frame), then it can be useful as additional elements to add into the final shot. There is often a desire to add heat haze in front of the lens. This is fine to do in non-VFX shots, but where you know you have to do any VFX work, it’s best to add any heat haze effects in post. Otherwise you’re battling with moving distortion on the shot, which obviously makes things very difficult. Scenario No. 2: In space, several astronauts aboard the Space Shuttle conduct experiments, enter data into tablets, view the Earth below, and go about their daily routines. Adam Howard: Is a freelance visual effects supervisor with recent
76 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 76-77
First steps: After reading the script and coming up with a few visual ideas, it’s important to get to know the director and know the way he or she thinks. During the production the VFX supervisor can be working very much as the director’s right hand and extra set of eyes, and you need to be able to communicate very clearly from the beginning. After meeting with the director, I meet with the DP to get his or her ideas and concepts. The process of creating is very much one of teamwork, so early discussions set the groundwork for great communication throughout the production. Planning the VFX: I would come up with lots of questions as script, casting and production design are all fleshed out: - Does the production designer plan on building a full interior and exterior set of the space shuttle or will he or she build partial sets and have the VFX department extend digitally? - Does the director plan on doing previs for this sequence? If so, does he or she want it to be run through VFX or handled as a standalone department with input from all department heads? - What is the design of the spacesuits and are there elements of them that will require VFX augmentation? - How do we handle reflections of the surrounding space on spacesuit visors? - Do we go with practical LED panel reflections or do we add the visors in post to help speed up the production schedule and make the shoot more comfortable for the actors? - What does the stunt department have in mind regarding flying actors in the interior and exterior sets for zero gravity shots? - What will we be seeing outside the space shuttle windows? In this case it is Earth, but might we be seeing other areas of space represented on screens or data tablets, and will that ‘look’ be coming from the production design department or will it be coming from VFX (or both)? Bidding: Bidding is a pretty standard process of breaking down exactly what will be required within each shot and then assigning line item prices to that. Once we get numbers back, I work with the VFX producer to decide which houses we will go with and which work will go where. In most cases the VFX budget tends to have been predetermined by production and the studio. This can be a challenge when the director has a specific vision that might not be able to be accommodated within the predetermined budget. But part of the job is helping the director get what he or she wants while staying within the budget and delivery schedule. During the shoot: Having been a VFX compositor and animator for nearly 30 years, I am particularly sensitive to the way live action and elements are shot. When possible, my aim is always to make as little unnecessary cleanup work as possible for the VFX crews so that they can really concentrate on the job at hand of making a beautiful and believable shot. So, clean greenscreens,
Phone a Friend You never need to be alone in VFX problem solving, suggests Mavericks VFX Founder/CEO Brendan Taylor. When working through the ‘twins fighting atop a train’ scenario, he decided there was no need to re-invent the wheel in how the twins part of the sequence might be achieved. So he rang Visual Effects Supervisor Geoff Scott who had regularly worked on cloning shots for Orphan Black, where the same actor played different characters in the same scene. “Geoff said that a lot of the success of the clone sequences [twinning, he calls it] relies on the actor who is playing the double on set who will be replaced,” relates Taylor. “If the double is a good match and a good performer, the shot has a much better chance at succeeding. A quick breakdown on how to do twins shots: you shoot the actor as one character and the double as the other character, either in a lock off or in motion control. Then you shoot the actor in the double’s spot and do the whole thing over again. Then you roto out the actor in one plate and comp them into the other. “Geoff also said that the very first clone shot they did on Orphan Black was outside, and because you don’t have control over the way the sun moves during the day, by the time you get to the swapping the actors, the light has completely changed. Then you have to composite an actor who is lit left to right into a plate that is lit right to left. It will never match. So they never did it outside again.”
TOP: A scene from Orphan Black shows the characters Sarah and Cosima, both played by Tatiana Maslany. (Image copyright © 2016 BBC America)
FALL 2018 VFXVOICE.COM • 77
8/23/18 12:38 PM
FILM
minimal roto (unless it is absolutely needed), accurate camera data and really good, clear descriptions of exactly what will be required in the final shot. Scenario No. 3: Twin brothers fight atop a moving train as it leaves a major city, along the countryside, and through various tunnels.
ABOVE: Brendan Taylor TOP: A production still from season 2 of The Handmaid’s Tale. Taylor’s Mavericks VFX worked on shots located in Fenway Park, where they extended plate photography to include a digital stadium. (Image copyright © 2018 Hulu) OPPOSITE: Taylor considered the carefully choreographed train fight in Skyfall as inspiration for his scenario. (Image copyright © 2012 Sony Pictures)
78 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 79
Brendan Taylor: Is the Founder and CEO of Mavericks VFX in Toronto, where he recently supervised visual effects for The Handmaid’s Tale, Man Seeking Woman and The Light Between Oceans. Taylor was nominated for an Emmy for Outstanding Special Visual Effects in a Supporting Role for 11.22.63. First steps: The first thing I do is research. Usually, it starts with my own experience, but quickly evolves into looking at other movies. I mean, there are a lot of really smart people who have come before me, so why not learn from them and apply those perspectives to my own ideas? You are really looking for examples where you think it worked, but also where it didn’t work. For example, for this particular scene, I would look at Indiana Jones and the Last Crusade, Skyfall, The Wolverine and The General for the train sequence, and at Moon and Orphan Black for twinsrelated VFX. Planning the VFX: I’d try and do the whole thing on a real moving train. You could replace the background to be whatever you want (to a degree). You also would have to allow for some close-ups against greenscreen because some people are uncomfortable on
the tops of trains. I’d probably recommend that these be really close (take another look at Skyfall). But, here’s the problem: the chances of getting a major city to agree to allow you to shoot something like that are less than zero. No city employee wants to be at the receiving end of the following headline: ‘A-List movie star and all the crew, including craft services, fall to their deaths off a moving train.’ It’s a lot easier – I know this from experience – to rent a section of track in a rural or even private area and shoot there than it is to try and shoot in a city. But that still doesn’t solve the ‘leave from the city’ problem. You can’t replace the background here (i.e. shoot in a rural area and replace the background with buildings), because the lighting between the big buildings should cast shadows on them and the roto and light bleed would be brutal and maybe even unworkable. So I would break it into two sections: 1. city, and 2. rural. In the city, it would be second unit shots of the train (mostly wide) where the action would be a combination of comping wide greenscreen shots, full CG actors and close-up greenscreen shots. In the rural, it would be mostly practical train with a few greenscreen closeups. It is really up to the supervisor to convince everyone to get out of the city as quickly as possible to get to the practical train stuff. I’ve found using the words ‘expensive’ and ‘time-consuming’ work well with producers, while ‘laughable’ and ‘very-CGI’ work well with directors. In all seriousness, they need your help to inform them of what will look best. It really can be an intimidating venture and you need to help them through this. Also, talk to the director. Ask what is more important: the
reality and danger of the scene, or showing that they are twins? If it’s the reality and danger, then outside it is. If it’s that they are twins, then it’s in a controlled environment. I still stand by that a lot of this needs to be as real as possible. I’d say shoot it on a train with doubles and face replacements where necessary. For the hero moments you’d use greenscreen and shoot with a ‘twinning’ method. (See sidebar) Bidding: The process for bidding is usually two rounds; the first one is all ‘blue sky’ – if we have all the money in the world. The second round usually starts with a phone call: ‘Hi, I don’t know how to say this, but we have one third of what you bid.’ And then you go through the ‘fun’ process of figuring out how to get great stuff for less money. It’s a tough slog. All of the computers, software, people, rent and insurance cost a ton of money. If you want good-looking effects, you have to pay for them. But not every production has a ton of money. As long as you are transparent, you can usually find smart ways of doing things well. During the shoot: I carry my still camera around my neck at all times. It’s a bit of a security blanket. If I ever feel like something isn’t going to work, or I feel uncomfortable, I just start taking photos. It’s also grist for the mill. You can accumulate a lot of material that will be useful for matte paintings, textures and reference. I am also obsessive about HDRIs. If there is the smallest inkling that there will be CG, I’ll do an HDRI. The most important thing is probably to be in sync with the cinematographer and director and to establish a connection. You are going to be working together for the next few months.
FALL 2018 VFXVOICE.COM • 79
8/23/18 12:38 PM
FILM
minimal roto (unless it is absolutely needed), accurate camera data and really good, clear descriptions of exactly what will be required in the final shot. Scenario No. 3: Twin brothers fight atop a moving train as it leaves a major city, along the countryside, and through various tunnels.
ABOVE: Brendan Taylor TOP: A production still from season 2 of The Handmaid’s Tale. Taylor’s Mavericks VFX worked on shots located in Fenway Park, where they extended plate photography to include a digital stadium. (Image copyright © 2018 Hulu) OPPOSITE: Taylor considered the carefully choreographed train fight in Skyfall as inspiration for his scenario. (Image copyright © 2012 Sony Pictures)
78 • VFXVOICE.COM FALL 2018
PG 74-79 VFX PROBLEMS.indd 79
Brendan Taylor: Is the Founder and CEO of Mavericks VFX in Toronto, where he recently supervised visual effects for The Handmaid’s Tale, Man Seeking Woman and The Light Between Oceans. Taylor was nominated for an Emmy for Outstanding Special Visual Effects in a Supporting Role for 11.22.63. First steps: The first thing I do is research. Usually, it starts with my own experience, but quickly evolves into looking at other movies. I mean, there are a lot of really smart people who have come before me, so why not learn from them and apply those perspectives to my own ideas? You are really looking for examples where you think it worked, but also where it didn’t work. For example, for this particular scene, I would look at Indiana Jones and the Last Crusade, Skyfall, The Wolverine and The General for the train sequence, and at Moon and Orphan Black for twinsrelated VFX. Planning the VFX: I’d try and do the whole thing on a real moving train. You could replace the background to be whatever you want (to a degree). You also would have to allow for some close-ups against greenscreen because some people are uncomfortable on
the tops of trains. I’d probably recommend that these be really close (take another look at Skyfall). But, here’s the problem: the chances of getting a major city to agree to allow you to shoot something like that are less than zero. No city employee wants to be at the receiving end of the following headline: ‘A-List movie star and all the crew, including craft services, fall to their deaths off a moving train.’ It’s a lot easier – I know this from experience – to rent a section of track in a rural or even private area and shoot there than it is to try and shoot in a city. But that still doesn’t solve the ‘leave from the city’ problem. You can’t replace the background here (i.e. shoot in a rural area and replace the background with buildings), because the lighting between the big buildings should cast shadows on them and the roto and light bleed would be brutal and maybe even unworkable. So I would break it into two sections: 1. city, and 2. rural. In the city, it would be second unit shots of the train (mostly wide) where the action would be a combination of comping wide greenscreen shots, full CG actors and close-up greenscreen shots. In the rural, it would be mostly practical train with a few greenscreen closeups. It is really up to the supervisor to convince everyone to get out of the city as quickly as possible to get to the practical train stuff. I’ve found using the words ‘expensive’ and ‘time-consuming’ work well with producers, while ‘laughable’ and ‘very-CGI’ work well with directors. In all seriousness, they need your help to inform them of what will look best. It really can be an intimidating venture and you need to help them through this. Also, talk to the director. Ask what is more important: the
reality and danger of the scene, or showing that they are twins? If it’s the reality and danger, then outside it is. If it’s that they are twins, then it’s in a controlled environment. I still stand by that a lot of this needs to be as real as possible. I’d say shoot it on a train with doubles and face replacements where necessary. For the hero moments you’d use greenscreen and shoot with a ‘twinning’ method. (See sidebar) Bidding: The process for bidding is usually two rounds; the first one is all ‘blue sky’ – if we have all the money in the world. The second round usually starts with a phone call: ‘Hi, I don’t know how to say this, but we have one third of what you bid.’ And then you go through the ‘fun’ process of figuring out how to get great stuff for less money. It’s a tough slog. All of the computers, software, people, rent and insurance cost a ton of money. If you want good-looking effects, you have to pay for them. But not every production has a ton of money. As long as you are transparent, you can usually find smart ways of doing things well. During the shoot: I carry my still camera around my neck at all times. It’s a bit of a security blanket. If I ever feel like something isn’t going to work, or I feel uncomfortable, I just start taking photos. It’s also grist for the mill. You can accumulate a lot of material that will be useful for matte paintings, textures and reference. I am also obsessive about HDRIs. If there is the smallest inkling that there will be CG, I’ll do an HDRI. The most important thing is probably to be in sync with the cinematographer and director and to establish a connection. You are going to be working together for the next few months.
FALL 2018 VFXVOICE.COM • 79
8/23/18 12:38 PM
FILM
A WILD RIDE WITH FALLEN KINGDOM’S GYROSPHERE By IAN FAILES
Blockbuster films are often described as rollercoaster rides, but J.A. Bayona’s Jurassic World: Fallen Kingdom contains a literal rollercoaster scene. Claire and Franklin, played by Bryce Dallas Howard and Justice Smith, must escape the erupting Isla Nublar while inside a gyrosphere, narrowly avoiding consumption by a dinosaur, only to launch off a cliff into the sea below before being rescued by Chris Pratt’s Owen Grady. VFX Voice finds out from Industrial Light & Magic Visual Effects Supervisor David Vickery, who shared duties with fellow Supervisor Alex Wuttke, how this Fallen Kingdom sequence came together via a carefully constructed gyrosphere set piece, an actual rollercoaster-like track, an elaborate underwater set, CG dinos and an exploding volcano. Plate photography for the on-land gyrosphere moments
were filmed in Hawaii, and followed storyboards and previs devised by Proof. The filmmakers aimed to acquire as much practical imagery as possible, and that resulted in the construction of several gyrosphere rigs, including a motion-controlled version that existed on the back of a truck on a pole arm. “It had three different axes of freedom of movement,” explains Vickery. “We could program these moves so that, if it was being hit by a T. Rex tail or a Sinoceratops head, the arm could swing and the gyrosphere could rotate and roll. We’d strap in Bryce and Justice and roll them over and upside down.” Shots of the gyrosphere traveling along were achieved with the set piece on a ‘biscuit rig’ towed by a vehicle that could drive at 15 mph. “One thing I really enjoyed about this was that we didn’t have [the gyrosphere] going super smooth – it was bouncing up and down all over the terrain,” says Vickery. “There was still a lot of work to paint out that rig, and then we needed to add the glass and use our Lidar scan and HDRIs to put in reflections. But what we got was our actors in absolutely correct lighting, a camera that was forced to work to get the shots, and for it to not feel like it was a completely detached, clean, smooth CG shot.” The gyrosphere navigates past several dinosaurs and flaming lava rocks that have been shot out of the volcano. ILM manipulated Hawaii plate photography to add in an erupting mountain with clouds of smoke and explosions, care of the studio’s proprietary Plume toolset. Special Effects Supervisor (U.S. Photography) Mike Meinardus laid down 30 to 40 pyro pots that could be triggered to explode along the gyrosphere’s path, along with smoke tubes and some fire bars. ILM would then add in digital lava bombs and match the practical explosions with CG ones. After an encounter with a Carnotaurus (and a T. Rex savior), the gyrosphere continues on its path to the cliff face. To get the right kind of reaction from the actors as the object launches into the sea, Production Special Effects Supervisor Paul Corbould built a section of rollercoaster track for a partial gyrosphere to slide down. A digital half of the shot was also required to fill out the sphere and surrounding environment.
OPPOSITE TOP: Inside the gyrosphere, Claire and Franklin come face to face with a Carnotaurus. OPPOSITE BOTTOM: The T. Rex saves Claire and Franklin – for now – but their journey continues off the edge of the cliff into the sea. TOP: The gyrosphere was a real set piece that could be driven along during filming in Hawaii. BOTTOM: The final shot with visual effects from ILM included the glass dome for the gyrosphere, removal of the vehicle rig for the sphere, and added dinosaurs, lava rocks and augmentations to the environment.
“One thing I really enjoyed about this was that we didn’t have [the gyrosphere] going super smooth – it was bouncing up and down all over the terrain. There was still a lot of work to paint out that rig, and then we needed to add the glass and use our Lidar scan and HDRIs to put in reflections.” —David Vickery, Visual Effects Supervisor, Industrial Light & Magic
80 • VFXVOICE.COM FALL 2018
PG 80-83 FALLEN KINGDOM.indd 80-81
FALL 2018 VFXVOICE.COM • 81
8/23/18 1:09 PM
FILM
A WILD RIDE WITH FALLEN KINGDOM’S GYROSPHERE By IAN FAILES
Blockbuster films are often described as rollercoaster rides, but J.A. Bayona’s Jurassic World: Fallen Kingdom contains a literal rollercoaster scene. Claire and Franklin, played by Bryce Dallas Howard and Justice Smith, must escape the erupting Isla Nublar while inside a gyrosphere, narrowly avoiding consumption by a dinosaur, only to launch off a cliff into the sea below before being rescued by Chris Pratt’s Owen Grady. VFX Voice finds out from Industrial Light & Magic Visual Effects Supervisor David Vickery, who shared duties with fellow Supervisor Alex Wuttke, how this Fallen Kingdom sequence came together via a carefully constructed gyrosphere set piece, an actual rollercoaster-like track, an elaborate underwater set, CG dinos and an exploding volcano. Plate photography for the on-land gyrosphere moments
were filmed in Hawaii, and followed storyboards and previs devised by Proof. The filmmakers aimed to acquire as much practical imagery as possible, and that resulted in the construction of several gyrosphere rigs, including a motion-controlled version that existed on the back of a truck on a pole arm. “It had three different axes of freedom of movement,” explains Vickery. “We could program these moves so that, if it was being hit by a T. Rex tail or a Sinoceratops head, the arm could swing and the gyrosphere could rotate and roll. We’d strap in Bryce and Justice and roll them over and upside down.” Shots of the gyrosphere traveling along were achieved with the set piece on a ‘biscuit rig’ towed by a vehicle that could drive at 15 mph. “One thing I really enjoyed about this was that we didn’t have [the gyrosphere] going super smooth – it was bouncing up and down all over the terrain,” says Vickery. “There was still a lot of work to paint out that rig, and then we needed to add the glass and use our Lidar scan and HDRIs to put in reflections. But what we got was our actors in absolutely correct lighting, a camera that was forced to work to get the shots, and for it to not feel like it was a completely detached, clean, smooth CG shot.” The gyrosphere navigates past several dinosaurs and flaming lava rocks that have been shot out of the volcano. ILM manipulated Hawaii plate photography to add in an erupting mountain with clouds of smoke and explosions, care of the studio’s proprietary Plume toolset. Special Effects Supervisor (U.S. Photography) Mike Meinardus laid down 30 to 40 pyro pots that could be triggered to explode along the gyrosphere’s path, along with smoke tubes and some fire bars. ILM would then add in digital lava bombs and match the practical explosions with CG ones. After an encounter with a Carnotaurus (and a T. Rex savior), the gyrosphere continues on its path to the cliff face. To get the right kind of reaction from the actors as the object launches into the sea, Production Special Effects Supervisor Paul Corbould built a section of rollercoaster track for a partial gyrosphere to slide down. A digital half of the shot was also required to fill out the sphere and surrounding environment.
OPPOSITE TOP: Inside the gyrosphere, Claire and Franklin come face to face with a Carnotaurus. OPPOSITE BOTTOM: The T. Rex saves Claire and Franklin – for now – but their journey continues off the edge of the cliff into the sea. TOP: The gyrosphere was a real set piece that could be driven along during filming in Hawaii. BOTTOM: The final shot with visual effects from ILM included the glass dome for the gyrosphere, removal of the vehicle rig for the sphere, and added dinosaurs, lava rocks and augmentations to the environment.
“One thing I really enjoyed about this was that we didn’t have [the gyrosphere] going super smooth – it was bouncing up and down all over the terrain. There was still a lot of work to paint out that rig, and then we needed to add the glass and use our Lidar scan and HDRIs to put in reflections.” —David Vickery, Visual Effects Supervisor, Industrial Light & Magic
80 • VFXVOICE.COM FALL 2018
PG 80-83 FALLEN KINGDOM.indd 80-81
FALL 2018 VFXVOICE.COM • 81
8/23/18 1:09 PM
FILM
TOP: Prior to the large-scale eruption of the island, a team of dinosaur conservationists attempt to rescue some of the animals. This frame shows the plate photography for a scene on the main street. BOTTOM: The final shot with CG architecture and digital dinosaur. OPPOSITE TOP: Fallen Kingdom director J.A. Bayona with a practical Indoraptor dinosaur during filming. This Jurassic World film had a much larger practical component than the previous movie. OPPOSITE BOTTOM: Visual Effects Supervisor David Vickery from ILM
82 • VFXVOICE.COM FALL 2018
PG 80-83 FALLEN KINGDOM.indd 82-83
For shots of the characters now underwater, another elaborate practical build enabled the gyrosphere to be submerged – with the actors – in a tank. “We thought,” recalls Vickery, “‘How can we try and shoot this in as coherent an environment as possible, without ending up with an element of Chris in a tank and an element of Justice and Bryce underwater in a gyrosphere or on a bluescreen in a dry-for-wet environment?’ Paul Corbould built a fully watertight gyrosphere that we submersed under the water.” The underwater rig had some unique features. The gyrosphere itself – which featured a specially blown glass dome – was 10% larger than it would’ve otherwise been, which let the camera and the camera operators have more space.
“We thought, ‘How can we try and shoot this in as coherent an environment as possible, without ending up with an element of Chris in a tank and an element of Justice and Bryce underwater in a gyrosphere or on a bluescreen in a dry-for-wet environment?’ Paul Corbould built a fully watertight gyrosphere that we submersed under the water.” —David Vickery, Visual Effects Supervisor, Industrial Light & Magic
It could also rotate to deal with different camera moves and it could be flooded with water via vents. For the visual effects team, there were a number of major challenges for the underwater shots. These included adding the environment outside the gyrosphere, drowning dinosaurs and lava bombs, and cracks and fissures that appear in the glass (inside and directly outside the sphere, it was all practical water). Also, part of the sequence appears to happen as an uncut ‘oner’. However, it was actually filmed in five separate pieces and then stitched together. The director himself suggested one way to blend plates was to use a ‘water wipe’ device. “We often used the device where we might pan off into the water above, and that’s an opportunity to cut because it’s basically a digital frame at that point,” outlines Vickery. “But even then, we would try to bring in elements like Claire’s arm when it reaches up to the top of the gyrosphere. That beat is digital, but that was to try to maintain a visual connection with the actors. “One of the things that J.A. was incredibly keen on doing was making sure we could get as much physical stuff in there on that day [of shooting as we could],” reflects Vickery on the entire sequence. “It was all about as much practical gyrosphere, as much practical tree, landscape and environment, and underwater shots [as possible]. I feel like we really achieved that.”
FALL 2018 VFXVOICE.COM • 83
8/23/18 1:09 PM
FILM
TOP: Prior to the large-scale eruption of the island, a team of dinosaur conservationists attempt to rescue some of the animals. This frame shows the plate photography for a scene on the main street. BOTTOM: The final shot with CG architecture and digital dinosaur. OPPOSITE TOP: Fallen Kingdom director J.A. Bayona with a practical Indoraptor dinosaur during filming. This Jurassic World film had a much larger practical component than the previous movie. OPPOSITE BOTTOM: Visual Effects Supervisor David Vickery from ILM
82 • VFXVOICE.COM FALL 2018
PG 80-83 FALLEN KINGDOM.indd 82-83
For shots of the characters now underwater, another elaborate practical build enabled the gyrosphere to be submerged – with the actors – in a tank. “We thought,” recalls Vickery, “‘How can we try and shoot this in as coherent an environment as possible, without ending up with an element of Chris in a tank and an element of Justice and Bryce underwater in a gyrosphere or on a bluescreen in a dry-for-wet environment?’ Paul Corbould built a fully watertight gyrosphere that we submersed under the water.” The underwater rig had some unique features. The gyrosphere itself – which featured a specially blown glass dome – was 10% larger than it would’ve otherwise been, which let the camera and the camera operators have more space.
“We thought, ‘How can we try and shoot this in as coherent an environment as possible, without ending up with an element of Chris in a tank and an element of Justice and Bryce underwater in a gyrosphere or on a bluescreen in a dry-for-wet environment?’ Paul Corbould built a fully watertight gyrosphere that we submersed under the water.” —David Vickery, Visual Effects Supervisor, Industrial Light & Magic
It could also rotate to deal with different camera moves and it could be flooded with water via vents. For the visual effects team, there were a number of major challenges for the underwater shots. These included adding the environment outside the gyrosphere, drowning dinosaurs and lava bombs, and cracks and fissures that appear in the glass (inside and directly outside the sphere, it was all practical water). Also, part of the sequence appears to happen as an uncut ‘oner’. However, it was actually filmed in five separate pieces and then stitched together. The director himself suggested one way to blend plates was to use a ‘water wipe’ device. “We often used the device where we might pan off into the water above, and that’s an opportunity to cut because it’s basically a digital frame at that point,” outlines Vickery. “But even then, we would try to bring in elements like Claire’s arm when it reaches up to the top of the gyrosphere. That beat is digital, but that was to try to maintain a visual connection with the actors. “One of the things that J.A. was incredibly keen on doing was making sure we could get as much physical stuff in there on that day [of shooting as we could],” reflects Vickery on the entire sequence. “It was all about as much practical gyrosphere, as much practical tree, landscape and environment, and underwater shots [as possible]. I feel like we really achieved that.”
FALL 2018 VFXVOICE.COM • 83
8/23/18 1:09 PM
VFX VAULT
PAINTING THE AFTERLIFE IN WHAT DREAMS MAY COME By IAN FAILES
Perhaps the ultimate goal of a visual effects artist is to be involved with a project that is art and spectacle at the same time. Vincent Ward’s 1998 film What Dreams May Come certainly represented both of these areas, particularly with its ‘painted world’ version of the afterlife, as visited by Robin Williams’ character, Chris, after the death of his wife, Annie (Annabella Sciorra). The visual effects for the painted world of What Dreams May Come would take the concept of optical flow – where every pixel is tracked in a moving image – to new levels in order to turn liveaction scenes into paintings in motion. This work, and other painterly imagery in the film, would ultimately result in an Academy Award for Best Visual Effects (awarded to Nicholas Brooks, Joel Hynek, Kevin Mack and Stuart Robertson). On the 20th anniversary of What Dreams May Come, VFX Voice sat down with one of its visual effects supervisors, Nicholas Brooks, to find out more about the R&D behind the optical flow-based painterly world and how it was executed. VFX Voice: What were you doing right before you started on What Dreams May Come? Nicholas Brooks: Well, it certainly was an interesting time. I was at Mass.lllusions at the time. We got the scripts for The Matrix and What Dreams May Come all at the same time. So those two films were linked, basically. The bullet time of The Matrix and the painterly world of What Dreams May Come were part of the same R&D effort. VFX Voice: Can you talk about some of the initial conversations with the director and this R&D effort that followed for the painterly world? Brooks: Vincent felt that heaven had to be something special. He didn’t want to rely on golden light and white sets. So he came up
84 • VFXVOICE.COM FALL 2018
PG 84-87 WHAT DREAMS.indd 84-85
with the concept of making the character Annie a painter – and art would be the way that we’d know we were in heaven. Then Chris, Robin Williams’ character, walks into heaven and sees his wife’s heaven and they’re connected through the painting. To do that, Vincent explored re-touching the live-action photography. He was looking at romanticist painter Caspar David Friedrich’s work for inspiration, particularly the painting called ‘Two Men Contemplating the Moon.’ Right at that time I’d been working with Kodak’s Cineon software and they’d just released Cinespeed, which was the first optical flow re-timer. From that, we came up with this wacky idea of doing ‘machine vision’ tracking of entire plates. We did some tests initially with painterly filters, which looked good on still frames, but they became a flickering mess when applied to a series of images. The idea became, why don’t we try to use optical flow to drive a paint system? VFX Voice: Can you explain what optical flow meant, in terms of the way you wanted to use it?
OPPOSITE TOP LEFT: Robin Williams plays Chris in What Dreams May Come. OPPOSITE TOP RIGHT: Robin Williams during filming of a scene where his character, Chris, encounters his old dog, Katie. Optical flow was used to make the flowery setting a painted world. OPPOSITE BOTTOM LEFT: Director Vincent Ward (left) and Robin Williams on set. TOP: Chris and Albert (Cuba Gooding Jr.) come across the Purple Tree, matching a newly painted tree crafted by Chris’s wife, Annie. Digital Domain ‘grew’ the tree with L-systems techniques. The studio was one of several vendors on What Dreams May Come. along with principal vendor Mass.Illusions, and POP Film, CIS Hollywood, Radium, Illusion Arts, Mobility, Giant Killer Robots, Shadowcaster and Cinema Production Services. Overall Visual Effects Supervisor Ellen Somers oversaw the production. BOTTOM: Chris and Annie inside the painted world afterlife.
Brooks: Optical flow comes from machine vision. If you’re putting an autonomous robot on the surface of the moon or Mars, you give it a vision system, a minimum of two, possibly three, possibly five cameras that view the world from slightly different angles. In a way, it’s similar to the way that humans with a stereo pair of optics, i.e., eyes, see the world and are able to understand depth through convergence. So if you’ve got a robot and you’ve got these cameras, then you need to write the perceptual brain or the perceptual program that works out the differences between these views and creates an idea of depth. Basically, it tries to match each frame to the other frame and give you ‘per pixel’. That’s called a vector field or an optical-flow vector field. In our world, where we’ve got one camera, you create a vector field that allows you to track pixels from frame one to frame two.
FALL 2018 VFXVOICE.COM • 85
8/23/18 1:08 PM
VFX VAULT
PAINTING THE AFTERLIFE IN WHAT DREAMS MAY COME By IAN FAILES
Perhaps the ultimate goal of a visual effects artist is to be involved with a project that is art and spectacle at the same time. Vincent Ward’s 1998 film What Dreams May Come certainly represented both of these areas, particularly with its ‘painted world’ version of the afterlife, as visited by Robin Williams’ character, Chris, after the death of his wife, Annie (Annabella Sciorra). The visual effects for the painted world of What Dreams May Come would take the concept of optical flow – where every pixel is tracked in a moving image – to new levels in order to turn liveaction scenes into paintings in motion. This work, and other painterly imagery in the film, would ultimately result in an Academy Award for Best Visual Effects (awarded to Nicholas Brooks, Joel Hynek, Kevin Mack and Stuart Robertson). On the 20th anniversary of What Dreams May Come, VFX Voice sat down with one of its visual effects supervisors, Nicholas Brooks, to find out more about the R&D behind the optical flow-based painterly world and how it was executed. VFX Voice: What were you doing right before you started on What Dreams May Come? Nicholas Brooks: Well, it certainly was an interesting time. I was at Mass.lllusions at the time. We got the scripts for The Matrix and What Dreams May Come all at the same time. So those two films were linked, basically. The bullet time of The Matrix and the painterly world of What Dreams May Come were part of the same R&D effort. VFX Voice: Can you talk about some of the initial conversations with the director and this R&D effort that followed for the painterly world? Brooks: Vincent felt that heaven had to be something special. He didn’t want to rely on golden light and white sets. So he came up
84 • VFXVOICE.COM FALL 2018
PG 84-87 WHAT DREAMS.indd 84-85
with the concept of making the character Annie a painter – and art would be the way that we’d know we were in heaven. Then Chris, Robin Williams’ character, walks into heaven and sees his wife’s heaven and they’re connected through the painting. To do that, Vincent explored re-touching the live-action photography. He was looking at romanticist painter Caspar David Friedrich’s work for inspiration, particularly the painting called ‘Two Men Contemplating the Moon.’ Right at that time I’d been working with Kodak’s Cineon software and they’d just released Cinespeed, which was the first optical flow re-timer. From that, we came up with this wacky idea of doing ‘machine vision’ tracking of entire plates. We did some tests initially with painterly filters, which looked good on still frames, but they became a flickering mess when applied to a series of images. The idea became, why don’t we try to use optical flow to drive a paint system? VFX Voice: Can you explain what optical flow meant, in terms of the way you wanted to use it?
OPPOSITE TOP LEFT: Robin Williams plays Chris in What Dreams May Come. OPPOSITE TOP RIGHT: Robin Williams during filming of a scene where his character, Chris, encounters his old dog, Katie. Optical flow was used to make the flowery setting a painted world. OPPOSITE BOTTOM LEFT: Director Vincent Ward (left) and Robin Williams on set. TOP: Chris and Albert (Cuba Gooding Jr.) come across the Purple Tree, matching a newly painted tree crafted by Chris’s wife, Annie. Digital Domain ‘grew’ the tree with L-systems techniques. The studio was one of several vendors on What Dreams May Come. along with principal vendor Mass.Illusions, and POP Film, CIS Hollywood, Radium, Illusion Arts, Mobility, Giant Killer Robots, Shadowcaster and Cinema Production Services. Overall Visual Effects Supervisor Ellen Somers oversaw the production. BOTTOM: Chris and Annie inside the painted world afterlife.
Brooks: Optical flow comes from machine vision. If you’re putting an autonomous robot on the surface of the moon or Mars, you give it a vision system, a minimum of two, possibly three, possibly five cameras that view the world from slightly different angles. In a way, it’s similar to the way that humans with a stereo pair of optics, i.e., eyes, see the world and are able to understand depth through convergence. So if you’ve got a robot and you’ve got these cameras, then you need to write the perceptual brain or the perceptual program that works out the differences between these views and creates an idea of depth. Basically, it tries to match each frame to the other frame and give you ‘per pixel’. That’s called a vector field or an optical-flow vector field. In our world, where we’ve got one camera, you create a vector field that allows you to track pixels from frame one to frame two.
FALL 2018 VFXVOICE.COM • 85
8/23/18 1:08 PM
VFX VAULT
If you’re panning to the right for instance, you will get a set of vectors that show the features of that world as values for each of those pixels in terms of x and y shifts. What we realized was that this meant we could generate a paint stroke for every pixel of the image, and we could transform that paint stroke to the next image and to the next image after that, and so on, because the camera is moving and we are basically generating a set of pixels. Then what we needed was a particle system that would take optical flow as transformation information. None of the existing tools had just the right kind of image processing, so we knew we had to build it ourselves. VFX Voice: How was this developed further for What Dreams May Come?
TOP: Vincent Ward directs Robin Williams. Note the orange ball tracking markers dotted among the flowers. BOTTOM: The original plate for a scene of Chris and Katie exploring the afterlife.
86 • VFXVOICE.COM FALL 2018
PG 84-87 WHAT DREAMS.indd 86-87
Brooks: The first thing we did was get the film studio to give us some money to test this, and we went out and shot some footage of a guy walking through a forest in South Carolina. We selected two shots from that, and we had hired a programmer, Pierre Jasmin, who had a particle system that he’d written. Pierre was one of the original programmers of Discreet Logic, and he went on to co-found RE:Vision Effects. Pierre had written a particle system that would take an image and analyze the color per pixel. Then what we did was generate paint strokes. We physically painted a bunch of paint strokes that had white, blue and red paint mixed into it. So you can imagine these slightly Monet-like strokes of different shapes and sizes that you saw the three pigments in. We scanned them all in and we used the color of the pixels – the white, blue and red channels – to drive
the movement. For example, on the photography, let’s say it was green grass; it would look at that green pixel and go, okay, if the base color is green, it would do some variations based on that. In Pierre’s system we generated layers of particles with optical flow with these paint strokes. We had rules for orientation, depth, and all sorts of different variations. In essence, we would apply a traditional painter’s algorithm, i.e., the way you might paint from the background to the foreground in terms of how we would paint the sky and how we paint the horizon. And we essentially segmented the image into different alpha channels. So when you look at the image, it would be maybe 10 different maps or different depths. What Dreams May Come was basically greenlit on the back of this test. Everybody realized it was possible that we could actually film in amazing locations, and be able to transform it into a moving painting without it looking kind of like kitsch or CG or over-processed. Interestingly, we did this What Dreams May Come test – which was really successful – and right after that we did the test for bullet time using a similar technique, but without the particles. Bullet time was more about frame interpolation where we set up all these multiple cameras and interpolated across with these different camera views. VFX Voice: Once you were in production, what things were being done on set to help with the optical flow process later on? Brooks: We’d also been developing the use of Lidar for visual effects production. We were Lidar scanning the landscapes and
getting tracking information and spacial information for the environment that helped us generate depth maps and to generate 3D data. This seems trivial today, but at the time we were absolutely using Lidar in a way that it hadn’t been used before. At that point it was still used mostly for engineering. We were scanning trees – all sorts of stuff – and learning how to segment all that point-value information, surface it, and abbreviate it so that we could use it in our paintings. With all this information we had, we were able to just keep tweaking the imagery, and we learned so much along the way. One of those little tricks we learned was, as we were moving paint along we could kind of accumulate it and sort of smear it. As the camera moved it would be self-smearing.
BOTTOM: The final shot was enabled with extensive development of tracking techniques, optical flow and a specialized particles tool to produce the painterly effects.
FALL 2018 VFXVOICE.COM • 87
8/23/18 1:08 PM
VFX VAULT
If you’re panning to the right for instance, you will get a set of vectors that show the features of that world as values for each of those pixels in terms of x and y shifts. What we realized was that this meant we could generate a paint stroke for every pixel of the image, and we could transform that paint stroke to the next image and to the next image after that, and so on, because the camera is moving and we are basically generating a set of pixels. Then what we needed was a particle system that would take optical flow as transformation information. None of the existing tools had just the right kind of image processing, so we knew we had to build it ourselves. VFX Voice: How was this developed further for What Dreams May Come?
TOP: Vincent Ward directs Robin Williams. Note the orange ball tracking markers dotted among the flowers. BOTTOM: The original plate for a scene of Chris and Katie exploring the afterlife.
86 • VFXVOICE.COM FALL 2018
PG 84-87 WHAT DREAMS.indd 86-87
Brooks: The first thing we did was get the film studio to give us some money to test this, and we went out and shot some footage of a guy walking through a forest in South Carolina. We selected two shots from that, and we had hired a programmer, Pierre Jasmin, who had a particle system that he’d written. Pierre was one of the original programmers of Discreet Logic, and he went on to co-found RE:Vision Effects. Pierre had written a particle system that would take an image and analyze the color per pixel. Then what we did was generate paint strokes. We physically painted a bunch of paint strokes that had white, blue and red paint mixed into it. So you can imagine these slightly Monet-like strokes of different shapes and sizes that you saw the three pigments in. We scanned them all in and we used the color of the pixels – the white, blue and red channels – to drive
the movement. For example, on the photography, let’s say it was green grass; it would look at that green pixel and go, okay, if the base color is green, it would do some variations based on that. In Pierre’s system we generated layers of particles with optical flow with these paint strokes. We had rules for orientation, depth, and all sorts of different variations. In essence, we would apply a traditional painter’s algorithm, i.e., the way you might paint from the background to the foreground in terms of how we would paint the sky and how we paint the horizon. And we essentially segmented the image into different alpha channels. So when you look at the image, it would be maybe 10 different maps or different depths. What Dreams May Come was basically greenlit on the back of this test. Everybody realized it was possible that we could actually film in amazing locations, and be able to transform it into a moving painting without it looking kind of like kitsch or CG or over-processed. Interestingly, we did this What Dreams May Come test – which was really successful – and right after that we did the test for bullet time using a similar technique, but without the particles. Bullet time was more about frame interpolation where we set up all these multiple cameras and interpolated across with these different camera views. VFX Voice: Once you were in production, what things were being done on set to help with the optical flow process later on? Brooks: We’d also been developing the use of Lidar for visual effects production. We were Lidar scanning the landscapes and
getting tracking information and spacial information for the environment that helped us generate depth maps and to generate 3D data. This seems trivial today, but at the time we were absolutely using Lidar in a way that it hadn’t been used before. At that point it was still used mostly for engineering. We were scanning trees – all sorts of stuff – and learning how to segment all that point-value information, surface it, and abbreviate it so that we could use it in our paintings. With all this information we had, we were able to just keep tweaking the imagery, and we learned so much along the way. One of those little tricks we learned was, as we were moving paint along we could kind of accumulate it and sort of smear it. As the camera moved it would be self-smearing.
BOTTOM: The final shot was enabled with extensive development of tracking techniques, optical flow and a specialized particles tool to produce the painterly effects.
FALL 2018 VFXVOICE.COM • 87
8/23/18 1:08 PM
GAMES
WHO ARE THE ROCK STARS OF VIDEO GAME VFX? By DAVID “DJ” JOHNSON CEO/Creative Director, Undertone FX
TOP : David “DJ” Johnson BOTTOM LEFT: Robert Gaines BOTTOM RIGHT: Sascha Herfort
In the film industry, when it comes to Visual Effects, there are names that we all know. We know exactly what they worked on and where they made their names. They are legends: Dennis Muren VES, ASC; Ray Harryhausen; Richard Edlund VES, ASC; Stan Winston; Paul Debevec; Ed Catmull VES – the list goes on. While your list might differ a bit, this is part of mine. So who are those people in the games industry? In video games, we are more than anonymous. We are the few in the trenches working on our passions. We don’t often show up in behind-thescenes videos. We are rarely named in any of the gaming awards. But there are incredible talents out there that are doing what I and others consider to be some of the most cutting-edge visual effects in the game industry. I would like to call out some of these people. They are the legends that I look up to and aspire to emulate. A few of them I’ve worked with, and a few I’ve competed against at the VES Awards. But for every one of them, when I hear their name, an image is conjured of a moment in gaming where something astounding was created – something better than I had ever seen before. These are some of my legends of gaming VFX, and what stood out in my head as making them each rock stars. Robert Gaines, Call of Duty 2 I remember everyone in the office I worked at huddled around a TV showing off this game and all of us admiring Robert’s work. It was a WWII level set in Russia, and in it you planted explosives, then from afar detonated a building and saw it come down. We were in awe. It was beautiful. The smoke plume left behind was gorgeous. It eventually led to me applying at Infinity Ward and working under Robert for a few years. I’ve learned more from Robert than from any other person I’ve worked for or with in games. His critique was always on point and he came up with ways to work that I’d never seen before – time looping iteration, first-person in-context FX placement via the console. I’ll always consider Robert a mentor for what he taught me about creating video game effects. Sascha Herfort, RYSE: Son of Rome There were a number of ways in which RYSE shined above anything I’d seen before. The siren character caked in dried mud with warpaint cracking off was stunning. The facial animation pipeline they developed was amazing, sharing the same setup across gameplay and cinematics. But the sequence where the warship crashed into the shore with sails tearing, ropes flying everywhere – it was destruction on an eye-popping scale. They implemented an Alembic GeomCaching pipeline. When this was shown at the Game Developers Conference the following year, I scrambled along with several other studios to play catch up. A new bar had been set. Sascha has now moved over to the film industry where he works as a Creature TD at ILM. Marijn Giesbertz, Killzone Shadow Fall Killzone Shadow Fall also had a number of advancements that truly amazed me, such as its approach to forces I hadn’t seen
88 • VFXVOICE.COM FALL 2018
PG 88-90 GAMES.indd 88-89
“For everyone [on my list], when I hear their name, an image is conjured of a moment in gaming where something astounding was created – something better than I had ever seen before.” —David “DJ” Johnson
in a real-time engine before. Some of the cityscape fly-ins were absolutely gorgeous. The producers at Guerrilla Games were early adopters of PBR lighting. They were using motion vectors (functionally optical flow) to get higher-resolution effects textures early on. But the scene that really floored me came late in the game where gravity was going bonkers. All of these building bits were flying around the sky. It was the answer to “What if there were a massive destruction sequence, but instead of falling, everything just swirled around and you could fly through and run around inside of it?” I was truly astounded. Alessandro Nardini, Call of Duty: Ghosts Alessandro comes from the film industry and is back in it now, but we were graced with his talents for Call of Duty: Ghosts. After seeing the work [leading technical animator and director] Chris Evans did in RYSE, we had our sights set on seeing how far we could push destruction in a game engine, and Alessandro was just the man to pull it off. The opening level shows you running through the streets and houses of a town outside San Diego, where a ‘Rod of God’ has just struck nearby (a non-nuclear kinetic rod dropped from space that does as much damage as a nuke). As the streets are cracking and dropping out beneath your feet, you run through houses that are being torn in half and witness buildings collapsing into the chasm. Alessandro won a VES Award for his work on this sequence, and it will always stand out as a high point in my career to have worked with him on it.
TOP LEFT: Marijn Giesbertz TOP RIGHT: Alessandro Nardini (Photo: Andrea Arghinenti) BOTTOM: Matt Vainio
Matt Vainio, Infamous: Second Son Second Son is what you get when phenomenal artistry intersects with powerful tools. It says something when, of all the games in this list, Infamous is the only one whose showcase of outstanding work revolves around core systems effects (the abilities of the player) rather than one-off set-piece moments. The player smoke, neon dash, and other gameplay effects are all brilliantly executed. Particles are emitted off of the entire player body with proper coloring, and transition to beautiful ribbons and ash with an insane amount of density and curl noise. When the team at Sucker Punch Productions showed off their tools at GDC and a VES event in Seattle, the audience was awestruck.
FALL 2018 VFXVOICE.COM • 89
8/23/18 1:11 PM
GAMES
WHO ARE THE ROCK STARS OF VIDEO GAME VFX? By DAVID “DJ” JOHNSON CEO/Creative Director, Undertone FX
TOP : David “DJ” Johnson BOTTOM LEFT: Robert Gaines BOTTOM RIGHT: Sascha Herfort
In the film industry, when it comes to Visual Effects, there are names that we all know. We know exactly what they worked on and where they made their names. They are legends: Dennis Muren VES, ASC; Ray Harryhausen; Richard Edlund VES, ASC; Stan Winston; Paul Debevec; Ed Catmull VES – the list goes on. While your list might differ a bit, this is part of mine. So who are those people in the games industry? In video games, we are more than anonymous. We are the few in the trenches working on our passions. We don’t often show up in behind-thescenes videos. We are rarely named in any of the gaming awards. But there are incredible talents out there that are doing what I and others consider to be some of the most cutting-edge visual effects in the game industry. I would like to call out some of these people. They are the legends that I look up to and aspire to emulate. A few of them I’ve worked with, and a few I’ve competed against at the VES Awards. But for every one of them, when I hear their name, an image is conjured of a moment in gaming where something astounding was created – something better than I had ever seen before. These are some of my legends of gaming VFX, and what stood out in my head as making them each rock stars. Robert Gaines, Call of Duty 2 I remember everyone in the office I worked at huddled around a TV showing off this game and all of us admiring Robert’s work. It was a WWII level set in Russia, and in it you planted explosives, then from afar detonated a building and saw it come down. We were in awe. It was beautiful. The smoke plume left behind was gorgeous. It eventually led to me applying at Infinity Ward and working under Robert for a few years. I’ve learned more from Robert than from any other person I’ve worked for or with in games. His critique was always on point and he came up with ways to work that I’d never seen before – time looping iteration, first-person in-context FX placement via the console. I’ll always consider Robert a mentor for what he taught me about creating video game effects. Sascha Herfort, RYSE: Son of Rome There were a number of ways in which RYSE shined above anything I’d seen before. The siren character caked in dried mud with warpaint cracking off was stunning. The facial animation pipeline they developed was amazing, sharing the same setup across gameplay and cinematics. But the sequence where the warship crashed into the shore with sails tearing, ropes flying everywhere – it was destruction on an eye-popping scale. They implemented an Alembic GeomCaching pipeline. When this was shown at the Game Developers Conference the following year, I scrambled along with several other studios to play catch up. A new bar had been set. Sascha has now moved over to the film industry where he works as a Creature TD at ILM. Marijn Giesbertz, Killzone Shadow Fall Killzone Shadow Fall also had a number of advancements that truly amazed me, such as its approach to forces I hadn’t seen
88 • VFXVOICE.COM FALL 2018
PG 88-90 GAMES.indd 88-89
“For everyone [on my list], when I hear their name, an image is conjured of a moment in gaming where something astounding was created – something better than I had ever seen before.” —David “DJ” Johnson
in a real-time engine before. Some of the cityscape fly-ins were absolutely gorgeous. The producers at Guerrilla Games were early adopters of PBR lighting. They were using motion vectors (functionally optical flow) to get higher-resolution effects textures early on. But the scene that really floored me came late in the game where gravity was going bonkers. All of these building bits were flying around the sky. It was the answer to “What if there were a massive destruction sequence, but instead of falling, everything just swirled around and you could fly through and run around inside of it?” I was truly astounded. Alessandro Nardini, Call of Duty: Ghosts Alessandro comes from the film industry and is back in it now, but we were graced with his talents for Call of Duty: Ghosts. After seeing the work [leading technical animator and director] Chris Evans did in RYSE, we had our sights set on seeing how far we could push destruction in a game engine, and Alessandro was just the man to pull it off. The opening level shows you running through the streets and houses of a town outside San Diego, where a ‘Rod of God’ has just struck nearby (a non-nuclear kinetic rod dropped from space that does as much damage as a nuke). As the streets are cracking and dropping out beneath your feet, you run through houses that are being torn in half and witness buildings collapsing into the chasm. Alessandro won a VES Award for his work on this sequence, and it will always stand out as a high point in my career to have worked with him on it.
TOP LEFT: Marijn Giesbertz TOP RIGHT: Alessandro Nardini (Photo: Andrea Arghinenti) BOTTOM: Matt Vainio
Matt Vainio, Infamous: Second Son Second Son is what you get when phenomenal artistry intersects with powerful tools. It says something when, of all the games in this list, Infamous is the only one whose showcase of outstanding work revolves around core systems effects (the abilities of the player) rather than one-off set-piece moments. The player smoke, neon dash, and other gameplay effects are all brilliantly executed. Particles are emitted off of the entire player body with proper coloring, and transition to beautiful ribbons and ash with an insane amount of density and curl noise. When the team at Sucker Punch Productions showed off their tools at GDC and a VES event in Seattle, the audience was awestruck.
FALL 2018 VFXVOICE.COM • 89
8/23/18 1:11 PM
GAMES
Tobias Stromvall, Call of Duty: Infinite Warfare The level in Infinite Warfare called “Dark Quarry” has you running around a ruined mining colony on an asteroid as it plummets into the sun. This level takes on a horror tone as the asteroid spins (it was thrown into a death spiral by an enemy nuke), and the day and night cycles (usually continuously powered by solar panels) cut out and reactivate in 90-second cycles. When you look or step outside, you can admire Tobias’s work. It is a hellscape. Boulders the size of houses crash all around you, and molten lava is tossed around in the air and splashes across the path you’re running on. Debris, hellfire and chaos erupt everywhere. And the all-too-large sun on the horizon, when it zooms past overhead, is beautiful with solar flares and a motion-vectored boiling core. Remarkable work by a magnificent artist.
“It’s also worth mentioning that many people worked on all of these shots. Games are a group effort, and often it isn’t always clear who to honor, as more than one person worked on segments.” —David “DJ” Johnson
TOP LEFT: Tobias Stromvall TOP RIGHT: Janne Pulkkinen (Photo: Katri Naukkarinen) BOTTOM: Kevin Huynh
Janne Pulkkinen, Quantum Break When a new game is announced and its trailer is as impressive as Quantum Break’s was, there is often a healthy dose of skepticism. We collectively assume that it was pre-rendered and “inspiration” for what they want to achieve in real time. It’s quite rare to have your expectations shattered in the way Quantum Break did. The time stutter and broken time effects were just insane – in real time, with actual gameplay. Environments bend and warp dangerously. Cars pause above your head mid-flight from a crash, threatening to move again and crush you. Beautiful particle treatments convey the disintegration of matter. The way audio inputs are distributed spatially creates a timing and jitter that is perfect – all with a cadence to it that left time feeling like it was trying to kill you everywhere. Kevin Huynh, God of War This latest entry into the God of War series took a much different tone. It’s already getting buzz as one of the best and most beautiful games of the year, and then their ArtStation art dump happened, and it became clear that many talented artists showcased their best work on this game. One piece stood out to me as probably the most beautiful thing I’ve ever seen in a real-time engine: an ice tree. For over two minutes, this sequence plays out with particulates swirling in the air, ribbons of leaves flying around, culminating in a breathtaking burst of ice energy above you that continues to swirl and channel to a crescendo where a ray of frost unleashes on the far wall’s door. The mesh trunk of the tree begins to grow toward the rift. Standing ovation for Kevin. Every person listed above has pushed the envelope in terms of what’s possible in a video game engine. There are other names out there probably more well known in the games VFX community that I didn’t list. It’s not that they don’t also deserve recognition – this list simply comes from my experiences. Naturally, your list will differ from mine. It’s also worth mentioning that many people worked on all of these shots. Games are a group effort, and often it isn’t always clear who to honor, as more than one person worked on segments. Kudos to the other artists out there who played a role in creating these moments. You know who you are. So to those of you in the gaming industry, I ask you this: “Who are your legends in games VFX?”
90 • VFXVOICE.COM FALL 2018
PG 88-90 GAMES.indd 90
8/23/18 1:11 PM
[ VES SECTION SPOTLIGHT: AUSTRALIA ]
Expansion-Minded Section Mirrors Growing VFX Industry By NAOMI GOLDMAN
TOP: VES Australia members and guests out on the town for a Pub Night in Sydney. BELOW: Owners and co-founders of Rising Sun Pictures at the Thor: Ragnarok VES screening in Adelaide. BOTTOM: VES Australia members and guests enjoying a global screening.
Founded in 2008, the Australia Section is thriving thanks to its strong leadership and collaborative community. Australian VFX companies, including Rising Sun Pictures, Animal Logic, Method Studios, Luma and Fin Design + Effects have built a reputation for creativity and quality and are finding increasing success on some of the biggest films in recent years. These companies at the forefront are all represented on the Australia Section Board and bring broad expertise and great enthusiasm for nurturing the community of visual effects artists and innovators. The Section boasts 110 members who span the country, with most of its membership concentrated in the coastal cities of Sydney, Melbourne and Adelaide. The makeup closely mirrors the industry: the majority of Section members work in feature film, and representation is growing in the commercial and gaming sectors. “The visual effects industry is buoyant and healthy across the country, and Australian VFX companies are making an impact,” says Alastair Stephen, Section Co-Chair and Executive Producer of VFX at Fin Design + Effects. “Australia has a solid infrastructure that is essential for working on high-end VFX projects. The market is both competitive and collegial as we have
many of the same people moving from company to company as projects complete and ramp up. The VES is proud to be a part of this tight-knit community of artists.” The Section hosts a full roster of film screenings and pub nights in the three major cities, which the Board highlights as great opportunities to build camaraderie and introduce partners and prospective members to our organization. “Moving forward, we are focused on building and diversifying our membership to include more professionals from commercials and gaming, and we anticipate expansion into virtual and augmented reality,” Stephen remarks. “We are also exploring the creation of educational programs around the craft of VFX, to offer additional benefits to our members. “We are always cognizant that we are representing not only our individual companies, but representing the strength of Australia on the world stage,” Stephen concludes. “Working with my Co-Chair, Ian Cope, our aim is to build a group where our peers can come together to share experiences and tap into one another’s expertise. Being a part of the VES is a privilege. Through the Society we get to be a part of something bigger and have a voice in advancing our global industry.”
“Being a part of the VES is a privilege. Through the Society we get to be a part of something bigger and have a voice in advancing our global industry.” —Alastair Stephen, VES Australia Section Co-Chair
FALL 2018
PG 91 AUSTRALIA SECTION.indd 91
VFXVOICE.COM • 91
8/23/18 1:11 PM
[ VES NEWS ] By NAOMI GOLDMAN VES SECTIONS CELEBRATE SUMMERTIME Section Summer Parties were in full gear. These social events are great opportunities for members to celebrate and to introduce partners and prospective members to this vibrant organization and VFX community. Festive events, including parties, pub nights, BBQs and karaoke nights, were held in London, Los Angeles, New York and San Francisco Bay Area. And kudos to VES Germany on hosting its first annual Summer Drinks and to VES France for hosting its first Summer BBQ. VES INDUCTS 2018 HONOREES The VES Board of Directors is pleased to announce the 2018 inductees into the VES Hall of Fame, the newest Lifetime and Honorary members and this year’s recipient of the VES Founders Award. VES Hall of Fame Honorees include: L.B. Abbott (1908-1985). Lenwood Ballard Abbott, ASC, was an award-winning special effects expert, cinematographer and cameraman. He won four Academy Special Achievement Awards for Visual Effects for Doctor Dolittle, Tora! Tora! Tora!, The Poseidon Adventure and Logan’s Run. Abbott was head of the Special Effects Department at 20th Century Fox from 1957-1970. Richard “Doc” Baily (1953-2006). Baily was a visual effects pioneer, digital animator and creator of the abstract image construction software Spore. Doc is best known for the breathtaking visuals he created to represent the sentient planet in Steven Soderbergh’s film Solaris, which were generated at extremely high resolution. His filmography also includes Blade, Fight Club and The Cell. Saul Bass (1920-1996). Bass was a renowned graphic designer, VFX consultant and Academy Award-winning filmmaker. During his 40-year career, Bass worked for some of Hollywood’s most prominent filmmakers including Hitchcock, Preminger, Kubrick and Scorsese. He is best known for designing some of the most iconic film posters and title sequences in film history, including Vertigo, North by Northwest, Psycho, Spartacus, West Side Story and Goodfellas.
TOP: A legion of Star Wars characters pose with members and guests at the VES Bay Area Section Summer BBQ. MIDDLE: Members and guests celebrate at the VES France first annual BBQ Party. BOTTOM: Members and guests celebrate at the VES New York Summer Party.
92 • VFXVOICE.COM FALL 2018
PG 92-93 VES NEWS.indd All Pages
Ray Harryhausen (1920-2013). Harryhausen was a pioneering multiple-award-winning visual effects creator, writer and producer who created a form of stop-motion model animation known as Dynamation. His most memorable highlights include: working with his mentor Willis H. O’Brien on Academy Award winner Mighty Joe Young; his first color film, The 7th Voyage of Sinbad; and Jason and the Argonauts, which featured a legendary sword fight with skeleton warriors. He received the Academy’s Gordon E. Sawyer Award for technological contributions that brought credit to the industry.
Derek Meddings (1931-1995). Meddings was a special and visual effects supervisor who worked in television and film, most notably for the James Bond and Superman film series. Meddings was awarded a shared Special Achievement Academy Award for special effects on Superman and shared the BAFTA Michael Balcon Award. He was also Oscar-nominated for Moonraker and BAFTAnominated for Batman and GoldenEye. Eileen Moran (1952-2012). Moran was a multiple VES Awardwinning visual effects producer known for her groundbreaking CG commercial work at Digital Domain and her feature work as an executive producer at Weta Digital. Moran won her first VES Award for Outstanding Visual Effects for her work on King Kong. She led the Weta Digital effects team on VES Award-winner Avatar. She also received a VES Award nomination for The Adventures of Tintin: The Secret of the Unicorn. Gene Roddenberry (1921-1991). Roddenberry was an award-winning writer and producer, best known for creating the Star Trek franchise. He was the first TV writer with a star on the Hollywood Walk of Fame and was inducted into the Science Fiction Hall of Fame and the Academy of Television Arts & Sciences Hall of Fame. Roddenberry and Star Trek have been cited as inspiration for other science fiction franchises, with George Lucas crediting the series for enabling Star Wars to be produced. Family members of the Hall of Fame inductees will be recognized on their behalf at a special VES program. Lifetime Membership Honoree: Jonathan Erland, VES, for meritorious service to the Society and the global industry. As Chairman of the Academy of Motion Picture Arts and Sciences Visual Effects Award Steering Committee, Jonathan Erland, VES was instrumental in establishing Visual Effects as a Branch of the Academy. He served 11 years on the Academy’s Board of Governors and 25 years on the Executive Committee of the Visual Effects Branch and the Scientific and Engineering Awards Committee. Erland was a founder of the VES and is the recipient of the inaugural VES Founders Award, and was among the first to receive the VES Fellows distinction. He also received the Academy’s Scientific and Engineering Award and the Gordon E. Sawyer Award in recognition of his career of technological contributions that have
brought credit to the industry. Honorary Membership Honoree: Jules Roman for her exemplary contributions to the entertainment industry and for furthering the interests and values of visual effects practitioners. Jules Roman is co-founder and CEO of Tippett Studio. In this role, she has continually strived to push the creative and technical edge of the visual effects industry, from early, high-profile stopmotion design to lauded animation work on such films as Solo (A Star Wars Story), Jurassic World, Harry Potter and The Deathly Hallows (Part 2), The Force Awakens, The Twilight Saga and Ted. Roman is recognized as a pragmatic leader in the field of animation and visual effects. She has been nimble in the face of runaway production overseas, maintaining her Berkeley-based studio by diversifying offerings for Themed Entertainment, TV Commercials, Mobile and VR content, and International productions. Founders Award Honoree: Gene Kozicki for his sustained contributions to the art, science or business of visual effects and meritorious service to the Society. Gene Kozicki has served as a member of the Board of Directors and LA Section Board of Managers and as longtime chair of the Archives Committee. As VFX Historian, Kozicki is active in the archiving of information, imagery and artifacts from the visual effects industry. He regularly consults with the Academy of Motion Pictures Arts and Sciences and the American Cinematheque on retrospectives and conservation. Kozicki’s career has spanned almost three decades. His ‘VFX boot camp’ commenced working with Robert and Dennis Skotak, who were tasked with blowing up Los Angeles for Terminator 2. He joined VIFX in 1994 and then worked with Rhythm & Hues for more than a decade. He has worked on various Star Trek series and films including Titanic, Chronicles of Narnia and Saban’s Power Rangers. Names of this year’s VES Fellows distinction were not announced in time for publication in this issue.
TOP, LEFT TO RIGHT: Gene Kozicki (Photo: Gentle Giant) Jules Roman (Photo: Michael Clemens) Jonathan Erland, VES
FALL 2018
VFXVOICE.COM • 93
8/23/18 1:14 PM
[ VES NEWS ] By NAOMI GOLDMAN VES SECTIONS CELEBRATE SUMMERTIME Section Summer Parties were in full gear. These social events are great opportunities for members to celebrate and to introduce partners and prospective members to this vibrant organization and VFX community. Festive events, including parties, pub nights, BBQs and karaoke nights, were held in London, Los Angeles, New York and San Francisco Bay Area. And kudos to VES Germany on hosting its first annual Summer Drinks and to VES France for hosting its first Summer BBQ. VES INDUCTS 2018 HONOREES The VES Board of Directors is pleased to announce the 2018 inductees into the VES Hall of Fame, the newest Lifetime and Honorary members and this year’s recipient of the VES Founders Award. VES Hall of Fame Honorees include: L.B. Abbott (1908-1985). Lenwood Ballard Abbott, ASC, was an award-winning special effects expert, cinematographer and cameraman. He won four Academy Special Achievement Awards for Visual Effects for Doctor Dolittle, Tora! Tora! Tora!, The Poseidon Adventure and Logan’s Run. Abbott was head of the Special Effects Department at 20th Century Fox from 1957-1970. Richard “Doc” Baily (1953-2006). Baily was a visual effects pioneer, digital animator and creator of the abstract image construction software Spore. Doc is best known for the breathtaking visuals he created to represent the sentient planet in Steven Soderbergh’s film Solaris, which were generated at extremely high resolution. His filmography also includes Blade, Fight Club and The Cell. Saul Bass (1920-1996). Bass was a renowned graphic designer, VFX consultant and Academy Award-winning filmmaker. During his 40-year career, Bass worked for some of Hollywood’s most prominent filmmakers including Hitchcock, Preminger, Kubrick and Scorsese. He is best known for designing some of the most iconic film posters and title sequences in film history, including Vertigo, North by Northwest, Psycho, Spartacus, West Side Story and Goodfellas.
TOP: A legion of Star Wars characters pose with members and guests at the VES Bay Area Section Summer BBQ. MIDDLE: Members and guests celebrate at the VES France first annual BBQ Party. BOTTOM: Members and guests celebrate at the VES New York Summer Party.
92 • VFXVOICE.COM FALL 2018
PG 92-93 VES NEWS.indd All Pages
Ray Harryhausen (1920-2013). Harryhausen was a pioneering multiple-award-winning visual effects creator, writer and producer who created a form of stop-motion model animation known as Dynamation. His most memorable highlights include: working with his mentor Willis H. O’Brien on Academy Award winner Mighty Joe Young; his first color film, The 7th Voyage of Sinbad; and Jason and the Argonauts, which featured a legendary sword fight with skeleton warriors. He received the Academy’s Gordon E. Sawyer Award for technological contributions that brought credit to the industry.
Derek Meddings (1931-1995). Meddings was a special and visual effects supervisor who worked in television and film, most notably for the James Bond and Superman film series. Meddings was awarded a shared Special Achievement Academy Award for special effects on Superman and shared the BAFTA Michael Balcon Award. He was also Oscar-nominated for Moonraker and BAFTAnominated for Batman and GoldenEye. Eileen Moran (1952-2012). Moran was a multiple VES Awardwinning visual effects producer known for her groundbreaking CG commercial work at Digital Domain and her feature work as an executive producer at Weta Digital. Moran won her first VES Award for Outstanding Visual Effects for her work on King Kong. She led the Weta Digital effects team on VES Award-winner Avatar. She also received a VES Award nomination for The Adventures of Tintin: The Secret of the Unicorn. Gene Roddenberry (1921-1991). Roddenberry was an award-winning writer and producer, best known for creating the Star Trek franchise. He was the first TV writer with a star on the Hollywood Walk of Fame and was inducted into the Science Fiction Hall of Fame and the Academy of Television Arts & Sciences Hall of Fame. Roddenberry and Star Trek have been cited as inspiration for other science fiction franchises, with George Lucas crediting the series for enabling Star Wars to be produced. Family members of the Hall of Fame inductees will be recognized on their behalf at a special VES program. Lifetime Membership Honoree: Jonathan Erland, VES, for meritorious service to the Society and the global industry. As Chairman of the Academy of Motion Picture Arts and Sciences Visual Effects Award Steering Committee, Jonathan Erland, VES was instrumental in establishing Visual Effects as a Branch of the Academy. He served 11 years on the Academy’s Board of Governors and 25 years on the Executive Committee of the Visual Effects Branch and the Scientific and Engineering Awards Committee. Erland was a founder of the VES and is the recipient of the inaugural VES Founders Award, and was among the first to receive the VES Fellows distinction. He also received the Academy’s Scientific and Engineering Award and the Gordon E. Sawyer Award in recognition of his career of technological contributions that have
brought credit to the industry. Honorary Membership Honoree: Jules Roman for her exemplary contributions to the entertainment industry and for furthering the interests and values of visual effects practitioners. Jules Roman is co-founder and CEO of Tippett Studio. In this role, she has continually strived to push the creative and technical edge of the visual effects industry, from early, high-profile stopmotion design to lauded animation work on such films as Solo (A Star Wars Story), Jurassic World, Harry Potter and The Deathly Hallows (Part 2), The Force Awakens, The Twilight Saga and Ted. Roman is recognized as a pragmatic leader in the field of animation and visual effects. She has been nimble in the face of runaway production overseas, maintaining her Berkeley-based studio by diversifying offerings for Themed Entertainment, TV Commercials, Mobile and VR content, and International productions. Founders Award Honoree: Gene Kozicki for his sustained contributions to the art, science or business of visual effects and meritorious service to the Society. Gene Kozicki has served as a member of the Board of Directors and LA Section Board of Managers and as longtime chair of the Archives Committee. As VFX Historian, Kozicki is active in the archiving of information, imagery and artifacts from the visual effects industry. He regularly consults with the Academy of Motion Pictures Arts and Sciences and the American Cinematheque on retrospectives and conservation. Kozicki’s career has spanned almost three decades. His ‘VFX boot camp’ commenced working with Robert and Dennis Skotak, who were tasked with blowing up Los Angeles for Terminator 2. He joined VIFX in 1994 and then worked with Rhythm & Hues for more than a decade. He has worked on various Star Trek series and films including Titanic, Chronicles of Narnia and Saban’s Power Rangers. Names of this year’s VES Fellows distinction were not announced in time for publication in this issue.
TOP, LEFT TO RIGHT: Gene Kozicki (Photo: Gentle Giant) Jules Roman (Photo: Michael Clemens) Jonathan Erland, VES
FALL 2018
VFXVOICE.COM • 93
8/23/18 1:14 PM
[ VFX CAREERS ]
Recruitment Counsel for the Professional This occasional column explores a unique set of considerations for the global VFX professional. This issue’s column features a VFX Voice Q&A with Ila Abramson, owner/recruiter of New York-based I Spy (ispyrecruiting. com), which specializes in 3D animation, 2D animation, motion design, visual effects and production management. VFX Voice: What should the working professional do to keep pace with leapfrogging changes in the industry?
Ila Abramson
Abramson: It’s important to always stay engaged with the community and what is happening in the world beyond your studio walls. Sometimes artists will be so inwardly focused on their production projects that they forget to come up for air. When they do pull their head out of the sand to look for new work, it can be a bit bewildering. I encourage people to always be revising and reassessing their skills and materials, whether they’re looking for a job or not. Keep your skills sharp, and your portfolio of work up to date and ready to share. To keep evolving as an artist, keep an open mind. That could mean exploring opportunities in related areas or a bit “outside of the box” – such as gaming, VR/AR, scientific animation or architectural visualization. And don’t just look things up online – network! Meet an old co-worker for drinks, go to a screening, a conference, or a professional meet-up. And pay it forward by mentoring someone – get involved with students through portfolio reviews or screenings, for example. Share what you know. VFX Voice: How does staying current differ depending on the expertise of the
individual? VFX pro? 2D/3D animator? Motion design? Post production? Are there general guidelines? Abramson: I wouldn’t “chase the tool.” That can be a bit of a rabbit hole. You can teach a tool, but you can’t teach talent. However, if you do notice a tool becoming more widely used or discussed, it may be worth exploring and familiarizing yourself with it. VFX Voice: What are your tips on continuing education? Abramson: Taking classes can be a great opportunity to stay engaged and network. Picking up an additional skill could help float you through down times, and some houses prefer artists with a generalist skill set, so the more you know, the better. This shouldn’t be limited to just classes on specific software – also traditional skills, i.e., sculpture, painting or photography. VFX Voice: Are you noticing much more movement in terms of VFX recruitment these days? Abramson: Obviously, the days of staying with a studio indefinitely are over. Production schedules and budgets are changing, and studios are increasingly recruiting more on a project basis. VFX Voice: There has been an explosion of VFX-infused programming by companies like Netflix, Amazon, Hulu and cable channels. Is this creating more work for more people, or are the usual companies able to fill that gap? Big VFX movies seem to be increasing also. Animation is healthy (think Coco and Incredibles 2).
“Some studios or talent may be wary of recruiters, or ‘headhunters,’ but many of us are small agencies run by real people who want to make a good match.” —Ila Abramson
94 • VFXVOICE.COM FALL 2018
PG 94-95 VFX CAREERS.indd 95
Abramson: Yes, more channels and programming definitely means there is more work – and more room for smaller shops as well. I think the trick is balancing talent, budget and schedule. Yes, larger studios will pick up a lot of this new work, but if small studios can keep a low overhead, they can benefit from these trends too. VFX Voice: What are the pros and cons of working with a recruiter? Abramson: For the artist, I don’t think there is any harm in applying to all the studios that interest you and also reaching out to recruiters. Many times we know of opportunities that the artist may not be aware of. Or a studio can simply be too slammed to handle all their recruiting and will ask a recruiter to assist. So why not cover your bases? Some studios or talent may be wary of recruiters, or “headhunters,” but many of us are small agencies run by real people who want to make a good match. VFX Voice: What are some of the things a pro should ask when a recruiter calls? Is there a fee? Job description? Abramson: You want to understand their process and how they operate. First, understand if you’re talking to a recruiter or an agent. An agent or rep is someone who goes out and ‘sells’ them as an artist, for a fee. As a recruiter, I’m hired by the studio, and the studio – not the artist – pays my fee. Some recruiting companies will also act as a “third party,” meaning that they, rather than the studio, will actually make payments to the artist for a given job. In the case of my company, I Spy, payment is directly between the studio and artist. When I do have an opportunity that may be of interest to an artist that has submitted to I Spy, I will contact the artist about it. Sometimes an artist will apply with me and I’ll have a job the next day. Other times, an artist may not
hear from me for six months or a year. But the artists are always under active consideration.
Abramson: I’m seeing opportunities in gaming, VR/AR, experiential, scientific animation and architectural visualization.
VFX Voice: What are the top advantages to using a recruiter?
VFX Voice: Are you seeing more women coming in to the business?
Abramson: You have someone shepherding your through the process and dealing directly with the people that make the hiring decisions – your résumé isn’t just sitting in a pile. Many times, a recruiter steps in when a studio is simply too slammed on a project, so you are already halfway in the door by the time a recruiter presents your work to the studio. You have someone proactively keeping you posted on opportunities you may not be aware of.
Abramson: Yes – and the more diverse talent pool we have, the better!
VFX Voice: What are the essential items a pro needs in their search tool kit? Résumé? DVD? YouTube site? Social media presence? Etc.? Abramson: Yes, yes and yes. Reel, résumé, website, credit list, LinkedIn profile, cover letter – think 360. If I’m on your website, can I connect to you on LinkedIn? If I’m on LinkedIn, can I get to your work and understand your contribution? Some folks dismiss LinkedIn, but it is a great tool for connecting with new people, reconnecting with others, and finding that inside connection that’s so important for getting in the door. Remember, many times we are already looking at your work before you are aware of it. Make yourself easy to find, and make sure that what we can find is what you want us to see. Keep your professional separate from your private. It’s a good idea to update materials at least every six months. That’s far easier than trying to hunt down shots later. And if you wait too long, you might even forget projects you’ve done. VFX Voice: What areas seem to be expanding now in terms of new work opportunities?
VFX Voice: Any final words of encouragement for job seekers? Abramson: Remember, the job search can be a positive process. It is an opportunity for the artist to get to know the studios that are out there, what they are doing, and the skill sets they’re looking for. It’s a chance to reconnect with your network. Be proactive. Don’t just respond to a job posting, rather make yourself the person they think of before they have to post the job.
CORRECTION A photo caption on page 81 of the Spring 2018 issue (Vol. 2, No. 2) misidentified Marty Rosenberg and Patrick McArdle as the two individuals in the photo. Pictured are Lincoln Hu and Doug Kaye. A photo caption on page 12 of the Summer 2018 issue (Vol. 2, No. 3) identified the avatar in the photo as a digital animation in an amnio tank. The avatar is an animatronic in a tank filled with water.
FALL 2018
VFXVOICE.COM • 95
8/23/18 1:15 PM
[ VFX CAREERS ]
Recruitment Counsel for the Professional This occasional column explores a unique set of considerations for the global VFX professional. This issue’s column features a VFX Voice Q&A with Ila Abramson, owner/recruiter of New York-based I Spy (ispyrecruiting. com), which specializes in 3D animation, 2D animation, motion design, visual effects and production management. VFX Voice: What should the working professional do to keep pace with leapfrogging changes in the industry?
Ila Abramson
Abramson: It’s important to always stay engaged with the community and what is happening in the world beyond your studio walls. Sometimes artists will be so inwardly focused on their production projects that they forget to come up for air. When they do pull their head out of the sand to look for new work, it can be a bit bewildering. I encourage people to always be revising and reassessing their skills and materials, whether they’re looking for a job or not. Keep your skills sharp, and your portfolio of work up to date and ready to share. To keep evolving as an artist, keep an open mind. That could mean exploring opportunities in related areas or a bit “outside of the box” – such as gaming, VR/AR, scientific animation or architectural visualization. And don’t just look things up online – network! Meet an old co-worker for drinks, go to a screening, a conference, or a professional meet-up. And pay it forward by mentoring someone – get involved with students through portfolio reviews or screenings, for example. Share what you know. VFX Voice: How does staying current differ depending on the expertise of the
individual? VFX pro? 2D/3D animator? Motion design? Post production? Are there general guidelines? Abramson: I wouldn’t “chase the tool.” That can be a bit of a rabbit hole. You can teach a tool, but you can’t teach talent. However, if you do notice a tool becoming more widely used or discussed, it may be worth exploring and familiarizing yourself with it. VFX Voice: What are your tips on continuing education? Abramson: Taking classes can be a great opportunity to stay engaged and network. Picking up an additional skill could help float you through down times, and some houses prefer artists with a generalist skill set, so the more you know, the better. This shouldn’t be limited to just classes on specific software – also traditional skills, i.e., sculpture, painting or photography. VFX Voice: Are you noticing much more movement in terms of VFX recruitment these days? Abramson: Obviously, the days of staying with a studio indefinitely are over. Production schedules and budgets are changing, and studios are increasingly recruiting more on a project basis. VFX Voice: There has been an explosion of VFX-infused programming by companies like Netflix, Amazon, Hulu and cable channels. Is this creating more work for more people, or are the usual companies able to fill that gap? Big VFX movies seem to be increasing also. Animation is healthy (think Coco and Incredibles 2).
“Some studios or talent may be wary of recruiters, or ‘headhunters,’ but many of us are small agencies run by real people who want to make a good match.” —Ila Abramson
94 • VFXVOICE.COM FALL 2018
PG 94-95 VFX CAREERS.indd 95
Abramson: Yes, more channels and programming definitely means there is more work – and more room for smaller shops as well. I think the trick is balancing talent, budget and schedule. Yes, larger studios will pick up a lot of this new work, but if small studios can keep a low overhead, they can benefit from these trends too. VFX Voice: What are the pros and cons of working with a recruiter? Abramson: For the artist, I don’t think there is any harm in applying to all the studios that interest you and also reaching out to recruiters. Many times we know of opportunities that the artist may not be aware of. Or a studio can simply be too slammed to handle all their recruiting and will ask a recruiter to assist. So why not cover your bases? Some studios or talent may be wary of recruiters, or “headhunters,” but many of us are small agencies run by real people who want to make a good match. VFX Voice: What are some of the things a pro should ask when a recruiter calls? Is there a fee? Job description? Abramson: You want to understand their process and how they operate. First, understand if you’re talking to a recruiter or an agent. An agent or rep is someone who goes out and ‘sells’ them as an artist, for a fee. As a recruiter, I’m hired by the studio, and the studio – not the artist – pays my fee. Some recruiting companies will also act as a “third party,” meaning that they, rather than the studio, will actually make payments to the artist for a given job. In the case of my company, I Spy, payment is directly between the studio and artist. When I do have an opportunity that may be of interest to an artist that has submitted to I Spy, I will contact the artist about it. Sometimes an artist will apply with me and I’ll have a job the next day. Other times, an artist may not
hear from me for six months or a year. But the artists are always under active consideration.
Abramson: I’m seeing opportunities in gaming, VR/AR, experiential, scientific animation and architectural visualization.
VFX Voice: What are the top advantages to using a recruiter?
VFX Voice: Are you seeing more women coming in to the business?
Abramson: You have someone shepherding your through the process and dealing directly with the people that make the hiring decisions – your résumé isn’t just sitting in a pile. Many times, a recruiter steps in when a studio is simply too slammed on a project, so you are already halfway in the door by the time a recruiter presents your work to the studio. You have someone proactively keeping you posted on opportunities you may not be aware of.
Abramson: Yes – and the more diverse talent pool we have, the better!
VFX Voice: What are the essential items a pro needs in their search tool kit? Résumé? DVD? YouTube site? Social media presence? Etc.? Abramson: Yes, yes and yes. Reel, résumé, website, credit list, LinkedIn profile, cover letter – think 360. If I’m on your website, can I connect to you on LinkedIn? If I’m on LinkedIn, can I get to your work and understand your contribution? Some folks dismiss LinkedIn, but it is a great tool for connecting with new people, reconnecting with others, and finding that inside connection that’s so important for getting in the door. Remember, many times we are already looking at your work before you are aware of it. Make yourself easy to find, and make sure that what we can find is what you want us to see. Keep your professional separate from your private. It’s a good idea to update materials at least every six months. That’s far easier than trying to hunt down shots later. And if you wait too long, you might even forget projects you’ve done. VFX Voice: What areas seem to be expanding now in terms of new work opportunities?
VFX Voice: Any final words of encouragement for job seekers? Abramson: Remember, the job search can be a positive process. It is an opportunity for the artist to get to know the studios that are out there, what they are doing, and the skill sets they’re looking for. It’s a chance to reconnect with your network. Be proactive. Don’t just respond to a job posting, rather make yourself the person they think of before they have to post the job.
CORRECTION A photo caption on page 81 of the Spring 2018 issue (Vol. 2, No. 2) misidentified Marty Rosenberg and Patrick McArdle as the two individuals in the photo. Pictured are Lincoln Hu and Doug Kaye. A photo caption on page 12 of the Summer 2018 issue (Vol. 2, No. 3) identified the avatar in the photo as a digital animation in an amnio tank. The avatar is an animatronic in a tank filled with water.
FALL 2018
VFXVOICE.COM • 95
8/23/18 1:15 PM
[ FINAL FRAME ]
Disneyland – When the Future was 1986
Even Walt Disney, who was known to be a futurist inspired by the works of Jules Verne, might be surprised at how Disneyland (and other theme parks) have evolved since July 1955 when the original Disneyland in Southern California opened. In fact, when Disneyland opened, Tomorrowland, the park’s portal to the future, only represented the future in the year 1986. In 1955, Tomorrowland’s main attraction was the TWA Moonliner (based on Disney’s 1950’s Man in Space TV episodes). The Moonliner was the tallest structure in the park and featured a ride to the moon known
as Rocket To The Moon. There was also an attraction called Circarama USA that featured movies on nine screens, space station X-1 with a satellite view of America, and the Autopia autoway which reflected America’s vital expansion of its interstate road network. Today, one of the most popular attractions at the Disneyland Resort is the newer Guardians of the Galaxy – Mission: Breakout! which opened in 2017. That ride, like some others around the world, is using VFX, AR and VR toolboxes to take patrons to heretofore impossible places. VFX is the new Tomorrowland.
Photo courtesy of Walt Disney
96 • VFXVOICE.COM FALL 2018
PG 96 FINAL FRAME.indd 96
8/23/18 1:16 PM
CVR3 FOTOKEM AD.indd 3
8/23/18 11:43 AM
CVR4 TECHNICOLOR AD.indd 4
8/23/18 11:45 AM