VFXVOICE.COM
WINTER 2018
JOE LETTERI: CREATUREMASTER THE OSCAR CONTENDERS • 40 YEARS OF STAR WARS VFX • REPORT: VFX SCREEN CREDITS THE JOURNEY OF COCO • 2018 VES AWARDS HONOREES: JOE LETTERI & JON FAVREAU
COVER 1 VFXV issue 4 WINTER 2018.indd 1
11/22/17 4:42 PM
CVR2 NETFLIX BRIGHT AD.indd 2
11/22/17 4:43 PM
PG 1 SPINVFX AD.indd 1
11/22/17 4:46 PM
[ EXECUTIVE NOTE ]
Welcome to the January issue of VFX Voice! Over the last two decades, the Visual Effects Society has used its position as a convener to take on critical issues affecting its membership and the greater VFX community. As the entertainment industry’s only organization representing the full breadth of visual effects practitioners, we are committed to advancing the art, science and application of visual effects, improving the welfare of our members and the global VFX community and celebrating VFX excellence – which drives box office and filmed entertainment like never before. In this issue we take a look at the complex subject of Screen Credits in a VFX Voice Special Report. Many in the industry believe that VFX doesn’t have a voice, nor does it get sufficient respect for the artistry of its contribution, which touches nearly every film produced and, in some, provides a majority of the on-screen imagery. Others simply want to know why their credits come so late in the end crawl, long after the other creative credits, or why their names don’t appear at all. As a whole, the visual effects community believes it’s time for a change, and the first step towards achieving that is knowledge and understanding. Our Special Report examines the history of this practice and gleans insights from industry leaders on the challenges and potential solutions to the issue of standardized credits in this dynamic marketplace – and the path towards greater recognition. From our perspective, it’s time to bring renewed attention to this issue of concern to many and move the conversation from private meetings with studio executives into the spotlight, spreading the dilemmas facing VFX facilities and artists to the wider Hollywood community. Read on and let us know your thoughts. Moving forward, we at VFX Voice hope to bring you more articles that dive deeper into issues of significant importance to our community, while continuing to deliver eye-popping visuals, in-depth profiles and insightful stories that highlight the exceptional artists and innovators all across our global community. Heading into awards season, this issue profiles our 2018 VES Awards honorees – Georges Méliès Award recipient Joe Letteri and VES Lifetime Achievement Award recipient Jon Favreau. We also take a look at likely Academy Award VFX contenders, as well as 40 years of Star Wars VFX, and more. Thank you again for being a part of the inaugural year of this exciting new addition to our organization’s legacy.
Mike Chambers, Chair, VES Board of Directors
Eric Roth, VES Executive Director
2 • VFXVOICE.COM WINTER 2018
PG 2 EXECUTIVE NOTE.indd 2
11/22/17 4:46 PM
PG 3 BLACKMAGIC AD.indd 3
11/22/17 4:47 PM
[ CONTENTS ] FEATURES 10 FILM: THE VFX OSCAR Previewing hot prospects for this year’s awards.
VFXVOICE.COM
DEPARTMENTS 2 EXECUTIVE NOTE 104 VES SECTION SPOTLIGHT: NEW YORK
22 SPECIAL REPORT: VFX SCREEN CREDITS Do today’s credits reflect VFX’s leading role? 30 COVER: JOE LETTERI Recipient of the 2018 VES Georges Méliès Award.
108 VES NEWS 112 FINAL FRAME: DENNIS MUREN, VES
38 FILM: OATS STUDIOS Director Neill Blomkamp revisits his indie-film roots. 44 VES AWARDS PROFILE: JON FAVREAU Recipient of the 2018 VES Lifetime Achievement Award.
ON THE COVER: War for the Planet of the Apes (Photo TM and copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
50 FILM: THE SHAPE OF WATER Legacy Effects, Mr. X neatly blend practical and digital. 54 TV: PREACHER Legend 3D enhances comic and horror in graphic show. 58 FILM: STAR WARS: THE LAST JEDI ILM returns to deliver an array of dazzling effects. 62 FILM: STAR WARS INNOVATIONS 40 years of VFX highlights that have shaped the industry. 66 PROFILE: CHRIS CORBOULD Inside Corbould’s ‘explosive’ career from Bond to Nolan. 74 INDUSTRY ROUNDTABLE: ANIMATION 2018 Industry leaders discuss creative and technical trends. 82 ANIMATION: THE JOURNEY OF COCO Pixar’s cultural tale was inspired by a Mexican holiday. 88 VR/AR/MR TRENDS: AUGMENTED REALITY New tech builds bridge to AR for consumers, developers. 94 IMMERSIVE MEDIA: VR180 New format isn’t true VR – but has mass-market appeal. 100 VFX VAULT: 1995’s JUMANJI How the original film’s CG creatures leaped to life.
4 • VFXVOICE.COM WINTER 2018
PG 4 TOC.indd 4
11/22/17 4:47 PM
PG 5 DISNEY THOR AD.indd 5
11/22/17 4:48 PM
WINTER 2018 • VOL. 2, NO. 1
VFXVOICE
Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com
VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS
EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Willie Clark Ian Failes Michael Goldman Naomi Goldman Trevor Hogg Debra Kaufman Chris McGowan Ed Ochs Paula Parisi Barbara Robertson Helene Seifer PUBLICATION ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth
OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Rita Cahill, Secretary Dennis Hoffman, Treasurer DIRECTORS Jeff Barnes, Andrew Bly, Brooke Breton, Kathryn Brillhart, Emma Clifton Perry, Bob Coleman, Dayne Cowan, Kim Davidson, Debbie Denise, Richard Edlund, VES, Pam Hogarth, Joel Hynek, Jeff Kleiser, Neil Lim-Sang, Brooke Lyndon-Stanford, Tim McGovern, Kevin Rafferty, Scott Ross, Barry Sandrew, Tim Sassoon, Dan Schrecker, David Tanaka, Bill Taylor, VES, Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Fon Davis, Charlie Iturriaga, Christian Kubsch, Andres Martinez, Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Jeff Casper, Manager of Media & Graphics Ben Schneider, Membership Coordinator Colleen Kelly, Office Manager Jennifer Cabrera, Administrative Assistant Alex Romero, Global Administrative Coordinator P.J. Schumacher, Controller Naomi Goldman, Public Relations
Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2017 The Visual Effects Society. Printed in the U.S.A.
6 • VFXVOICE.COM WINTER 2018
PG 6 MASTHEAD.indd 6
11/22/17 4:48 PM
PG 7 DOWNSIZING PARAMOUNT AD.indd 7
11/22/17 4:48 PM
PG 8-9 FOX SPREAD AD.indd All Pages
11/22/17 4:49 PM
PG 8-9 FOX SPREAD AD.indd All Pages
11/22/17 4:49 PM
FILM
THE VFX OSCAR: MEET THE CONTENDERS By IAN FAILES
It’s near impossible to predict the VFX Oscar winner. In some years, a clear front-runner often emerges, such as The Jungle Book (2016), Gravity (2013) or Life of Pi (2012). In other years, the winners have surprised many, such as Hugo in 2011 and Ex Machina in 2015. But the reality is that the visual effects work in all the films that make it through the nomination process is always stunning. 2017 was an enormous year for visual effects, which means the art of Oscar prediction remains a tough one. The year featured massive films with core VFX components that would not have been possible without VFX artistry. There were also smaller movies that included more ‘invisible effects,’ even though the practical and digital effects work in them was just as critical to the storytelling. With that in mind, VFX Voice looks at 20 of the possible contenders for the visual effects Oscar, out of a list that could easily have included many more. As most visual effects practitioners know, the Academy of Motion Picture Arts & Sciences Visual Effects Branch creates a ‘short list’ of films that will rival for nominations for the Best Visual Effects Oscar. Once the short list is announced, each visual effects team creates a reel and the team presents it at the Academy in front of Visual Effects Branch members. This ‘VFX Bakeoff’ takes place early to mid-January. Subsequent to that, final nominations are announced.
10 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 10-11
ALIEN: COVENANT
BEAUTY AND THE BEAST
The Alien films have a rich history in on-set creature effects; with Alien: Covenant, director Ridley Scott used that to his advantage by having many of the creatures and gags built practically, even if the final intention was to replace them with CG. It meant actors had real things to react to – a noticeable component in the film’s many action scenes. Plus, it meant the VFX houses had a grounded reality to match with absolute photorealism.
Bill Condon’s Beauty and the Beast was one of the most anticipated films of the year, successfully navigating the jump from the original 1991 2D animated feature to live action. Interestingly, the visual effects heavily reference the original cartoon, especially the household items joyously brought to life by Framestore, and then take the Beast into the photoreal realm, thanks to an on-set performance by Dan Stevens and computer graphics by Digital Domain. TOP: MPC, under the guidance of Visual Effects Supervisor Charley Henley, delivered several fully-CG creatures for the film, including the Xenomorph. (Photo TM & Copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.) BOTTOM: Digital Domain utilized its new facial-capture and re-targeting system called Direct Drive to take Dan Stevens’s original performance and transplant that onto their CG Beast. (Photo copyright © 2016 Walt Disney Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 11
11/22/17 4:50 PM
FILM
THE VFX OSCAR: MEET THE CONTENDERS By IAN FAILES
It’s near impossible to predict the VFX Oscar winner. In some years, a clear front-runner often emerges, such as The Jungle Book (2016), Gravity (2013) or Life of Pi (2012). In other years, the winners have surprised many, such as Hugo in 2011 and Ex Machina in 2015. But the reality is that the visual effects work in all the films that make it through the nomination process is always stunning. 2017 was an enormous year for visual effects, which means the art of Oscar prediction remains a tough one. The year featured massive films with core VFX components that would not have been possible without VFX artistry. There were also smaller movies that included more ‘invisible effects,’ even though the practical and digital effects work in them was just as critical to the storytelling. With that in mind, VFX Voice looks at 20 of the possible contenders for the visual effects Oscar, out of a list that could easily have included many more. As most visual effects practitioners know, the Academy of Motion Picture Arts & Sciences Visual Effects Branch creates a ‘short list’ of films that will rival for nominations for the Best Visual Effects Oscar. Once the short list is announced, each visual effects team creates a reel and the team presents it at the Academy in front of Visual Effects Branch members. This ‘VFX Bakeoff’ takes place early to mid-January. Subsequent to that, final nominations are announced.
10 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 10-11
ALIEN: COVENANT
BEAUTY AND THE BEAST
The Alien films have a rich history in on-set creature effects; with Alien: Covenant, director Ridley Scott used that to his advantage by having many of the creatures and gags built practically, even if the final intention was to replace them with CG. It meant actors had real things to react to – a noticeable component in the film’s many action scenes. Plus, it meant the VFX houses had a grounded reality to match with absolute photorealism.
Bill Condon’s Beauty and the Beast was one of the most anticipated films of the year, successfully navigating the jump from the original 1991 2D animated feature to live action. Interestingly, the visual effects heavily reference the original cartoon, especially the household items joyously brought to life by Framestore, and then take the Beast into the photoreal realm, thanks to an on-set performance by Dan Stevens and computer graphics by Digital Domain. TOP: MPC, under the guidance of Visual Effects Supervisor Charley Henley, delivered several fully-CG creatures for the film, including the Xenomorph. (Photo TM & Copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.) BOTTOM: Digital Domain utilized its new facial-capture and re-targeting system called Direct Drive to take Dan Stevens’s original performance and transplant that onto their CG Beast. (Photo copyright © 2016 Walt Disney Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 11
11/22/17 4:50 PM
FILM
BLADE RUNNER 2049
DOWNSIZING
DUNKIRK
GHOST IN THE SHELL
Few films other than Ridley Scott’s Blade Runner are referenced so commonly as visual effects masterpieces. But Denis Villeneuve’s Blade Runner 2049 has somehow taken the essence of the practical miniatures and matte paintings in the original film and preserved that feeling, albeit with the ability to rely on the latest digital techniques. The nostalgia factor – and incredible VFX work – will be a major factor in this being an Oscar contender.
Downsizing, from director Alexander Payne, is perhaps a left-field choice as a VFX Oscar contender. It mainly involves miniaturization work by Industrial Light & Magic – a studio that has pioneered this technique before in films such as Innerspace and The Indian in the Cupboard. Here, miniaturization is so completely intertwined in the story that it almost becomes unnoticeable, a significant measure of success.
Christopher Nolan films have a habit of winning (Interstellar, Inception) and being nominated (The Dark Knight) for the VFX Oscar. This is despite Nolan regularly looking to film things as practically as possible. Ultimately, the director’s films regularly combine live action, large-scale effects, miniatures, CG and seamless compositing. Voters will be astounded how well these things were combined by Double Negative in Dunkirk.
This Rupert Sanders take on the original Japanese manga and anime has many elaborately crafted sequences, with Visual Effects Supervisors Guillaume Rocheron and John Dykstra entrusting several studios to fill out the world. As it has done on a number of films released in 2017, lead house MPC took digital makeup to new heights by augmenting live-action performances with CG, as well as finding innovative ways to deliver futuristic ‘solograms’ – moving 3D advertisements – and cityscapes.
TOP: Along with a crack team of VFX studios like Double Negative, MPC and Framestore – and Visual Effects Supervisor John Nelson – Blade Runner 2049 was also shot by lauded cinematographer Roger Deakins, ASC, BSC adding a deep level of visuals and style. (Photo copyright © 2017 Columbia Pictures/Alcon Entertainment LLC. All Rights Reserved.) BOTTOM: Kristen Wiig and Matt Damon star as a couple who look to shrink themselves in an effort to downsize their lives. Visual effects, overseen by James E. Price, helped sell key scenes requiring scale differences. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)
12 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 12-13
TOP: Visual Effects Supervisor Andrew Jackson led the Double Negative team in realizing Nolan’s World War II drama, much of it shot on IMAX cameras. (Photo copyright © 2017 Warner Bros. Entertainment Inc., RatpacDune Entertainment LLC and Ratpac Entertainment LLC. All Rights Reserved.) BOTTOM: Iconic hovering moments through the futuristic city were actually dubbed ‘Ghost Cam’ and required collaboration between design, cinematography and VFX. (Photo copyright © 2016 Paramount Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 13
11/22/17 4:50 PM
FILM
BLADE RUNNER 2049
DOWNSIZING
DUNKIRK
GHOST IN THE SHELL
Few films other than Ridley Scott’s Blade Runner are referenced so commonly as visual effects masterpieces. But Denis Villeneuve’s Blade Runner 2049 has somehow taken the essence of the practical miniatures and matte paintings in the original film and preserved that feeling, albeit with the ability to rely on the latest digital techniques. The nostalgia factor – and incredible VFX work – will be a major factor in this being an Oscar contender.
Downsizing, from director Alexander Payne, is perhaps a left-field choice as a VFX Oscar contender. It mainly involves miniaturization work by Industrial Light & Magic – a studio that has pioneered this technique before in films such as Innerspace and The Indian in the Cupboard. Here, miniaturization is so completely intertwined in the story that it almost becomes unnoticeable, a significant measure of success.
Christopher Nolan films have a habit of winning (Interstellar, Inception) and being nominated (The Dark Knight) for the VFX Oscar. This is despite Nolan regularly looking to film things as practically as possible. Ultimately, the director’s films regularly combine live action, large-scale effects, miniatures, CG and seamless compositing. Voters will be astounded how well these things were combined by Double Negative in Dunkirk.
This Rupert Sanders take on the original Japanese manga and anime has many elaborately crafted sequences, with Visual Effects Supervisors Guillaume Rocheron and John Dykstra entrusting several studios to fill out the world. As it has done on a number of films released in 2017, lead house MPC took digital makeup to new heights by augmenting live-action performances with CG, as well as finding innovative ways to deliver futuristic ‘solograms’ – moving 3D advertisements – and cityscapes.
TOP: Along with a crack team of VFX studios like Double Negative, MPC and Framestore – and Visual Effects Supervisor John Nelson – Blade Runner 2049 was also shot by lauded cinematographer Roger Deakins, ASC, BSC adding a deep level of visuals and style. (Photo copyright © 2017 Columbia Pictures/Alcon Entertainment LLC. All Rights Reserved.) BOTTOM: Kristen Wiig and Matt Damon star as a couple who look to shrink themselves in an effort to downsize their lives. Visual effects, overseen by James E. Price, helped sell key scenes requiring scale differences. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)
12 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 12-13
TOP: Visual Effects Supervisor Andrew Jackson led the Double Negative team in realizing Nolan’s World War II drama, much of it shot on IMAX cameras. (Photo copyright © 2017 Warner Bros. Entertainment Inc., RatpacDune Entertainment LLC and Ratpac Entertainment LLC. All Rights Reserved.) BOTTOM: Iconic hovering moments through the futuristic city were actually dubbed ‘Ghost Cam’ and required collaboration between design, cinematography and VFX. (Photo copyright © 2016 Paramount Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 13
11/22/17 4:50 PM
FILM
GUARDIANS OF THE GALAXY VOL. 2
JUSTICE LEAGUE
KONG: SKULL ISLAND
LOGAN
The depth of visual effects work in Guardians of the Galaxy Vol. 2 is one of its crowning VFX achievements. Several houses worked on bringing numerous digital creatures to life (Rocket Raccoon, for example, was shared amongst four facilities). Other notables include highly detailed digital environments, an exploding fractal-filled planet and de-aging visual effects for Kurt Russell.
Superman. Batman. Wonder Woman. Aquaman. The Flash. Cyborg. Each of these characters in Justice League required their own specialized visual effects enhancement, and that doesn’t touch on the many countless battle scenes in the film. More than most, Justice League clearly shows the key role that VFX now plays in comic book films in both realizing superheroes and being responsible for some of the coolest ‘audience favorite’ shots.
Watching Kong take on a multitude of other creatures, and even helicopters in one horrific early sequence, in Jordan Vogt-Roberts’s Kong: Skull Island, it’s clear that the VFX team, led by Supervisors Stephen Rosenbaum and Jeff White, reached the necessary spectacle heights for this film. They also managed to imbue a strong sense of emotion and character into the giant ape – and that’s what the visual effects will be remembered for.
James Mangold’s Logan has a grittier and more grounded place in the X-Men series of films than others before it, as its bloody effects attest. But audiences were surprised to learn that some Wolverine (Hugh Jackman) scenes made use of a completely photoreal digital double, a pleasing sleight of hand that could edge the film closer to a VFX Oscar nomination.
TOP: Industrial Light & Magic amplified its muscle simulation and hair tools to help enable a number of mind-blowing sequences in Kong: Skull Island. (Photo copyright © 2017 Warner Bros Entertainment. All Rights Reserved.) BOTTOM: Image Engine produced the digital Hugh Jackman double, as well as a double for the character of Laura (Dafne Keen) in the film, under guidance from overall Visual Effects Supervisor Chas Jarrett. (Photo copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
TOP: Baby Groot, as made by Framestore, during the opening dance sequence to the film. James Gunn even pantomimed the dance himself for animators to refer to. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)
14 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 15
BOTTOM: Visual Effects Supervisor John ‘DJ’ Des Jardin returned to the DC Universe and oversaw work from VFX studios including MPC, Weta Digital, Double Negative and Scanline VFX. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 15
11/22/17 4:50 PM
FILM
GUARDIANS OF THE GALAXY VOL. 2
JUSTICE LEAGUE
KONG: SKULL ISLAND
LOGAN
The depth of visual effects work in Guardians of the Galaxy Vol. 2 is one of its crowning VFX achievements. Several houses worked on bringing numerous digital creatures to life (Rocket Raccoon, for example, was shared amongst four facilities). Other notables include highly detailed digital environments, an exploding fractal-filled planet and de-aging visual effects for Kurt Russell.
Superman. Batman. Wonder Woman. Aquaman. The Flash. Cyborg. Each of these characters in Justice League required their own specialized visual effects enhancement, and that doesn’t touch on the many countless battle scenes in the film. More than most, Justice League clearly shows the key role that VFX now plays in comic book films in both realizing superheroes and being responsible for some of the coolest ‘audience favorite’ shots.
Watching Kong take on a multitude of other creatures, and even helicopters in one horrific early sequence, in Jordan Vogt-Roberts’s Kong: Skull Island, it’s clear that the VFX team, led by Supervisors Stephen Rosenbaum and Jeff White, reached the necessary spectacle heights for this film. They also managed to imbue a strong sense of emotion and character into the giant ape – and that’s what the visual effects will be remembered for.
James Mangold’s Logan has a grittier and more grounded place in the X-Men series of films than others before it, as its bloody effects attest. But audiences were surprised to learn that some Wolverine (Hugh Jackman) scenes made use of a completely photoreal digital double, a pleasing sleight of hand that could edge the film closer to a VFX Oscar nomination.
TOP: Industrial Light & Magic amplified its muscle simulation and hair tools to help enable a number of mind-blowing sequences in Kong: Skull Island. (Photo copyright © 2017 Warner Bros Entertainment. All Rights Reserved.) BOTTOM: Image Engine produced the digital Hugh Jackman double, as well as a double for the character of Laura (Dafne Keen) in the film, under guidance from overall Visual Effects Supervisor Chas Jarrett. (Photo copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
TOP: Baby Groot, as made by Framestore, during the opening dance sequence to the film. James Gunn even pantomimed the dance himself for animators to refer to. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)
14 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 15
BOTTOM: Visual Effects Supervisor John ‘DJ’ Des Jardin returned to the DC Universe and oversaw work from VFX studios including MPC, Weta Digital, Double Negative and Scanline VFX. (Photo copyright © 2017 Paramount Pictures. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 15
11/22/17 4:50 PM
FILM
OKJA
PIRATES OF THE CARIBBEAN: DEAD MEN TELL NO TALES
SPIDER-MAN: HOMECOMING
STAR WARS: THE LAST JEDI
Bong Joon-ho’s Okja is another left-field possibility for VFX Oscar contention. It did receive a cinema release, but its predominant run was via Netflix. Nevertheless, Visual Effects Supervisor Erik-Jan De Boer’s handling of the CG super-pig integration into live-action frames is superb, and there’s something about this quirky tale that makes you forget Okja is not real at all.
Two huge accomplishments in terms of visual effects were made on Pirates of the Caribbean: Dead Men Tell No Tales, co-directed by Joachim Rønning and Espen Sandberg, with VFX supervision by Gary Brozenich. One was digital character makeup, particularly for Javier Bardem’s Salazar. The other was a very deliberate methodology for digital water, which ranged all the way from flat calms to raging seas.
The biggest set pieces in Jon Watts’s Spider-Man: Homecoming – such as the ferry sinking scene, the Washington Monument rescue and the Avengers airport tarmac battle – all relied heavily on visual effects. What might be considered an even bigger achievement, however, is the consistent appearance of a CG Spider-Man in both large stunt sequences and even smaller scenes.
The recent Star Wars films have made the most of the latest advancements in practical creature effects and digital visual effects. Sometimes the practical side is pushed harder in pre-release, but no one can question Industrial Light & Magic’s commitment to generating a dazzling array of digital space imagery, space creatures and space battles.
TOP: Okja acts occasionally like a playful puppy, and Method’s animators looked to canine, hippo and other reference to get that right feeling in the creature’s movements. (Photo copyright © 2017 Netflix. All Rights Reserved.)
BOTTOM: Filmed on set with practical makeup, Javier Bardem was augmented by MPC to feature further scarring and an underwater look for his hair. (Photo credit: Film Frame. Copyright © 2017 Disney Enterprises Inc. All Rights Reserved.)
TOP: Visual Effects Supervisor Janek Sirrs oversaw many studios in bringing Spidey’s swinging actions to life, including this scene completed by Digital Domain. (Photo courtesy of Digital Domain. Copyright © 2017 Columbia Pictures/CTMG Inc. All Rights Reserved.)
BOTTOM: A winning formula in ILM’s creation of spaceships and battle scenes in Rian Johnson’s Star Wars:The Last Jedi is a nod to its roots in miniatures. (Photo credit: Film Frames. Copyright © 2017 Industrial Light & Magic/Lucasfilm Ltd. All Rights Reserved.)
16 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 17
WINTER 2018 VFXVOICE.COM • 17
11/22/17 4:50 PM
FILM
OKJA
PIRATES OF THE CARIBBEAN: DEAD MEN TELL NO TALES
SPIDER-MAN: HOMECOMING
STAR WARS: THE LAST JEDI
Bong Joon-ho’s Okja is another left-field possibility for VFX Oscar contention. It did receive a cinema release, but its predominant run was via Netflix. Nevertheless, Visual Effects Supervisor Erik-Jan De Boer’s handling of the CG super-pig integration into live-action frames is superb, and there’s something about this quirky tale that makes you forget Okja is not real at all.
Two huge accomplishments in terms of visual effects were made on Pirates of the Caribbean: Dead Men Tell No Tales, co-directed by Joachim Rønning and Espen Sandberg, with VFX supervision by Gary Brozenich. One was digital character makeup, particularly for Javier Bardem’s Salazar. The other was a very deliberate methodology for digital water, which ranged all the way from flat calms to raging seas.
The biggest set pieces in Jon Watts’s Spider-Man: Homecoming – such as the ferry sinking scene, the Washington Monument rescue and the Avengers airport tarmac battle – all relied heavily on visual effects. What might be considered an even bigger achievement, however, is the consistent appearance of a CG Spider-Man in both large stunt sequences and even smaller scenes.
The recent Star Wars films have made the most of the latest advancements in practical creature effects and digital visual effects. Sometimes the practical side is pushed harder in pre-release, but no one can question Industrial Light & Magic’s commitment to generating a dazzling array of digital space imagery, space creatures and space battles.
TOP: Okja acts occasionally like a playful puppy, and Method’s animators looked to canine, hippo and other reference to get that right feeling in the creature’s movements. (Photo copyright © 2017 Netflix. All Rights Reserved.)
BOTTOM: Filmed on set with practical makeup, Javier Bardem was augmented by MPC to feature further scarring and an underwater look for his hair. (Photo credit: Film Frame. Copyright © 2017 Disney Enterprises Inc. All Rights Reserved.)
TOP: Visual Effects Supervisor Janek Sirrs oversaw many studios in bringing Spidey’s swinging actions to life, including this scene completed by Digital Domain. (Photo courtesy of Digital Domain. Copyright © 2017 Columbia Pictures/CTMG Inc. All Rights Reserved.)
BOTTOM: A winning formula in ILM’s creation of spaceships and battle scenes in Rian Johnson’s Star Wars:The Last Jedi is a nod to its roots in miniatures. (Photo credit: Film Frames. Copyright © 2017 Industrial Light & Magic/Lucasfilm Ltd. All Rights Reserved.)
16 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 17
WINTER 2018 VFXVOICE.COM • 17
11/22/17 4:50 PM
FILM
THE FATE OF THE FURIOUS
THE SHAPE OF WATER
You might not think of The Fate of the Furious as an invisible effects movie, but in many ways that is how this F. Gary Gray film was made. Practical stunts formed the base of many action scenes, with Visual Effects Supervisors Kelvin McIlwain and Michael Grobe enabling a wealth of wire and rig removals and seamless CG additions.
Could Guillermo del Toro’s The Shape of Water take a leaf out of Ex Machina’s book and scoop the VFX Oscar? It has a central character that is both a practical and CG creation, and generally uses visual effects in very subtle ways. Plus, it delighted audiences, a big factor in those recognizing the efforts of the filmmakers and VFX team.
THOR: RAGNAROK
VALERIAN AND THE CITY OF A THOUSAND PLANETS
With so many Marvel films being released, Taika Waititi’s Thor: Ragnarok offered something different: characters and a world you already knew (and some you didn’t) but with more of an improvisational and independent vibe. That went for the visual effects, too, which were more deliberately stylized than previous Thor outings, a point of difference that might see the film elevated to a nomination.
Director Luc Besson wholeheartedly adopted the newest digital techniques in making Valerian, from previs to motion capture, facial animation, effects and rendering. While not warmly adopted by audiences, the film has a high proportion of astonishing scenes, most notably anything involving the Pearl creatures and the ‘you have to see it to believe it’ Big Market sequence.
TOP: Thor (Chris Hemsworth) and Hulk (Mark Ruffalo) face off in one of several battles – CG humans, animals and creatures were a major part of the film. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)
BOTTOM: Weta Digital worked with overall Visual Effects Supervisor Scott Stokdyk to generate the completely digital Pearl characters, based on real female models used in 3D scanning and facial capture. (Photo courtesy of Weta Digital. Copyright © 2016 EuropaCorp Pictures. Valerian SAS-TFI Films Production. All Rights Reserved.)
TOP: Some scenes were largely digital, such as this crazy ice-breaking submarine moment delivered by Digital Domain. (Photo copyright © 2017 Universal Pictures. All Rights Reserved.) BOTTOM: Visual Effects Supervisor Dennis Berardi from Mr X Inc. capitalized on an on-set performance from Doug Jones wearing creature prosthetics by Legacy Effects to further enhance aspects of the character, such as eyes and gills, in CG. (Photo credit: Kerry Hayes. Copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
18 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 19
WINTER 2018 VFXVOICE.COM • 19
11/22/17 4:50 PM
FILM
THE FATE OF THE FURIOUS
THE SHAPE OF WATER
You might not think of The Fate of the Furious as an invisible effects movie, but in many ways that is how this F. Gary Gray film was made. Practical stunts formed the base of many action scenes, with Visual Effects Supervisors Kelvin McIlwain and Michael Grobe enabling a wealth of wire and rig removals and seamless CG additions.
Could Guillermo del Toro’s The Shape of Water take a leaf out of Ex Machina’s book and scoop the VFX Oscar? It has a central character that is both a practical and CG creation, and generally uses visual effects in very subtle ways. Plus, it delighted audiences, a big factor in those recognizing the efforts of the filmmakers and VFX team.
THOR: RAGNAROK
VALERIAN AND THE CITY OF A THOUSAND PLANETS
With so many Marvel films being released, Taika Waititi’s Thor: Ragnarok offered something different: characters and a world you already knew (and some you didn’t) but with more of an improvisational and independent vibe. That went for the visual effects, too, which were more deliberately stylized than previous Thor outings, a point of difference that might see the film elevated to a nomination.
Director Luc Besson wholeheartedly adopted the newest digital techniques in making Valerian, from previs to motion capture, facial animation, effects and rendering. While not warmly adopted by audiences, the film has a high proportion of astonishing scenes, most notably anything involving the Pearl creatures and the ‘you have to see it to believe it’ Big Market sequence.
TOP: Thor (Chris Hemsworth) and Hulk (Mark Ruffalo) face off in one of several battles – CG humans, animals and creatures were a major part of the film. (Photo credit: Film Frame. Copyright © 2017 Marvel Studios. All Rights Reserved.)
BOTTOM: Weta Digital worked with overall Visual Effects Supervisor Scott Stokdyk to generate the completely digital Pearl characters, based on real female models used in 3D scanning and facial capture. (Photo courtesy of Weta Digital. Copyright © 2016 EuropaCorp Pictures. Valerian SAS-TFI Films Production. All Rights Reserved.)
TOP: Some scenes were largely digital, such as this crazy ice-breaking submarine moment delivered by Digital Domain. (Photo copyright © 2017 Universal Pictures. All Rights Reserved.) BOTTOM: Visual Effects Supervisor Dennis Berardi from Mr X Inc. capitalized on an on-set performance from Doug Jones wearing creature prosthetics by Legacy Effects to further enhance aspects of the character, such as eyes and gills, in CG. (Photo credit: Kerry Hayes. Copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
18 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 19
WINTER 2018 VFXVOICE.COM • 19
11/22/17 4:50 PM
FILM
VFX Oscar Roll Call
WAR FOR THE PLANET OF THE APES
WONDER WOMAN
Weta Digital already wowed filmgoers in the two previous Apes films by taking live-action motion-captured performances from the likes of Andy Serkis and turning them into living, breathing characters. It seems impossible, but the studio, led by Visual Effects Supervisors Joe Letteri and Dan Lemmon, somehow managed to up the ante this time around. War is surely considered a VFX Oscar front-runner.
Wonder Woman, from director Patty Jenkins, is on this list not because of any major technical achievements, but because it is another example of seamless and ‘invisible effects’. The VFX aid in selling the war-time scenes – and in enhancing stunts, especially using digital doubles. The most front-and-center work appears at the end in the final Diana versus Ares battle, but by then audiences have bought into the created world.
TOP: The ape Caesar (played by Andy Serkis). Weta Digital made further improvements in the way its fur system handled dirt and snow for the film, along with enhancements to its proprietary Manuka renderer. (Photo copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
BOTTOM: MPC, under the supervision of overall VFX Supervisor Bill Westenhofer, took plate photography of actress Gal Gadot and enhanced it to show her crashing through a window. (Photo courtesy of MPC. Copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)
20 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 21
The following is a list of films that the Academy of Motion Picture Arts & Sciences has recognized for Best Visual Effects since 1990. The Academy was cognizant of technical contributions to films as far back as their first Awards Show in 1928 when it handed out an award for “Best Engineering Effects” to the movie Wings, which was also the Best Picture winner that year. In 1933, the Academy recognized the work of Willis O’Brien for his work on King Kong. The very first award, entitled “Special Achievement Award for Special Effects,” was presented in 1938 for Paramount’s Spawn of the North. The Academy struggled with terminology and recognition for many years and did not come up with the official Best Visual Effects moniker until 1977. In some years there was no recognition award for special effects or technical achievements. In some years only three films were nominated instead of five. Now, however, Best Visual Effects is a major category and one of the evening’s most prestigious awards. Of special note is the fact that Dennis Muren, VES has received the most awards in this category with eight awards. Muren is also the most nominated in this category with 15 nominations.
The Jungle Book, 2016 Ex Machina, 2015 Interstellar, 2014 Gravity, 2013 Life of Pi, 2012 Hugo, 2011 Inception, 2010 Avatar, 2009 The Curious Case of Benjamin Button, 2008 The Golden Compass, 2007 Pirates of the Caribbean: Dead Man’s Chest, 2006 King Kong, 2005 Spider-Man 2, 2004 The Lord of the Rings: The Return of the King, 2003 The Lord of the Rings: The Two Towers, 2002 The Lord of the Rings: The Fellowship of the Ring, 2001 Gladiator, 2000 The Matrix, 1999 What Dreams May Come, 1998 Titanic, 1997 Independence Day, 1996 Babe, 1995 Forest Gump, 1994 Jurassic Park, 1993 Death Becomes Her, 1992 Terminator 2, 1991 Total Recall, 1990
WINTER 2018 VFXVOICE.COM • 21
11/22/17 4:50 PM
FILM
VFX Oscar Roll Call
WAR FOR THE PLANET OF THE APES
WONDER WOMAN
Weta Digital already wowed filmgoers in the two previous Apes films by taking live-action motion-captured performances from the likes of Andy Serkis and turning them into living, breathing characters. It seems impossible, but the studio, led by Visual Effects Supervisors Joe Letteri and Dan Lemmon, somehow managed to up the ante this time around. War is surely considered a VFX Oscar front-runner.
Wonder Woman, from director Patty Jenkins, is on this list not because of any major technical achievements, but because it is another example of seamless and ‘invisible effects’. The VFX aid in selling the war-time scenes – and in enhancing stunts, especially using digital doubles. The most front-and-center work appears at the end in the final Diana versus Ares battle, but by then audiences have bought into the created world.
TOP: The ape Caesar (played by Andy Serkis). Weta Digital made further improvements in the way its fur system handled dirt and snow for the film, along with enhancements to its proprietary Manuka renderer. (Photo copyright © 2017 Twentieth Century Fox Film Corporation. All Rights Reserved.)
BOTTOM: MPC, under the supervision of overall VFX Supervisor Bill Westenhofer, took plate photography of actress Gal Gadot and enhanced it to show her crashing through a window. (Photo courtesy of MPC. Copyright © 2016 Warner Bros. Entertainment. All Rights Reserved.)
20 • VFXVOICE.COM WINTER 2018
PG 10-21 VFX OSCAR.indd 21
The following is a list of films that the Academy of Motion Picture Arts & Sciences has recognized for Best Visual Effects since 1990. The Academy was cognizant of technical contributions to films as far back as their first Awards Show in 1928 when it handed out an award for “Best Engineering Effects” to the movie Wings, which was also the Best Picture winner that year. In 1933, the Academy recognized the work of Willis O’Brien for his work on King Kong. The very first award, entitled “Special Achievement Award for Special Effects,” was presented in 1938 for Paramount’s Spawn of the North. The Academy struggled with terminology and recognition for many years and did not come up with the official Best Visual Effects moniker until 1977. In some years there was no recognition award for special effects or technical achievements. In some years only three films were nominated instead of five. Now, however, Best Visual Effects is a major category and one of the evening’s most prestigious awards. Of special note is the fact that Dennis Muren, VES has received the most awards in this category with eight awards. Muren is also the most nominated in this category with 15 nominations.
The Jungle Book, 2016 Ex Machina, 2015 Interstellar, 2014 Gravity, 2013 Life of Pi, 2012 Hugo, 2011 Inception, 2010 Avatar, 2009 The Curious Case of Benjamin Button, 2008 The Golden Compass, 2007 Pirates of the Caribbean: Dead Man’s Chest, 2006 King Kong, 2005 Spider-Man 2, 2004 The Lord of the Rings: The Return of the King, 2003 The Lord of the Rings: The Two Towers, 2002 The Lord of the Rings: The Fellowship of the Ring, 2001 Gladiator, 2000 The Matrix, 1999 What Dreams May Come, 1998 Titanic, 1997 Independence Day, 1996 Babe, 1995 Forest Gump, 1994 Jurassic Park, 1993 Death Becomes Her, 1992 Terminator 2, 1991 Total Recall, 1990
WINTER 2018 VFXVOICE.COM • 21
11/22/17 4:50 PM
SPECIAL REPORT
VFX SCREEN CREDITS: THE QUEST FOR RECOGNITION AND RESPECT By DEBRA KAUFMAN
FROM TOP: The Wizard of Oz (Image copyright © 1939 Metro-Goldwyn-Mayer. All Rights Reserved Jason and the Argonauts (Image copyright © 1963, renewed 1991 Columbia Pictures Industries, Inc. Courtesy of Columbia Pictures. All Rights Reserved.)
“Artists want validation from the production, but the studio executives are stymied by rules and regulations. We have a good collaborative response from people in legal and production, but I think the rules surrounding credits in VFX are fairly immovable. We have tried for years and years.” —Matt Fox, Global Joint Managing Director, Film, Framestore
22 • VFXVOICE.COM WINTER 2018
PG 22-28 VFX CREDITS.indd 22-23
The 85th Academy Awards was a watershed moment for the visual effects industry. While 500 VFX artists protested outside the Kodak theater, inside, Rhythm & Hues Visual Effects Supervisor Bill Westenhofer received the VFX Oscar® for Ang Lee’s Life of Pi. What audiences at home (and in the theater) didn’t know was that Rhythm & Hues, founded in 1987, was shutting its doors even as Westenhofer accepted the award. He didn’t have the chance to share that news with the global audience as he had run out his allotted time at the microphone and it went dead. The moment was emblematic of what many in the industry believe: that VFX doesn’t have a voice nor does it get sufficient respect for the artistry of its contribution, which touches nearly every film and, in some, provides a majority of the on-screen imagery. Many others in the VFX industry simply want to know why their credits come so late in the end crawl, long after the last of the other creative credits. To understand why VFX credits are where they are, context is crucial. “Credits have always been contested,” says UCLA Film and Television Archive director Jan-Christopher Horak, who wrote Saul Bass: Anatomy of Film Design. “It’s always a political skirmish and a skirmish about power.” Horak reports that, in the early days of silent films, credits consisted of a single title card that would name the film and the production company. In particular, producers were careful not to identify the actors since they might have to pay them more money if they became popular. But, very quickly, the public identified individuals anyway and, says Horak, “clamored to find out who they were.” As producers feared, the actors began to demand more money. Visual effects were credited early on, says Jonathan Erland, VES, who points to director F.W. Murnau’s 1927 Sunrise, which won inaugural Academy Awards for best actress, cinematography and “best unique and artistic picture,” the latter of which refers to the movie’s exquisite compositing. But credits were still a haphazard affair. That began to change as actors formed the Screen Actors Guild in 1933. The story that people like to tell is that studio boss Irving Thalberg swore he would die before accepting the Guild; when he died in 1936, the studios signed a contract with the fledgling Guild. In fact, after the 1935 passage of the National Labor Relations Act, producers agreed to negotiate in 1937. By then, the directors and writers had banded together into their own guilds, all of them with collective bargaining rights and under the auspices of the National Labor Relations Act of 1935. Not so visual effects, which was in-camera work that, when it was credited, was usually listed near the cinematographer or camera department. The studios still called the shots, standardizing many aspects of filmmaking, but, as the Hollywood guilds exercised their col-
lective muscle, opening credits became more elaborate and recognized more people. To be sure, some of the credits were political considerations and many names in a department didn’t make it in the screen credits. “Even in the 1930s and 1940s, there were always feuds about credits within the guilds,” says Horak. “The big studios had no qualms about having 20 different writers work on a film, and then when it came to credits, there would be arbitration.” Credits were less standardized however when it came to in-camera visual effects, which were often – but not always – listed as special photographic effects. In 1933, King Kong recognized Willis H. O’Brien in the opening credits as “Chief Technician,” although today he would be credited as the film’s visual effects supervisor; Linwood Dunn, ASC’s contribution as optical photographer, however, was not credited. The 1939 Wizard of Oz gave an opening “special effects” credit to Arnold Gillespie, and the 1939 Gone With the Wind credited Jack Cosgrove for “special photographic effects” in bigger typeface and before cinematographer Ernest Haller, ASC’s credit. Although cinematographers had their own honorary society, the ASC, the camera guilds were local until the national Local 600 formed in 1996. The Paramount Consent Decree of 1948, also known as the Hollywood Antitrust Case, changed the movie industry in a profound way that impacted the importance of credits. The Consent Decree was aimed at breaking up the studios’ vertical monopoly over production and distribution of the content and the theaters where the content was exhibited. Through some twists and turns, the Supreme Court finally decided for the Consent Decree in 1948, essentially breaking the studio system that had existed for decades. The result was the birth of independent features – and independent actors who now craved prominent credits sometimes more than money. Credit on the screen kept them in the public eye and increased their value. Among visual effects, however, credits for special effects photography continued to be unreliable, sometimes at the movie’s opening and other times completely missing. For example, Arnold Gillespie was again credited for special photographic effects for his work on the 1959 Ben Hur, and Ray Harryhausen got quite prominent credit on the 1963 Jason and the Argonauts, as Associate Producer and Creator of Special Effects. But not everyone got the credit they deserved. On the 1952 High Noon, for example, Willis Cook was not credited for special effects; he likewise did not get a credit as special effects supervisor on the 1957 A Farewell to Arms, the 1960 All the Young Men, or even the 1970 The Molly Maguires. There were still numerous movies that barely credited anyone who worked on them. Although a movie might only list 40 people or so, this was the era in which we first hear loud complaints about “excessive” credits and credits that were too complicated to properly enact. In 1947, the Los Angeles Times reported on all the lawsuits provoked by credits listed on marquees, posters and all printed advertisements. “Most of these ticklish provisions are set forth
exhaustively in studio contracts,” it said. “But the burden of carrying them out falls heavily on the sagging shoulders of the poor exhibitors.” A 1962 Variety article lamented “the downgrading of values of recent years” with regard to screen credits that “literally list the entire cast.” “Featured” billing, the publication argued, lost its significance. Screen actors publicly warred in the 1940s and 1950s over top billing, sometimes resolved in Solomonic fashion. In 1947, Irene Dunne and William Powell tussled over top billing in Warner’s Bros.’ Life with Father. The solution? Two first reels were made, one giving Powell’s name top billing, the second with Dunne’s on top, and exhibitors alternated these two first reels day-by-day. The solution was identical for Rita Hayworth and Deborah Kerr’s struggle over credits in the 1958 Separate Tables. Agents for Marilyn Monroe and Betty Grable warred over which would get top billing in the 1953 How to Marry a Millionaire. The solution there was to give Monroe first billing in the ad art and Grable top screen billing on all the prints. Producers, meanwhile, saw credits as “an unbelievable nuisance,” says Horak. “They can also be an expense,” he says. As the number of credits grew, so did the gradual move to putting them all at the end. As early as 1941, Citizen Kane drew attention by putting all the credits at the end, but Horak says that the 1956 Around the World in 80 Days was a turning point for placing credits at the end. Perhaps the industry’s biggest feud over credits was that for “possessory credit,” or who had the right to an opening credit of “a film by” or a “screenplay by.” D.W. Griffith got the first such credit in 1915 on Birth of a Nation, and Alfred Hitchcock, David Lean and many others followed his path. The debate began in 1966 when the WGA and AMPTP negotiated a deal that limited the ability of the director to get possessory credit unless he was a writer on the movie or the author of the original source material. When the DGA learned of the deal in 1968, things quickly devolved, with the directors threatening to “withhold services” after April 30, 1968. Dozens of articles followed the serpentine struggle between the two powerful guilds that wasn’t resolved, according to the DGA website, until 2004, nearly 40 years after it began. During this era, however, a perusal of the pages of major Los Angeles newspapers and trade publications turns up no stories on the struggles of visual effects artists for recognition of their contributions. In fact, visual effects-heavy movies were in the trades for other reasons, such as George Lucas famously leaving the DGA in 1981 when the guild attempted to fine Lucasfilm $250,000 for putting director Irvin Kershner’s card at the end of The Empire Strikes Back, in defiance of the DGA’s regulation that the director’s credit must be in the opening credits. Lucas and Kershner had applied for a waiver “explaining that creatively the credits only worked at the end,” but the DGA wouldn’t budge. Lucas paid Kershner’s fine. Other VFX-heavy films also struggled over credits that had nothing to do with the visual effects. The tussle over Mars Attacks! was about WGA sole screen credit to original writer
WINTER 2018 VFXVOICE.COM • 23
11/22/17 4:51 PM
SPECIAL REPORT
VFX SCREEN CREDITS: THE QUEST FOR RECOGNITION AND RESPECT By DEBRA KAUFMAN
FROM TOP: The Wizard of Oz (Image copyright © 1939 Metro-Goldwyn-Mayer. All Rights Reserved Jason and the Argonauts (Image copyright © 1963, renewed 1991 Columbia Pictures Industries, Inc. Courtesy of Columbia Pictures. All Rights Reserved.)
“Artists want validation from the production, but the studio executives are stymied by rules and regulations. We have a good collaborative response from people in legal and production, but I think the rules surrounding credits in VFX are fairly immovable. We have tried for years and years.” —Matt Fox, Global Joint Managing Director, Film, Framestore
22 • VFXVOICE.COM WINTER 2018
PG 22-28 VFX CREDITS.indd 22-23
The 85th Academy Awards was a watershed moment for the visual effects industry. While 500 VFX artists protested outside the Kodak theater, inside, Rhythm & Hues Visual Effects Supervisor Bill Westenhofer received the VFX Oscar® for Ang Lee’s Life of Pi. What audiences at home (and in the theater) didn’t know was that Rhythm & Hues, founded in 1987, was shutting its doors even as Westenhofer accepted the award. He didn’t have the chance to share that news with the global audience as he had run out his allotted time at the microphone and it went dead. The moment was emblematic of what many in the industry believe: that VFX doesn’t have a voice nor does it get sufficient respect for the artistry of its contribution, which touches nearly every film and, in some, provides a majority of the on-screen imagery. Many others in the VFX industry simply want to know why their credits come so late in the end crawl, long after the last of the other creative credits. To understand why VFX credits are where they are, context is crucial. “Credits have always been contested,” says UCLA Film and Television Archive director Jan-Christopher Horak, who wrote Saul Bass: Anatomy of Film Design. “It’s always a political skirmish and a skirmish about power.” Horak reports that, in the early days of silent films, credits consisted of a single title card that would name the film and the production company. In particular, producers were careful not to identify the actors since they might have to pay them more money if they became popular. But, very quickly, the public identified individuals anyway and, says Horak, “clamored to find out who they were.” As producers feared, the actors began to demand more money. Visual effects were credited early on, says Jonathan Erland, VES, who points to director F.W. Murnau’s 1927 Sunrise, which won inaugural Academy Awards for best actress, cinematography and “best unique and artistic picture,” the latter of which refers to the movie’s exquisite compositing. But credits were still a haphazard affair. That began to change as actors formed the Screen Actors Guild in 1933. The story that people like to tell is that studio boss Irving Thalberg swore he would die before accepting the Guild; when he died in 1936, the studios signed a contract with the fledgling Guild. In fact, after the 1935 passage of the National Labor Relations Act, producers agreed to negotiate in 1937. By then, the directors and writers had banded together into their own guilds, all of them with collective bargaining rights and under the auspices of the National Labor Relations Act of 1935. Not so visual effects, which was in-camera work that, when it was credited, was usually listed near the cinematographer or camera department. The studios still called the shots, standardizing many aspects of filmmaking, but, as the Hollywood guilds exercised their col-
lective muscle, opening credits became more elaborate and recognized more people. To be sure, some of the credits were political considerations and many names in a department didn’t make it in the screen credits. “Even in the 1930s and 1940s, there were always feuds about credits within the guilds,” says Horak. “The big studios had no qualms about having 20 different writers work on a film, and then when it came to credits, there would be arbitration.” Credits were less standardized however when it came to in-camera visual effects, which were often – but not always – listed as special photographic effects. In 1933, King Kong recognized Willis H. O’Brien in the opening credits as “Chief Technician,” although today he would be credited as the film’s visual effects supervisor; Linwood Dunn, ASC’s contribution as optical photographer, however, was not credited. The 1939 Wizard of Oz gave an opening “special effects” credit to Arnold Gillespie, and the 1939 Gone With the Wind credited Jack Cosgrove for “special photographic effects” in bigger typeface and before cinematographer Ernest Haller, ASC’s credit. Although cinematographers had their own honorary society, the ASC, the camera guilds were local until the national Local 600 formed in 1996. The Paramount Consent Decree of 1948, also known as the Hollywood Antitrust Case, changed the movie industry in a profound way that impacted the importance of credits. The Consent Decree was aimed at breaking up the studios’ vertical monopoly over production and distribution of the content and the theaters where the content was exhibited. Through some twists and turns, the Supreme Court finally decided for the Consent Decree in 1948, essentially breaking the studio system that had existed for decades. The result was the birth of independent features – and independent actors who now craved prominent credits sometimes more than money. Credit on the screen kept them in the public eye and increased their value. Among visual effects, however, credits for special effects photography continued to be unreliable, sometimes at the movie’s opening and other times completely missing. For example, Arnold Gillespie was again credited for special photographic effects for his work on the 1959 Ben Hur, and Ray Harryhausen got quite prominent credit on the 1963 Jason and the Argonauts, as Associate Producer and Creator of Special Effects. But not everyone got the credit they deserved. On the 1952 High Noon, for example, Willis Cook was not credited for special effects; he likewise did not get a credit as special effects supervisor on the 1957 A Farewell to Arms, the 1960 All the Young Men, or even the 1970 The Molly Maguires. There were still numerous movies that barely credited anyone who worked on them. Although a movie might only list 40 people or so, this was the era in which we first hear loud complaints about “excessive” credits and credits that were too complicated to properly enact. In 1947, the Los Angeles Times reported on all the lawsuits provoked by credits listed on marquees, posters and all printed advertisements. “Most of these ticklish provisions are set forth
exhaustively in studio contracts,” it said. “But the burden of carrying them out falls heavily on the sagging shoulders of the poor exhibitors.” A 1962 Variety article lamented “the downgrading of values of recent years” with regard to screen credits that “literally list the entire cast.” “Featured” billing, the publication argued, lost its significance. Screen actors publicly warred in the 1940s and 1950s over top billing, sometimes resolved in Solomonic fashion. In 1947, Irene Dunne and William Powell tussled over top billing in Warner’s Bros.’ Life with Father. The solution? Two first reels were made, one giving Powell’s name top billing, the second with Dunne’s on top, and exhibitors alternated these two first reels day-by-day. The solution was identical for Rita Hayworth and Deborah Kerr’s struggle over credits in the 1958 Separate Tables. Agents for Marilyn Monroe and Betty Grable warred over which would get top billing in the 1953 How to Marry a Millionaire. The solution there was to give Monroe first billing in the ad art and Grable top screen billing on all the prints. Producers, meanwhile, saw credits as “an unbelievable nuisance,” says Horak. “They can also be an expense,” he says. As the number of credits grew, so did the gradual move to putting them all at the end. As early as 1941, Citizen Kane drew attention by putting all the credits at the end, but Horak says that the 1956 Around the World in 80 Days was a turning point for placing credits at the end. Perhaps the industry’s biggest feud over credits was that for “possessory credit,” or who had the right to an opening credit of “a film by” or a “screenplay by.” D.W. Griffith got the first such credit in 1915 on Birth of a Nation, and Alfred Hitchcock, David Lean and many others followed his path. The debate began in 1966 when the WGA and AMPTP negotiated a deal that limited the ability of the director to get possessory credit unless he was a writer on the movie or the author of the original source material. When the DGA learned of the deal in 1968, things quickly devolved, with the directors threatening to “withhold services” after April 30, 1968. Dozens of articles followed the serpentine struggle between the two powerful guilds that wasn’t resolved, according to the DGA website, until 2004, nearly 40 years after it began. During this era, however, a perusal of the pages of major Los Angeles newspapers and trade publications turns up no stories on the struggles of visual effects artists for recognition of their contributions. In fact, visual effects-heavy movies were in the trades for other reasons, such as George Lucas famously leaving the DGA in 1981 when the guild attempted to fine Lucasfilm $250,000 for putting director Irvin Kershner’s card at the end of The Empire Strikes Back, in defiance of the DGA’s regulation that the director’s credit must be in the opening credits. Lucas and Kershner had applied for a waiver “explaining that creatively the credits only worked at the end,” but the DGA wouldn’t budge. Lucas paid Kershner’s fine. Other VFX-heavy films also struggled over credits that had nothing to do with the visual effects. The tussle over Mars Attacks! was about WGA sole screen credit to original writer
WINTER 2018 VFXVOICE.COM • 23
11/22/17 4:51 PM
SPECIAL REPORT
FROM TOP: King Kong (Image copyright © 1933 RKO. All Rights Reserved.) Ben Hur (Image copyright © 1959 Metro-Goldwyn Mayer. All Rights Reserved.) Gone With the Wind (Image copyright © 1939 Metro-Goldwyn-Mayer.All Rights Reserved.)
“VFX now plays a role throughout the filmmaking process. The natural progression would see visual effects moving up in the order of credits, which is often not the case.” —Gill Howe, Executive Producer, Rising Sun Pictures
24 • VFXVOICE.COM WINTER 2018
PG 22-28 VFX CREDITS.indd 24-25
Jonathan Gems (which left out script doctors); When New Line submitted four producer names for The Lord of the Rings: The Fellowship of the Ring, the ruling was that they were limited to three (Tim Sanders got cut). In 2003, writer Tad Friend wrote an in-depth piece in The New Yorker about Hollywood screen credits that focused exclusively on writers. While writers, directors and actors fought their wars over credits, the visual effects industry was undergoing its massive shift from analog to digital, with the founding of such companies as Industrial Light + Magic, Boss Films, Robert Abel & Associates and Digital Productions. Increasingly, the visual effects in movies evolved from special camera effects – something that people could understand – to CGI, which was a puzzling acronym for most of the public. The top grossing films of the 1990s were all VFX-centric movies, from Titanic to Star Wars: Episode I – The Phantom Menace, Jurassic Park, Independence Day, Men in Black and Toy Story 2, but the conversation about VFX was relegated to “how they did it” stories, with no conversation about how to properly credit the work being done. In fact, many films continued to give VFX credit for “special photographic effects.” In 1982, Douglas Trumbull, VES’s credit on Blade Runner was for Special Photographic Effects Supervisor, the same credit he got on the 1968 2001: A Space Odyssey. These were halcyon days for visual effects artists, who reveled in the work pioneering the use of computers to accurately represent everything from fur to faces. The Visual Effects Society was formed in 1997 as a “non-profit, professional honorary society” to recognize, advance and honor “visual effects as an art form, promoting the interests of its Membership.” Concern over credits was not on the minds of most visual effects artists in those early days, buoyed as they were by the challenges of pushing technology forward and creating stunning images. As the movies got bigger and took more advantage of digital visual effects, so the list of VFX credits grew, and lots of people noticed. On Nov. 8, 1982, The New York Times’ Aljean Harmetz reported that “some Hollywood movies today have technical credits six or eight times as large as their casts.” She pointed out Star Trek II, with 20 actors who get their names and 127 behindthe-scenes people; Raiders of the Lost Ark that credited 232 “technicians … with computer engineering and electronic systems design”; and Tron, which has 44 actors and “nearly 300 people whose faces do not appear on the screen.” She did not mean this in a good way. Although everyone from assistant cooks to secretaries now gets credit, says Harmetz, “the chief reason for the length of credits are the requirements of the sophisticated special effects movies that have poured out since Star Wars … all recognition of the computer-generated imagery that will increasingly be a part of moviemaking.” The visual effects community didn’t answer the charge of excessive credits – and were not covered in the press as seeking more significant placement of their credits. A keyword search of “credits” at the Academy of Motion Picture Arts & Sciences’ Margaret Herrick Library produces not a single article on VFX credits during this era.
AD
WINTER 2018 VFXVOICE.COM • 25
11/22/17 4:51 PM
PG 25 DISNEY COCO AD.indd 25
11/22/17 4:51 PM
SPECIAL REPORT
FROM TOP: A small portion of visual effects screen credits in today’s VFX-heavy films. [The Hunger Games: Mockingjay - Part 2 (2015)] (Image copyright © 2015 Lionsgate. All Rights Reserved.) Visual effects artists hard at work behind the scenes. (Photo copyright © Bandito VFX)
“The horse is out of the barn. We have over-population of the business with super talent. I try in my way to right the ship, but we’re fighting with studios that have 100 years of experience in negotiating deals.” —Richard Edlund, VES and Founder/President of former Boss Films Studios
26 • VFXVOICE.COM WINTER 2018
PG 22-28 VFX CREDITS.indd 27
The visual effects community believes it’s time for a change. “As technology and the art of visual effects has evolved, VFX now plays a role throughout the filmmaking process,” says Rising Sun Pictures Executive Producer Gill Howe. “The natural progression would see visual effects moving up in the order of credits, which is often not the case.” Howe points out that some VFX-heavy films spend half their budgets on visual effects, which also isn’t reflected in the credits. VFX credits are currently an opaque process. On a film-by-film basis, each VFX facility head negotiates with the studio executive in charge of VFX for credits, but regardless of the compromise they come to, each studio’s credits executive has to sign off – and at times simply scotches whatever deal was struck, sometimes with a Catch-22 explanation. For example, former VES Chair Jeffrey A. Okun, VES recalls that one now-defunct VFX facility made a deal with the studio’s VFX executive to halve their rate for shots that had to be created and delivered over a weekend, in exchange for a single card company credit in the end crawl. When the movie opened, however, the VFX company was shocked to find it didn’t get that promised credit, because the studio’s vice president of credit and title administration explained that the dollar amount was less than the threshold required to get a credit. For Framestore Global Joint Managing Director, Film, Matt Fox, credits are crucial for the crew, which wants recognition as well as a salary. “Artists want validation from the production,” he says, “but the studio executives are stymied by rules and regulations.” Instead, he has to map out everyone working on the project and eventually determine who gets credit and who doesn’t. “We have a good collaborative response from people in legal and production, but I think the rules surrounding credits in VFX are fairly immovable,” says Fox. “We have tried for years and years.” Some more highly placed VFX professionals have tried to enable change on their own. Richard Edlund, VES, ASC reports that 20 years ago he went to the DGA to try to get creative credits for VFX supervisors, “thereby taking VFX supervisors into the DGA as such, maybe even as VFX directors.” Edlund says that initially the DGA seemed open to the idea, with the suggestion that VFX supervisors could be considered 2nd unit directors, but the pitch languished. “I talked to all the visual effects supervisors and they were all excited about the concept,” says Edlund. “But the studios wouldn’t like it because they don’t want to pay what a 2nd unit director makes, and there is no union or guild on the side of VFX supervisors.” Visual effects supervisors can occasionally get screen credit in the title block with the cinematographer and production designer: John Knoll; John Dykstra, ASC; Dennis Muren, VES, ASC, among a handful of others, have done so. But each and every picture requires the DGA to issue a waiver, which are not given out freely. Edlund recalls litigating against the DGA to get that credit on Fright Night, after he had successfully negotiated a waiver for Ghostbusters and 2010. The sticking point was that the DGA considered VFX to be a technical, not creative credit, says Edlund, who also had constant credit issues as head of Boss Films. “I spent tens of thousands of dollars negotiating credit issues,” he
AD
WINTER 2018 VFXVOICE.COM • 27
11/22/17 4:51 PM
PG 27 YANNIX AD.indd 27
11/22/17 4:52 PM
SPECIAL REPORT
LEFT TO RIGHT: Gill Howe (Photo: Jennie Bloom) Jeffrey A. Okun, VES Matt Fox Richard Edlund, VES, ASC John Knoll Jan-Christopher Horak (Photo: Todd Cheney)
says. “Every movie, I had to go and fight for 120 credits. Every contract would end up costing me $50,000 to negotiate.” Because visual effects are a complex process that often goes on for many months at multiple facilities, the total number of VFX credits is a moving target. “VFX teams tend to grow as the movie goes through the post process,” says Fox. At the same time, says ILM Chief Creative Officer/Senior Visual Effects Supervisor John Knoll, the studios are pushing for fewer names “at least relative to the sizes of the crews. The restrictions get tighter as the crews get bigger,” he says. “I oppose that. I don’t think it’s fair or equitable to do to our department, especially considering the contribution we make.” VFX credits listed at the end, he adds, is “an anachronism” from the earlier days when visual effects consisted of dissolves and wipes, “with the occasional use of miniatures and matte paintings.” Knoll also notes that the cycle time for a feature film at ILM is a year and a half, with 300 to 400 people working on a movie for that time. “We out-number and outlast the live-action crews by a pretty substantial margin,” he says. Fox notes that, even when VFX does get the go-ahead to list all or most of the artists, the names are listed in small type in several columns that roll so quickly that it’s impossible to see individual names. Fox believes this stress might be alleviated if the studios were to support an IMDb-type official list. “Artists want to know that, for all their efforts, they can point to the fact they got a de facto credit,” he says. What can be done? Most people in the VFX industry would like to have collective bargaining power, with or without the structure of a union, but forming one now would face several challenges: because the VFX industry is now
28 • VFXVOICE.COM WINTER 2018
PG 22-28 VFX CREDITS.indd 28-29
international, most think it is impossible to standardize rates and working conditions globally. Others say that a union would also drive up the costs of creating VFX, an anathema to the studios and producers. Some experts, like Edlund, think “the horse is out of the barn. We have over-population of the business with super talent,” he says. “I try in my way to right the ship, but we’re fighting with studios that have 100 years of experience in negotiating deals.” Horak, further from the battle, draws a softer conclusion: that we are still in the transition from analog to digital. “Hollywood and its modus operandi is extremely conservative and only changes very slowly, as we know,” he says. “The actual fact of moviemaking is far ahead of what’s in the heads of those who have the say. It may just take a little bit longer.” If there is a real desire to influence practices that dictate credits for those in the craft, there is ample Hollywood history to draw from, as virtually every job category has had to fight for recognition and rights. In addition to lawyers, funding and persistence, successful efforts made their case as visible as possible to help build industry support. So far, the VFX industry has fought its battles with studio executives as companies and individuals in private. In bringing renewed attention to this complex and critical issue, a next step towards workable solutions might be to continue the conversation while spreading the dilemmas facing VFX facilities to the wider Hollywood community. Problems can’t be resolved if they’re only talked about in hushed tones, out of the spotlight.
AD
WINTER 2018 VFXVOICE.COM • 29
11/22/17 4:51 PM
PG 29 DISNEY GOTG AD.indd 29
11/22/17 4:52 PM
COVER
MASTER CREATURE CREATOR JOE LETTERI: 2018 VES GEORGES MÉLIÈS AWARD WINNER By NAOMI GOLDMAN
Gollum. King Kong. T-Rex. The Na’vi. Caesar. Iconic characters brought to life thanks to a profound imagination and a passion for math – both possessed by a visionary artist adept at using technology to create unforgettable worlds and CG characters that speak volumes about our humanity. That pioneering visual effects master is Joe Letteri, VES – a boy from Pennsylvania entranced with natural phenomena and a curiosity for the universe big enough to share. Coming off of last summer’s stunning War for the Planet of the Apes, Letteri’s prolific skill and pioneering techniques have been on display for more than 25 years. His groundbreaking work has garnered Letteri four Academy Awards (The Lord of the Rings: The Two Towers, The Lord of The Rings: The Return of the King, King Kong and Avatar), four BAFTA Awards and six Visual Effects Society Awards. His filmography also includes The Abyss, Jurassic Park, Star Wars: Special Edition, I, Robot, X-Men: The Last Stand and The Hobbit and Planet of the Apes trilogies – and he’s now focused on the highly anticipated Avatar sequels. He is currently Director of Weta Digital, having joined the company in 2001 after a 10-year tenure at ILM. Letteri is a recipient of the New Zealand Order of Merit, the Queen’s honor, bestowed for rendering meritorious service to the Crown and nation. In October, Letteri was given the distinction of VES Fellow, and in February he will be bestowed with the Georges Méliès Award at the 16th Annual VES Awards, in recognition of his significant and lasting contributions to the art and science of the visual effects industry.
ORIGIN STORY TOP LEFT: Joe Letteri TOP RIGHT: Visual Effects Supervisor Dan Lemmon, Andy Serkis and Joe Letteri at the New York premiere of War for the Planet of the Apes.
30 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 30-31
Growing up in Aliquippa, Penn. in the 1960s, Letteri was enthralled early on with the vein of spectacle filmmaking popular in the era. “Dr. No, Goldfinger, midnight scary movies, black and white alien invasion films … they were always fun and got me thinking, how did they do that? I always enjoyed them with a keen
sense of curiosity,” says Letteri. His trajectory towards visual effects was fueled by a love of math and science amidst the burgeoning computer age. “I was always interested in optics, physics and astronomy. I started thinking about how you see things. The flip side is optical illusions – how do those work and how does your mind believe those things? I was trying to understand how tricking the mind speaks to how you understand the world around you. “I was in love with graphing things. Randall Brodsky’s work on fractals, using math to describe complicated visual images that almost looked organic, really got me thinking about the possibilities of visualizing things with equations and making pictures out of data. “At the same time, astonishing films like 2001: A Space Odyssey and Star Wars were taking you to outer space, which is where I wanted to be anyway. The advent of computer graphics created this tool where you could know something about optics, photography, motion and how to construct things in 3D, but unlike the traditional arts where you built a model and moved it and photographed it, here you were manipulating the pixels directly. It opened the door to a whole different way of thinking about the art in that anything you could describe you could capture pixel by pixel. I don’t draw, I don’t paint, I don’t animate in the traditional sense, but using a computer I can figure out how to create any image in the world – or a world yet unknown.”
THE WONDER YEARS Letteri’s entre into the field was spurred by an encounter at SIGGRAPH in 1988 with the owner of Orange County, Californiabased Yale Video, who offered him the chance to work on the company’s computers after business hours. This began his midnight-to-daybreak nightly cycle of teaching himself how to use and program the machines on the Cubicom System. Letteri got his professional start in Los Angeles doing commercials and news graphics at Metrolight Studios. His first film job, which was also his first film at ILM, was the opening shot in Star Trek VI: The Undiscovered Country where the Klingon moon Praxis explodes into a big ring of fire. “They had tried to figure out how to do it as a practical effect, but couldn’t get the energy to make it big enough. Because I had been working on the side with fractals creating clouds and other natural phenomena, I went to work and created that first shot of the film.” Next up was Jurassic Park, which Letteri describes as the big defining project. “I became really interested in lighting and shaders and how we could create truly realistic-looking dinosaurs. Discovering Pixar’s RenderMan companion and the idea that you could actually code shaders and get them to do what you want them to do became fascinating to me. So I wrote a library of shaders. How the skin was going to look, the reflectants, the lights. Everything had to be specialized. “Getting to work on Jurassic Park, on big scary creatures – I mean as a kid, who didn’t draw dinosaurs? And the ability to actually do that and put it on the screen photographically was fantastic. Seeing the audience reaction – because the dinosaurs were the
“I was trying to understand how tricking the mind speaks to how you understand the world around you.” —Joe Letteri
TOP: War for the Planet of the Apes. (Photo TM and © 2017 Twentieth Century Fox Film Corporation. All rights reserved. ) ABOVE: James Cameron, Peter Jackson, Steven Spielberg and Joe Letteri on The Adventures of Tintin mo-cap stage in Playa Vista.
WINTER 2018 VFXVOICE.COM • 31
11/22/17 4:53 PM
COVER
MASTER CREATURE CREATOR JOE LETTERI: 2018 VES GEORGES MÉLIÈS AWARD WINNER By NAOMI GOLDMAN
Gollum. King Kong. T-Rex. The Na’vi. Caesar. Iconic characters brought to life thanks to a profound imagination and a passion for math – both possessed by a visionary artist adept at using technology to create unforgettable worlds and CG characters that speak volumes about our humanity. That pioneering visual effects master is Joe Letteri, VES – a boy from Pennsylvania entranced with natural phenomena and a curiosity for the universe big enough to share. Coming off of last summer’s stunning War for the Planet of the Apes, Letteri’s prolific skill and pioneering techniques have been on display for more than 25 years. His groundbreaking work has garnered Letteri four Academy Awards (The Lord of the Rings: The Two Towers, The Lord of The Rings: The Return of the King, King Kong and Avatar), four BAFTA Awards and six Visual Effects Society Awards. His filmography also includes The Abyss, Jurassic Park, Star Wars: Special Edition, I, Robot, X-Men: The Last Stand and The Hobbit and Planet of the Apes trilogies – and he’s now focused on the highly anticipated Avatar sequels. He is currently Director of Weta Digital, having joined the company in 2001 after a 10-year tenure at ILM. Letteri is a recipient of the New Zealand Order of Merit, the Queen’s honor, bestowed for rendering meritorious service to the Crown and nation. In October, Letteri was given the distinction of VES Fellow, and in February he will be bestowed with the Georges Méliès Award at the 16th Annual VES Awards, in recognition of his significant and lasting contributions to the art and science of the visual effects industry.
ORIGIN STORY TOP LEFT: Joe Letteri TOP RIGHT: Visual Effects Supervisor Dan Lemmon, Andy Serkis and Joe Letteri at the New York premiere of War for the Planet of the Apes.
30 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 30-31
Growing up in Aliquippa, Penn. in the 1960s, Letteri was enthralled early on with the vein of spectacle filmmaking popular in the era. “Dr. No, Goldfinger, midnight scary movies, black and white alien invasion films … they were always fun and got me thinking, how did they do that? I always enjoyed them with a keen
sense of curiosity,” says Letteri. His trajectory towards visual effects was fueled by a love of math and science amidst the burgeoning computer age. “I was always interested in optics, physics and astronomy. I started thinking about how you see things. The flip side is optical illusions – how do those work and how does your mind believe those things? I was trying to understand how tricking the mind speaks to how you understand the world around you. “I was in love with graphing things. Randall Brodsky’s work on fractals, using math to describe complicated visual images that almost looked organic, really got me thinking about the possibilities of visualizing things with equations and making pictures out of data. “At the same time, astonishing films like 2001: A Space Odyssey and Star Wars were taking you to outer space, which is where I wanted to be anyway. The advent of computer graphics created this tool where you could know something about optics, photography, motion and how to construct things in 3D, but unlike the traditional arts where you built a model and moved it and photographed it, here you were manipulating the pixels directly. It opened the door to a whole different way of thinking about the art in that anything you could describe you could capture pixel by pixel. I don’t draw, I don’t paint, I don’t animate in the traditional sense, but using a computer I can figure out how to create any image in the world – or a world yet unknown.”
THE WONDER YEARS Letteri’s entre into the field was spurred by an encounter at SIGGRAPH in 1988 with the owner of Orange County, Californiabased Yale Video, who offered him the chance to work on the company’s computers after business hours. This began his midnight-to-daybreak nightly cycle of teaching himself how to use and program the machines on the Cubicom System. Letteri got his professional start in Los Angeles doing commercials and news graphics at Metrolight Studios. His first film job, which was also his first film at ILM, was the opening shot in Star Trek VI: The Undiscovered Country where the Klingon moon Praxis explodes into a big ring of fire. “They had tried to figure out how to do it as a practical effect, but couldn’t get the energy to make it big enough. Because I had been working on the side with fractals creating clouds and other natural phenomena, I went to work and created that first shot of the film.” Next up was Jurassic Park, which Letteri describes as the big defining project. “I became really interested in lighting and shaders and how we could create truly realistic-looking dinosaurs. Discovering Pixar’s RenderMan companion and the idea that you could actually code shaders and get them to do what you want them to do became fascinating to me. So I wrote a library of shaders. How the skin was going to look, the reflectants, the lights. Everything had to be specialized. “Getting to work on Jurassic Park, on big scary creatures – I mean as a kid, who didn’t draw dinosaurs? And the ability to actually do that and put it on the screen photographically was fantastic. Seeing the audience reaction – because the dinosaurs were the
“I was trying to understand how tricking the mind speaks to how you understand the world around you.” —Joe Letteri
TOP: War for the Planet of the Apes. (Photo TM and © 2017 Twentieth Century Fox Film Corporation. All rights reserved. ) ABOVE: James Cameron, Peter Jackson, Steven Spielberg and Joe Letteri on The Adventures of Tintin mo-cap stage in Playa Vista.
WINTER 2018 VFXVOICE.COM • 31
11/22/17 4:53 PM
COVER
stars of the film they were coming to see – opened up lots of potential in figuring out how to create complex characters.” Letteri brings that attention to lifelike detail to every project he helmed, from creating a new technique for realistic fur on Kitty the Saber Toothed Cat in The Flintstones to conducting lighting studies based on old Hollywood glamour shots to get the right lighting for Casper (the Friendly Ghost’s) eyes.
character that lives in your midst with emotional complexity. “Not only did we come up with the idea of using performance capture and working with Andy Serkis, but we also created a technique called subsurface scattering that allowed us to create realistic skin for the first time. The combination of those elements gave us Gollum, who was believably alive and held up to real actors. Creating digital characters with that emotional range and subtlety has been the basis for all of the characters we’ve created since then.
TOP: Avatar (Photo copyright © 2009 Twentieth Century Fox Film Corporation. All rights reserved.) BOTTOM: The Hobbit (Photo TM & copyright © 2012 Warner Bros. Entertainment Inc. All rights reserved. )
CHARACTER COUNTS TOP: Visual Effects Supervisor Erik Winquist, Joe Letteri, Jason Clarke, Visual Effects Supervisors Dan Lemmon and Daniel Barrett at the 13th Annual VES Awards. BOTTOM: War for the Planet of the Apes Production Designer James Chinlund, Joe Letteri and director Matt Reeves.
As to his most beloved character, Letteri speaks like a parent not wanting to play favorites, but cites his seminal work on The Lord of the Rings. “Gollum was a breakthrough. He was the first character that had to play alongside live-action characters in a way that you couldn’t tell he wasn’t a real person. He was a CG character, but he was treated with every consideration that any real character would. There was an inner conflict and there was an arc. Nothing about him was thrown away. And we set about to create this villain that we wanted you to like. Because if you have a villain that you don’t like it has to be epic, like Darth Vader, but that’s not a
“Getting to work on Jurassic Park, on big scary creatures – I mean as a kid, who didn’t draw dinosaurs? Seeing the audience reaction – because the dinosaurs were the stars of the film they were coming to see – opened up lots of potential in figuring out how to create complex characters.” —Joe Letteri
32 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 32-33
WINTER 2018 VFXVOICE.COM • 33
11/22/17 4:53 PM
COVER
stars of the film they were coming to see – opened up lots of potential in figuring out how to create complex characters.” Letteri brings that attention to lifelike detail to every project he helmed, from creating a new technique for realistic fur on Kitty the Saber Toothed Cat in The Flintstones to conducting lighting studies based on old Hollywood glamour shots to get the right lighting for Casper (the Friendly Ghost’s) eyes.
character that lives in your midst with emotional complexity. “Not only did we come up with the idea of using performance capture and working with Andy Serkis, but we also created a technique called subsurface scattering that allowed us to create realistic skin for the first time. The combination of those elements gave us Gollum, who was believably alive and held up to real actors. Creating digital characters with that emotional range and subtlety has been the basis for all of the characters we’ve created since then.
TOP: Avatar (Photo copyright © 2009 Twentieth Century Fox Film Corporation. All rights reserved.) BOTTOM: The Hobbit (Photo TM & copyright © 2012 Warner Bros. Entertainment Inc. All rights reserved. )
CHARACTER COUNTS TOP: Visual Effects Supervisor Erik Winquist, Joe Letteri, Jason Clarke, Visual Effects Supervisors Dan Lemmon and Daniel Barrett at the 13th Annual VES Awards. BOTTOM: War for the Planet of the Apes Production Designer James Chinlund, Joe Letteri and director Matt Reeves.
As to his most beloved character, Letteri speaks like a parent not wanting to play favorites, but cites his seminal work on The Lord of the Rings. “Gollum was a breakthrough. He was the first character that had to play alongside live-action characters in a way that you couldn’t tell he wasn’t a real person. He was a CG character, but he was treated with every consideration that any real character would. There was an inner conflict and there was an arc. Nothing about him was thrown away. And we set about to create this villain that we wanted you to like. Because if you have a villain that you don’t like it has to be epic, like Darth Vader, but that’s not a
“Getting to work on Jurassic Park, on big scary creatures – I mean as a kid, who didn’t draw dinosaurs? Seeing the audience reaction – because the dinosaurs were the stars of the film they were coming to see – opened up lots of potential in figuring out how to create complex characters.” —Joe Letteri
32 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 32-33
WINTER 2018 VFXVOICE.COM • 33
11/22/17 4:53 PM
COVER
AD TOP: Dawn of the Planet of the Apes (Photo TM and copyright © 2014 Twentieth Century Fox Film Corporation. All Rights Reserved.) BOTTOM: Joe Letteri and James Cameron on the set of Avatar.
“I remember the first time we saw Gollum in shots and started to get a sense that this is going to work. Or Caesar in Rise of the Planet of the Apes, the first shot we did of him in his jail cell where he’s watching with his eyes and we knew we had captured that humanity. In every film those character moments are the standout.” On the end note of these characters. Letteri acknowledges: “Saying goodbye to Caesar is emotional because of the investment in him. But it’s a satisfying end to Caesar and his journey as a leader. And Gollum, he was consumed by a ring of molten lava, so he certainly got a dramatic cinematic ending.”
ON INSPIRATION “I’m inspired by the opportunity to create characters that allow you to tell stories in new ways. In the Apes story, telling it through the eyes of the apes gave us all the opportunity to walk back from our biases and see it told from their perspective. Taking the audience inside how they perceive love and loyalty, war and fear lets us to look at the problems of society from a different vantage
“I don’t draw, I don’t paint, I don’t animate in the traditional sense, but using a computer I can figure out how to create any image in the world – or a world yet unknown.” —Joe Letteri
34 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 35
FALL 2018 VFXVOICE.COM • 35
11/22/17 4:53 PM
PG 35 WETA AD.indd 35
11/22/17 4:53 PM
COVER
AD
point. To bring that to the screen, you have to put a lot of effort into mastering the cinematic techniques of storytelling, while also needing to understand what makes a character come alive.”
far beyond the effects to make a valuable contribution to the story. If you do both of these things, you have a good chance at honoring the art of storytelling through VFX.”
ROOTS AND WINGS
THE LAST WORD
Letteri points to mentors who influenced him at critical junctures. “At the foundation of my career, Tim McGovern and Richard ‘Dr.’ Baily, both at MetroLight, were great not only in teaching me practical skills, but also in letting me run with things. And when I got to ILM, Dennis Muren, VES was fantastic, because he had such a sense of history and a keen eye and was always referring back to ‘how does this thing happen in the real world’ and keeping our work grounded in a physical basis. I’ve also been fortunate to work with visionary directors including Peter Jackson, James Cameron and Steven Spielberg, who were open and collaborative and willing to take on new ideas – no matter how crazy they sounded.” To aspiring VFX artists, Letteri offers two pieces of advice. First, “You have to find something in the field you like well enough to become an expert in. Really know something and do something as deeply as possible. And you must appreciate the broad scope of filmmaking. Know how what you’re doing fits into the film and how everyone else is working towards the same common goal because it’s very collaborative. You have to think about your work
Letteri was in the unique position at the 13th Annual VES Awards of competing against himself for Outstanding Visual Effects on The Hobbit: The Battle of the Five Armies vs. Dawn of the Planet of the Apes (he and the team won two VES Awards for Apes that year). And where does he keep his gold Man in the Moon statues? “Proudly displayed in a cabinet in my office.” What profession would he likely have pursued if not VFX? “I probably would have followed my love of astronomy or astrophysics. But I’m also fascinated with biology. Because if you’re thinking about astrophysics, you’re thinking about life on other planets, so it all comes down to the big picture that ties it all together.
36 • VFXVOICE.COM WINTER 2018
PG 30-36 JOE LETTERI.indd 37
TOP: Director Matt Reeves, Joe Letteri and actor Toby Kebbell (Koba) on the set of Dawn of the Planet of the Apes.
WINTER 2018 VFXVOICE.COM • 37
11/22/17 4:53 PM
PG 37 ROTOMAKER AD.indd 37
11/22/17 4:53 PM
FILM
NEILL BLOMKAMP GOES EXPERIMENTAL WITH OATS STUDIOS By IAN FAILES
TOP: An alien creature in the Oats Studios short Rakka, which tells the story of a human fighting against an invasion of Earth. (Photo copyright © 2017 Oats Studios)
38 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 38-39
Before director Neill Blomkamp’s hugely successful breakout hit District 9 (2009), the filmmaker had already received plenty of attention via a series of shorts – most notably, Alive in Joburg (2005). These shorts contained a gritty realism and took advantage of photoreal computer graphics – with a hand-held feel – to help tell their often future dystopian stories. Blomkamp would go on to make Elysium (2013) and Chappie (2015), but then midway through 2017 he surprised many by revealing he had returned to the realm of independent shorts with the launch of his Vancouver-based film production company, Oats Studios. Oats soon released online three experimental short films – all with significant visual effects – as well as other content. Blomkamp’s idea was to see what might strike a chord with viewers, or which short could be widened into a possible feature. Importantly, the director was doing this outside the studio system, so Oats Studios owned the IP (the studio has even released CG models, scripts and other assets for people to use how they wish). Among several hand-picked Blomkamp collaborators at Oats Studios is Visual Effects Supervisor Chris Harvey, who had worked with the director on Chappie. He tells VFX Voice about working at Oats on those effects-heavy shorts, and on a unique real-time rendering collaboration with Unity, and the lessons he’s taken away from having an independent studio experience. AN ENTIRE FILM STUDIO IN ONE PLACE
Deliberately small and nimble, Oats Studios was designed to
house just about every part of the filmmaking process under one roof. “That means there’s a lot of synergies we have that may have gotten lost in the general machine of filmmaking,” suggests Harvey. “Things have got so hyper-specialized, and people just focus on this one tiny aspect, and someone else focuses on that other tiny aspect, and maybe sometimes they talk. “But one thing we’ve learned in having everything under one roof,” adds Harvey, “is there’s just this really cool synergy that we’re getting back, whether it be art department and visual effects, or with the practical guys and even visual effects guys getting to help out on set, shooting things or doing behind the scenes.” Harvey says this was a big part of getting the three main shorts achieved – Rakka, Firebase and Zygote. Each could be art directed at the studio and then filmed on location elsewhere with everyone chipping in, and then with post back at Oats. “Maybe that’s why you see so many people now doing their own little short films,” offers Harvey. “Where they assemble a tiny team, and it’s four people working together for four years to do their short, and they have to do every aspect of the film, from sound to set design, to costumes, to everything. There’s that love of it because they get to do everything, and it’s like we’ve recovered that on a bigger scale, which has been really, really exciting.”
“Maybe that’s why you see so many people now doing their own little short films, where they assemble a tiny team, and it’s four people working together for four years to do their short, and they have to do every aspect of the film, from sound to set design, to costumes, to everything. There’s that love of it because they get to do everything, and it’s like we’ve recovered that on a bigger scale, which has been really, really exciting.” —Chris Harvey, Visual Effects Supervisor, Oats Studios
CROWD SOURCING
Any film project – short or long – is of course subject to review by critics and a general audience. But Oats’ aim has been to candidly ask viewers what they thought each work, what could be improved,
TOP: Rakka features an alien civilization known as the Kluum – realized in CG by Oats – which dominate the Earth with their specialized liquid nanotechnology. (Photo copyright © 2017 Oats Studios)
WINTER 2018 VFXVOICE.COM • 39
11/22/17 4:54 PM
FILM
NEILL BLOMKAMP GOES EXPERIMENTAL WITH OATS STUDIOS By IAN FAILES
TOP: An alien creature in the Oats Studios short Rakka, which tells the story of a human fighting against an invasion of Earth. (Photo copyright © 2017 Oats Studios)
38 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 38-39
Before director Neill Blomkamp’s hugely successful breakout hit District 9 (2009), the filmmaker had already received plenty of attention via a series of shorts – most notably, Alive in Joburg (2005). These shorts contained a gritty realism and took advantage of photoreal computer graphics – with a hand-held feel – to help tell their often future dystopian stories. Blomkamp would go on to make Elysium (2013) and Chappie (2015), but then midway through 2017 he surprised many by revealing he had returned to the realm of independent shorts with the launch of his Vancouver-based film production company, Oats Studios. Oats soon released online three experimental short films – all with significant visual effects – as well as other content. Blomkamp’s idea was to see what might strike a chord with viewers, or which short could be widened into a possible feature. Importantly, the director was doing this outside the studio system, so Oats Studios owned the IP (the studio has even released CG models, scripts and other assets for people to use how they wish). Among several hand-picked Blomkamp collaborators at Oats Studios is Visual Effects Supervisor Chris Harvey, who had worked with the director on Chappie. He tells VFX Voice about working at Oats on those effects-heavy shorts, and on a unique real-time rendering collaboration with Unity, and the lessons he’s taken away from having an independent studio experience. AN ENTIRE FILM STUDIO IN ONE PLACE
Deliberately small and nimble, Oats Studios was designed to
house just about every part of the filmmaking process under one roof. “That means there’s a lot of synergies we have that may have gotten lost in the general machine of filmmaking,” suggests Harvey. “Things have got so hyper-specialized, and people just focus on this one tiny aspect, and someone else focuses on that other tiny aspect, and maybe sometimes they talk. “But one thing we’ve learned in having everything under one roof,” adds Harvey, “is there’s just this really cool synergy that we’re getting back, whether it be art department and visual effects, or with the practical guys and even visual effects guys getting to help out on set, shooting things or doing behind the scenes.” Harvey says this was a big part of getting the three main shorts achieved – Rakka, Firebase and Zygote. Each could be art directed at the studio and then filmed on location elsewhere with everyone chipping in, and then with post back at Oats. “Maybe that’s why you see so many people now doing their own little short films,” offers Harvey. “Where they assemble a tiny team, and it’s four people working together for four years to do their short, and they have to do every aspect of the film, from sound to set design, to costumes, to everything. There’s that love of it because they get to do everything, and it’s like we’ve recovered that on a bigger scale, which has been really, really exciting.”
“Maybe that’s why you see so many people now doing their own little short films, where they assemble a tiny team, and it’s four people working together for four years to do their short, and they have to do every aspect of the film, from sound to set design, to costumes, to everything. There’s that love of it because they get to do everything, and it’s like we’ve recovered that on a bigger scale, which has been really, really exciting.” —Chris Harvey, Visual Effects Supervisor, Oats Studios
CROWD SOURCING
Any film project – short or long – is of course subject to review by critics and a general audience. But Oats’ aim has been to candidly ask viewers what they thought each work, what could be improved,
TOP: Rakka features an alien civilization known as the Kluum – realized in CG by Oats – which dominate the Earth with their specialized liquid nanotechnology. (Photo copyright © 2017 Oats Studios)
WINTER 2018 VFXVOICE.COM • 39
11/22/17 4:54 PM
FILM
The Making of Zygote
TOP: A scene from Firebase, Blomkamp’s alternate history Vietnam War short. (Photo copyright © 2017 Oats Studios) BOTTOM: Firebase, and Oats’ other shorts, have taken advantage of motion capture, photogrammetry, experimental CG software and different approaches to rendering to make their diverse creatures and characters possible. (Photo copyright © 2017 Oats Studios) BOTTOM RIGHT: A scene from ADAM: The Mirror, the first of Oats Studios’ collaborations with Unity. (Photo copyright © 2017 Oats Studios and Unity Technologies)
40 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 40-41
and where else it could be taken. “I wouldn’t say we’ve figured out yet how you gauge user response and how you even gauge success,” notes Harvey. “Rakka’s at about 4 million views on YouTube. Is that successful? There’s no metric to gauge it against.” Oats’ key members have reviewed comments on YouTube, Reddit and other online forums. But, as Harvey describes, “it’s a really interesting mess of spaghetti to unwind about what people really think about it.” “One thing that’s been really interesting to see is our forum,” says Harvey. “It’s a select smaller group of people. Some people definitely seemed to catch what we were trying to do in terms of community involvement. I would say we’re still learning about how
One of the most terrifying creatures appearing in Oats Studios’ shorts –so far – is the eponymous Zygote. Made up of multiple human bodies, the entity wreaks havoc on a mining facility in the Arctic Circle. And it was as complicated to produce as it is unpleasant to look it. Oats Studios Modeler and Senior Character Artist on Zygote, Ian Spriggs, took some original concept art for the creature and began designing the Zygote directly in 3D. “Neill said, ‘I want it to be very human, like human body parts stuck together,’” says Spriggs. A photogrammetry rig made up of 32 cameras was used to scan different hands. From five hands, Spriggs made many more variations. Senior Rigging TD Eric D. Legare rigged each hand individually, meaning they could then be posed. From there, digi-doubles were added to the Zygote model. “There were about seven full bodies in there,” notes Spriggs. “I posed them and started adding limbs and then slowly added more and more detail, making sure each hand was holding something and that each hand had a purpose. I had to ‘grotesque’ them up as well.” Another edict from Blomkamp was that the Zygote appear as if someone had sewn all the body parts together. For reference,
the director had seen a YouTube video of a worm that squirts a white glue that appears as an unusually colored vein. “It’s very sci-fi,” says Spriggs. “It was as if someone had built this creature, stitched it together, melted his brain onto this thing, and that white goo is the external nervous system!” The Zygote’s human-like eyes that are attached to its torso were equally horrific, and came from a suggestion from Visual Effects Supervisor Chris Harvey. “Chris said, ‘Imagine if you slit open the skin with a knife, and you pushed an eyeball into that slit,’” posits Spriggs. “And that’s basically what happened to the Zygote eyes.” For Spriggs, modeling and texturing the Zygote was an eight-month process, done in tandem with rigger Legare. Spriggs is particularly complimentary of the resulting rig, and the ‘gurgling’ type animation done for the Zygote. Even the sound design, he says, added to the uniqueness of the character. “They recorded a whole bunch of different people, and they said to each of them, ‘you make some choking sounds, you make some gargling sounds, and you make some screams,’ and they just mixed them all together. It just made it feel so horrific.”
Ian Spriggs’ CG model of the Zygote, a terrifying creature made up of human bodies, hands and eyes. (Photo copyright © 2017 Oats Studios)
The Zygote rig by Eric D. Legare proved to be extremely complicated given the number of body parts and hands. (Photo copyright © 2017 Oats Studios)
to better engage the audience at a base level.” IGNORING THE LAST 5%
In an effort to get the shorts out there, one philosophy at Oats Studios has been to take the visual effects to about 90% or 95%, and not worry about that last pecentage that would normally go into final polishing. It’s a philosophy that, argues Harvey, hasn’t altered the way people view the work. “Even if you try to take it to that 100% mark, the truth is, no shot’s ever really done. So in some ways, it almost seems more honest, and maybe it’s caused people to pick on it less. They’re going, ‘Well, they’re saying there’s stuff to pick on. They’re acknowledging it. They’re not pretending or hiding behind the fact
“One thing we’ve learned in having everything under one roof is there’s just this really cool synergy that we’re getting back, whether it be art department and visual effects, or with the practical guys and even visual effects guys getting to help out on set, shooting things or doing behind the scenes.” —Chris Harvey, Visual Effects Supervisor, Oats Studios
WINTER 2018 VFXVOICE.COM • 41
11/22/17 4:54 PM
FILM
The Making of Zygote
TOP: A scene from Firebase, Blomkamp’s alternate history Vietnam War short. (Photo copyright © 2017 Oats Studios) BOTTOM: Firebase, and Oats’ other shorts, have taken advantage of motion capture, photogrammetry, experimental CG software and different approaches to rendering to make their diverse creatures and characters possible. (Photo copyright © 2017 Oats Studios) BOTTOM RIGHT: A scene from ADAM: The Mirror, the first of Oats Studios’ collaborations with Unity. (Photo copyright © 2017 Oats Studios and Unity Technologies)
40 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 40-41
and where else it could be taken. “I wouldn’t say we’ve figured out yet how you gauge user response and how you even gauge success,” notes Harvey. “Rakka’s at about 4 million views on YouTube. Is that successful? There’s no metric to gauge it against.” Oats’ key members have reviewed comments on YouTube, Reddit and other online forums. But, as Harvey describes, “it’s a really interesting mess of spaghetti to unwind about what people really think about it.” “One thing that’s been really interesting to see is our forum,” says Harvey. “It’s a select smaller group of people. Some people definitely seemed to catch what we were trying to do in terms of community involvement. I would say we’re still learning about how
One of the most terrifying creatures appearing in Oats Studios’ shorts –so far – is the eponymous Zygote. Made up of multiple human bodies, the entity wreaks havoc on a mining facility in the Arctic Circle. And it was as complicated to produce as it is unpleasant to look it. Oats Studios Modeler and Senior Character Artist on Zygote, Ian Spriggs, took some original concept art for the creature and began designing the Zygote directly in 3D. “Neill said, ‘I want it to be very human, like human body parts stuck together,’” says Spriggs. A photogrammetry rig made up of 32 cameras was used to scan different hands. From five hands, Spriggs made many more variations. Senior Rigging TD Eric D. Legare rigged each hand individually, meaning they could then be posed. From there, digi-doubles were added to the Zygote model. “There were about seven full bodies in there,” notes Spriggs. “I posed them and started adding limbs and then slowly added more and more detail, making sure each hand was holding something and that each hand had a purpose. I had to ‘grotesque’ them up as well.” Another edict from Blomkamp was that the Zygote appear as if someone had sewn all the body parts together. For reference,
the director had seen a YouTube video of a worm that squirts a white glue that appears as an unusually colored vein. “It’s very sci-fi,” says Spriggs. “It was as if someone had built this creature, stitched it together, melted his brain onto this thing, and that white goo is the external nervous system!” The Zygote’s human-like eyes that are attached to its torso were equally horrific, and came from a suggestion from Visual Effects Supervisor Chris Harvey. “Chris said, ‘Imagine if you slit open the skin with a knife, and you pushed an eyeball into that slit,’” posits Spriggs. “And that’s basically what happened to the Zygote eyes.” For Spriggs, modeling and texturing the Zygote was an eight-month process, done in tandem with rigger Legare. Spriggs is particularly complimentary of the resulting rig, and the ‘gurgling’ type animation done for the Zygote. Even the sound design, he says, added to the uniqueness of the character. “They recorded a whole bunch of different people, and they said to each of them, ‘you make some choking sounds, you make some gargling sounds, and you make some screams,’ and they just mixed them all together. It just made it feel so horrific.”
Ian Spriggs’ CG model of the Zygote, a terrifying creature made up of human bodies, hands and eyes. (Photo copyright © 2017 Oats Studios)
The Zygote rig by Eric D. Legare proved to be extremely complicated given the number of body parts and hands. (Photo copyright © 2017 Oats Studios)
to better engage the audience at a base level.” IGNORING THE LAST 5%
In an effort to get the shorts out there, one philosophy at Oats Studios has been to take the visual effects to about 90% or 95%, and not worry about that last pecentage that would normally go into final polishing. It’s a philosophy that, argues Harvey, hasn’t altered the way people view the work. “Even if you try to take it to that 100% mark, the truth is, no shot’s ever really done. So in some ways, it almost seems more honest, and maybe it’s caused people to pick on it less. They’re going, ‘Well, they’re saying there’s stuff to pick on. They’re acknowledging it. They’re not pretending or hiding behind the fact
“One thing we’ve learned in having everything under one roof is there’s just this really cool synergy that we’re getting back, whether it be art department and visual effects, or with the practical guys and even visual effects guys getting to help out on set, shooting things or doing behind the scenes.” —Chris Harvey, Visual Effects Supervisor, Oats Studios
WINTER 2018 VFXVOICE.COM • 41
11/22/17 4:54 PM
FILM
TOP: In Zygote, the creature is made up of body parts consisting of people who once were part of an Arctic mining facility. (Photo copyright © 2017 Oats Studios) BOTTOM: Behind the scenes on the production of Firebase. Blomkamp’s stated goal with Oats Studios is to use the talents of accomplished filmmakers in all areas, but in an independent setting. (Photo copyright © 2017 Oats Studios)
that it’s like it’s perfect, because it isn’t, and they know that.’ “It has also definitely allowed us to produce more content and tell more stories,” continues Harvey. “I think if you were to ask the average person, ‘Would you rather have seen one of those with slightly better VFX, or are you happy to have seen two?’, I think almost everyone would say, ‘I’d rather have two.’ I think that’s been very successful, and I think we’ve definitely learned where we can trim it down or even become more efficient with that.” NEW TOOLS
In terms of visual effects technologies, Oats Studios does revolve around a typical pipeline, but it also has had the benefit of trying out new tools to see what works and what doesn’t. “There’s tons of stuff we’ve adopted into the pipeline,” says Harvey, “from small things to bigger things, and we’ve even written a bunch of our own proprietary pipeline tools. For example, there’s something called Instant Meshes that was released. It’s open source, and it’s just a way to retopologize a mesh, and we adopted that into our photogrammetry pipeline. “We even found this tool that helps us with object tracking in NUKE,” adds Harvey. “Our friends at other facilities will say, ‘Oh yeah, I talked to IT about it, and they had to talk to legal, and there’s all this red tape, so we can’t try it.’ This one small thing has saved us days and days and hours of labor, and it’s free.” More recently, Oats jumped head-first into a real-time rendering
42 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 43
project with Unity based on its ADAM franchise. ADAM: The Mirror and ADAM: The Prophet both utilized Unity’s real-time rendering and production pipeline, and offered Blomkamp a new opportunity to explore filmmaking in this medium. “Really, the idea fits so well into our own mandate,” comments Harvey. “We can shoot on a mo-cap stage in real-time, and output final frames, and edit and release films a week later.” Hopefully, that might mean audiences get to see more films from Blomkamp in his trademark style.
TOP: Oats Studios worked with Unity to continue the game engine company’s ADAM series of films and experiment with virtual production and real-time rendering. (Photo copyright © 2017 Oats Studios and Unity Technologies) BOTTOM LEFT: Artists at Oats Studios review a Rakka design. (Photo copyright © 2017 Oats Studios) BOTTOM RIGHT: Oats Studios Visual Effects Supervisor Chris Harvey (left) and director Neill Blomkamp (right). (Photo copyright © 2017 Oats Studios)
WINTER 2018
VFXVOICE.COM • 43
11/22/17 4:54 PM
FILM
TOP: In Zygote, the creature is made up of body parts consisting of people who once were part of an Arctic mining facility. (Photo copyright © 2017 Oats Studios) BOTTOM: Behind the scenes on the production of Firebase. Blomkamp’s stated goal with Oats Studios is to use the talents of accomplished filmmakers in all areas, but in an independent setting. (Photo copyright © 2017 Oats Studios)
that it’s like it’s perfect, because it isn’t, and they know that.’ “It has also definitely allowed us to produce more content and tell more stories,” continues Harvey. “I think if you were to ask the average person, ‘Would you rather have seen one of those with slightly better VFX, or are you happy to have seen two?’, I think almost everyone would say, ‘I’d rather have two.’ I think that’s been very successful, and I think we’ve definitely learned where we can trim it down or even become more efficient with that.” NEW TOOLS
In terms of visual effects technologies, Oats Studios does revolve around a typical pipeline, but it also has had the benefit of trying out new tools to see what works and what doesn’t. “There’s tons of stuff we’ve adopted into the pipeline,” says Harvey, “from small things to bigger things, and we’ve even written a bunch of our own proprietary pipeline tools. For example, there’s something called Instant Meshes that was released. It’s open source, and it’s just a way to retopologize a mesh, and we adopted that into our photogrammetry pipeline. “We even found this tool that helps us with object tracking in NUKE,” adds Harvey. “Our friends at other facilities will say, ‘Oh yeah, I talked to IT about it, and they had to talk to legal, and there’s all this red tape, so we can’t try it.’ This one small thing has saved us days and days and hours of labor, and it’s free.” More recently, Oats jumped head-first into a real-time rendering
42 • VFXVOICE.COM WINTER 2018
PG 38-43 OATS STUDIOS.indd 43
project with Unity based on its ADAM franchise. ADAM: The Mirror and ADAM: The Prophet both utilized Unity’s real-time rendering and production pipeline, and offered Blomkamp a new opportunity to explore filmmaking in this medium. “Really, the idea fits so well into our own mandate,” comments Harvey. “We can shoot on a mo-cap stage in real-time, and output final frames, and edit and release films a week later.” Hopefully, that might mean audiences get to see more films from Blomkamp in his trademark style.
TOP: Oats Studios worked with Unity to continue the game engine company’s ADAM series of films and experiment with virtual production and real-time rendering. (Photo copyright © 2017 Oats Studios and Unity Technologies) BOTTOM LEFT: Artists at Oats Studios review a Rakka design. (Photo copyright © 2017 Oats Studios) BOTTOM RIGHT: Oats Studios Visual Effects Supervisor Chris Harvey (left) and director Neill Blomkamp (right). (Photo copyright © 2017 Oats Studios)
WINTER 2018
VFXVOICE.COM • 43
11/22/17 4:54 PM
VES AWARDS PROFILE
2018 VES LIFETIME ACHIEVEMENT AWARD WINNER JON FAVREAU: THE CLASSICS RE-IMAGINEER By NAOMI GOLDMAN
TOP: Jon Favreau (Photo credit: Dan Doperalski)
44 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 44-45
From usher to extra to actor to auteur, Jon Favreau is a true journeyman. He has created exceptionally humanistic stories through his vision, unique approach to storytelling and full embrace of emerging technology to enhance the movie-going experience. He has championed boundary-breaking use of visual effects to create a seamless and invisible background for remarkable stories and vibrant characters – from superheroes to beloved mythical creatures – to play across film and television. For his innovative work in digital filmmaking and significant contributions to the visual effects industry, Favreau will receive the VES Lifetime Achievement Award from its Board of Directors at the 16th Annual VES Awards next month. Hot off the heels of directing The Jungle Book, the wildly successful and critically acclaimed live-action adaptation of the animated film, Favreau is currently directing the Disney live-action feature The Lion King. He is slated to direct The Second Jungle Book and will produce Avengers: Infinity War and its untitled sequel. Favreau’s filmography as a director includes Cowboys & Aliens, blockbuster hits Iron Man and Iron Man 2, Elf, Zathura and the Gnomes & Goblins VR experience. He originally established himself as a writer of considerable talent – and a breakout actor – with the acclaimed comedy Swingers. Favreau has served as the creator, producer and host of the critically acclaimed and Emmy-nominated IFC series Dinner for Five, and has directed the pilots for Young Sheldon and The Orville. Favreau’s creative vision has garnered him the Harold Lloyd Award for filmmaking from the Advanced Imaging Society and Virtual Reality Society, the Filmmaker Award from the Cinema Audio Society, and his engaging instant classic The Jungle Book earned five awards at last year’s Visual Effects Society Awards including the top honor for Best Photoreal Feature. FREAKS AND GEEKS
Favreau talks about a love affair with the cinema from an early age. “I was really inspired by Ray Harryhausen, Jason and the Argonauts and early monster movies. My grandmother had a black & white TV, and when his films popped up, I was glued to the screen. I was compelled at seeing the snakes in Medusa’s hair come to life or the skeletons in the ground rise… a spark of life in the technology. And I have cherished memories of going to see the revival of King Kong at the theater with my dad in New York – it was captivating and magical. Not just the imagery, but I was fascinated with how they did those things. Now it seems so easy to figure out, but back then I pored over the pages of Starlog magazine with a huge sense of curiosity. “I’ve always loved the cool, hip films with a New York vibe done by Woody Allen and Martin Scorsese. And then the geeky stuff, sci-fi like Blade Runner and Mad Max. Back then most stuff was either geeky or cool – but those two were both. The dormant nerd was awakened within me. “As I evolved as a filmmaker, I was always looking to understand how the directors achieved their artistic vision. As I moved towards the development of Gnomes & Goblins, I
explored projects to help answer the question of how to create emotionally engaging experiences in VR – ones where you didn’t want to take the headset off.” THE POWER OF MYTH
Remakes or adaptations of original titles with strong connections are often judged by audiences against their memories – in addition to the work itself – and it takes a special kind of filmmaker to shepherd such emotionally rich projects. Favreau is a student of Joseph Campbell and often refers to his work on the power and responsibility of myth as he has taken on back-to-back projects re-imagining classics. He views his role as storyteller akin to how elders of the tribe preserve and pass down information, often imbuing their personal references in the process to keep the stories alive. “I believe that the role of storyteller is to present a set of myths and legends and lessons to a new generation while offering a fresh and engaging vision. In the case of Star Wars, all of the great visual effects took old myths and the hero’s journey and presented it in a way that made it exciting for me to see as kid. With these two films [The Jungle Book and The Lion King], we are trying to deliver experiences that are richer than anything we could do before… something where you could invite people to see something magical for the first time or revisit something that they’ve grown up loving, treated with a tremendous amount of affection and care. “The approach is not binary anymore: old-fashioned movies where people are shooting film and using traditional techniques vs. new work using digital technology and CGI – and they feel like two different media. We are marrying traditional storytelling with new technology. When we harness technology well, you can feel the humanity and inspiration come through, and I believe it can bring about greater connectivity and empathy through our shared experience. I’m committed to preserving the culture and the history of cinema, and it’s important to me to keep traditions alive even while things are changing so rapidly and we aspire to take on new worlds.”
TOP: Jon Favreau and Robert Downey Jr. planning shots for Iron Man at Randy’s Donuts. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) BOTTOM: Favreau with Jeff Bridges and Faran Tahir on the set of Iron Man. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
“I believe that the role of storyteller is to present a set of myths and legends and lessons to a new generation while offering a fresh and engaging vision.” —Jon Favreau
EMBRACING VFX
When it comes to technology and visual effects, Favreau speaks to a clear progression from his reticence in utilizing them at the outset, to his advocacy as a seasoned filmmaker who has pushed the boundaries of cinematic storytelling though his inventive use of new tools. “Early in my career I was a bit of a Luddite regarding CGI and I stayed away from it. I thought the technology was drawing too much attention to itself and that it was often misapplied as a magical fix to problems. “On Elf, at the beginning of the film, there were no digital effects at all. I used forced perspective and motion control and other techniques to achieve the shots I wanted. With Iron Man, I started to trust visual effects more, both the technology and the artists. Stan Winston built practical suits of the Iron Man armor, but we integrated CG into the process as it
WINTER 2018 VFXVOICE.COM • 45
11/22/17 4:54 PM
VES AWARDS PROFILE
2018 VES LIFETIME ACHIEVEMENT AWARD WINNER JON FAVREAU: THE CLASSICS RE-IMAGINEER By NAOMI GOLDMAN
TOP: Jon Favreau (Photo credit: Dan Doperalski)
44 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 44-45
From usher to extra to actor to auteur, Jon Favreau is a true journeyman. He has created exceptionally humanistic stories through his vision, unique approach to storytelling and full embrace of emerging technology to enhance the movie-going experience. He has championed boundary-breaking use of visual effects to create a seamless and invisible background for remarkable stories and vibrant characters – from superheroes to beloved mythical creatures – to play across film and television. For his innovative work in digital filmmaking and significant contributions to the visual effects industry, Favreau will receive the VES Lifetime Achievement Award from its Board of Directors at the 16th Annual VES Awards next month. Hot off the heels of directing The Jungle Book, the wildly successful and critically acclaimed live-action adaptation of the animated film, Favreau is currently directing the Disney live-action feature The Lion King. He is slated to direct The Second Jungle Book and will produce Avengers: Infinity War and its untitled sequel. Favreau’s filmography as a director includes Cowboys & Aliens, blockbuster hits Iron Man and Iron Man 2, Elf, Zathura and the Gnomes & Goblins VR experience. He originally established himself as a writer of considerable talent – and a breakout actor – with the acclaimed comedy Swingers. Favreau has served as the creator, producer and host of the critically acclaimed and Emmy-nominated IFC series Dinner for Five, and has directed the pilots for Young Sheldon and The Orville. Favreau’s creative vision has garnered him the Harold Lloyd Award for filmmaking from the Advanced Imaging Society and Virtual Reality Society, the Filmmaker Award from the Cinema Audio Society, and his engaging instant classic The Jungle Book earned five awards at last year’s Visual Effects Society Awards including the top honor for Best Photoreal Feature. FREAKS AND GEEKS
Favreau talks about a love affair with the cinema from an early age. “I was really inspired by Ray Harryhausen, Jason and the Argonauts and early monster movies. My grandmother had a black & white TV, and when his films popped up, I was glued to the screen. I was compelled at seeing the snakes in Medusa’s hair come to life or the skeletons in the ground rise… a spark of life in the technology. And I have cherished memories of going to see the revival of King Kong at the theater with my dad in New York – it was captivating and magical. Not just the imagery, but I was fascinated with how they did those things. Now it seems so easy to figure out, but back then I pored over the pages of Starlog magazine with a huge sense of curiosity. “I’ve always loved the cool, hip films with a New York vibe done by Woody Allen and Martin Scorsese. And then the geeky stuff, sci-fi like Blade Runner and Mad Max. Back then most stuff was either geeky or cool – but those two were both. The dormant nerd was awakened within me. “As I evolved as a filmmaker, I was always looking to understand how the directors achieved their artistic vision. As I moved towards the development of Gnomes & Goblins, I
explored projects to help answer the question of how to create emotionally engaging experiences in VR – ones where you didn’t want to take the headset off.” THE POWER OF MYTH
Remakes or adaptations of original titles with strong connections are often judged by audiences against their memories – in addition to the work itself – and it takes a special kind of filmmaker to shepherd such emotionally rich projects. Favreau is a student of Joseph Campbell and often refers to his work on the power and responsibility of myth as he has taken on back-to-back projects re-imagining classics. He views his role as storyteller akin to how elders of the tribe preserve and pass down information, often imbuing their personal references in the process to keep the stories alive. “I believe that the role of storyteller is to present a set of myths and legends and lessons to a new generation while offering a fresh and engaging vision. In the case of Star Wars, all of the great visual effects took old myths and the hero’s journey and presented it in a way that made it exciting for me to see as kid. With these two films [The Jungle Book and The Lion King], we are trying to deliver experiences that are richer than anything we could do before… something where you could invite people to see something magical for the first time or revisit something that they’ve grown up loving, treated with a tremendous amount of affection and care. “The approach is not binary anymore: old-fashioned movies where people are shooting film and using traditional techniques vs. new work using digital technology and CGI – and they feel like two different media. We are marrying traditional storytelling with new technology. When we harness technology well, you can feel the humanity and inspiration come through, and I believe it can bring about greater connectivity and empathy through our shared experience. I’m committed to preserving the culture and the history of cinema, and it’s important to me to keep traditions alive even while things are changing so rapidly and we aspire to take on new worlds.”
TOP: Jon Favreau and Robert Downey Jr. planning shots for Iron Man at Randy’s Donuts. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) BOTTOM: Favreau with Jeff Bridges and Faran Tahir on the set of Iron Man. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
“I believe that the role of storyteller is to present a set of myths and legends and lessons to a new generation while offering a fresh and engaging vision.” —Jon Favreau
EMBRACING VFX
When it comes to technology and visual effects, Favreau speaks to a clear progression from his reticence in utilizing them at the outset, to his advocacy as a seasoned filmmaker who has pushed the boundaries of cinematic storytelling though his inventive use of new tools. “Early in my career I was a bit of a Luddite regarding CGI and I stayed away from it. I thought the technology was drawing too much attention to itself and that it was often misapplied as a magical fix to problems. “On Elf, at the beginning of the film, there were no digital effects at all. I used forced perspective and motion control and other techniques to achieve the shots I wanted. With Iron Man, I started to trust visual effects more, both the technology and the artists. Stan Winston built practical suits of the Iron Man armor, but we integrated CG into the process as it
WINTER 2018 VFXVOICE.COM • 45
11/22/17 4:54 PM
VES AWARDS PROFILE
TOP: Favreau and Robert Downey Jr. on the set of Iron Man. (Photo credit Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) BOTTOM: Favreau with the Iron Man armored suit created by Stan Winston and worn by the character Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
evolved. We always aimed to design shots where the effects lent themselves to the original story we were telling without taking you out of that experience. By the time I was working on The Jungle Book, I was immersed in using the tools at our disposal to tell a story in a way we just couldn’t have before. “I also felt that that people made the mistake of attributing what they were seeing to the tools and not the artistry behind them. As I evolved and embraced the potential of the technology, I wanted to see the artists and animators get recognized for their skill, discipline and enormous talent that breathes life into those tools with the eye of a great painter. There are a lot of people working very hard to make sure you don’t see their work, which is counterintuitive, but it’s the nature of what we do. So I will continue to call attention to those visual effects
“The approach is not binary anymore: old-fashioned movies where people are shooting film and using traditional techniques vs. new work using digital technology and CGI – and they feel like two different media. We are marrying traditional storytelling with new technology.” —Jon Favreau
46 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 46-47
TOP LEFT: Favreau with a nod to Iron Man Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) TOP RIGHT: Favreau and Robert Downey Jr. on the set of Iron Man 2. (Photo credit: Zade Rosenthal. Copyright © 2010 Marvel Studios/Paramount) BOTTOM LEFT AND RIGHT: Favreau with Neel Sethi, who portrayed Mowgli, on the set of The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc.)
artists, because they work really hard in service to telling the story.” THE ACTOR’S EYE
Favreau is in a unique position behind the camera and brings his considerable experience as an actor to his flourishing career as a writer, producer and director. “I learned how to work by watching other filmmakers and being an actor over the years. So by the time I finally directed,
WINTER 2018 VFXVOICE.COM • 47
11/22/17 4:55 PM
VES AWARDS PROFILE
TOP: Favreau and Robert Downey Jr. on the set of Iron Man. (Photo credit Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) BOTTOM: Favreau with the Iron Man armored suit created by Stan Winston and worn by the character Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
evolved. We always aimed to design shots where the effects lent themselves to the original story we were telling without taking you out of that experience. By the time I was working on The Jungle Book, I was immersed in using the tools at our disposal to tell a story in a way we just couldn’t have before. “I also felt that that people made the mistake of attributing what they were seeing to the tools and not the artistry behind them. As I evolved and embraced the potential of the technology, I wanted to see the artists and animators get recognized for their skill, discipline and enormous talent that breathes life into those tools with the eye of a great painter. There are a lot of people working very hard to make sure you don’t see their work, which is counterintuitive, but it’s the nature of what we do. So I will continue to call attention to those visual effects
“The approach is not binary anymore: old-fashioned movies where people are shooting film and using traditional techniques vs. new work using digital technology and CGI – and they feel like two different media. We are marrying traditional storytelling with new technology.” —Jon Favreau
46 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 46-47
TOP LEFT: Favreau with a nod to Iron Man Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount) TOP RIGHT: Favreau and Robert Downey Jr. on the set of Iron Man 2. (Photo credit: Zade Rosenthal. Copyright © 2010 Marvel Studios/Paramount) BOTTOM LEFT AND RIGHT: Favreau with Neel Sethi, who portrayed Mowgli, on the set of The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc.)
artists, because they work really hard in service to telling the story.” THE ACTOR’S EYE
Favreau is in a unique position behind the camera and brings his considerable experience as an actor to his flourishing career as a writer, producer and director. “I learned how to work by watching other filmmakers and being an actor over the years. So by the time I finally directed,
WINTER 2018 VFXVOICE.COM • 47
11/22/17 4:55 PM
VES AWARDS PROFILE
“When we harness technology well, you can feel the humanity and inspiration come through, and I believe it can bring about greater connectivity and empathy through our shared experience. I’m committed to preserving the culture and the history of cinema, and it’s important to me to keep traditions alive even while things are changing so rapidly and we aspire to take on new worlds.” —Jon Favreau I already had a lot of experience through the apprenticeship of being on other directors’ sets. It was a dream come true to be on Martin Scorsese’s set for The Wolf of Wall Street, just to watch him work after hearing so many people talk about what that was like. Most directors don’t have that luxury or the opportunities I had. And that camaraderie with other directors was instrumental to my learning curve. I was very supported by the generosity of other filmmakers like James Cameron. “I love acting in large part for the chance to be on other directors’ sets because there is always something to learn, some nuance that might influence my approach to a project, or the dynamic interaction on set with the actors and crew. I get offered parts by filmmakers I admire, but directing makes it difficult at times. Last year, I was in Spider-Man: Homecoming, and in addition to being an executive producer on the next Avengers, I’m an actor in that. I’m ready for that kind of fun!” TOP: The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc. ) BOTTOM: Favreau with Neel Sethi, who portrayed Mowgli, on the set of The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc. )
48 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 49
THE LAST WORD
On his first job in the business, Favreau glowingly recalls his service as an usher at RKO Keith’s Theatre in Flushing, Queens, a landmark theater converted from an old vaudeville
“By the time I finally directed, I already had a lot of experience through the apprenticeship of being on other directors’ sets. It was a dream come true to be on Martin Scorsese’s set for The Wolf of Wall Street, just to watch him work after hearing so many people talk about what that was like. Most directors don’t have that luxury or the opportunities I had. And that camaraderie with other directors was instrumental to my learning curve. I was very supported by the generosity of other filmmakers like James Cameron.” —Jon Favreau house built in the 1920s. “I was there when Return of the Jedi and Indiana Jones and the Temple of Doom were showing and got the chance to watch movies over and over. A dream job. “ When asked what career path he would have chosen if not the one he enjoys today, he confirms it probably wouldn’t be making cubano sandwiches from an El Jefe food truck like he did in Chef (though he does admit that visual effects ‘enhanced’ his wicked professional knife skills on display in the indie hit.) “I would probably find a way to sneak back into the movie business in some capacity... I don’t think I could really stay away. Animation is really compelling and I would love to see if I have what it takes to be an animator. It’s such a magical discipline and something I wanted to do when I was very young, but there wasn’t a lot of opportunity at the time. I’m excited to see how audiences of all ages have embraced the resurgence of animation and the painstaking artistry that produces such enduring works of cinema.”
TOP: The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc.)
WINTER 2018 VFXVOICE.COM • 49
11/22/17 4:55 PM
VES AWARDS PROFILE
“When we harness technology well, you can feel the humanity and inspiration come through, and I believe it can bring about greater connectivity and empathy through our shared experience. I’m committed to preserving the culture and the history of cinema, and it’s important to me to keep traditions alive even while things are changing so rapidly and we aspire to take on new worlds.” —Jon Favreau I already had a lot of experience through the apprenticeship of being on other directors’ sets. It was a dream come true to be on Martin Scorsese’s set for The Wolf of Wall Street, just to watch him work after hearing so many people talk about what that was like. Most directors don’t have that luxury or the opportunities I had. And that camaraderie with other directors was instrumental to my learning curve. I was very supported by the generosity of other filmmakers like James Cameron. “I love acting in large part for the chance to be on other directors’ sets because there is always something to learn, some nuance that might influence my approach to a project, or the dynamic interaction on set with the actors and crew. I get offered parts by filmmakers I admire, but directing makes it difficult at times. Last year, I was in Spider-Man: Homecoming, and in addition to being an executive producer on the next Avengers, I’m an actor in that. I’m ready for that kind of fun!” TOP: The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc. ) BOTTOM: Favreau with Neel Sethi, who portrayed Mowgli, on the set of The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc. )
48 • VFXVOICE.COM WINTER 2018
PG 44-49 JON FAVREAU.indd 49
THE LAST WORD
On his first job in the business, Favreau glowingly recalls his service as an usher at RKO Keith’s Theatre in Flushing, Queens, a landmark theater converted from an old vaudeville
“By the time I finally directed, I already had a lot of experience through the apprenticeship of being on other directors’ sets. It was a dream come true to be on Martin Scorsese’s set for The Wolf of Wall Street, just to watch him work after hearing so many people talk about what that was like. Most directors don’t have that luxury or the opportunities I had. And that camaraderie with other directors was instrumental to my learning curve. I was very supported by the generosity of other filmmakers like James Cameron.” —Jon Favreau house built in the 1920s. “I was there when Return of the Jedi and Indiana Jones and the Temple of Doom were showing and got the chance to watch movies over and over. A dream job. “ When asked what career path he would have chosen if not the one he enjoys today, he confirms it probably wouldn’t be making cubano sandwiches from an El Jefe food truck like he did in Chef (though he does admit that visual effects ‘enhanced’ his wicked professional knife skills on display in the indie hit.) “I would probably find a way to sneak back into the movie business in some capacity... I don’t think I could really stay away. Animation is really compelling and I would love to see if I have what it takes to be an animator. It’s such a magical discipline and something I wanted to do when I was very young, but there wasn’t a lot of opportunity at the time. I’m excited to see how audiences of all ages have embraced the resurgence of animation and the painstaking artistry that produces such enduring works of cinema.”
TOP: The Jungle Book. (Photo copyright © 2014 Disney Enterprises, Inc.)
WINTER 2018 VFXVOICE.COM • 49
11/22/17 4:55 PM
FILM
Guillermo del Toro is no stranger to the world of creatures. His films have featured insects, monsters, ghosts, human-hybrids and everything in between. The director continues to demonstrate his fondness for slightly off-kilter living things in The Shape of Water. In the film, an amphibious entity – known as ‘The Asset’ (played by Doug Jones) – is held in a 1960s government laboratory until a mute janitor (Sally Hawkins) befriends and ultimately falls in love with it. As he has done in previous films, del Toro relied heavily on a man-in-suit approach for Jones to act as The Asset on set. The suit and prosthetics were fabricated by Legacy Effects. Then, to enhance Jones’s performance or to enable scenes that could not be carried out in the suit, a digital version of The Asset and augmentations to the creature were carried out by Mr. X Inc. That practical and digital effects collaboration was key to having The Asset, which was affectionately called ‘Charlie’ during production, realistically interact with its often-watery environment and with the other characters, particularly for touching moments with the janitor, Elisa. VFX Voice finds out from Legacy and Mr. X how they made it possible. COMBINING REAL AND DIGITAL
THE CREATURE FITS THE SUIT IN THE SHAPE OF WATER By IAN FAILES
TOP: Legacy Effects artists sculpt the form of The Asset suit, which will later be used to produce molds for suit parts. (Image copyright © 2017 Fox Searchlight Pictures.)
50 • VFXVOICE.COM WINTER 2018
PG 50-53 SHAPE OF WATER.indd 50-51
Early on, del Toro indicated to his effects teams that he wanted to use both a real creature suit and CG for The Asset. “Guillermo said to us,” relates Mr. X Visual Effects Supervisor Dennis Berardi, “‘This is a collaboration, and I don’t want to have a fuss about what’s digital, and what’s practical. We’re going to tell this story in the best possible way.’ We just worked together, and it went the way it needed to go, based on logistics, or a particularly difficult stunt or performance moment.” While the suit was being crafted at Legacy, the visual effects artists at Mr. X had input on the design side, an aspect that would help later on when they had to replicate the character digitally for swimming and some other water-related shots. Early screen tests with the suit also revealed that an extra level of facial performance would be required. “The Asset emotes,” says Berardi. “He is a leading man. There are tender moments, there are moments of anguish, there’s a torture scene. It’s also a love story, and it was going to be tough for Doug to hit the subtleties of that while wearing a mask.” The solution was for Jones to wear a mostly finished Legacy head-piece that Mr. X would go in and replace parts of, including the eyes and brow. Some larger moments, such as when he growls at a cat and swimming scenes, involved a full head replacement. “We kept Doug’s primary performance in the face and used it to the extent that we could, and then went bigger from there,” explains Berardi. “It was still a very important thing to have. If we had only had him in a motion-capture suit on set, I don’t think we would have achieved the same thing. Doug really delivered a performance in that suit that was inspiring, and we were there to enhance it, and to take it the last 20%.” BEHIND THE SUIT
Based on initial designs and concept work, Legacy Effects
worked to manufacture the creature suit to fit Jones, an actor who has had considerable experience as a creature performer. During development, the company had to adapt to a few modifications in the design. “It became less monstrous and tapped into more of Charlie’s handsome attributes,” outlines Legacy Co-founder Shane Mahan. “It slowly evolved to be a bit more ‘man-like.’ We had to work on it right until the last minute, and then ship it to Toronto for filming, practically sticky-wet, to make it on time.” The suit was made of multiple materials, including a base of foam rubber, and silicone for the back of the head, hands, gills and some parts of the body. Other body parts were crafted from urethane to provide for a translucent look, especially for the fins. The face was more of a makeup solution. Says Mahan: “We put translucent material wherever we could to make it feel aquatic, so you feel light reading through it, like the membrane of a skin.” On set, a group of Legacy Effects personnel were responsible for the application of the suit, prosthetics and makeup for The Asset. It took about three hours each day. “With this film,” notes Mahan, “it was all about the details – the translucency of the claws, the inner claw with the veins on it, the little fin bones that are in the castings of the gills. I wanted the light to react a certain way. There are certain things that I wanted to create to help tell the story.” Mahan says that the suit was designed knowing that there would be CG augmentations and take-overs, and that the final result had the best of both techniques. “Audiences are very savvy these days,” suggests Mahan. “No matter how good a digital character is in the film, you still can mentally detect that it’s not really there. But if it’s really glistening, and there’s, say, real water dripping off of it, it does something a little bit different to your perception. And then when you enhance it with CGI, it’s really a great marriage of the two approaches.” ENHANCING THE PERFORMANCE
For shots where The Asset was required to be completely CG or go through CG enhancements, Mr. X set to work on replicating the character, and Doug Jones, digitally. This involved a scanning process using the studio’s proprietary system called Xscan, a rig of
“We kept Doug Jones’s primary performance in the face and used it to the extent that we could, and then went bigger from there. It was still a very important thing to have. If we had only had him in a motion-capture suit on set, I don’t think we would have achieved the same thing. Doug really delivered a performance in that suit that was inspiring, and we were there to enhance it, and take it the last 20%.” —Dennis Berardi, Visual Effects Supervisor, Mr. X
TOP TO BOTTOM: Legacy Effects Co-founder Shane Mahan (left) and his crew fit Doug Jones (seated) into The Asset suit. The suit included a number of animatronics for gill and other movements. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures) Shane Mahan readies Doug Jones in the suit for a take on set. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures) Doug Jones inside The Asset suit is readied for a photogrammetry scanning session; this aided Mr. X in producing a matching digital model. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures)
WINTER 2018 VFXVOICE.COM • 51
11/22/17 4:55 PM
FILM
Guillermo del Toro is no stranger to the world of creatures. His films have featured insects, monsters, ghosts, human-hybrids and everything in between. The director continues to demonstrate his fondness for slightly off-kilter living things in The Shape of Water. In the film, an amphibious entity – known as ‘The Asset’ (played by Doug Jones) – is held in a 1960s government laboratory until a mute janitor (Sally Hawkins) befriends and ultimately falls in love with it. As he has done in previous films, del Toro relied heavily on a man-in-suit approach for Jones to act as The Asset on set. The suit and prosthetics were fabricated by Legacy Effects. Then, to enhance Jones’s performance or to enable scenes that could not be carried out in the suit, a digital version of The Asset and augmentations to the creature were carried out by Mr. X Inc. That practical and digital effects collaboration was key to having The Asset, which was affectionately called ‘Charlie’ during production, realistically interact with its often-watery environment and with the other characters, particularly for touching moments with the janitor, Elisa. VFX Voice finds out from Legacy and Mr. X how they made it possible. COMBINING REAL AND DIGITAL
THE CREATURE FITS THE SUIT IN THE SHAPE OF WATER By IAN FAILES
TOP: Legacy Effects artists sculpt the form of The Asset suit, which will later be used to produce molds for suit parts. (Image copyright © 2017 Fox Searchlight Pictures.)
50 • VFXVOICE.COM WINTER 2018
PG 50-53 SHAPE OF WATER.indd 50-51
Early on, del Toro indicated to his effects teams that he wanted to use both a real creature suit and CG for The Asset. “Guillermo said to us,” relates Mr. X Visual Effects Supervisor Dennis Berardi, “‘This is a collaboration, and I don’t want to have a fuss about what’s digital, and what’s practical. We’re going to tell this story in the best possible way.’ We just worked together, and it went the way it needed to go, based on logistics, or a particularly difficult stunt or performance moment.” While the suit was being crafted at Legacy, the visual effects artists at Mr. X had input on the design side, an aspect that would help later on when they had to replicate the character digitally for swimming and some other water-related shots. Early screen tests with the suit also revealed that an extra level of facial performance would be required. “The Asset emotes,” says Berardi. “He is a leading man. There are tender moments, there are moments of anguish, there’s a torture scene. It’s also a love story, and it was going to be tough for Doug to hit the subtleties of that while wearing a mask.” The solution was for Jones to wear a mostly finished Legacy head-piece that Mr. X would go in and replace parts of, including the eyes and brow. Some larger moments, such as when he growls at a cat and swimming scenes, involved a full head replacement. “We kept Doug’s primary performance in the face and used it to the extent that we could, and then went bigger from there,” explains Berardi. “It was still a very important thing to have. If we had only had him in a motion-capture suit on set, I don’t think we would have achieved the same thing. Doug really delivered a performance in that suit that was inspiring, and we were there to enhance it, and to take it the last 20%.” BEHIND THE SUIT
Based on initial designs and concept work, Legacy Effects
worked to manufacture the creature suit to fit Jones, an actor who has had considerable experience as a creature performer. During development, the company had to adapt to a few modifications in the design. “It became less monstrous and tapped into more of Charlie’s handsome attributes,” outlines Legacy Co-founder Shane Mahan. “It slowly evolved to be a bit more ‘man-like.’ We had to work on it right until the last minute, and then ship it to Toronto for filming, practically sticky-wet, to make it on time.” The suit was made of multiple materials, including a base of foam rubber, and silicone for the back of the head, hands, gills and some parts of the body. Other body parts were crafted from urethane to provide for a translucent look, especially for the fins. The face was more of a makeup solution. Says Mahan: “We put translucent material wherever we could to make it feel aquatic, so you feel light reading through it, like the membrane of a skin.” On set, a group of Legacy Effects personnel were responsible for the application of the suit, prosthetics and makeup for The Asset. It took about three hours each day. “With this film,” notes Mahan, “it was all about the details – the translucency of the claws, the inner claw with the veins on it, the little fin bones that are in the castings of the gills. I wanted the light to react a certain way. There are certain things that I wanted to create to help tell the story.” Mahan says that the suit was designed knowing that there would be CG augmentations and take-overs, and that the final result had the best of both techniques. “Audiences are very savvy these days,” suggests Mahan. “No matter how good a digital character is in the film, you still can mentally detect that it’s not really there. But if it’s really glistening, and there’s, say, real water dripping off of it, it does something a little bit different to your perception. And then when you enhance it with CGI, it’s really a great marriage of the two approaches.” ENHANCING THE PERFORMANCE
For shots where The Asset was required to be completely CG or go through CG enhancements, Mr. X set to work on replicating the character, and Doug Jones, digitally. This involved a scanning process using the studio’s proprietary system called Xscan, a rig of
“We kept Doug Jones’s primary performance in the face and used it to the extent that we could, and then went bigger from there. It was still a very important thing to have. If we had only had him in a motion-capture suit on set, I don’t think we would have achieved the same thing. Doug really delivered a performance in that suit that was inspiring, and we were there to enhance it, and take it the last 20%.” —Dennis Berardi, Visual Effects Supervisor, Mr. X
TOP TO BOTTOM: Legacy Effects Co-founder Shane Mahan (left) and his crew fit Doug Jones (seated) into The Asset suit. The suit included a number of animatronics for gill and other movements. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures) Shane Mahan readies Doug Jones in the suit for a take on set. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures) Doug Jones inside The Asset suit is readied for a photogrammetry scanning session; this aided Mr. X in producing a matching digital model. (Photo credit: Kerry Hayes. Copyright © 2017 Fox Searchlight Pictures)
WINTER 2018 VFXVOICE.COM • 51
11/22/17 4:55 PM
FILM
Face-time To aid in the subtle facial animation for The Asset, Mr. X relied on its Xscan portable photogrammetry rig. “We could bring Doug Jones in, put him in position and in basically one second we got 56 overlapping photos that captured every angle,” explains Visual Effects Supervisor Dennis Berardi. “Doug was so patient with us,” recalls Berardi. “We kept going back for more and more reference. I’d want to scan him without makeup, and run him through a range of facial expressions, and then put him in the makeup and do the same thing.” A resulting CG model computed from the images also provided high-resolution textures of Jones. The actor provided multiple face shapes – 24 in total – to mimic the approach usually taken to acquire a Facial Action Coding System, or FACS, capture.
“We thought it was important to gather both in makeup and out because ultimately the suit and makeup was built off of Doug’s facial structure,” says Animation Supervisor Kevin Scott. “It meant we could then interpolate between the two to derive our digital Charlie face.” Still, tracking the digital facial features or CG head into the live-action plates of Jones was a major obstacle for Mr. X, especially since the suit materials would often stretch in different ways between different shots. “They had a number of different suits, so we couldn’t track it with just one model,” states Scott. “We had to build certain controls into our rig to kind of imply the rubber stretch into our CG model. That was probably the hardest technical challenge we had on this show.”
TOP: A typical shot in which Mr. X would need to add facial animation onto the live- action suit for The Asset began with original photography of Doug Jones in the Legacy Effects prosthetics, such as this scene of the creature growling. (Image copyright © 2017 Fox Searchlight Pictures)
The final shot replaced mostly the eyes and brow area, and in this case the gills so that they could be made to flare with greater effect. (Image copyright © 2017 Fox Searchlight Pictures)
Mr. X then tracked the required plate and body and face motion and added its CG elements. Facial features and animation were sourced via scans of Jones in and out of makeup using its Xscan photogrammetry rig. (Image copyright © 2017 Fox Searchlight Pictures)
TOP: Live-action photography of The Asset was enhanced in many scenes by Mr. X’s addition of facial animation. (Image copyright © 2017 Fox Searchlight Pictures) MIDDLE: This bathtub sequence made use of a completely digital takeover by Mr. X’s CG model for The Asset. (Image copyright © 2017 Fox Searchlight Pictures)
56 cameras that could be brought to the set for photogrammetry capture (see sidebar). Both The Asset and the janitor Elisa do not speak. That meant that Mr. X’s animators, led by Animation Supervisor Kevin Scott focused heavily on the body language from the original plates when adding to The Asset with digital effects. “Everything was very mime and facial expression driven,” says Scott. “So we had to bring that up to match what Sally’s performance was doing. We knew Doug was doing that inside the suit
52 • VFXVOICE.COM WINTER 2018
PG 50-53 SHAPE OF WATER.indd 52-53
BOTTOM: In this underwater scene featuring The Asset with Elisa, a dry-for-wet live action shoot was enhanced by Mr. X to add a watery feeling, floating particles and even CG hair for the actress. (Image copyright © 2017 Fox Searchlight Pictures)
-- we just had to make sure that we represented him as well as we could.” For swimming shots, in which The Asset would be almost completely digital, Mr. X looked at swimming reference, especially competitive swimmers who used a dolphin kick (“It gave this unique look in the way it undulates their bodies,” notes Scott). More subtle animation included adding in gill movement and nictitating membranes for The Asset’s eyes, which involved looking at crocodile and other amphibious animals.
“Audiences are very savvy these days. No matter how good a digital character is in the film, you still can mentally detect that it’s not really there. But if it’s really glistening, and there’s, say, real water dripping off of it, it does something a little bit different to your perception. And then when you enhance it with CGI, it’s really a great marriage of the two approaches.” —Shane Mahan, Co-founder, Legacy Effects Scott also had the chance to speak directly with Jones about how the actor would approach the character in terms of movement and reactions. “He said, ‘It’s a cross between Silver Surfer and a graceful matador,’” says Scott. “I actually came back to the office and I wrote that on a big post-it note and stuck it on my monitor so I could be reminded of that every day. What that really meant to me was when he moves, he’s like a dancer – very fluid, flowing lines all the way through his body. As I was reviewing work with the animators, we kept that on the forefront.”
CRAFTING A CHARACTER
In the end, the combination of a man-in-suit and a number of digital enhancements make for a very compelling character, one that the audience is immediately drawn to. Of course, that’s the intention of the filmmakers, as Legacy’s Shane Mahan describes. “Everybody’s hope is that when you watch the film you’re not thinking that there’s a man in a costume, that instead you should be thinking, ‘Where did they find this exotic creature, and how did they get it to be in a movie?’ That’s really the magic of it.”
WINTER 2018 VFXVOICE.COM • 53
11/22/17 4:55 PM
FILM
Face-time To aid in the subtle facial animation for The Asset, Mr. X relied on its Xscan portable photogrammetry rig. “We could bring Doug Jones in, put him in position and in basically one second we got 56 overlapping photos that captured every angle,” explains Visual Effects Supervisor Dennis Berardi. “Doug was so patient with us,” recalls Berardi. “We kept going back for more and more reference. I’d want to scan him without makeup, and run him through a range of facial expressions, and then put him in the makeup and do the same thing.” A resulting CG model computed from the images also provided high-resolution textures of Jones. The actor provided multiple face shapes – 24 in total – to mimic the approach usually taken to acquire a Facial Action Coding System, or FACS, capture.
“We thought it was important to gather both in makeup and out because ultimately the suit and makeup was built off of Doug’s facial structure,” says Animation Supervisor Kevin Scott. “It meant we could then interpolate between the two to derive our digital Charlie face.” Still, tracking the digital facial features or CG head into the live-action plates of Jones was a major obstacle for Mr. X, especially since the suit materials would often stretch in different ways between different shots. “They had a number of different suits, so we couldn’t track it with just one model,” states Scott. “We had to build certain controls into our rig to kind of imply the rubber stretch into our CG model. That was probably the hardest technical challenge we had on this show.”
TOP: A typical shot in which Mr. X would need to add facial animation onto the live- action suit for The Asset began with original photography of Doug Jones in the Legacy Effects prosthetics, such as this scene of the creature growling. (Image copyright © 2017 Fox Searchlight Pictures)
The final shot replaced mostly the eyes and brow area, and in this case the gills so that they could be made to flare with greater effect. (Image copyright © 2017 Fox Searchlight Pictures)
Mr. X then tracked the required plate and body and face motion and added its CG elements. Facial features and animation were sourced via scans of Jones in and out of makeup using its Xscan photogrammetry rig. (Image copyright © 2017 Fox Searchlight Pictures)
TOP: Live-action photography of The Asset was enhanced in many scenes by Mr. X’s addition of facial animation. (Image copyright © 2017 Fox Searchlight Pictures) MIDDLE: This bathtub sequence made use of a completely digital takeover by Mr. X’s CG model for The Asset. (Image copyright © 2017 Fox Searchlight Pictures)
56 cameras that could be brought to the set for photogrammetry capture (see sidebar). Both The Asset and the janitor Elisa do not speak. That meant that Mr. X’s animators, led by Animation Supervisor Kevin Scott focused heavily on the body language from the original plates when adding to The Asset with digital effects. “Everything was very mime and facial expression driven,” says Scott. “So we had to bring that up to match what Sally’s performance was doing. We knew Doug was doing that inside the suit
52 • VFXVOICE.COM WINTER 2018
PG 50-53 SHAPE OF WATER.indd 52-53
BOTTOM: In this underwater scene featuring The Asset with Elisa, a dry-for-wet live action shoot was enhanced by Mr. X to add a watery feeling, floating particles and even CG hair for the actress. (Image copyright © 2017 Fox Searchlight Pictures)
-- we just had to make sure that we represented him as well as we could.” For swimming shots, in which The Asset would be almost completely digital, Mr. X looked at swimming reference, especially competitive swimmers who used a dolphin kick (“It gave this unique look in the way it undulates their bodies,” notes Scott). More subtle animation included adding in gill movement and nictitating membranes for The Asset’s eyes, which involved looking at crocodile and other amphibious animals.
“Audiences are very savvy these days. No matter how good a digital character is in the film, you still can mentally detect that it’s not really there. But if it’s really glistening, and there’s, say, real water dripping off of it, it does something a little bit different to your perception. And then when you enhance it with CGI, it’s really a great marriage of the two approaches.” —Shane Mahan, Co-founder, Legacy Effects Scott also had the chance to speak directly with Jones about how the actor would approach the character in terms of movement and reactions. “He said, ‘It’s a cross between Silver Surfer and a graceful matador,’” says Scott. “I actually came back to the office and I wrote that on a big post-it note and stuck it on my monitor so I could be reminded of that every day. What that really meant to me was when he moves, he’s like a dancer – very fluid, flowing lines all the way through his body. As I was reviewing work with the animators, we kept that on the forefront.”
CRAFTING A CHARACTER
In the end, the combination of a man-in-suit and a number of digital enhancements make for a very compelling character, one that the audience is immediately drawn to. Of course, that’s the intention of the filmmakers, as Legacy’s Shane Mahan describes. “Everybody’s hope is that when you watch the film you’re not thinking that there’s a man in a costume, that instead you should be thinking, ‘Where did they find this exotic creature, and how did they get it to be in a movie?’ That’s really the magic of it.”
WINTER 2018 VFXVOICE.COM • 53
11/22/17 4:55 PM
TV
Although the AMC and Sony Television series Preacher is based on a comic book series, it is not a show where superheroes take giant leaps or weave their powerful magic. But visual effects do form a crucial part of the show’s sometimes outrageous, regularly subversive and often gory scenes. One of the significant visual effects contributors to Preacher, which was developed by Sam Catlin, Evan Goldberg and Seth Rogen, is Legend 3D in Toronto (FuseFX is also a major vendor on the show). Legend 3D came on board for the show’s second season which wrapped up in September, helping to tell the story of Jesse Custer (Dominic Cooper), a preacher who discovers he has the power to command others to do as he wishes. Along the way, and in a search for God, Custer and a group of friends encounter all manner of enemies – supernatural and otherwise. This is where the visual effects, including from Legend 3D, came in, delivering enhancements for various violence, plus a range of digital environments, set extensions and composites for several episodes.
By IAN FAILES
TOP: Dominic Cooper as Jesse Custer in Preacher. Much of season two was set in New Orleans. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
54 • VFXVOICE.COM WINTER 2018
PG 54-57 PREACHER.indd 54-55
THE EFFECTS CHALLENGES
GOING STEALTHY
What are some examples of these invisible effects? Sometimes they were simply enhancements to what had been filmed in live action, and sometimes they were about taking a particular practical effect further. For example, in one episode, the character Tracy (Gianna LePera) commits suicide by shooting herself in the head. “The on-set supervisor, Dottie Starling, and the folks at KNB EFX Group did some really great work to get us assets to complement,” outlines Ghering. “We were working with several performance and stunt takes, a brain cannon on greenscreen, a life cast of the actor’s head, as well as complementing it all with some CG blood that would fall and soak into the blanket.” To pull off that breathtaking shot, a double of Tracy’s head was matchmoved and animated to the plate which then emitted the fluid to create blood with proper spray and trajectory. “We used a couple of passes with cached geometry which was then finessed in the final compositing stage,” he says. Other challenges included a number of CG environments and set extensions, all intended to be seamlessly integrated with what had been filmed for the show. “We had a great opportunity to bring to life the Hell environment, for example,” notes Ghering. “The initial production design was to achieve somewhere in between Cloud City in Star Wars and Kowloon in China. We followed closely to the reference from production designer Dave Blass: massive, industrial and foreboding with pin pricks of light going off into infinity.” The Hell environment was generated as a procedural asset using MASH in Maya to build the geometry. Legend 3D artists then textured the geometry with MARI and Substance, while completing lookdev and lighting in Maya and V-Ray. “We also populated the environment,” continues Ghering, “with dozens of passes of volumetric lighting, dust, steam and particulates to give it a belching, choking feel. The fluid and particulate passes were done in Houdini and Maya. The lights
Legend 3D’s intention was to have its work remain invisible. “If someone knows you have done the work then you haven’t done your job,” says Ghering. “Being able to train your eye to know where the sweet spot is can be difficult,” he adds. “You can easily over or underwork something that is supposed to be subtle. The artists want people to know they’ve been there and the client often wants to see where the value in the shot is. If it’s invisible then how do you quantify either?” To that end, Ghering says one of the best notes they received on Preacher was, ‘I don’t know what is CG here, so that is probably a good thing’. Head of VFX Production at Legend 3D in Toronto,
“If someone knows you have done the work then you haven’t done your job. … The artists want people to know they’ve been there and the client often wants to see where the value in the shot is. If it’s invisible then how do you quantify either?” —Adam Ghering, Visual Effects Supervisor
HOW LEGEND JUMPED FROM 3D TO VFX
LEGEND 3D: BRINGING HELL TO LIFE IN PREACHER
Lisa Sepp-Wilson, suggests, too, that their goal was always to keep the effects grounded in reality. “We felt that we had a responsibility to the original vision of season one of the series to maintain as much photoreal imagery as possible, even though this is technically a graphic novel come to life,” Sepp-Wilson says.
Until recently, Legend 3D was mostly known as a stereo conversion house. But in this capacity, the studio had ultimately been completing waves of visual effects work such as adding in particles, rain and debris, and extensive object and rig removal. “After a time,” recounts Legend 3D Visual Effects Supervisor Adam Ghering, “we were doing more and more effects which led to the need for a purpose-built VFX department. “Our relationships that we have built along the way, as well as a great personnel, particularly Sarah Stiteler in our business development group, made it possible to connect with Sony Television and the executive producers on Preacher,” he adds. As a conversion house, Legend 3D had developed a pipeline centered on compositing. This suited what was required for Preacher, since much of the work involved split takes and greenscreen set extensions. Legend 3D was also able to take shots further into CG where necessary. “The pipeline, software and talent complemented the needs of the show,” states Ghering. “Additionally, there were some opportunities to build out a specialized CG team to handle specific shots and environments.”
TOP: Joseph Gilgun as Cassidy and Graham McTavish as The Saint of Killers. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) BOTTOM: Actor Joseph Gilgun as Cassidy prepares for a scene in which his fingers will be cut off by The Saint of Killers. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 55
11/22/17 4:56 PM
TV
Although the AMC and Sony Television series Preacher is based on a comic book series, it is not a show where superheroes take giant leaps or weave their powerful magic. But visual effects do form a crucial part of the show’s sometimes outrageous, regularly subversive and often gory scenes. One of the significant visual effects contributors to Preacher, which was developed by Sam Catlin, Evan Goldberg and Seth Rogen, is Legend 3D in Toronto (FuseFX is also a major vendor on the show). Legend 3D came on board for the show’s second season which wrapped up in September, helping to tell the story of Jesse Custer (Dominic Cooper), a preacher who discovers he has the power to command others to do as he wishes. Along the way, and in a search for God, Custer and a group of friends encounter all manner of enemies – supernatural and otherwise. This is where the visual effects, including from Legend 3D, came in, delivering enhancements for various violence, plus a range of digital environments, set extensions and composites for several episodes.
By IAN FAILES
TOP: Dominic Cooper as Jesse Custer in Preacher. Much of season two was set in New Orleans. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
54 • VFXVOICE.COM WINTER 2018
PG 54-57 PREACHER.indd 54-55
THE EFFECTS CHALLENGES
GOING STEALTHY
What are some examples of these invisible effects? Sometimes they were simply enhancements to what had been filmed in live action, and sometimes they were about taking a particular practical effect further. For example, in one episode, the character Tracy (Gianna LePera) commits suicide by shooting herself in the head. “The on-set supervisor, Dottie Starling, and the folks at KNB EFX Group did some really great work to get us assets to complement,” outlines Ghering. “We were working with several performance and stunt takes, a brain cannon on greenscreen, a life cast of the actor’s head, as well as complementing it all with some CG blood that would fall and soak into the blanket.” To pull off that breathtaking shot, a double of Tracy’s head was matchmoved and animated to the plate which then emitted the fluid to create blood with proper spray and trajectory. “We used a couple of passes with cached geometry which was then finessed in the final compositing stage,” he says. Other challenges included a number of CG environments and set extensions, all intended to be seamlessly integrated with what had been filmed for the show. “We had a great opportunity to bring to life the Hell environment, for example,” notes Ghering. “The initial production design was to achieve somewhere in between Cloud City in Star Wars and Kowloon in China. We followed closely to the reference from production designer Dave Blass: massive, industrial and foreboding with pin pricks of light going off into infinity.” The Hell environment was generated as a procedural asset using MASH in Maya to build the geometry. Legend 3D artists then textured the geometry with MARI and Substance, while completing lookdev and lighting in Maya and V-Ray. “We also populated the environment,” continues Ghering, “with dozens of passes of volumetric lighting, dust, steam and particulates to give it a belching, choking feel. The fluid and particulate passes were done in Houdini and Maya. The lights
Legend 3D’s intention was to have its work remain invisible. “If someone knows you have done the work then you haven’t done your job,” says Ghering. “Being able to train your eye to know where the sweet spot is can be difficult,” he adds. “You can easily over or underwork something that is supposed to be subtle. The artists want people to know they’ve been there and the client often wants to see where the value in the shot is. If it’s invisible then how do you quantify either?” To that end, Ghering says one of the best notes they received on Preacher was, ‘I don’t know what is CG here, so that is probably a good thing’. Head of VFX Production at Legend 3D in Toronto,
“If someone knows you have done the work then you haven’t done your job. … The artists want people to know they’ve been there and the client often wants to see where the value in the shot is. If it’s invisible then how do you quantify either?” —Adam Ghering, Visual Effects Supervisor
HOW LEGEND JUMPED FROM 3D TO VFX
LEGEND 3D: BRINGING HELL TO LIFE IN PREACHER
Lisa Sepp-Wilson, suggests, too, that their goal was always to keep the effects grounded in reality. “We felt that we had a responsibility to the original vision of season one of the series to maintain as much photoreal imagery as possible, even though this is technically a graphic novel come to life,” Sepp-Wilson says.
Until recently, Legend 3D was mostly known as a stereo conversion house. But in this capacity, the studio had ultimately been completing waves of visual effects work such as adding in particles, rain and debris, and extensive object and rig removal. “After a time,” recounts Legend 3D Visual Effects Supervisor Adam Ghering, “we were doing more and more effects which led to the need for a purpose-built VFX department. “Our relationships that we have built along the way, as well as a great personnel, particularly Sarah Stiteler in our business development group, made it possible to connect with Sony Television and the executive producers on Preacher,” he adds. As a conversion house, Legend 3D had developed a pipeline centered on compositing. This suited what was required for Preacher, since much of the work involved split takes and greenscreen set extensions. Legend 3D was also able to take shots further into CG where necessary. “The pipeline, software and talent complemented the needs of the show,” states Ghering. “Additionally, there were some opportunities to build out a specialized CG team to handle specific shots and environments.”
TOP: Joseph Gilgun as Cassidy and Graham McTavish as The Saint of Killers. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) BOTTOM: Actor Joseph Gilgun as Cassidy prepares for a scene in which his fingers will be cut off by The Saint of Killers. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 55
11/22/17 4:56 PM
TV
LEFT TO RIGHT: Executive Producer/writer/director Sam Catlin and director Wayne Yip on the set of Preacher. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) The aftermath of Tracy’s (Gianna LePera) self-inflicted headshot, achieved as on-set makeup effect. (Photo credit: Michele K. Short/AMC/ Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Legend 3D had to match the production design for Hell while carrying out any environment augmentations. (Photo credit: Michele K. Short/AMC/ Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Director Michael Slovis captures a scene with Tom Thon as The Pope. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC. and Sony Pictures Television Inc. All Rights Reserved.)
“We had to always be one step ahead of the next episode, with assets ready to go, artists at the ready and a rigid-as-possible delivery schedule, while still remaining flexible and able to execute last-minute changes when requested.” —Lisa Sepp-Wilson, Head of VFX Production, Legend 3D
56 • VFXVOICE.COM WINTER 2018
PG 54-57 PREACHER.indd 56-57
also needed to flicker to give the feel of projected light as each one was a personal hell.” Hallways within Hell were commonly extended by Legend 3D. A typical shot would usually start with a foreground greenscreen plate that was matchmoved if necessary and then projected onto cards or geometry – this depended on the individual plate and the amount of parallax. Elements such as shafts of light, flickering lights, dust and ambient particles were commonly added to the environments. Asked if there was one shot that proved to be most challenging in the series, Sepp-Wilson comments that no single scene was harder than the others. “But,” she says, “there were design challenges, and we definitely felt it was a collaborative effort between us and the exec team. We really enjoyed executing some of the full-CG work, as well as some of the smoke and steam shots.” ASSEMBLING A TEAM
Since Legend 3D was pivoting to more visual effects work, they needed a tight and talented team. “We had a great workflow here within the Toronto facility, and were able to iterate and turn shots around extremely efficiently as a result,” says Sepp-Wilson. “Each episode took about two weeks to complete, once final shots were turned over and put on the production floor.” The studio worked on seven episodes from season two of Preacher, each time maintaining communication with the
production’s post team made up of the post supervisor, VFX editor and episode editors. “We also kept in touch with the on-set supervisor, Dottie, while they were still shooting the particular episodes we worked on,” she says. “Over the course of the post period I traveled to L.A. often in order to be present for spotting sessions and meeting with editorial.” ON TIME AND ON BUDGET
Anyone who works in TV visual effects knows that creating imagery on an episodic schedule and budget can be intense. Legend 3D relied on that tight-knit team and scheduling software to be able to manage the process. “We had to always be one step ahead of the next episode,” remarks Sepp-Wilson, “with assets ready to go, artists at the ready and a rigid-as-possible delivery schedule, while still remaining flexible and able to execute last-minute changes when requested. “We utilized Shotgun to communicate with the EPs,” she adds. “We would post shots for their approval, and they would respond with notes and/or approvals. It was a very efficient process, as they were able to view our shots whenever they chose and wherever they happened to be.” The result is that Preacher delivers on its promise of a faithful comic book adaptation with plenty of comic and horror elements thrown in, often thanks to the hard work from the effects teams on the show.
LEFT TO RIGHT: Ian Colletti as Arseface. Legend 3D extended hallways for scenes taking place in Hell. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Blood hits were one of Legend 3D’s staples for the series. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Legend 3D worked on this underwater scene in which an armored truck is seen in the bottom of a swamp. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) A view into Hell crafted by Legend 3D – the shots needed to convey an endless chasm with ‘pin pricks’ of light. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 57
11/22/17 4:56 PM
TV
LEFT TO RIGHT: Executive Producer/writer/director Sam Catlin and director Wayne Yip on the set of Preacher. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) The aftermath of Tracy’s (Gianna LePera) self-inflicted headshot, achieved as on-set makeup effect. (Photo credit: Michele K. Short/AMC/ Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Legend 3D had to match the production design for Hell while carrying out any environment augmentations. (Photo credit: Michele K. Short/AMC/ Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Director Michael Slovis captures a scene with Tom Thon as The Pope. (Photo credit: Michele K. Short/AMC/Sony Pictures Television. Copyright © 2017 AMC Network Entertainment LLC. and Sony Pictures Television Inc. All Rights Reserved.)
“We had to always be one step ahead of the next episode, with assets ready to go, artists at the ready and a rigid-as-possible delivery schedule, while still remaining flexible and able to execute last-minute changes when requested.” —Lisa Sepp-Wilson, Head of VFX Production, Legend 3D
56 • VFXVOICE.COM WINTER 2018
PG 54-57 PREACHER.indd 56-57
also needed to flicker to give the feel of projected light as each one was a personal hell.” Hallways within Hell were commonly extended by Legend 3D. A typical shot would usually start with a foreground greenscreen plate that was matchmoved if necessary and then projected onto cards or geometry – this depended on the individual plate and the amount of parallax. Elements such as shafts of light, flickering lights, dust and ambient particles were commonly added to the environments. Asked if there was one shot that proved to be most challenging in the series, Sepp-Wilson comments that no single scene was harder than the others. “But,” she says, “there were design challenges, and we definitely felt it was a collaborative effort between us and the exec team. We really enjoyed executing some of the full-CG work, as well as some of the smoke and steam shots.” ASSEMBLING A TEAM
Since Legend 3D was pivoting to more visual effects work, they needed a tight and talented team. “We had a great workflow here within the Toronto facility, and were able to iterate and turn shots around extremely efficiently as a result,” says Sepp-Wilson. “Each episode took about two weeks to complete, once final shots were turned over and put on the production floor.” The studio worked on seven episodes from season two of Preacher, each time maintaining communication with the
production’s post team made up of the post supervisor, VFX editor and episode editors. “We also kept in touch with the on-set supervisor, Dottie, while they were still shooting the particular episodes we worked on,” she says. “Over the course of the post period I traveled to L.A. often in order to be present for spotting sessions and meeting with editorial.” ON TIME AND ON BUDGET
Anyone who works in TV visual effects knows that creating imagery on an episodic schedule and budget can be intense. Legend 3D relied on that tight-knit team and scheduling software to be able to manage the process. “We had to always be one step ahead of the next episode,” remarks Sepp-Wilson, “with assets ready to go, artists at the ready and a rigid-as-possible delivery schedule, while still remaining flexible and able to execute last-minute changes when requested. “We utilized Shotgun to communicate with the EPs,” she adds. “We would post shots for their approval, and they would respond with notes and/or approvals. It was a very efficient process, as they were able to view our shots whenever they chose and wherever they happened to be.” The result is that Preacher delivers on its promise of a faithful comic book adaptation with plenty of comic and horror elements thrown in, often thanks to the hard work from the effects teams on the show.
LEFT TO RIGHT: Ian Colletti as Arseface. Legend 3D extended hallways for scenes taking place in Hell. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Blood hits were one of Legend 3D’s staples for the series. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) Legend 3D worked on this underwater scene in which an armored truck is seen in the bottom of a swamp. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.) A view into Hell crafted by Legend 3D – the shots needed to convey an endless chasm with ‘pin pricks’ of light. (Photo copyright © 2017 AMC Network Entertainment LLC and Sony Pictures Television Inc. All Rights Reserved.)
WINTER 2018 VFXVOICE.COM • 57
11/22/17 4:56 PM
FILM
CRAFTING NEW EFFECTS FRONTIERS IN THE LAST JEDI By IAN FAILES
All images copyright © 2017 Lucasfilm Ltd. TOP: Director Rian Johnson sits with the character Chewbacca (played by Joonas Suotamo) on board the set of the Millennium Falcon. BOTTOM: Chewbacca and a porg sit at the controls of the Millennium Falcon. This porg is a digital ILM creation, while others in the film were handled as practical creatures by Neal Scanlan.
58 • VFXVOICE.COM WINTER 2018
PG 58-61 STAR WARS THE LAST JEDI.indd 58-59
Few film releases in 2017 were as anticipated as Rian Johnson’s Star Wars: The Last Jedi, the follow-up to The Force Awakens and the continuation of the stories surrounding the Resistance’s movement against the evil First Order. The movie continues to follow the plights of the scavenger Rey (Daisy Ridley), self-exiled Jedi Master Luke Skywalker (Mark Hamill) and Kylo Ren (Adam Driver), the servant to Supreme Leader Snoke (Andy Serkis) and son of Han Solo (Harrison Ford) and Leia Organa (Carrie Fisher). The Last Jedi also sees, of course, the return of Industrial Light & Magic to deliver the film’s significant visual effects, a diverse mix of CG creatures and characters, droids, space battles, land assaults and epic environments. That work was headed by Visual Effects Supervisor Ben Morris who oversaw effects carried out in four of ILM’s studios across the world (London, San Francisco, Vancouver and Singapore) and at other studios contracted to ILM. Speaking to VFX Voice prior to the film’s release, Morris ran down the mammoth effort in shepherding a complex array of visual effects for The Last Jedi, including, yes, those adorable porgs. A PORG’S STORY PLANNING A JEDI JOURNEY
Morris’s first interaction with the director on The Last Jedi occurred while the visual effects supervisor was still in post production on The Force Awakens. “For me,” says Morris, “it was the opportunity to take the responsibility for the whole film, and it was fantastic to meet with Rian so early on.” Johnson wrote the script and also produced his own storyboards and others with a storyboard artist. From those, ILM and The Third Floor split up previsualization duties on the film, which continued throughout production. Morris says Johnson had never done previs before and totally embraced the approach. “It was a new world for him but by the end he was thrilled,” notes Morris. “I think it was one of those things; if as a filmmaker you’ve never been there, you might question why you need [previs], but when you work on a film with so many visual effects and unknowns, it not only helps the director, it actually helps all the other departments as well. “On most of these big films now,” adds Morris, “we not only seem to be delivering previs for the director and editor but you get every other head of department coming in and saying, ‘Can I see the latest previs? Where are we going with this scene?’ It’s a tool for everyone now.” The Last Jedi was filmed predominantly at Pinewood Studios where massive set-pieces were built, and at several locations, including Ireland and Croatia. In the age of bluescreen filmmaking (which was certainly still part of the film’s production), location shoots and filming with practical sets were crucial, Morris believes, as were the use of traditional practical effects. Special Effects Supervisor Chris Corbould handled these, while on-set creature effects from Creature Shop Concept Designer/Creature Shop Head Neal Scanlan were also featured heavily during filming. “I pushed very strongly to try and ground our work in physical locations, going to places, so that you can be on location,” notes Morris. “You just get a better response and performance, and it offers realistic lighting, which is the biggest challenge nowadays.”
The porgs are a classic example of the mix between practical and digital effects in The Last Jedi. Native to the island of Ahch-To where Rey visits Luke, the porgs are flying birds with high levels of curiosity. They were designed by Scanlan and his team and realized for on-set puppeteering in several practical incarnations. ILM then replicated the porgs digitally. “Rian held us to this very closely,” states Morris. “We were absolutely matching the physical puppets that Neal created. We would sit our CG porg side by side with the practical one and look into the intricate details. Rian was very keen that the audience wouldn’t know which was which. “Rian also believes in the school of animation that less is always more,” continues Morris. “He would frequently say, ‘No, no, guys, just because you can doesn’t mean you should.’ He was very sensitive to cartoony animation – well, ‘cartoony’ was his phrase for it – but it just meant animation that went beyond the bounds of what he thought was necessary. That was a great guidance and he never changed his opinion on that.”
“[Director] Rian [Johnson] very much wanted to shoot Andy [Serkis] as Andy. Once he understood that we could take whatever Andy did and make it successfully translate into Snoke, he really didn’t need to see Snoke on the day. That’s something that quite a lot of directors do — once they have that confidence, they can make that abstraction.” —Ben Morris, Visual Effects Supervisor, ILM TOP: Daisy Ridley as Rey on the Skellig Islands, standing in for Ahch-To, with director Rian Johnson and crew. BOTTOM: The dreaded AT-M6 walkers, along with Kylo Ren’s shuttle. ILM referenced the original stop-motion AT-AT walkers from The Empire Strikes Back but infused these new vehicles with a more menacing feel.
WINTER 2018 VFXVOICE.COM • 59
11/22/17 4:57 PM
FILM
CRAFTING NEW EFFECTS FRONTIERS IN THE LAST JEDI By IAN FAILES
All images copyright © 2017 Lucasfilm Ltd. TOP: Director Rian Johnson sits with the character Chewbacca (played by Joonas Suotamo) on board the set of the Millennium Falcon. BOTTOM: Chewbacca and a porg sit at the controls of the Millennium Falcon. This porg is a digital ILM creation, while others in the film were handled as practical creatures by Neal Scanlan.
58 • VFXVOICE.COM WINTER 2018
PG 58-61 STAR WARS THE LAST JEDI.indd 58-59
Few film releases in 2017 were as anticipated as Rian Johnson’s Star Wars: The Last Jedi, the follow-up to The Force Awakens and the continuation of the stories surrounding the Resistance’s movement against the evil First Order. The movie continues to follow the plights of the scavenger Rey (Daisy Ridley), self-exiled Jedi Master Luke Skywalker (Mark Hamill) and Kylo Ren (Adam Driver), the servant to Supreme Leader Snoke (Andy Serkis) and son of Han Solo (Harrison Ford) and Leia Organa (Carrie Fisher). The Last Jedi also sees, of course, the return of Industrial Light & Magic to deliver the film’s significant visual effects, a diverse mix of CG creatures and characters, droids, space battles, land assaults and epic environments. That work was headed by Visual Effects Supervisor Ben Morris who oversaw effects carried out in four of ILM’s studios across the world (London, San Francisco, Vancouver and Singapore) and at other studios contracted to ILM. Speaking to VFX Voice prior to the film’s release, Morris ran down the mammoth effort in shepherding a complex array of visual effects for The Last Jedi, including, yes, those adorable porgs. A PORG’S STORY PLANNING A JEDI JOURNEY
Morris’s first interaction with the director on The Last Jedi occurred while the visual effects supervisor was still in post production on The Force Awakens. “For me,” says Morris, “it was the opportunity to take the responsibility for the whole film, and it was fantastic to meet with Rian so early on.” Johnson wrote the script and also produced his own storyboards and others with a storyboard artist. From those, ILM and The Third Floor split up previsualization duties on the film, which continued throughout production. Morris says Johnson had never done previs before and totally embraced the approach. “It was a new world for him but by the end he was thrilled,” notes Morris. “I think it was one of those things; if as a filmmaker you’ve never been there, you might question why you need [previs], but when you work on a film with so many visual effects and unknowns, it not only helps the director, it actually helps all the other departments as well. “On most of these big films now,” adds Morris, “we not only seem to be delivering previs for the director and editor but you get every other head of department coming in and saying, ‘Can I see the latest previs? Where are we going with this scene?’ It’s a tool for everyone now.” The Last Jedi was filmed predominantly at Pinewood Studios where massive set-pieces were built, and at several locations, including Ireland and Croatia. In the age of bluescreen filmmaking (which was certainly still part of the film’s production), location shoots and filming with practical sets were crucial, Morris believes, as were the use of traditional practical effects. Special Effects Supervisor Chris Corbould handled these, while on-set creature effects from Creature Shop Concept Designer/Creature Shop Head Neal Scanlan were also featured heavily during filming. “I pushed very strongly to try and ground our work in physical locations, going to places, so that you can be on location,” notes Morris. “You just get a better response and performance, and it offers realistic lighting, which is the biggest challenge nowadays.”
The porgs are a classic example of the mix between practical and digital effects in The Last Jedi. Native to the island of Ahch-To where Rey visits Luke, the porgs are flying birds with high levels of curiosity. They were designed by Scanlan and his team and realized for on-set puppeteering in several practical incarnations. ILM then replicated the porgs digitally. “Rian held us to this very closely,” states Morris. “We were absolutely matching the physical puppets that Neal created. We would sit our CG porg side by side with the practical one and look into the intricate details. Rian was very keen that the audience wouldn’t know which was which. “Rian also believes in the school of animation that less is always more,” continues Morris. “He would frequently say, ‘No, no, guys, just because you can doesn’t mean you should.’ He was very sensitive to cartoony animation – well, ‘cartoony’ was his phrase for it – but it just meant animation that went beyond the bounds of what he thought was necessary. That was a great guidance and he never changed his opinion on that.”
“[Director] Rian [Johnson] very much wanted to shoot Andy [Serkis] as Andy. Once he understood that we could take whatever Andy did and make it successfully translate into Snoke, he really didn’t need to see Snoke on the day. That’s something that quite a lot of directors do — once they have that confidence, they can make that abstraction.” —Ben Morris, Visual Effects Supervisor, ILM TOP: Daisy Ridley as Rey on the Skellig Islands, standing in for Ahch-To, with director Rian Johnson and crew. BOTTOM: The dreaded AT-M6 walkers, along with Kylo Ren’s shuttle. ILM referenced the original stop-motion AT-AT walkers from The Empire Strikes Back but infused these new vehicles with a more menacing feel.
WINTER 2018 VFXVOICE.COM • 59
11/22/17 4:57 PM
FILM
THE RETURN OF SNOKE
Another digital character ILM contributed was Supreme Leader Snoke. In The Force Awakens, Snoke only appeared in hologram form as a 25-foot-high projection. This time the audience gets right up close with Snoke, an early plan from Johnson, who turned to Morris to confirm this would be possible. “My answer was, ‘Absolutely yes’,” recalls Morris. “I thought it would be thrilling to bring him into the physical world. Snoke has been back in the ‘body shop’ and had a complete re-build for this film.” Andy Serkis performed the character on set in an active LED motion-capture suit with a head-mounted camera. Interestingly, although ILM’s performance-capture system enables real-time live facial and body-animation playback on set, this was not something that was used for Snoke. “One reason is that Rian very much wanted to shoot Andy as Andy,” says Morris. “Once he understood that we could take whatever Andy did and make it successfully translate into Snoke, he really didn’t need to see Snoke on the day. That’s something that quite a lot of directors do – once they have that confidence, they can make that abstraction.” ALIEN WORLDS
TOP TO BOTTOM: Resistance ski speeders race across the salt plain surface of the planet Crait. Resistance bombers engage in battle with the First Order fleet, one of a number of space battles appearing in The Last Jedi. BB-8 makes a re-appearance during a spaceflight. The popular droid was achieved as both a practical and digital effect.
“Steve Yedlin, ASC, the Director of Photography, used a dome shooting environment and an LED light from ARRI called a SkyPanel, which was very controllable. We had layers of silks that gave us a soft ambient look. We could also control laser flashes on iPads. It was like, ‘Green laser! Green laser! Explosion!’ And you could have the light traveling past the cockpit. It really gave us the benefit of having that interactive lighting.” —Ben Morris, Visual Effects Supervisor, ILM
60 • VFXVOICE.COM WINTER 2018
PG 58-61 STAR WARS THE LAST JEDI.indd 60-61
Star Wars films allow audiences to visit strange alien landscapes, and in The Last Jedi there are several incredible places. One is an alien salt plain on the planet Crait where Resistance ski speeders square off against AT-M6 walkers. The speeders have the effect of revealing red crystal material as they scoot over the salt plain surface. “It’s a wonderful and exciting scene,” says Morris. “It’s an example of Rian having a very clear idea of what he wanted before even a picture was painted. Red is a visual theme in the film. Rian was very interested in the idea of a pure white salt flat and the concept of a red crystal material being revealed through the action of skating over the top or crashing into it.” The production traveled to a real-world salt flat and filmed plates, using them as a basis to generate a digital environment in which to place the action. Part of that scene, too, involved animating the AT-M6 walkers, reminiscent of the much-loved stop-motion AT-ATs first seen in The Empire Strikes Back. Here the updated walkers are ‘chunkier’ and ‘meaner’, suggests Morris, who also found approaching the movement for the First Order vehicles surprisingly challenging. “We did use the original stop-motion as the start and foundation of the animation,” Morris says, “but what you actually find is that your nostalgia gives you a different vision in your head of how incredibly smooth those movements were. While they blew me away as a kid, when you go back and watch them you think, ‘OK, we can use that as the basis and enhance it and give a more massive scale.’” CLASSIC STAR WARS
The Last Jedi would not be a Star Wars film without a good space battle or two. Johnson tackled these early on in thumbnail
sketches that were fleshed out with storyboards and previs. Says Morris: “We went that route simply because with such a huge body of work to complete for this film. [Battle scenes] are the kinds of shots that are entirely virtual – you can actually get ahead of the game. You can start earlier, rather than waiting until live-action elements are available.” ILM crafted digital ships and many, many digital explosions, sometimes mixed in with practical pyro elements. Cockpit shots were still necessary; they were filmed practically on partial sets with a focus on interactive lighting. “Steve Yedlin, ASC, the Director of Photography, used a dome shooting environment and an LED light from ARRI called a SkyPanel, which was very controllable,” explains Morris. “We had layers of silks that gave us a soft ambient look. We could also control laser flashes on iPads. It was like, ‘Green laser! Green laser! Explosion!’ You could have the light traveling past the cockpit. It gave us the benefit of having that interactive lighting.” This approach, says Morris, was just another example of the useful combination between practical and digital in delivering the most exciting imagery possible for The Last Jedi. “Rian Johnson is a filmmaker who wanted to shoot as practically as he could,” Morris concludes. “The real magic can occur and does occur when you gather 200 people on a main unit to a filming location. You have all the actors there, and you give them as much physical set as you can, as much visual stimulus as you can – pyro, atmospherics, creatures walking around them. And we just know that we can always enhance and augment that digitally if we need to.”
TOP: Luke Skywalker receives his lightsaber from Rey. Mark Hamill performed the role with some strategically positioned green tracking markers around his fingers, with ILM delivering the mechanical hand in CG. BOTOM:Finn (John Boyega) battles Captain Phasma (Gwendoline Christie). Phasma’s metallic suit was occasionally augmented by ILM to add in distinctive reflections.
“The real magic can occur and does occur when you gather 200 people on a main unit to a filming location. You have all the actors there, and you give them as much physical set as you can, as much visual stimulus as you can – pyro, atmospherics, creatures walking around them. And we just know that we can always enhance and augment that digitally if we need to.” —Ben Morris, Visual Effects Supervisor, ILM
WINTER 2018 VFXVOICE.COM • 61
11/22/17 4:57 PM
FILM
THE RETURN OF SNOKE
Another digital character ILM contributed was Supreme Leader Snoke. In The Force Awakens, Snoke only appeared in hologram form as a 25-foot-high projection. This time the audience gets right up close with Snoke, an early plan from Johnson, who turned to Morris to confirm this would be possible. “My answer was, ‘Absolutely yes’,” recalls Morris. “I thought it would be thrilling to bring him into the physical world. Snoke has been back in the ‘body shop’ and had a complete re-build for this film.” Andy Serkis performed the character on set in an active LED motion-capture suit with a head-mounted camera. Interestingly, although ILM’s performance-capture system enables real-time live facial and body-animation playback on set, this was not something that was used for Snoke. “One reason is that Rian very much wanted to shoot Andy as Andy,” says Morris. “Once he understood that we could take whatever Andy did and make it successfully translate into Snoke, he really didn’t need to see Snoke on the day. That’s something that quite a lot of directors do – once they have that confidence, they can make that abstraction.” ALIEN WORLDS
TOP TO BOTTOM: Resistance ski speeders race across the salt plain surface of the planet Crait. Resistance bombers engage in battle with the First Order fleet, one of a number of space battles appearing in The Last Jedi. BB-8 makes a re-appearance during a spaceflight. The popular droid was achieved as both a practical and digital effect.
“Steve Yedlin, ASC, the Director of Photography, used a dome shooting environment and an LED light from ARRI called a SkyPanel, which was very controllable. We had layers of silks that gave us a soft ambient look. We could also control laser flashes on iPads. It was like, ‘Green laser! Green laser! Explosion!’ And you could have the light traveling past the cockpit. It really gave us the benefit of having that interactive lighting.” —Ben Morris, Visual Effects Supervisor, ILM
60 • VFXVOICE.COM WINTER 2018
PG 58-61 STAR WARS THE LAST JEDI.indd 60-61
Star Wars films allow audiences to visit strange alien landscapes, and in The Last Jedi there are several incredible places. One is an alien salt plain on the planet Crait where Resistance ski speeders square off against AT-M6 walkers. The speeders have the effect of revealing red crystal material as they scoot over the salt plain surface. “It’s a wonderful and exciting scene,” says Morris. “It’s an example of Rian having a very clear idea of what he wanted before even a picture was painted. Red is a visual theme in the film. Rian was very interested in the idea of a pure white salt flat and the concept of a red crystal material being revealed through the action of skating over the top or crashing into it.” The production traveled to a real-world salt flat and filmed plates, using them as a basis to generate a digital environment in which to place the action. Part of that scene, too, involved animating the AT-M6 walkers, reminiscent of the much-loved stop-motion AT-ATs first seen in The Empire Strikes Back. Here the updated walkers are ‘chunkier’ and ‘meaner’, suggests Morris, who also found approaching the movement for the First Order vehicles surprisingly challenging. “We did use the original stop-motion as the start and foundation of the animation,” Morris says, “but what you actually find is that your nostalgia gives you a different vision in your head of how incredibly smooth those movements were. While they blew me away as a kid, when you go back and watch them you think, ‘OK, we can use that as the basis and enhance it and give a more massive scale.’” CLASSIC STAR WARS
The Last Jedi would not be a Star Wars film without a good space battle or two. Johnson tackled these early on in thumbnail
sketches that were fleshed out with storyboards and previs. Says Morris: “We went that route simply because with such a huge body of work to complete for this film. [Battle scenes] are the kinds of shots that are entirely virtual – you can actually get ahead of the game. You can start earlier, rather than waiting until live-action elements are available.” ILM crafted digital ships and many, many digital explosions, sometimes mixed in with practical pyro elements. Cockpit shots were still necessary; they were filmed practically on partial sets with a focus on interactive lighting. “Steve Yedlin, ASC, the Director of Photography, used a dome shooting environment and an LED light from ARRI called a SkyPanel, which was very controllable,” explains Morris. “We had layers of silks that gave us a soft ambient look. We could also control laser flashes on iPads. It was like, ‘Green laser! Green laser! Explosion!’ You could have the light traveling past the cockpit. It gave us the benefit of having that interactive lighting.” This approach, says Morris, was just another example of the useful combination between practical and digital in delivering the most exciting imagery possible for The Last Jedi. “Rian Johnson is a filmmaker who wanted to shoot as practically as he could,” Morris concludes. “The real magic can occur and does occur when you gather 200 people on a main unit to a filming location. You have all the actors there, and you give them as much physical set as you can, as much visual stimulus as you can – pyro, atmospherics, creatures walking around them. And we just know that we can always enhance and augment that digitally if we need to.”
TOP: Luke Skywalker receives his lightsaber from Rey. Mark Hamill performed the role with some strategically positioned green tracking markers around his fingers, with ILM delivering the mechanical hand in CG. BOTOM:Finn (John Boyega) battles Captain Phasma (Gwendoline Christie). Phasma’s metallic suit was occasionally augmented by ILM to add in distinctive reflections.
“The real magic can occur and does occur when you gather 200 people on a main unit to a filming location. You have all the actors there, and you give them as much physical set as you can, as much visual stimulus as you can – pyro, atmospherics, creatures walking around them. And we just know that we can always enhance and augment that digitally if we need to.” —Ben Morris, Visual Effects Supervisor, ILM
WINTER 2018 VFXVOICE.COM • 61
11/22/17 4:57 PM
FILM
When George Lucas formed Industrial Light & Magic (ILM) in 1975, the director’s aim was for the company to deliver the ambitious visual effects for his space opera, Star Wars. Of course, ILM went on to revolutionize the way effects and story meet in that film, and across nine other Star Wars saga movies, plus scores of other releases. Many of the innovations in visual effects that ILM has developed for the various Star Wars films shaped the industry, and continue to do so. VFX Voice takes a look back at just some of these leaps and bounds over the past 40 years. A NEW KIND OF MOTION CONTROL
STAR WARS: A FORCE FOR INNOVATION By IAN FAILES
Lucas’s vision for Star Wars (later retitled Star Wars: Episode IV – A New Hope) included a number of elaborate spaceship shots and even frenetic ‘dog fights’ in space. That led Visual Effects Supervisor John Dykstra, who Lucas brought on board early, to develop a computer-controlled motion-control camera system that could be used to film complex miniature spaceship battle scenes against bluescreen that would be then optically composited into starfields and other backgrounds. The camera system was dubbed the ‘Dykstraflex.’ Previous space scenes with miniature photography had largely been achieved with a locked-off camera, but the Dykstraflex – which consisted of a system of stepper motors and a track boom set-up – allowed for a new fluidity of movement as if a real camera operator had been capturing that action. Combined with a proprietary optical printer, the Dystraflex and subsequent motion-control systems built by ILM quickly made it a powerhouse visual effects studio.
All images copyright © 2017 Lucasfilm Ltd. TOP: The camera system was dubbed the ‘Dykstraflex,’ seen here suspended over a miniature Death Star trench for one of the bombing run shots in Star Wars: Episode IV – A New Hope (1977), the first Star Wars movie. BOTTOM: Here, a traditional 35mm negative is turned on its side, allowing a larger negative area to be used for higher resolution. VistaVision had largely become an almost dormant format until ILM resurrected it.
HIGH-QUALITY EFFECTS
The visual effects photography for A New Hope was captured on film, and it would go through several layers of duplication, compositing and finishing before being output back onto a film print. To capture their effects plates in the highest quality possible, ILM chose the VistaVision format. Here, a traditional 35mm negative is turned on its side, allowing a larger negative area to be used for higher resolution. VistaVision had largely become an almost dormant format until ILM resurrected it. In addition to allowing for higher quality, VistaVision had the benefit of using standard 35mm film, which means the negative could be processed at the same cost as regular film. ILM continued using VistaVision and adapting it to fit its various custom motion-control cameras into the late 1980s.
EMOTION WITH STOP-MOTION
From the moving holographic chess game on the Millennium Falcon in A New Hope, to the snow walking AT-ATs and Tauntauns from The Empire Strikes Back, and the horrific Rancor of Return of the Jedi, many of Star Wars’ most memorable characters were achieved with stop-motion miniature puppets. Visual Effects Supervisor Phil Tippett, VES, led the way with creature performances, and ILM eventually combined its approach to motion-control photography with stop-motion to allow for more natural camera movement and fluid animation. Tippett’s stop-motion mastery continued to influence ILM even into the visual effects for Jurassic Park. For the film, full-motion dinosaurs would ultimately be achieved in CG, but the animators also made use of a stop-motion armature device to input animation.
TOP LEFT: Filming of a miniature pod race arena for The Phantom Menace. TOP RIGHT: Visual Effects Supervisor Phil Tippett, VES, led the way with these creature performances. Tippett’s stop-motion mastery continued to influence ILM even into the visual effects for Jurassic Park. BOTTOM LEFT: This matte painting on glass under construction by Frank Ordaz was for a hangar establishing shot in Return of the Jedi. Live-action photography would be projected into a specific area of the glass left blank to produce the final shot. BOTTOM RIGHT: Lucasfilm’s Computer Division delivered a wire-frame holographic model of the Death Star under construction in Return of the Jedi.
THE ART OF MATTE PAINTING
Matte painting has been an effects art form dating back to the
THE ILM MODEL SHOP
Although it no longer exists today, the famed Model Shop was for decades a critical component of ILM responsible for the design and construction of models and miniatures, creature and puppet effects, camera systems and rigs, special effects elements and other practical sides of visual effects. The Model Shop was spun off from ILM in 2006, but will always be remembered as the leader in the field.
62 • VFXVOICE.COM WINTER 2018
PG 62-65 STAR WARS VFX.indd 62-63
WINTER 2018 VFXVOICE.COM • 63
11/22/17 4:58 PM
FILM
When George Lucas formed Industrial Light & Magic (ILM) in 1975, the director’s aim was for the company to deliver the ambitious visual effects for his space opera, Star Wars. Of course, ILM went on to revolutionize the way effects and story meet in that film, and across nine other Star Wars saga movies, plus scores of other releases. Many of the innovations in visual effects that ILM has developed for the various Star Wars films shaped the industry, and continue to do so. VFX Voice takes a look back at just some of these leaps and bounds over the past 40 years. A NEW KIND OF MOTION CONTROL
STAR WARS: A FORCE FOR INNOVATION By IAN FAILES
Lucas’s vision for Star Wars (later retitled Star Wars: Episode IV – A New Hope) included a number of elaborate spaceship shots and even frenetic ‘dog fights’ in space. That led Visual Effects Supervisor John Dykstra, who Lucas brought on board early, to develop a computer-controlled motion-control camera system that could be used to film complex miniature spaceship battle scenes against bluescreen that would be then optically composited into starfields and other backgrounds. The camera system was dubbed the ‘Dykstraflex.’ Previous space scenes with miniature photography had largely been achieved with a locked-off camera, but the Dykstraflex – which consisted of a system of stepper motors and a track boom set-up – allowed for a new fluidity of movement as if a real camera operator had been capturing that action. Combined with a proprietary optical printer, the Dystraflex and subsequent motion-control systems built by ILM quickly made it a powerhouse visual effects studio.
All images copyright © 2017 Lucasfilm Ltd. TOP: The camera system was dubbed the ‘Dykstraflex,’ seen here suspended over a miniature Death Star trench for one of the bombing run shots in Star Wars: Episode IV – A New Hope (1977), the first Star Wars movie. BOTTOM: Here, a traditional 35mm negative is turned on its side, allowing a larger negative area to be used for higher resolution. VistaVision had largely become an almost dormant format until ILM resurrected it.
HIGH-QUALITY EFFECTS
The visual effects photography for A New Hope was captured on film, and it would go through several layers of duplication, compositing and finishing before being output back onto a film print. To capture their effects plates in the highest quality possible, ILM chose the VistaVision format. Here, a traditional 35mm negative is turned on its side, allowing a larger negative area to be used for higher resolution. VistaVision had largely become an almost dormant format until ILM resurrected it. In addition to allowing for higher quality, VistaVision had the benefit of using standard 35mm film, which means the negative could be processed at the same cost as regular film. ILM continued using VistaVision and adapting it to fit its various custom motion-control cameras into the late 1980s.
EMOTION WITH STOP-MOTION
From the moving holographic chess game on the Millennium Falcon in A New Hope, to the snow walking AT-ATs and Tauntauns from The Empire Strikes Back, and the horrific Rancor of Return of the Jedi, many of Star Wars’ most memorable characters were achieved with stop-motion miniature puppets. Visual Effects Supervisor Phil Tippett, VES, led the way with creature performances, and ILM eventually combined its approach to motion-control photography with stop-motion to allow for more natural camera movement and fluid animation. Tippett’s stop-motion mastery continued to influence ILM even into the visual effects for Jurassic Park. For the film, full-motion dinosaurs would ultimately be achieved in CG, but the animators also made use of a stop-motion armature device to input animation.
TOP LEFT: Filming of a miniature pod race arena for The Phantom Menace. TOP RIGHT: Visual Effects Supervisor Phil Tippett, VES, led the way with these creature performances. Tippett’s stop-motion mastery continued to influence ILM even into the visual effects for Jurassic Park. BOTTOM LEFT: This matte painting on glass under construction by Frank Ordaz was for a hangar establishing shot in Return of the Jedi. Live-action photography would be projected into a specific area of the glass left blank to produce the final shot. BOTTOM RIGHT: Lucasfilm’s Computer Division delivered a wire-frame holographic model of the Death Star under construction in Return of the Jedi.
THE ART OF MATTE PAINTING
Matte painting has been an effects art form dating back to the
THE ILM MODEL SHOP
Although it no longer exists today, the famed Model Shop was for decades a critical component of ILM responsible for the design and construction of models and miniatures, creature and puppet effects, camera systems and rigs, special effects elements and other practical sides of visual effects. The Model Shop was spun off from ILM in 2006, but will always be remembered as the leader in the field.
62 • VFXVOICE.COM WINTER 2018
PG 62-65 STAR WARS VFX.indd 62-63
WINTER 2018 VFXVOICE.COM • 63
11/22/17 4:58 PM
FILM
earliest days of cinema, and used on many films prior to A New Hope. But ILM significantly revitalized the art of matte painting by using it in copious numbers of shots to render strange alien worlds, space environments and in many other often invisible instances. Until digital techniques took over, matte paintings were usually painted on glass or masonite with an array of paints. Matte painting done digitally is still called matte painting, but it now invariably includes 3D geometry, projection mapping and often more complex camera movements. ILM continued to innovate in these areas, too, as artists began to replace traditional methods. THE MAGIC OF CG
Having already advanced the art in motion control, miniatures, matte paintings and optical compositing, ILM would also ultimately become a leader in using computer graphics for visual effects. It began, though, with Lucasfilm’s Computer Division, which, among other projects, delivered a wire-frame holographic model of the Death Star under construction in Return of the Jedi. The graphics part of the division ultimately became Pixar. By the time The Phantom Menace came around, ILM was fully engaged in CG, and was orchestrating shows with nearly 2,000 digital visual effects shots – including fully CG characters (although that film and the other prequels still made significant use of practical effects and miniatures). The Star Wars films have often showcased the collaborative nature of ILM’s work in combining real and digital. VIDEO STARS VIRTUAL CHARACTERS
Jar Jar Binks, performed by Ahmed Best, was one of the first fully CG main characters to appear in a feature film when he made his debut in The Phantom Menace in 1999. Binks came to life through Best’s on-set performance, additional motion capture and via animation by ILM. Many more CG characters would be rendered by ILM for the remaining prequels, and into the latest Star Wars releases, where the company’s advancements in facial-capture and digital-double technologies continued to advance. For Rogue One, ILM resurrected deceased actor Peter Cushing to play the role of Governor Tarkin by having a look-a-like actor perform on set and then generating a completely photoreal version of Cushing as he appeared in A New Hope. (A younger version of Carrie Fisher as Princess Leia was also crafted).
TOP TO BOTTOM: Jar Jar Binks, performed by Ahmed Best, was one of the first fully CG main characters to appear in a feature film when he made his debut in The Phantom Menace in 1999. Binks came to life through Best’s on-set performance, additional motion capture and via animation by ILM. LM was also able to capitalize on advancements in the area of projection mapping for the Star Wars prequels. It became incredibly helpful during the making of the pod race in The Phantom Menace for the massive desert landscapes. Here, artists took photographs of miniatures made as rock formations from multiple angles. Attack of the Clones became the first major motion picture to be captured completely on digital video using a Sony and Panavision 24-frame camera (the HDW-F900). The technology would enable faster set-ups and a completely digital workflow through to visual effects.
64 • VFXVOICE.COM WINTER 2018
PG 62-65 STAR WARS VFX.indd 64-65
PROJECTION MAPPING
With new visual effects techniques tried and tested on several other projects, ILM was also able to capitalize on advancements in the area of projection mapping for the Star Wars prequels. It became incredibly helpful during the making of the pod race in The Phantom Menace for the massive desert landscapes. Artists took photographs of miniatures made as rock formations from multiple angles. The textures from the photographs were then ‘re-projected’ onto proxy geometry of the same formations. It meant that the camera could move around and zoom past these environments, while also ensuring they remained photo-real.
During production on The Phantom Menace, Lucas experimented with digital video by filming a very small portion of the film this way. The next Star Wars film, Attack of the Clones, became the first major motion picture to be captured completely on digital video using a Sony and Panavision 24-frame camera (the HDW-F900). The technology would enable faster set-ups and a completely digital workflow through to visual effects. INTO THE VIRTUAL WORLD
The Star Wars films have been catalysts for change in both analog and digital technologies, and now, most recently, in virtual immersive techniques. A few years ago, Lucasfilm launched ILMxLab, which stands alongside ILM in creating VR, AR and mixed reality experiences, many of which have been Star Wars-related. On Rogue One, too, ILM developed a virtual camera system that took advantage of motion capture and tracking and an existing virtual production workflow to allow the director Gareth Edwards to plan and stage shots on the fly with just a hand-held controller. The result was real-time feedback for constructing much more dynamic shots in the final film.
LEFT TO RIGHT: ILM also developed a virtual camera system on Rogue One that took advantage of motion capture and tracking and an existing virtual production workflow to allow the director Gareth Edwards to plan and stage shots on the fly with just a hand-held controller. The result was real-time feedback for constructing much more dynamic shots in the final film. Phil Tippett, VES, manipulates a miniature AT-AT for a Hoth battle stop-motion scene in The Empire Strikes Back. Jar Jar Binks from The Phantom Menace. Soon many CG characters would permeate the Star Wars saga of films – this had even begun for the ‘Special Edition’ versions of the original releases in which Lucas added and adjusted several scenes to the original Star Wars trilogy. On set, Ahmed Best played Jar Jar Binks in partial prosthetics and a head-piece that was elongated to allow for proper eyelines. By the time The Force Awakens was released, ILM had established a strong photoreal pipeline that scenes like this could be completely CG. However, they were often still informed by the original miniatures and location shoots.
WINTER 2018 VFXVOICE.COM • 65
11/22/17 4:58 PM
FILM
earliest days of cinema, and used on many films prior to A New Hope. But ILM significantly revitalized the art of matte painting by using it in copious numbers of shots to render strange alien worlds, space environments and in many other often invisible instances. Until digital techniques took over, matte paintings were usually painted on glass or masonite with an array of paints. Matte painting done digitally is still called matte painting, but it now invariably includes 3D geometry, projection mapping and often more complex camera movements. ILM continued to innovate in these areas, too, as artists began to replace traditional methods. THE MAGIC OF CG
Having already advanced the art in motion control, miniatures, matte paintings and optical compositing, ILM would also ultimately become a leader in using computer graphics for visual effects. It began, though, with Lucasfilm’s Computer Division, which, among other projects, delivered a wire-frame holographic model of the Death Star under construction in Return of the Jedi. The graphics part of the division ultimately became Pixar. By the time The Phantom Menace came around, ILM was fully engaged in CG, and was orchestrating shows with nearly 2,000 digital visual effects shots – including fully CG characters (although that film and the other prequels still made significant use of practical effects and miniatures). The Star Wars films have often showcased the collaborative nature of ILM’s work in combining real and digital. VIDEO STARS VIRTUAL CHARACTERS
Jar Jar Binks, performed by Ahmed Best, was one of the first fully CG main characters to appear in a feature film when he made his debut in The Phantom Menace in 1999. Binks came to life through Best’s on-set performance, additional motion capture and via animation by ILM. Many more CG characters would be rendered by ILM for the remaining prequels, and into the latest Star Wars releases, where the company’s advancements in facial-capture and digital-double technologies continued to advance. For Rogue One, ILM resurrected deceased actor Peter Cushing to play the role of Governor Tarkin by having a look-a-like actor perform on set and then generating a completely photoreal version of Cushing as he appeared in A New Hope. (A younger version of Carrie Fisher as Princess Leia was also crafted).
TOP TO BOTTOM: Jar Jar Binks, performed by Ahmed Best, was one of the first fully CG main characters to appear in a feature film when he made his debut in The Phantom Menace in 1999. Binks came to life through Best’s on-set performance, additional motion capture and via animation by ILM. LM was also able to capitalize on advancements in the area of projection mapping for the Star Wars prequels. It became incredibly helpful during the making of the pod race in The Phantom Menace for the massive desert landscapes. Here, artists took photographs of miniatures made as rock formations from multiple angles. Attack of the Clones became the first major motion picture to be captured completely on digital video using a Sony and Panavision 24-frame camera (the HDW-F900). The technology would enable faster set-ups and a completely digital workflow through to visual effects.
64 • VFXVOICE.COM WINTER 2018
PG 62-65 STAR WARS VFX.indd 64-65
PROJECTION MAPPING
With new visual effects techniques tried and tested on several other projects, ILM was also able to capitalize on advancements in the area of projection mapping for the Star Wars prequels. It became incredibly helpful during the making of the pod race in The Phantom Menace for the massive desert landscapes. Artists took photographs of miniatures made as rock formations from multiple angles. The textures from the photographs were then ‘re-projected’ onto proxy geometry of the same formations. It meant that the camera could move around and zoom past these environments, while also ensuring they remained photo-real.
During production on The Phantom Menace, Lucas experimented with digital video by filming a very small portion of the film this way. The next Star Wars film, Attack of the Clones, became the first major motion picture to be captured completely on digital video using a Sony and Panavision 24-frame camera (the HDW-F900). The technology would enable faster set-ups and a completely digital workflow through to visual effects. INTO THE VIRTUAL WORLD
The Star Wars films have been catalysts for change in both analog and digital technologies, and now, most recently, in virtual immersive techniques. A few years ago, Lucasfilm launched ILMxLab, which stands alongside ILM in creating VR, AR and mixed reality experiences, many of which have been Star Wars-related. On Rogue One, too, ILM developed a virtual camera system that took advantage of motion capture and tracking and an existing virtual production workflow to allow the director Gareth Edwards to plan and stage shots on the fly with just a hand-held controller. The result was real-time feedback for constructing much more dynamic shots in the final film.
LEFT TO RIGHT: ILM also developed a virtual camera system on Rogue One that took advantage of motion capture and tracking and an existing virtual production workflow to allow the director Gareth Edwards to plan and stage shots on the fly with just a hand-held controller. The result was real-time feedback for constructing much more dynamic shots in the final film. Phil Tippett, VES, manipulates a miniature AT-AT for a Hoth battle stop-motion scene in The Empire Strikes Back. Jar Jar Binks from The Phantom Menace. Soon many CG characters would permeate the Star Wars saga of films – this had even begun for the ‘Special Edition’ versions of the original releases in which Lucas added and adjusted several scenes to the original Star Wars trilogy. On set, Ahmed Best played Jar Jar Binks in partial prosthetics and a head-piece that was elongated to allow for proper eyelines. By the time The Force Awakens was released, ILM had established a strong photoreal pipeline that scenes like this could be completely CG. However, they were often still informed by the original miniatures and location shoots.
WINTER 2018 VFXVOICE.COM • 65
11/22/17 4:58 PM
PROFILE
ENGINEERING MOVIE MAGIC WITH CHRIS CORBOULD By KEVIN H. MARTIN
For over 40 years, Chris Corbould has been involved in delivering a very hands-on version of effects for some of the most spectacular action movies of all time. His entry into the field came while he was still in his teens during a summer stint on the rock musical Tommy, for which his uncle Colin Chilvers (later an Oscarand BAFTA-winner for Superman) was doing floor effects. “After that, I realized special effects was something I wanted to do all the time,” Corbould explains, noting he soon landed a gig at Effects Associates, based out of UK’s Pinewood Studios. “Once there, I was apprenticed to a number of veterans and set to various tasks. I might be shaping something on a lathe one day, then helping build a specially engineered rig the next.” During his tenure at Effects Associates, Corbould got firsthand exposure to many venerable old-school effects gags. “I learned the tricks for producing various atmospherics like rain and snow, and later on found out about the pyrotechnic side of things. My early career was one big learning curve, and all that provided knowledge that put me in good stead later. After 10 years or so, when you’ve worked with enough people to figure out the right and the wrong way to do things, you feel secure in starting to build on those established methods.” Corbould was a trainee on 1977’s The Spy Who Loved Me, devising “standalone pieces, like ski poles that turn into guns.” With only one exception, he has worked on every Eon-made James Bond feature since, learning from miniature and physical effects maestros Derek Meddings and John Richardson, VES. “On Moonraker there were lots of interesting jobs for John,” he recalls. “Bond’s boat required the addition of a hang glider that deployed from the roof, plus there were mines that launched out the back. Decades later, when I attended the London Museum ‘Bond in Motion’ exhibit, I found it rather funny that in looking at this boat again, I could see all these pieces of welding I’d done on it.” AIRBORNE ADVENTURES
Down through the years, Corbould’s Bond assignments were a varied and ambitious bunch, including converting Aston Martins and Jaguars to four-wheel drive for Die Another Day’s ice chase and even articulating a four-story sinking house for the Venice-set climax to Casino Royale, which required a 100-ton rig. But during the ‘80s – a period when the Bond budgets, which had doubled on Moonraker, were more closely managed – he helped implement a number of ingenious yet relatively inexpensive solutions. For Your Eyes Only opens with Bond having to commandeer a runaway TOP: Corbould ochestrated the sinking of a four-story house for the Venice-set climax in Casino Royale. Model of Venice house is seen in background. (Photo copyright © 2006 Danjaq LLC and United Artists Film Corporation. All rights reserved.) BOTTOM: Corbould on the set of Skyfall. (Photo courtesy of Chris Corbould. Copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.) OPPOSITE: Corbould with Lara Croft’s training robot S.I.M.O.N. from the original Lara Croft: Tomb Raider. The robot was an early example of how computer-generated VFX embellished what Corbould accomplished practically on stage with the articulated mockup. (Photo copyright © 2001 Paramount Pictures. All rights reserved.)
66 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 66-67
WINTER 2018 VFXVOICE.COM • 67
11/22/17 4:58 PM
PROFILE
ENGINEERING MOVIE MAGIC WITH CHRIS CORBOULD By KEVIN H. MARTIN
For over 40 years, Chris Corbould has been involved in delivering a very hands-on version of effects for some of the most spectacular action movies of all time. His entry into the field came while he was still in his teens during a summer stint on the rock musical Tommy, for which his uncle Colin Chilvers (later an Oscarand BAFTA-winner for Superman) was doing floor effects. “After that, I realized special effects was something I wanted to do all the time,” Corbould explains, noting he soon landed a gig at Effects Associates, based out of UK’s Pinewood Studios. “Once there, I was apprenticed to a number of veterans and set to various tasks. I might be shaping something on a lathe one day, then helping build a specially engineered rig the next.” During his tenure at Effects Associates, Corbould got firsthand exposure to many venerable old-school effects gags. “I learned the tricks for producing various atmospherics like rain and snow, and later on found out about the pyrotechnic side of things. My early career was one big learning curve, and all that provided knowledge that put me in good stead later. After 10 years or so, when you’ve worked with enough people to figure out the right and the wrong way to do things, you feel secure in starting to build on those established methods.” Corbould was a trainee on 1977’s The Spy Who Loved Me, devising “standalone pieces, like ski poles that turn into guns.” With only one exception, he has worked on every Eon-made James Bond feature since, learning from miniature and physical effects maestros Derek Meddings and John Richardson, VES. “On Moonraker there were lots of interesting jobs for John,” he recalls. “Bond’s boat required the addition of a hang glider that deployed from the roof, plus there were mines that launched out the back. Decades later, when I attended the London Museum ‘Bond in Motion’ exhibit, I found it rather funny that in looking at this boat again, I could see all these pieces of welding I’d done on it.” AIRBORNE ADVENTURES
Down through the years, Corbould’s Bond assignments were a varied and ambitious bunch, including converting Aston Martins and Jaguars to four-wheel drive for Die Another Day’s ice chase and even articulating a four-story sinking house for the Venice-set climax to Casino Royale, which required a 100-ton rig. But during the ‘80s – a period when the Bond budgets, which had doubled on Moonraker, were more closely managed – he helped implement a number of ingenious yet relatively inexpensive solutions. For Your Eyes Only opens with Bond having to commandeer a runaway TOP: Corbould ochestrated the sinking of a four-story house for the Venice-set climax in Casino Royale. Model of Venice house is seen in background. (Photo copyright © 2006 Danjaq LLC and United Artists Film Corporation. All rights reserved.) BOTTOM: Corbould on the set of Skyfall. (Photo courtesy of Chris Corbould. Copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.) OPPOSITE: Corbould with Lara Croft’s training robot S.I.M.O.N. from the original Lara Croft: Tomb Raider. The robot was an early example of how computer-generated VFX embellished what Corbould accomplished practically on stage with the articulated mockup. (Photo copyright © 2001 Paramount Pictures. All rights reserved.)
66 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 66-67
WINTER 2018 VFXVOICE.COM • 67
11/22/17 4:58 PM
PROFILE
helicopter flying through a partially-constructed building. “We built the copter to operate from a hydraulic cantilevered arm,” Corbould reveals, “and that attached to a track, so we could give the appearance that Bond was actually within a full aircraft at this location. “Then there was a whole separate issue for the follow-up,” Corbould continues, “when he impales a wheelchair-bound villain with the helicopter’s landing skid before dropping him down a chimney, all of which was also done practically and full-size. That required us to make a wheelchair that complied with all CAA [Civil Aviation Authority] regulations and specifications. By the time we were done, they could have legally flown that copter and wheelchair anywhere round the whole world!” As is often the case with the most reliable cinematic collaborators, Corbould believes that a good script has to serve as the primary motivator. “You always read it thinking about how you can make it better,” he notes, “but that read can really inform your ideas, they have to spring from that core.” His first supervisory assignment, on the Pierce Brosnan 007 debut GoldenEye, involved just such an embellishment in response to a director’s inquiry, which resulted in a planned-for motorcycle chase in Russia to instead involve a full-size tank careening through Moscow. TOP TO BOTTOM: Corbould did welding on John Richardson’s boat in Moonraker, learning miniature and physical effects from maestros Derek Meddings and John Richardson, VES. Here is the boat at the National Motor Museum in the UK as part of the traveling “Bond in Motion” exhibition. (Photo copyright © 1979 United Artists. All rights reserved.) Timothy Dalton’s second time out as 007 in Licence to Kill climaxes with a sustained tanker truck chase involving massive conflagrations. While some of the pyrotechnics look like they were captured on the fly almost documentary-style, in actuality Corbould prepped extensively, with controls in place to both stop and reset the careening vehicles. (Photo copyright © 1989 Danjaq, MGM and United Aetists. All rights reserved.) Corbould has been responsible for many a mighty fireball, climaxing with a Spectre pyrotechnic event when Bond decimates Blofeld’s stronghold with a single well-placed shot. (Photo copyright © 2015 Danjaq, MGM, Columbia Pictures and Sony Pictures Entertainment. All rights reserved.)
68 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 68-69
Achieved partly on location, with the rest staged back in the UK, the tank chase was one of the more memorable set-pieces of the series, and shows how through careful planning, the impact of location work can be maximized without spending unduly on travel. “For A View to a Kill, we again did most of the finale at Pinewood as a blimp hits the Golden Gate bridge,” he notes. “We only needed a few establishing shots on location, having built the whole underbelly of the vessel out of steel and hung it off a huge crane, so we were able to do this with full studio controls in place.” VEHICULAR PYROMANIA
For 1989’s Licence to Kill, Corbould was charged with coordinating one of the most elaborate and sustained chases in the long-running Bond series, involving a number of tanker trucks, nearly all of which wind up exploding as Bond carries out an uncharacteristically personal vengeance on his quarry. “Initially just finding the trailers and the tankers to shoot in Mexico was a problem,” Corbould admits. “We eventually had eight trailers and six tractor units, which had to be modified so they looked alike. There was also the matter of enabling stunt guys to do what they needed. That included creating a special release mechanism on one vehicle that permitted the driver to exit and get away to safety before a particularly large explosion.” At one point, Bond cuts a fully-loaded tanker loose from its cab sending it careening down the side of a hill to slam into another tanker, causing them both to erupt in flames. When viewing the mayhem, one might conclude that the shot was executed with a ‘hope for the best’ freehand style, but in truth, Corbould carefully prepared and controlled every aspect of the tanker’s bumpy descent. “There was a link between the truck going up the hill and the trailer coming down it,” he reveals. “We had to sink 20 tons of concrete before attaching a pulley at the top of the hill, which ensured we could stop a move or reset as needed.” Corbould has been responsible for many a mighty fireball, climaxing with a Spectre pyrotechnic event that went right into the Guinness Book of World Records, when Bond decimates Blofeld’s stronghold with a single well-placed shot. “Again, it’s very much a matter of figuring out every aspect in advance,” he acknowledges. “We’ll do five or 10 or sometimes even 20 tests beforehand. Seventy-five percent of our job is testing, not just to make sure we get the look right – a factor that is also dependent on weather, as the biggest blasts look more colorful when shot in overcast conditions – but also to ensure it will all come off safely.” Corbould fulfilled a longstanding ambition with the Joker’s casual destruction of a hospital on Christopher Nolan’s The Dark Knight. “We had a very narrow window in which to shoot that one,” he says. “There was an industrial railway line 200 yards behind the hospital, and under no circumstances were they going to stop their trains, so we couldn’t wait for ideal weather conditions. Even so, I’d always wanted to blow up a building, and Chris afforded me that luxury.” He states that when on private property, testing with seismic monitors is essential, as is providing experts who can examine the data. “I can tell them that there are buildings 150 yards away and get a reliable answer about whether that will cause
TOP: The truck flip in The Dark Knight took place on a Chicago street located in the banking district that was barely the width of this trailer’s length. Operating within the confines of a narrow street, Corbould managed to flip a truck completely over without damaging any of the nearby bank buildings. (Photo copyright © 2008 Warner Bros. Pictures. All rights reserved.)
“I really like working with those who know how to balance the various aspects of effects work. There’s one mindset to do as much as possible in post, and while I admire the incredible things that can be achieved through visual effects, if you can use them in conjunction with a live effect, that can truly produce some very memorable, incredible results.” —Chris Corbould
WINTER 2018 VFXVOICE.COM • 69
11/22/17 4:58 PM
PROFILE
helicopter flying through a partially-constructed building. “We built the copter to operate from a hydraulic cantilevered arm,” Corbould reveals, “and that attached to a track, so we could give the appearance that Bond was actually within a full aircraft at this location. “Then there was a whole separate issue for the follow-up,” Corbould continues, “when he impales a wheelchair-bound villain with the helicopter’s landing skid before dropping him down a chimney, all of which was also done practically and full-size. That required us to make a wheelchair that complied with all CAA [Civil Aviation Authority] regulations and specifications. By the time we were done, they could have legally flown that copter and wheelchair anywhere round the whole world!” As is often the case with the most reliable cinematic collaborators, Corbould believes that a good script has to serve as the primary motivator. “You always read it thinking about how you can make it better,” he notes, “but that read can really inform your ideas, they have to spring from that core.” His first supervisory assignment, on the Pierce Brosnan 007 debut GoldenEye, involved just such an embellishment in response to a director’s inquiry, which resulted in a planned-for motorcycle chase in Russia to instead involve a full-size tank careening through Moscow. TOP TO BOTTOM: Corbould did welding on John Richardson’s boat in Moonraker, learning miniature and physical effects from maestros Derek Meddings and John Richardson, VES. Here is the boat at the National Motor Museum in the UK as part of the traveling “Bond in Motion” exhibition. (Photo copyright © 1979 United Artists. All rights reserved.) Timothy Dalton’s second time out as 007 in Licence to Kill climaxes with a sustained tanker truck chase involving massive conflagrations. While some of the pyrotechnics look like they were captured on the fly almost documentary-style, in actuality Corbould prepped extensively, with controls in place to both stop and reset the careening vehicles. (Photo copyright © 1989 Danjaq, MGM and United Aetists. All rights reserved.) Corbould has been responsible for many a mighty fireball, climaxing with a Spectre pyrotechnic event when Bond decimates Blofeld’s stronghold with a single well-placed shot. (Photo copyright © 2015 Danjaq, MGM, Columbia Pictures and Sony Pictures Entertainment. All rights reserved.)
68 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 68-69
Achieved partly on location, with the rest staged back in the UK, the tank chase was one of the more memorable set-pieces of the series, and shows how through careful planning, the impact of location work can be maximized without spending unduly on travel. “For A View to a Kill, we again did most of the finale at Pinewood as a blimp hits the Golden Gate bridge,” he notes. “We only needed a few establishing shots on location, having built the whole underbelly of the vessel out of steel and hung it off a huge crane, so we were able to do this with full studio controls in place.” VEHICULAR PYROMANIA
For 1989’s Licence to Kill, Corbould was charged with coordinating one of the most elaborate and sustained chases in the long-running Bond series, involving a number of tanker trucks, nearly all of which wind up exploding as Bond carries out an uncharacteristically personal vengeance on his quarry. “Initially just finding the trailers and the tankers to shoot in Mexico was a problem,” Corbould admits. “We eventually had eight trailers and six tractor units, which had to be modified so they looked alike. There was also the matter of enabling stunt guys to do what they needed. That included creating a special release mechanism on one vehicle that permitted the driver to exit and get away to safety before a particularly large explosion.” At one point, Bond cuts a fully-loaded tanker loose from its cab sending it careening down the side of a hill to slam into another tanker, causing them both to erupt in flames. When viewing the mayhem, one might conclude that the shot was executed with a ‘hope for the best’ freehand style, but in truth, Corbould carefully prepared and controlled every aspect of the tanker’s bumpy descent. “There was a link between the truck going up the hill and the trailer coming down it,” he reveals. “We had to sink 20 tons of concrete before attaching a pulley at the top of the hill, which ensured we could stop a move or reset as needed.” Corbould has been responsible for many a mighty fireball, climaxing with a Spectre pyrotechnic event that went right into the Guinness Book of World Records, when Bond decimates Blofeld’s stronghold with a single well-placed shot. “Again, it’s very much a matter of figuring out every aspect in advance,” he acknowledges. “We’ll do five or 10 or sometimes even 20 tests beforehand. Seventy-five percent of our job is testing, not just to make sure we get the look right – a factor that is also dependent on weather, as the biggest blasts look more colorful when shot in overcast conditions – but also to ensure it will all come off safely.” Corbould fulfilled a longstanding ambition with the Joker’s casual destruction of a hospital on Christopher Nolan’s The Dark Knight. “We had a very narrow window in which to shoot that one,” he says. “There was an industrial railway line 200 yards behind the hospital, and under no circumstances were they going to stop their trains, so we couldn’t wait for ideal weather conditions. Even so, I’d always wanted to blow up a building, and Chris afforded me that luxury.” He states that when on private property, testing with seismic monitors is essential, as is providing experts who can examine the data. “I can tell them that there are buildings 150 yards away and get a reliable answer about whether that will cause
TOP: The truck flip in The Dark Knight took place on a Chicago street located in the banking district that was barely the width of this trailer’s length. Operating within the confines of a narrow street, Corbould managed to flip a truck completely over without damaging any of the nearby bank buildings. (Photo copyright © 2008 Warner Bros. Pictures. All rights reserved.)
“I really like working with those who know how to balance the various aspects of effects work. There’s one mindset to do as much as possible in post, and while I admire the incredible things that can be achieved through visual effects, if you can use them in conjunction with a live effect, that can truly produce some very memorable, incredible results.” —Chris Corbould
WINTER 2018 VFXVOICE.COM • 69
11/22/17 4:58 PM
PROFILE
TOP: Christopher Nolan’s Inception put a spin on a hotel hallway battle. Corbould worked with engineers to devise a huge set capable of rotating up to 6 rpm and still accommodate a remotely operated track-mounted camera. (Photo copyright © 2010 Warner Bros. Pictures. All rights reserved.) BOTTOM: At the behest of GoldenEye director Martin Campbell, Corbould upgraded a proposed motorcycle chase in Moscow to feature 007 commandeering a tank. Detailed storyboarding allowed production to shoot only a fraction of the sequence in Russia, with the rest filmed in the UK. (Photo copyright © 1995 MGM and United Artists. All rights reserved.)
a problem. You can then tailor the blast to the specific limitations of a location, which means addressing environmental concerns; if you blow something up near water, you can’t use materials that will pollute.” BOYS WITH TOYS
Prefacing further discussion of his work on Nolan’s films, Corbould declares his admiration for certain types of directors. “I really like working with those who know how to balance the various aspects of effects work,” he states. “There’s one mindset to do as much as possible in post, and while I admire the incredible things that can be achieved through visual effects, if you can use them in conjunction with a live effect, that can truly produce some very memorable, incredible results. With the Bond films and with Chris, I think there is a ‘Wow!’ factor that really comes through with big stunts and in-camera effects. And now, not having to hide safety cables, knowing they can be painted out, makes the work even more safe while upping the very ambitious look and scope.” He cites Tomb Raider’s training robot as an early example of how
“When [Bond in For Your Eyes Only] impales a wheelchair-bound villain with the helicopter’s landing skid before dropping him down a chimney, all of which was also done practically and full-size, that required us to make a wheelchair that complied with all CAA [Civil Aviation Authority] regulations and specifications. By the time we were done, they could have legally flown that copter and wheelchair anywhere round the whole world!” —Chris Corbould
70 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 71
computer-generated VFX embellished what he could accomplish practically on stage with the articulated mockup, while also admitting there was a certain familiarity in aspects of the bot’s design, which clearly belies any thought that there could be a man inside providing movement. “It’s great fun to make up a robot. That one calls back in some ways to the robot I worked on [with Colin Chilvers] on Saturn 3,” he chuckles. “It seems like every director and production designer has an idea about what the machine should look like and how it should behave, and CGI helped us change the equation a bit on Tomb Raider.” After meeting Nolan during prep for Batman Begins, Corbould was invited to view a model of the new proposed Batmobile, fashioned kitbash style by Production Designer Nathan Crowley. “Chris asked, ‘Can you build that full-size?’ I said that I could, yeah, but there would have be some compromises made. There was no front axle, so I had to change that, and then there was the challenge to get it traveling 50 miles per hour. In the end, my guys got it up to 100 [mph] – and when the vehicle did its big jumps through space, it could land intact, allowing us to stage these maneuvers 30 times safely and without damage. It was only due to sheer determination on the part of my crew that it wound up so solidly built. I was very proud of what we got on camera, and to be honest, Chris seemed quite over the moon about the results.” He was initially less than optimistic about fulfilling Nolan’s intent for the truck flip in The Dark Knight, owing to the location selected. “Chris chose a street in Chicago that was located in the banking district and it was barely the width of this trailer’s length, so we had to be 100% sure it was going to go exactly as planned, and not flip or skid sideways into any of these buildings. The day after we flipped the truck, Chris was upset because somebody got a picture of it happening. And I told him I was thrilled about that, because people would probably want to see the movie more after knowing that this kind of thing was done for real.” Another massive engineering challenge lay ahead for Corbould for his Oscar-winning turn in the director’s Inception. The fight
TOP: Beginning with Skyfall, Corbould added the title of 2nd unit director to his many skills, shooting the helicopter attack on Skyfall manor. Coordinating with VFX Supervisor Steve Begg, the on-screen excitement was a mix of full-size practical gags and miniature pyrotechnics involving large-scale copter and auto models. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.) BOTTOM: A blimp hits San Francisco’s Golden Gate Bridge in View to a Kill. After establishing shots for the finale were done atop the bridge, Corbould and crew built a full underbelly for the villain’s blimp at Pinewood Studios, then suspended it from a huge crane above a partial bridge mockup. (Photo copyright © 1985 MGM/UA Entertainment Co. All rights reserved.)
WINTER 2018 VFXVOICE.COM • 71
11/22/17 4:58 PM
PROFILE
TOP: Christopher Nolan’s Inception put a spin on a hotel hallway battle. Corbould worked with engineers to devise a huge set capable of rotating up to 6 rpm and still accommodate a remotely operated track-mounted camera. (Photo copyright © 2010 Warner Bros. Pictures. All rights reserved.) BOTTOM: At the behest of GoldenEye director Martin Campbell, Corbould upgraded a proposed motorcycle chase in Moscow to feature 007 commandeering a tank. Detailed storyboarding allowed production to shoot only a fraction of the sequence in Russia, with the rest filmed in the UK. (Photo copyright © 1995 MGM and United Artists. All rights reserved.)
a problem. You can then tailor the blast to the specific limitations of a location, which means addressing environmental concerns; if you blow something up near water, you can’t use materials that will pollute.” BOYS WITH TOYS
Prefacing further discussion of his work on Nolan’s films, Corbould declares his admiration for certain types of directors. “I really like working with those who know how to balance the various aspects of effects work,” he states. “There’s one mindset to do as much as possible in post, and while I admire the incredible things that can be achieved through visual effects, if you can use them in conjunction with a live effect, that can truly produce some very memorable, incredible results. With the Bond films and with Chris, I think there is a ‘Wow!’ factor that really comes through with big stunts and in-camera effects. And now, not having to hide safety cables, knowing they can be painted out, makes the work even more safe while upping the very ambitious look and scope.” He cites Tomb Raider’s training robot as an early example of how
“When [Bond in For Your Eyes Only] impales a wheelchair-bound villain with the helicopter’s landing skid before dropping him down a chimney, all of which was also done practically and full-size, that required us to make a wheelchair that complied with all CAA [Civil Aviation Authority] regulations and specifications. By the time we were done, they could have legally flown that copter and wheelchair anywhere round the whole world!” —Chris Corbould
70 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 71
computer-generated VFX embellished what he could accomplish practically on stage with the articulated mockup, while also admitting there was a certain familiarity in aspects of the bot’s design, which clearly belies any thought that there could be a man inside providing movement. “It’s great fun to make up a robot. That one calls back in some ways to the robot I worked on [with Colin Chilvers] on Saturn 3,” he chuckles. “It seems like every director and production designer has an idea about what the machine should look like and how it should behave, and CGI helped us change the equation a bit on Tomb Raider.” After meeting Nolan during prep for Batman Begins, Corbould was invited to view a model of the new proposed Batmobile, fashioned kitbash style by Production Designer Nathan Crowley. “Chris asked, ‘Can you build that full-size?’ I said that I could, yeah, but there would have be some compromises made. There was no front axle, so I had to change that, and then there was the challenge to get it traveling 50 miles per hour. In the end, my guys got it up to 100 [mph] – and when the vehicle did its big jumps through space, it could land intact, allowing us to stage these maneuvers 30 times safely and without damage. It was only due to sheer determination on the part of my crew that it wound up so solidly built. I was very proud of what we got on camera, and to be honest, Chris seemed quite over the moon about the results.” He was initially less than optimistic about fulfilling Nolan’s intent for the truck flip in The Dark Knight, owing to the location selected. “Chris chose a street in Chicago that was located in the banking district and it was barely the width of this trailer’s length, so we had to be 100% sure it was going to go exactly as planned, and not flip or skid sideways into any of these buildings. The day after we flipped the truck, Chris was upset because somebody got a picture of it happening. And I told him I was thrilled about that, because people would probably want to see the movie more after knowing that this kind of thing was done for real.” Another massive engineering challenge lay ahead for Corbould for his Oscar-winning turn in the director’s Inception. The fight
TOP: Beginning with Skyfall, Corbould added the title of 2nd unit director to his many skills, shooting the helicopter attack on Skyfall manor. Coordinating with VFX Supervisor Steve Begg, the on-screen excitement was a mix of full-size practical gags and miniature pyrotechnics involving large-scale copter and auto models. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.) BOTTOM: A blimp hits San Francisco’s Golden Gate Bridge in View to a Kill. After establishing shots for the finale were done atop the bridge, Corbould and crew built a full underbelly for the villain’s blimp at Pinewood Studios, then suspended it from a huge crane above a partial bridge mockup. (Photo copyright © 1985 MGM/UA Entertainment Co. All rights reserved.)
WINTER 2018 VFXVOICE.COM • 71
11/22/17 4:58 PM
PROFILE
TOP: Corbould realized a longtime goal – demolishing an entire building – for a scene when the Joker blows up a hospital in The Dark Knight. (Photo copyright © 2008 Warner Bros. Pictures. All rights reserved.) BOTTOM: Corbould shooting the helicopter attack on Skyfall manor. Coordinating with VFX Supervisor Steve Begg, the on-screen excitement was a mix of full-size practical gags and miniature pyrotechnics involving large-scale copter and auto models. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.)
72 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 73
sequence in which gravity becomes fluid within a hotel room hallway looks marvelously photorealistic, owing in no small part to Corbould’s efforts to allow the filmmakers to achieve nearly all of the sequence in-camera. “The action takes place in a 10-footwide corridor running 30 feet in length,” he reports. “It required a close collaboration with electricians, using slip rings to power the apparatus, which turned at 6 rpm.” While that might sound like a modest rate of speed, it was anything but. “We found that the actors, including Joseph GordonLevitt, who was completely and utterly committed to pulling this sequence off, had to stop acting, because at that velocity, it seemed to become all about self-preservation!” Corbould also had to provide a means for the camera to negotiate the set in a sufficiently dynamic manner. “Chris and [Director of Photography] Wally Pfister [ASC] had requirements for the camera to move as well, so we mounted a track beneath the floor that allowed it to run remotely as everything got spun round.” He feels strongly that his crew of regulars, some of which have been with him for 20 years, are in many ways “the real stars.” With the Bond films requiring effects teams ranging from 70 to 120 crewmembers to complete the most ambitious sequences, Corbould relies on key team members. On Spectre, Lead Workshop Supervisor Kevin Herd helped Corbould tackle another building collapse, seen during the film’s pre-title sequence, while Senior Effects Technician Dan Homewood and Effects Designer Jason Leinster oversaw the construction effort for an underground train crash in Skyfall. That required a pair of carriages to be attached to an overhead monorail track spanning most of Pinewood’s 007 stage, with cables linking them to a tractor that brought the train up to speed for its crash through the expansive set. Beginning with Skyfall, Corbould added the title of 2nd unit
director to his many skills. “I’ve wanted to try this for a while, and Sam Mendes trusted me enough to let me shoot the helicopter attack on Skyfall manor along with various blowings up. To have control over filming these action scenes is something I quite honestly adore, I just love it! Achieving the vision of the first unit director, and hearing him say, ‘That’s exactly what I was looking for!’ tells me that I’ve done my job.” The Mendes/Corbould teaming was repeated on Spectre, and Corbould will be directing second unit on Disney’s 2018 Christopher Robin project [helmed by Quantum director Marc Forster.] Beyond that, he is also slated to handle these duties on producer Barbara Broccoli’s next non-Bond film, The Rhythm Section. “Back in the early days of CGI, people said everything would go that route, leaving us physical effects guys with only a few years left,” says Corbould. “But in the interim it seems we found a happy balance between special effects and CG. The ultimate goal remains to make something spectacular, so I believe in using whatever combination of tools that work best to create that illusion. If that means CG for one particular aspect and practical for another, then by all means use them in concert, use them all.”
LEFT TO RIGHT: Corbould with his Batmobile for 2005’s Batman Begins. Apart from the engine and gearbox, everything was built from scratch. The Tumbler weighed 2.5 tons and could reach speeds in excess of 60 mph. It did big jumps through space and landed intact, allowing crew to stage maneuvers 30 times safely and without damage. (Photo copyright © 2005 Warner Bros. Pictures. All rights reserved.) Corbould’s beast of a Batmobile in Batman Begins. (Photo copyright © 2005 Warner Bros. Pictures. All rights reserved.) In For Your Eyes Only, Corbould enables Bond to impale a wheelchair-bound villain with a helicopter landing skid. “That required us to make a wheelchair that complied with CAA regulations,” says Corbould. “They could have legally flown that wheelchair round the whole world!” Shooting the underground train crash in Skyfall required a pair of carriages to be attached to an overhead monorail track spanning most of Pinewood’s 007 stage, with cables linking them to a tractor that brought the train up to speed for its crash through the expansive set. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.)
WINTER 2018 VFXVOICE.COM • 73
11/22/17 4:58 PM
PROFILE
TOP: Corbould realized a longtime goal – demolishing an entire building – for a scene when the Joker blows up a hospital in The Dark Knight. (Photo copyright © 2008 Warner Bros. Pictures. All rights reserved.) BOTTOM: Corbould shooting the helicopter attack on Skyfall manor. Coordinating with VFX Supervisor Steve Begg, the on-screen excitement was a mix of full-size practical gags and miniature pyrotechnics involving large-scale copter and auto models. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.)
72 • VFXVOICE.COM WINTER 2018
PG 66-73 CHRIS CORBOULD.indd 73
sequence in which gravity becomes fluid within a hotel room hallway looks marvelously photorealistic, owing in no small part to Corbould’s efforts to allow the filmmakers to achieve nearly all of the sequence in-camera. “The action takes place in a 10-footwide corridor running 30 feet in length,” he reports. “It required a close collaboration with electricians, using slip rings to power the apparatus, which turned at 6 rpm.” While that might sound like a modest rate of speed, it was anything but. “We found that the actors, including Joseph GordonLevitt, who was completely and utterly committed to pulling this sequence off, had to stop acting, because at that velocity, it seemed to become all about self-preservation!” Corbould also had to provide a means for the camera to negotiate the set in a sufficiently dynamic manner. “Chris and [Director of Photography] Wally Pfister [ASC] had requirements for the camera to move as well, so we mounted a track beneath the floor that allowed it to run remotely as everything got spun round.” He feels strongly that his crew of regulars, some of which have been with him for 20 years, are in many ways “the real stars.” With the Bond films requiring effects teams ranging from 70 to 120 crewmembers to complete the most ambitious sequences, Corbould relies on key team members. On Spectre, Lead Workshop Supervisor Kevin Herd helped Corbould tackle another building collapse, seen during the film’s pre-title sequence, while Senior Effects Technician Dan Homewood and Effects Designer Jason Leinster oversaw the construction effort for an underground train crash in Skyfall. That required a pair of carriages to be attached to an overhead monorail track spanning most of Pinewood’s 007 stage, with cables linking them to a tractor that brought the train up to speed for its crash through the expansive set. Beginning with Skyfall, Corbould added the title of 2nd unit
director to his many skills. “I’ve wanted to try this for a while, and Sam Mendes trusted me enough to let me shoot the helicopter attack on Skyfall manor along with various blowings up. To have control over filming these action scenes is something I quite honestly adore, I just love it! Achieving the vision of the first unit director, and hearing him say, ‘That’s exactly what I was looking for!’ tells me that I’ve done my job.” The Mendes/Corbould teaming was repeated on Spectre, and Corbould will be directing second unit on Disney’s 2018 Christopher Robin project [helmed by Quantum director Marc Forster.] Beyond that, he is also slated to handle these duties on producer Barbara Broccoli’s next non-Bond film, The Rhythm Section. “Back in the early days of CGI, people said everything would go that route, leaving us physical effects guys with only a few years left,” says Corbould. “But in the interim it seems we found a happy balance between special effects and CG. The ultimate goal remains to make something spectacular, so I believe in using whatever combination of tools that work best to create that illusion. If that means CG for one particular aspect and practical for another, then by all means use them in concert, use them all.”
LEFT TO RIGHT: Corbould with his Batmobile for 2005’s Batman Begins. Apart from the engine and gearbox, everything was built from scratch. The Tumbler weighed 2.5 tons and could reach speeds in excess of 60 mph. It did big jumps through space and landed intact, allowing crew to stage maneuvers 30 times safely and without damage. (Photo copyright © 2005 Warner Bros. Pictures. All rights reserved.) Corbould’s beast of a Batmobile in Batman Begins. (Photo copyright © 2005 Warner Bros. Pictures. All rights reserved.) In For Your Eyes Only, Corbould enables Bond to impale a wheelchair-bound villain with a helicopter landing skid. “That required us to make a wheelchair that complied with CAA regulations,” says Corbould. “They could have legally flown that wheelchair round the whole world!” Shooting the underground train crash in Skyfall required a pair of carriages to be attached to an overhead monorail track spanning most of Pinewood’s 007 stage, with cables linking them to a tractor that brought the train up to speed for its crash through the expansive set. (Photo copyright © 2012 Danjaq S.A., MGM and United Artists Film Corporation. All rights reserved.)
WINTER 2018 VFXVOICE.COM • 73
11/22/17 4:58 PM
INDUSTRY ROUNDTABLE
ANIMATION GAINING TECHNICAL FREEDOM TO EXPLORE NEW CREATIVE TERRAIN By TREVOR HOGG
Ever since Walt Disney incorporated sound into Steamboat Willie (1928) and utilized the multiplane camera to create a depth of field for Snow White and the Seven Dwarfs (1937), creative aspirations driving technological innovations have been an integral part of animation industry. When the Academy Awards presented the first Oscar for Best Animated Feature in 2001, it was in recognition of the medium maturing into a global cinematic art form with the ability to entertain and explore social concerns no matter the age of audience members. To find out what to expect in 2018, executives, producers and VFX supervisors at leading animation studios, along with filmmakers responsible for The Little Prince, The Breadwinner and The Smurfs: The Legend of Smurfy Hollow, were asked about what they believe will be the creative and technical trends shaping the industry.
LEFT TO RIGHT: LAIKA uses stop-motion, computer animation and origami in Kubo and the Two Strings. (Image copyright LAIKA/Focus Features) Mark Osborne captures the essence of The Little Prince manuscript with stop-motion while the narration centers on a modern-day young girl in 2D computer animation. (Photo courtesy of Mark Osborne) Filmmaker Eva Cvijanovi´c utilizes stop-motion animation in Hedgehog’s Home. (Image courtesy of National Film Board of Canada) TOP LEFT: Aardman Animation teamed with Google Spotlight Stories to produce Special Delivery for mobile devices. (Courtesy of Google Spotlight Stories) BOTTOM LEFT: The story of Elephant King appears as cut-outs while present-day Kabul unfolds in traditional 2D computer animation in The Breadwinner produced by Cartoon Saloon, Aircraft Pictures and Melusine Productions. (Image courtesy of GKIDS)
TOP RIGHT: Blue Sky Studios has some fun with Ferdinand in which a captured gentle bull tries to reunite with his family. (Image copyright © 2017 Twentieth Century Fox Film Corporation.) BOTTOM RIGHT: The sequel How to Train Your Dragon 2 won six Annie Awards including Best Animated Feature for DreamWorks Animation. (Image courtesy of DreamWorks Animation)
With Inside Out, Pixar Animation Studios combines cartoony as well as more grounded animation to reflect the inner turmoil of an uprooted 12-year-old girl. (Image copyright © 2015 Disney/Pixar) Rainmaker Entertainment relaunches the 24-year-old Reboot franchise with ReBoot: The Guardian Code. (Image courtesy of Rainmaker & Mainframe Studios) Dracula overturns the lives of characters on a cruise ship in Hotel Transylvania 3. (Image copyright © 2017 Sony Pictures Entertainment) Stephan Franck, director of The Smurfs: The Legend of Smurfy Hollow, created Silver, a vampire graphic novel currently being developed as a movie. (Image copyright © 2017 Dark Planet Comics)
74 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 74-75
WINTER 2018 VFXVOICE.COM • 75
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
ANIMATION GAINING TECHNICAL FREEDOM TO EXPLORE NEW CREATIVE TERRAIN By TREVOR HOGG
Ever since Walt Disney incorporated sound into Steamboat Willie (1928) and utilized the multiplane camera to create a depth of field for Snow White and the Seven Dwarfs (1937), creative aspirations driving technological innovations have been an integral part of animation industry. When the Academy Awards presented the first Oscar for Best Animated Feature in 2001, it was in recognition of the medium maturing into a global cinematic art form with the ability to entertain and explore social concerns no matter the age of audience members. To find out what to expect in 2018, executives, producers and VFX supervisors at leading animation studios, along with filmmakers responsible for The Little Prince, The Breadwinner and The Smurfs: The Legend of Smurfy Hollow, were asked about what they believe will be the creative and technical trends shaping the industry.
LEFT TO RIGHT: LAIKA uses stop-motion, computer animation and origami in Kubo and the Two Strings. (Image copyright LAIKA/Focus Features) Mark Osborne captures the essence of The Little Prince manuscript with stop-motion while the narration centers on a modern-day young girl in 2D computer animation. (Photo courtesy of Mark Osborne) Filmmaker Eva Cvijanovi´c utilizes stop-motion animation in Hedgehog’s Home. (Image courtesy of National Film Board of Canada) TOP LEFT: Aardman Animation teamed with Google Spotlight Stories to produce Special Delivery for mobile devices. (Courtesy of Google Spotlight Stories) BOTTOM LEFT: The story of Elephant King appears as cut-outs while present-day Kabul unfolds in traditional 2D computer animation in The Breadwinner produced by Cartoon Saloon, Aircraft Pictures and Melusine Productions. (Image courtesy of GKIDS)
TOP RIGHT: Blue Sky Studios has some fun with Ferdinand in which a captured gentle bull tries to reunite with his family. (Image copyright © 2017 Twentieth Century Fox Film Corporation.) BOTTOM RIGHT: The sequel How to Train Your Dragon 2 won six Annie Awards including Best Animated Feature for DreamWorks Animation. (Image courtesy of DreamWorks Animation)
With Inside Out, Pixar Animation Studios combines cartoony as well as more grounded animation to reflect the inner turmoil of an uprooted 12-year-old girl. (Image copyright © 2015 Disney/Pixar) Rainmaker Entertainment relaunches the 24-year-old Reboot franchise with ReBoot: The Guardian Code. (Image courtesy of Rainmaker & Mainframe Studios) Dracula overturns the lives of characters on a cruise ship in Hotel Transylvania 3. (Image copyright © 2017 Sony Pictures Entertainment) Stephan Franck, director of The Smurfs: The Legend of Smurfy Hollow, created Silver, a vampire graphic novel currently being developed as a movie. (Image copyright © 2017 Dark Planet Comics)
74 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 74-75
WINTER 2018 VFXVOICE.COM • 75
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“Virtual reality will see the big players knuckle down and continue to invest to improve the experiences and reduce hardware costs to generate demand, reversing the push from manufacturers to a pull from audiences.” —Heather Wright Executive Producer & Head of Partner Content, Aardman Animation
76 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 76-77
“VR/AR is being used and explored in the animation pipeline. There is a lot of excitement about the potential for intuitive and immersive production tools that could improve department workflows and flexibility.” —Kirk Garfield CG Supervisor, Blue Sky Studios
Heather Wright, Executive Producer and Head of Partner Content, Aardman Animation, United Kingdom Look for growth in real-time animation engines for personalizing audience/customer interactions as chatbots or AR using characters to develop emotional connections. A return to craft with visible evidence of the artist in the work. Strong original work to create memorable cut-through in a noisy world. Less individual pieces at a higher quality that demand brave clients. Virtual reality will see the big players knuckle down and continue to invest to improve the experiences and reduce hardware costs to generate demand, reversing the push from manufacturers to a pull from audiences. Kirk Garfield, CG Supervisor, Blue Sky Studios, Greenwich, Connecticut We see an increasing trend from our creative directors for a more handcrafted approach with stylized animation, and pushed styling in our character designs. The desire for clean lines, specific shapes and characters that must perform a wide range of complex motion from all angles, makes for extremely challenging character design and execution. Also, the desire for specific styling greatly pushes the limits of how we incorporate simulation of garments and hair, which is physically based. It is more difficult, but the efforts are in pursuit of crafting a unique feel and look that serves to transport audiences ever more fully into our stories. Another technical trend is striving for more real-time, or close-to-real-time interactivity for artists. As evidenced at SIGGRAPH, we are seeing a huge push in using the graphics card in all areas of our pipeline to achieve faster interactivity, and greater levels of artistic iteration which allow artists to be more creative and explore options not able to be achieved in traditional schedules. Usually these advances have other associated costs like hardware upgrades [network, workstation, gfx cards]. It will be interesting to see how a studio’s systems infrastructure will need to adapt to real-time rendering from 300+ artists across the pipeline. There is a prevalent use of Global Illumination at other studios – we were first of course to use stochastic Monte Carlo raytracing, but while we have seen that push, we will also be seeing a move to explore looks that depart from physically-based rendering approaches, in an effort to create experiences that are very different than what audiences have grown accustomed to. Also, VR/AR is being used and explored in the animation pipeline. There is a lot of excitement about the potential for intuitive and immersive production tools that could improve department workflows and flexibility. As the industry went fully digital [between the ‘90s and early 2000s] we no longer could hide issues behind film grain. When stereoscopic resurged [2003 to present] it became even harder to employ mono compositing techniques to paint problems away. The bar has been raised again with 4K and HDR. Not only do we all have to contend with increased render times and the required resources to do so, but it also reveals issues or visual
anomalies we would have not have perceived before. We continue to look for ways to innovate how we mitigate problems before, during and after render. Advancements in Machine Learning and Deep Learning are becoming more exciting for solving problems like noise. Nora Twomey, Co-Founder, Cartoon Saloon, Ireland Animation is such a broad term. We look at films that are classified as live-action but they’re animated films. The technology dates and you can easily spot what makes movie look like it comes from the era of 10 years ago. At the time, you couldn’t tell. It was state of the art and looked so real. What’s always interesting to me is making things accessible to artists so they don’t have to think too technically. The more restrictions that are taken away from animators, the more interesting work we are going to see. Exciting times ahead for sure. Dave Walvoord, VFX Supervisor, DreamWorks Animation, Los Angeles DreamWorks will continue its push into highly scalable parallel computing for 2018. As we continue to work towards having the ability to efficiently leverage arbitrary numbers of cores for animation, simulation and rendering, it means that all of our images can increase in visual richness. Historically, Feature Animation adopted a rigid department-based pipeline that leveraged economy of scale for data management. Likewise, our artists became highly specialized to fit into those rigid responsibilities. This limited our ability to solve creative challenges in innovative ways and to work outside the constraints of the pipeline. For How to Train Your Dragon 3 we have streamlined our tool set so that multiple departments are using the same tools and this has resulted in collaboration between departments becoming easier than it ever has been. Now we are asking ourselves where and how we would like to solve a problem independent of pipeline constraints. Creatively, I look forward to an evolution in the look of animated films. Films like The Jungle Book and Moana challenge traditional audience expectations for what animation should look like. This, coupled with more powerful tools, challenges us as artists to further push our creative boundaries.
“What’s always interesting to me is making things accessible to artists so they don’t have to think too technically. The more restrictions that are taken away from animators, the more interesting work we are going to see. Exciting times ahead for sure.” —Nora Twomey Co-Founder, Cartoon Saloon
“Creatively, I look forward to an evolution in the look of animated films. Films like The Jungle Book and Moana challenge traditional audience expectations for what animation should look like. This, coupled with more powerful tools challenges us as artists to further push our creative boundaries.” —Dave Walvoord VFX Supervisor, DreamWorks Animation
WINTER 2018 VFXVOICE.COM • 77
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“Virtual reality will see the big players knuckle down and continue to invest to improve the experiences and reduce hardware costs to generate demand, reversing the push from manufacturers to a pull from audiences.” —Heather Wright Executive Producer & Head of Partner Content, Aardman Animation
76 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 76-77
“VR/AR is being used and explored in the animation pipeline. There is a lot of excitement about the potential for intuitive and immersive production tools that could improve department workflows and flexibility.” —Kirk Garfield CG Supervisor, Blue Sky Studios
Heather Wright, Executive Producer and Head of Partner Content, Aardman Animation, United Kingdom Look for growth in real-time animation engines for personalizing audience/customer interactions as chatbots or AR using characters to develop emotional connections. A return to craft with visible evidence of the artist in the work. Strong original work to create memorable cut-through in a noisy world. Less individual pieces at a higher quality that demand brave clients. Virtual reality will see the big players knuckle down and continue to invest to improve the experiences and reduce hardware costs to generate demand, reversing the push from manufacturers to a pull from audiences. Kirk Garfield, CG Supervisor, Blue Sky Studios, Greenwich, Connecticut We see an increasing trend from our creative directors for a more handcrafted approach with stylized animation, and pushed styling in our character designs. The desire for clean lines, specific shapes and characters that must perform a wide range of complex motion from all angles, makes for extremely challenging character design and execution. Also, the desire for specific styling greatly pushes the limits of how we incorporate simulation of garments and hair, which is physically based. It is more difficult, but the efforts are in pursuit of crafting a unique feel and look that serves to transport audiences ever more fully into our stories. Another technical trend is striving for more real-time, or close-to-real-time interactivity for artists. As evidenced at SIGGRAPH, we are seeing a huge push in using the graphics card in all areas of our pipeline to achieve faster interactivity, and greater levels of artistic iteration which allow artists to be more creative and explore options not able to be achieved in traditional schedules. Usually these advances have other associated costs like hardware upgrades [network, workstation, gfx cards]. It will be interesting to see how a studio’s systems infrastructure will need to adapt to real-time rendering from 300+ artists across the pipeline. There is a prevalent use of Global Illumination at other studios – we were first of course to use stochastic Monte Carlo raytracing, but while we have seen that push, we will also be seeing a move to explore looks that depart from physically-based rendering approaches, in an effort to create experiences that are very different than what audiences have grown accustomed to. Also, VR/AR is being used and explored in the animation pipeline. There is a lot of excitement about the potential for intuitive and immersive production tools that could improve department workflows and flexibility. As the industry went fully digital [between the ‘90s and early 2000s] we no longer could hide issues behind film grain. When stereoscopic resurged [2003 to present] it became even harder to employ mono compositing techniques to paint problems away. The bar has been raised again with 4K and HDR. Not only do we all have to contend with increased render times and the required resources to do so, but it also reveals issues or visual
anomalies we would have not have perceived before. We continue to look for ways to innovate how we mitigate problems before, during and after render. Advancements in Machine Learning and Deep Learning are becoming more exciting for solving problems like noise. Nora Twomey, Co-Founder, Cartoon Saloon, Ireland Animation is such a broad term. We look at films that are classified as live-action but they’re animated films. The technology dates and you can easily spot what makes movie look like it comes from the era of 10 years ago. At the time, you couldn’t tell. It was state of the art and looked so real. What’s always interesting to me is making things accessible to artists so they don’t have to think too technically. The more restrictions that are taken away from animators, the more interesting work we are going to see. Exciting times ahead for sure. Dave Walvoord, VFX Supervisor, DreamWorks Animation, Los Angeles DreamWorks will continue its push into highly scalable parallel computing for 2018. As we continue to work towards having the ability to efficiently leverage arbitrary numbers of cores for animation, simulation and rendering, it means that all of our images can increase in visual richness. Historically, Feature Animation adopted a rigid department-based pipeline that leveraged economy of scale for data management. Likewise, our artists became highly specialized to fit into those rigid responsibilities. This limited our ability to solve creative challenges in innovative ways and to work outside the constraints of the pipeline. For How to Train Your Dragon 3 we have streamlined our tool set so that multiple departments are using the same tools and this has resulted in collaboration between departments becoming easier than it ever has been. Now we are asking ourselves where and how we would like to solve a problem independent of pipeline constraints. Creatively, I look forward to an evolution in the look of animated films. Films like The Jungle Book and Moana challenge traditional audience expectations for what animation should look like. This, coupled with more powerful tools, challenges us as artists to further push our creative boundaries.
“What’s always interesting to me is making things accessible to artists so they don’t have to think too technically. The more restrictions that are taken away from animators, the more interesting work we are going to see. Exciting times ahead for sure.” —Nora Twomey Co-Founder, Cartoon Saloon
“Creatively, I look forward to an evolution in the look of animated films. Films like The Jungle Book and Moana challenge traditional audience expectations for what animation should look like. This, coupled with more powerful tools challenges us as artists to further push our creative boundaries.” —Dave Walvoord VFX Supervisor, DreamWorks Animation
WINTER 2018 VFXVOICE.COM • 77
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“Imagine a world where a group of artists crowd-fund an idea for an animated feature film. They use the money to access cutting-edge software and hardware resources. They build assets, animate, light their scenes and send them off to a distant render farm that is quietly teaching itself how to make images more beautiful, more quickly.” —Steve Emerson VFX Supervisor, LAIKA
“It is my greatest hope that creative and technical boundaries will continue to get pushed by everyone from the top down, not just the artists. I am thrilled to see excitement and open minds at the highest levels right now when it comes to innovative and creative new ideas.” —Mark Osborne Filmmaker
Steve Emerson, VFX Supervisor, LAIKA, Portland, Oregon In an age of consumption, animation professionals will need to create higher-quality content faster than ever before and figure out where to stash all the zeroes and ones. Artificial intelligence will enable us to create better images more efficiently while remote, cloud-based services will eliminate the need for expensive pipelines and hardware infrastructure. At LAIKA, because our performances and sets are captured in-camera, we use a live-action visual effects workflow. The only difference is scale and the fact that we capture performances one frame at a time over the course of days, weeks and sometimes even months. When you’re doing visual effects for live-action animation, it’s critical that all of the computer-generated elements in a shot feel photoreal. Photorealism is achieved by using rendering systems to simulate the real-world flow of light. The more light rays the better. However, simulating light rays is processor-intensive and there never seems to be enough rendering resources to move through a given week’s inventory. We have the option to render at lower quality and use fewer rays, but the resulting noise makes these elements unusable for the final film. Artificial Intelligence and Deep Learning are technical trends that will have an impact on this issue. Researchers at UC Santa Barbara, Disney and Pixar are creating software that trains a deep-learning model called a convolutional neural network. The system learns how to transform noisy images into renders that resemble those computed with more light rays. This technology will enable animation studios to create more high-quality content – faster and with fewer rendering resources. Animation studios will also leverage online, cloud-based resources to meet increasing demands for content. ‘Studio-in-abox’ services like Foundry’s Elara will enable storytellers to access software, data storage and rendering resources on-demand from any web-connected device. Independent productions and even solo artists will have access to cutting-edge technology and limitless computing power without having to shoulder the burden of building a pipeline and purchasing expensive hardware infrastructure. Imagine a world where a group of artists crowd-fund an idea for an animated feature film. They use the money to access cutting-edge software and hardware resources. They build assets, animate, light their scenes and send them off to a distant render farm that is quietly teaching itself how to make images more beautiful, more quickly. Mark Osborne, Filmmaker, New York City I really believe that in an effort to get away from the clutter, animated feature films will have to continue to find ways to innovate, both stylistically and in storytelling. A lot of CGI films have been looking so similar lately that it’s increasingly harder to get audiences excited about animation – aside from the wellknown brands. It is my greatest hope that creative and technical boundaries will continue to get pushed by everyone from the top
78 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 79
down, not just the artists. I am thrilled to see the excitement and open minds at the highest levels right now when it comes to innovative and creative new ideas. The mixed-media approach of The Little Prince was a huge challenge, and a perfect fit for what we were trying to do from a storytelling standpoint. Using stop-motion animation and CGI to represent two entirely different points of view was a very scary and exciting challenge. Everyone involved stepped up to make it work, and audiences loved that the film was so different and visually unique. It was always the hope that by doing the film with those different animation techniques that it could push the boundaries and open minds and doors to all the amazing possibilities that animation offers storytellers. Michael Fukushima, Executive Producer of English Program Animation Studio, National Film Board of Canada, Montreal A notable creative trend I see is a continuing shift towards and love for stop-motion animation. The arrival of Wes Anderson’s new Isle of Dogs is a clear sign of this in the features world, but amongst short-form animation it’s even more pronounced. Curiously, it’s younger filmmakers who are fascinated by tactile animation and I see more and more of it on the festival circuit. Our own Hedgehog’s Home, by Eva Cvijanovi´c, has been racking up awards since it launched at Berlin in February, but it’s only one of several stop-mo shorts picking up festival awards everywhere. Indie features, primarily hand-drawn digital 2D, are becoming a viable alternative – albeit with limited distribution – to the big studio blockbusters. The National Film Board (NFB) had some wonderful success this past year with Window Horses, which we co-produced with filmmaker Ann Marie Fleming and her production company. Meanwhile, Ireland’s Cartoon Saloon launched The Breadwinner, there’s a recent film from the creators of Ernest and Celestine, and U.S. distributor GKIDS has positioned three or four films for Oscar contention this year. It’s a good time to be thinking of feature animation in the low-budget, indie sphere. Technically, the exploration of VR space is continuing apace in animation, with Google Spotlight Stories really taking the lead there. The NFB is doing its share in experimenting with the form with younger filmmakers like Paloma Dawkins [Museum of Symmetry] and Jeff Barnaby [West Wind], but Spotlight Stories is getting veterans into the field and testing their storytelling limits. It’s all pretty exciting stuff that happily keeps me young at heart as a producer.
“Indie features, primarily hand-drawn digital 2D, are becoming a viable alternative – albeit with limited distribution – to the big studio blockbusters. ... It’s a good time to be thinking of feature animation in the low-budget, indie sphere.” —Michael Fukushima Executive Producer of English Program Animation Studio, National Film Board of Canada
“We’ll see an uptick in the development of new techniques that will make big impacts in efficiency for rendering, simulation and character animation. Rendering in particular produces a ton of data – imagine leveraging that to produce rendering algorithms that get faster every time you use them.” —Steve May Chief Technical Officer, Pixar Animation Studios
Steve May, Chief Technical Officer, Pixar Animation Studios, Emeryville, California Machine learning is an overused buzzword right now, but through 2018 we will see more practical applications of machine learning for animation production. A lot of this will
WINTER 2018 VFXVOICE.COM • 79
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“Imagine a world where a group of artists crowd-fund an idea for an animated feature film. They use the money to access cutting-edge software and hardware resources. They build assets, animate, light their scenes and send them off to a distant render farm that is quietly teaching itself how to make images more beautiful, more quickly.” —Steve Emerson VFX Supervisor, LAIKA
“It is my greatest hope that creative and technical boundaries will continue to get pushed by everyone from the top down, not just the artists. I am thrilled to see excitement and open minds at the highest levels right now when it comes to innovative and creative new ideas.” —Mark Osborne Filmmaker
Steve Emerson, VFX Supervisor, LAIKA, Portland, Oregon In an age of consumption, animation professionals will need to create higher-quality content faster than ever before and figure out where to stash all the zeroes and ones. Artificial intelligence will enable us to create better images more efficiently while remote, cloud-based services will eliminate the need for expensive pipelines and hardware infrastructure. At LAIKA, because our performances and sets are captured in-camera, we use a live-action visual effects workflow. The only difference is scale and the fact that we capture performances one frame at a time over the course of days, weeks and sometimes even months. When you’re doing visual effects for live-action animation, it’s critical that all of the computer-generated elements in a shot feel photoreal. Photorealism is achieved by using rendering systems to simulate the real-world flow of light. The more light rays the better. However, simulating light rays is processor-intensive and there never seems to be enough rendering resources to move through a given week’s inventory. We have the option to render at lower quality and use fewer rays, but the resulting noise makes these elements unusable for the final film. Artificial Intelligence and Deep Learning are technical trends that will have an impact on this issue. Researchers at UC Santa Barbara, Disney and Pixar are creating software that trains a deep-learning model called a convolutional neural network. The system learns how to transform noisy images into renders that resemble those computed with more light rays. This technology will enable animation studios to create more high-quality content – faster and with fewer rendering resources. Animation studios will also leverage online, cloud-based resources to meet increasing demands for content. ‘Studio-in-abox’ services like Foundry’s Elara will enable storytellers to access software, data storage and rendering resources on-demand from any web-connected device. Independent productions and even solo artists will have access to cutting-edge technology and limitless computing power without having to shoulder the burden of building a pipeline and purchasing expensive hardware infrastructure. Imagine a world where a group of artists crowd-fund an idea for an animated feature film. They use the money to access cutting-edge software and hardware resources. They build assets, animate, light their scenes and send them off to a distant render farm that is quietly teaching itself how to make images more beautiful, more quickly. Mark Osborne, Filmmaker, New York City I really believe that in an effort to get away from the clutter, animated feature films will have to continue to find ways to innovate, both stylistically and in storytelling. A lot of CGI films have been looking so similar lately that it’s increasingly harder to get audiences excited about animation – aside from the wellknown brands. It is my greatest hope that creative and technical boundaries will continue to get pushed by everyone from the top
78 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 79
down, not just the artists. I am thrilled to see the excitement and open minds at the highest levels right now when it comes to innovative and creative new ideas. The mixed-media approach of The Little Prince was a huge challenge, and a perfect fit for what we were trying to do from a storytelling standpoint. Using stop-motion animation and CGI to represent two entirely different points of view was a very scary and exciting challenge. Everyone involved stepped up to make it work, and audiences loved that the film was so different and visually unique. It was always the hope that by doing the film with those different animation techniques that it could push the boundaries and open minds and doors to all the amazing possibilities that animation offers storytellers. Michael Fukushima, Executive Producer of English Program Animation Studio, National Film Board of Canada, Montreal A notable creative trend I see is a continuing shift towards and love for stop-motion animation. The arrival of Wes Anderson’s new Isle of Dogs is a clear sign of this in the features world, but amongst short-form animation it’s even more pronounced. Curiously, it’s younger filmmakers who are fascinated by tactile animation and I see more and more of it on the festival circuit. Our own Hedgehog’s Home, by Eva Cvijanovi´c, has been racking up awards since it launched at Berlin in February, but it’s only one of several stop-mo shorts picking up festival awards everywhere. Indie features, primarily hand-drawn digital 2D, are becoming a viable alternative – albeit with limited distribution – to the big studio blockbusters. The National Film Board (NFB) had some wonderful success this past year with Window Horses, which we co-produced with filmmaker Ann Marie Fleming and her production company. Meanwhile, Ireland’s Cartoon Saloon launched The Breadwinner, there’s a recent film from the creators of Ernest and Celestine, and U.S. distributor GKIDS has positioned three or four films for Oscar contention this year. It’s a good time to be thinking of feature animation in the low-budget, indie sphere. Technically, the exploration of VR space is continuing apace in animation, with Google Spotlight Stories really taking the lead there. The NFB is doing its share in experimenting with the form with younger filmmakers like Paloma Dawkins [Museum of Symmetry] and Jeff Barnaby [West Wind], but Spotlight Stories is getting veterans into the field and testing their storytelling limits. It’s all pretty exciting stuff that happily keeps me young at heart as a producer.
“Indie features, primarily hand-drawn digital 2D, are becoming a viable alternative – albeit with limited distribution – to the big studio blockbusters. ... It’s a good time to be thinking of feature animation in the low-budget, indie sphere.” —Michael Fukushima Executive Producer of English Program Animation Studio, National Film Board of Canada
“We’ll see an uptick in the development of new techniques that will make big impacts in efficiency for rendering, simulation and character animation. Rendering in particular produces a ton of data – imagine leveraging that to produce rendering algorithms that get faster every time you use them.” —Steve May Chief Technical Officer, Pixar Animation Studios
Steve May, Chief Technical Officer, Pixar Animation Studios, Emeryville, California Machine learning is an overused buzzword right now, but through 2018 we will see more practical applications of machine learning for animation production. A lot of this will
WINTER 2018 VFXVOICE.COM • 79
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“It’s an exciting time because we don’t have to make a show for everyone. It’s better to be the first click for a smaller audience than a third click for bigger.” —Gregory Little SVP Content, Rainmaker & Mainframe Studios
LEFT AND OPPOSITE PAGE: Character designs by Blue Sky Studios from the animated feature Spies in Disguise featuring Will Smith as the cool and charming covert operative Lance Sterling and Tom Holland as Walter, the socially awkward gadget-inventing teenager. (Photo courtesy of Blue Sky Studios)
80 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 81
still be in the research phase, but I think we’ll see an uptick in the development of new techniques that will make big impacts in efficiency for rendering, simulation and character animation. Rendering in particular produces a ton of data – imagine leveraging that to produce rendering algorithms that get faster every time you use them. We’ll see the use of more real-time rendering and game engines to produce content. This will force us to rethink the traditional assembly-line-style pipeline common in feature animation and move us towards a pipeline that is more collaborative and interactive for our artists. Global illumination and path tracing is now ubiquitous in high-end animation production. By default, this can produce a visual style of animation that is photorealistic. I hope that we’ll see more artistic experimentation with global illumination to produce non-photorealistic looks and give directors and artists more creative options. Gregory Little, SVP Content, Rainmaker & Mainframe Studios, Los Angeles, California My watch list is out of control. So is everyone else’s. To find fans, we need to go narrow and deep: understand what our specific audience wants and give them a pure visual dose from the first frame. It’s an exciting time because we don’t have to make a show for everyone. It’s better to be the first click for a smaller audience than a third click for bigger. Castlevania, the new show on Netflix by our sister company Frederator Studios, is a good example of this. To push our visual creativity, we’ve worked to free ourselves from technical constraints, then build what we need if necessary to achieve a look. This started two years ago on ReBoot: The Guardian Code when we built an Unreal game engine pipeline to deliver the huge, AI-driven world of the show. It was torture at times but really paid off. The show looks amazing and we’re delivering it in 4K at a TV budget. This is a real advantage for us, so much so that we’re moving to Redshift for our conventional CG pipeline. We’ve also launched a 2D group, and have a new project that’s a hybrid with hand-drawn characters in a live-action world. It’s total chaos! But our artists love the room it gives them creatively and the work is more original and interesting. We want to keep fans once we’ve earned them, so we’re working really hard to build multi-platform engagement into shows when it makes sense. Here again, the game engine plays a big role. VR, AR, gaming, secondary content – it’s all much easier working in this platform, assuming you design it that way at the start. Bottom line, we’re having more fun than we’ve ever had. Kids content can be a grind sometimes, but the world has opened up. Serialization makes storytelling much more interesting. Buyers are desperate for shows that stand out and willing to spend money to get them. Let’s hope it lasts!
Pam Marsden, Head of Production, Sony Pictures Animation, Culver City, California There are significant and obvious innovations and smaller tweaks in existing technology that shape our business and our work/life balance. One of the most significant trends is VR. Filmmakers are interested in this new, appealing format and seeking ways to be involved. VR will help change the way we develop and deliver stories. Another trend is driven by alternate ways to distribute our work. New platforms allow us to reach different, targeted audiences and tell different kinds of stories. Meanwhile, on a more personal level, cheap video conferencing and simple secure file-transfer technology allows filmmakers to work from home, which then means that we can work with an even more diverse group of artists. Our movies are now typically influenced by artists from all over the world. The nature of collaboration is evolving and becoming more international. Stephan Franck, Filmmaker & Founder of Dark Planet Comics, Los Angeles, California In my opinion, the game changer will continue to be the digital platforms distributing animated content, as they have broken down the rigid categories in which animated programs previously had to fit. Creatively, that may mean an increase in unique and atypical projects. Or it might just mean that animation will allow itself to tackle a broader choice of genres and subject matters and be unapologetic about it. I think the audience has been there for a while, we’re just now catching up.
“One of the most significant trends is VR. Filmmakers are interested in this new, appealing format and seeking ways to be involved. VR will help change the way we develop and deliver stories. Another trend is driven by alternate ways to distribute our work. New platforms allow us to reach different, targeted audiences and tell different kinds of stories.” —Pam Marsden Head of Production, Sony Pictures Animation
“In my opinion, the game changer will continue to be the digital platforms distributing animated content, as they have broken down the rigid categories in which animated programs previously had to fit.” —Stephan Franck Filmmaker/Founder, Dark Planet Comics
WINTER 2018 VFXVOICE.COM • 81
11/22/17 4:59 PM
INDUSTRY ROUNDTABLE
“It’s an exciting time because we don’t have to make a show for everyone. It’s better to be the first click for a smaller audience than a third click for bigger.” —Gregory Little SVP Content, Rainmaker & Mainframe Studios
LEFT AND OPPOSITE PAGE: Character designs by Blue Sky Studios from the animated feature Spies in Disguise featuring Will Smith as the cool and charming covert operative Lance Sterling and Tom Holland as Walter, the socially awkward gadget-inventing teenager. (Photo courtesy of Blue Sky Studios)
80 • VFXVOICE.COM WINTER 2018
PG 74-81 INDUSTRY ROUNDTABLE.indd 81
still be in the research phase, but I think we’ll see an uptick in the development of new techniques that will make big impacts in efficiency for rendering, simulation and character animation. Rendering in particular produces a ton of data – imagine leveraging that to produce rendering algorithms that get faster every time you use them. We’ll see the use of more real-time rendering and game engines to produce content. This will force us to rethink the traditional assembly-line-style pipeline common in feature animation and move us towards a pipeline that is more collaborative and interactive for our artists. Global illumination and path tracing is now ubiquitous in high-end animation production. By default, this can produce a visual style of animation that is photorealistic. I hope that we’ll see more artistic experimentation with global illumination to produce non-photorealistic looks and give directors and artists more creative options. Gregory Little, SVP Content, Rainmaker & Mainframe Studios, Los Angeles, California My watch list is out of control. So is everyone else’s. To find fans, we need to go narrow and deep: understand what our specific audience wants and give them a pure visual dose from the first frame. It’s an exciting time because we don’t have to make a show for everyone. It’s better to be the first click for a smaller audience than a third click for bigger. Castlevania, the new show on Netflix by our sister company Frederator Studios, is a good example of this. To push our visual creativity, we’ve worked to free ourselves from technical constraints, then build what we need if necessary to achieve a look. This started two years ago on ReBoot: The Guardian Code when we built an Unreal game engine pipeline to deliver the huge, AI-driven world of the show. It was torture at times but really paid off. The show looks amazing and we’re delivering it in 4K at a TV budget. This is a real advantage for us, so much so that we’re moving to Redshift for our conventional CG pipeline. We’ve also launched a 2D group, and have a new project that’s a hybrid with hand-drawn characters in a live-action world. It’s total chaos! But our artists love the room it gives them creatively and the work is more original and interesting. We want to keep fans once we’ve earned them, so we’re working really hard to build multi-platform engagement into shows when it makes sense. Here again, the game engine plays a big role. VR, AR, gaming, secondary content – it’s all much easier working in this platform, assuming you design it that way at the start. Bottom line, we’re having more fun than we’ve ever had. Kids content can be a grind sometimes, but the world has opened up. Serialization makes storytelling much more interesting. Buyers are desperate for shows that stand out and willing to spend money to get them. Let’s hope it lasts!
Pam Marsden, Head of Production, Sony Pictures Animation, Culver City, California There are significant and obvious innovations and smaller tweaks in existing technology that shape our business and our work/life balance. One of the most significant trends is VR. Filmmakers are interested in this new, appealing format and seeking ways to be involved. VR will help change the way we develop and deliver stories. Another trend is driven by alternate ways to distribute our work. New platforms allow us to reach different, targeted audiences and tell different kinds of stories. Meanwhile, on a more personal level, cheap video conferencing and simple secure file-transfer technology allows filmmakers to work from home, which then means that we can work with an even more diverse group of artists. Our movies are now typically influenced by artists from all over the world. The nature of collaboration is evolving and becoming more international. Stephan Franck, Filmmaker & Founder of Dark Planet Comics, Los Angeles, California In my opinion, the game changer will continue to be the digital platforms distributing animated content, as they have broken down the rigid categories in which animated programs previously had to fit. Creatively, that may mean an increase in unique and atypical projects. Or it might just mean that animation will allow itself to tackle a broader choice of genres and subject matters and be unapologetic about it. I think the audience has been there for a while, we’re just now catching up.
“One of the most significant trends is VR. Filmmakers are interested in this new, appealing format and seeking ways to be involved. VR will help change the way we develop and deliver stories. Another trend is driven by alternate ways to distribute our work. New platforms allow us to reach different, targeted audiences and tell different kinds of stories.” —Pam Marsden Head of Production, Sony Pictures Animation
“In my opinion, the game changer will continue to be the digital platforms distributing animated content, as they have broken down the rigid categories in which animated programs previously had to fit.” —Stephan Franck Filmmaker/Founder, Dark Planet Comics
WINTER 2018 VFXVOICE.COM • 81
11/22/17 4:59 PM
ANIMATION
THE JOURNEY OF COCO CAPTURING THE HOLIDAY SPIRITS IN MEXICO’S DAY OF THE DEAD By TREVOR HOGG
TOP: Concept art by Robert Kondo and production designer Harley Jessup of a Las Vegas style show in the Land of the Dead which makes use of the skull motif.
82 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 82-83
Coco tells the story about a young boy who aspires to become a musician against the wishes of his family and embarks on a magical adventure where he encounters a famous deceased relative. There was no doubt as to the setting of the 19th feature from Pixar Animation Studios. “The reason for Coco being set in Mexico is related to the idea of making a film based on the Day of the Dead,” explains Coco co-director and writer Adrian Molina. “We were inspired by the opportunity in animation to create this land of the ancestors and the emotional story you can tell that leads into the holiday.” Research trips to museums, markets, plazas, workshops, churches, haciendas and cemeteries were vital in the development of the project. “Within weeks of me first pitching this story to John Lasseter [in 2011] we were all on planes heading down to Mexico,” recalls Coco director Lee Unkrich. “We traveled all over the country, from Mexico City to Oaxaca, down to small towns in the state of Michoacán, to document the look of these different urban and rural areas, and give us inspiration for creating our world. It was also critical for us to spend time with different families as they celebrated Día de los Muertos, and observe their family traditions and dynamics. “We could have easily gone down a route like ‘Starbones’ coffee shops on every corner, but I wanted to keep everything grounded and influenced directly by Mexican architecture and life,” remarks Unkrich. “That being said, we have lots of hidden skeletons everywhere as well as skull faces both within the architecture and the way that these different towers in the Land of the Dead line up.” The visual styles of Land of the Living and Land of the Dead contrast, says Unkrich. “Everything is flat and horizontal in this little town of Santa Cecilia in which Miguel lives. When he goes to the Land of the Dead we suddenly introduce all of these vibrant colors. It’s our version of Dorothy going to the Land of Oz. We also designed the Land of the Dead to be vertical. It’s made up of tall towers.”
In the Land of the Dead are towers constructed by the newly arrived souls who build them upwards in the architectural style of their time. “At the base of these towers are pyramids and at the top there is modern construction going on,” remarks Supervising Technical Director David Ryu. “We created a diversity of buildings through time and stacked them up on top of each other to create this look. From an execution perspective these things are big. We had this system of taking these buildings, finding ones that match and instancing them together so that we only paid for it once; and doing that not only inside a single tower but across towers.” One of the elements that was halfway deployed on Pixar’s Cars 3 but used throughout Coco is a technology called Flat Sets which allows Presto to think of dense complicated sets as single entities. “You’re not loading a tower, another building, a vehicle and a headlight, as the system thinks about it as one thing,” Ryu says. “ Optionally, if an animator needed to go in and move any stuff they can click on a button, crack it open and make it real.” Various types of lights are found in the Land of the Dead from pin lights on buildings to street lamps to headlights to windows reflecting lights. “Having this many lights was hard,” admits Ryu. “We put these test scenes together and had to talk to the RenderMan guys on how to make this stuff scale. They partnered with Intel to optimize the core light picking algorithms. We worked with them to further optimize the process of figuring which 10 lights out of eight million are important to the thing that you’re actually rendering. It’s influenced by where you’re looking, how far away the lights are, if you are being blocked by shadow or not, how bright the lights are, and by the shape of the lights. The RenderMan guys went in and optimized that part of the code a ton for us.” “We had never done skeleton characters at Pixar before, so a lot of time was spent looking at how skeletons had been used in other animation, from early Disney cartoons to Ray Harryhausen to Tim Burton,” remarks Unkrich. “It was all done with an eye towards being as appealing as possible. At the same time, we had to create characters that could emote and people could care about. That’s what led us to make decisions like allowing our skeletons to have eyeballs. Eyes are the windows to the soul and I couldn’t imagine playing emotional scenes with a character that had empty eye sockets.” A decision was made to have lips and angular mouth shapes to suggest a hard surface. “We couldn’t stay with a separated jawbone and skull area look because we needed to be able to get some clear mouth shapes and be able to articulate dialogue,” recalls Supervising Animator Gini Santos. “The eyeballs and
“The reason for Coco being set in Mexico is related to the idea of making a film based on the Day of the Dead. We were inspired by the opportunity in animation to create this land of the ancestors and the emotional story that you can tell that leads into the holiday.” —Adrian Molina, Coco Co-director/Writer
TOP: Concept art by Armand Baltazar and John Nevarez of Miguel running away from the tomb of his idol, Ernesto de la Cruz. ABOVE: Concept art by Ernesto Nemesio of the signature towers built over the centuries by the recently deceased in the Land of the Dead.
WINTER 2018 VFXVOICE.COM • 83
11/22/17 4:59 PM
ANIMATION
THE JOURNEY OF COCO CAPTURING THE HOLIDAY SPIRITS IN MEXICO’S DAY OF THE DEAD By TREVOR HOGG
TOP: Concept art by Robert Kondo and production designer Harley Jessup of a Las Vegas style show in the Land of the Dead which makes use of the skull motif.
82 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 82-83
Coco tells the story about a young boy who aspires to become a musician against the wishes of his family and embarks on a magical adventure where he encounters a famous deceased relative. There was no doubt as to the setting of the 19th feature from Pixar Animation Studios. “The reason for Coco being set in Mexico is related to the idea of making a film based on the Day of the Dead,” explains Coco co-director and writer Adrian Molina. “We were inspired by the opportunity in animation to create this land of the ancestors and the emotional story you can tell that leads into the holiday.” Research trips to museums, markets, plazas, workshops, churches, haciendas and cemeteries were vital in the development of the project. “Within weeks of me first pitching this story to John Lasseter [in 2011] we were all on planes heading down to Mexico,” recalls Coco director Lee Unkrich. “We traveled all over the country, from Mexico City to Oaxaca, down to small towns in the state of Michoacán, to document the look of these different urban and rural areas, and give us inspiration for creating our world. It was also critical for us to spend time with different families as they celebrated Día de los Muertos, and observe their family traditions and dynamics. “We could have easily gone down a route like ‘Starbones’ coffee shops on every corner, but I wanted to keep everything grounded and influenced directly by Mexican architecture and life,” remarks Unkrich. “That being said, we have lots of hidden skeletons everywhere as well as skull faces both within the architecture and the way that these different towers in the Land of the Dead line up.” The visual styles of Land of the Living and Land of the Dead contrast, says Unkrich. “Everything is flat and horizontal in this little town of Santa Cecilia in which Miguel lives. When he goes to the Land of the Dead we suddenly introduce all of these vibrant colors. It’s our version of Dorothy going to the Land of Oz. We also designed the Land of the Dead to be vertical. It’s made up of tall towers.”
In the Land of the Dead are towers constructed by the newly arrived souls who build them upwards in the architectural style of their time. “At the base of these towers are pyramids and at the top there is modern construction going on,” remarks Supervising Technical Director David Ryu. “We created a diversity of buildings through time and stacked them up on top of each other to create this look. From an execution perspective these things are big. We had this system of taking these buildings, finding ones that match and instancing them together so that we only paid for it once; and doing that not only inside a single tower but across towers.” One of the elements that was halfway deployed on Pixar’s Cars 3 but used throughout Coco is a technology called Flat Sets which allows Presto to think of dense complicated sets as single entities. “You’re not loading a tower, another building, a vehicle and a headlight, as the system thinks about it as one thing,” Ryu says. “ Optionally, if an animator needed to go in and move any stuff they can click on a button, crack it open and make it real.” Various types of lights are found in the Land of the Dead from pin lights on buildings to street lamps to headlights to windows reflecting lights. “Having this many lights was hard,” admits Ryu. “We put these test scenes together and had to talk to the RenderMan guys on how to make this stuff scale. They partnered with Intel to optimize the core light picking algorithms. We worked with them to further optimize the process of figuring which 10 lights out of eight million are important to the thing that you’re actually rendering. It’s influenced by where you’re looking, how far away the lights are, if you are being blocked by shadow or not, how bright the lights are, and by the shape of the lights. The RenderMan guys went in and optimized that part of the code a ton for us.” “We had never done skeleton characters at Pixar before, so a lot of time was spent looking at how skeletons had been used in other animation, from early Disney cartoons to Ray Harryhausen to Tim Burton,” remarks Unkrich. “It was all done with an eye towards being as appealing as possible. At the same time, we had to create characters that could emote and people could care about. That’s what led us to make decisions like allowing our skeletons to have eyeballs. Eyes are the windows to the soul and I couldn’t imagine playing emotional scenes with a character that had empty eye sockets.” A decision was made to have lips and angular mouth shapes to suggest a hard surface. “We couldn’t stay with a separated jawbone and skull area look because we needed to be able to get some clear mouth shapes and be able to articulate dialogue,” recalls Supervising Animator Gini Santos. “The eyeballs and
“The reason for Coco being set in Mexico is related to the idea of making a film based on the Day of the Dead. We were inspired by the opportunity in animation to create this land of the ancestors and the emotional story that you can tell that leads into the holiday.” —Adrian Molina, Coco Co-director/Writer
TOP: Concept art by Armand Baltazar and John Nevarez of Miguel running away from the tomb of his idol, Ernesto de la Cruz. ABOVE: Concept art by Ernesto Nemesio of the signature towers built over the centuries by the recently deceased in the Land of the Dead.
WINTER 2018 VFXVOICE.COM • 83
11/22/17 4:59 PM
ANIMATION
TOP: A sculpture created by Greg Dykstra of the alebrije Pepita which is a personal favorite of producer Darla Anderson. BOTTOM: Concept art by Robert Kondo of the Marigold Bridge with the final effect being executed by Dave Hale who needed to make the structure grounded but also magical.
84 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 84-85
eyelids were treated as one malleable organic piece so we shaped the eye sockets to work like eyebrows in creating shapes of expressions.” “Skeletons are humanoid, so we were able to use underlying rigs for arms, legs, fingers, bones, eyes and mouths,” notes Ryu. “The skeletons can take off their arms and put their hands on their heads, and take off their heads and throw them to the next guy. That’s part of the fun of the world. We had to augment the human rig to be able to detach.” Some rules needed to be put in place to ensure an aspect of believability. “The bones could separate because of impact or excitement,” explains Santos. “That’s our skeleton version of squash and stretch. It was more difficult because we required a lot more control.” An automated tool called Kingpin was utilized to animate a ‘jiggle’ to the bones to give them a sense of looseness. Clothes worn by the skeletons come from the time period in which they lived. “Our cloth simulator went through a basic overhaul,” states Ryu. “Our tools team put in new collision algorithms and lots of pieces of new technology to make simulations of higher density cloth meshes against this skinny bone geometry robust. We ran a bunch of tests in the beginning when we had skeletons against a skirt with a petticoat and blouse on top of it, and stuff would blow up. After this overhaul that stuff was super stable which enabled us to push it.” “There are some scenes in the movie with tens of thousands of skeleton crowds in them and we have a Houdini-based system for simulating these crowds,” adds Ryu. “We built a system that could take each individual thing in a shot and analyze how big it ever gets onscreen. If it is super faraway it isn’t very important, but if it gets up close it’s super important. We auto-generated a bunch of levels of detail for every single crowd character for every single animation cycle. This system would figure out how important everything was onscreen and then automatically assign these levels of details.” One of the characters that went through a few iterations and had the most discussions about was Mama Imelda, Miguel’s greatgreat-grandmother, who we see in photos in the Land of the Living but in person in the Land of the Dead,” remarks Molina. “There was a lot of talk about what her character was motivated by, and what she was protecting in terms of her family and why she creates these rules that are trouble for Miguel. How she holds herself, the way she does her hair and the outfits she wears. We talked about a woman who was once vulnerable and got hurt because her heart
was open – even down to the design of her dress and shoulders to give a sense of armoring up and of being strong for the sake of her family. There was also a lot of talk about what visually sells this character who is protecting her family.” Most of the effects in Coco are based on taking a cultural idiom of a holiday and turning that into a real thing, like a path on the ground [of marigold petals] turns into this large bridge,” remarks Effects Supervisor Michael O’Brien. “Where the characters interact is designed like somebody walking through real petals. Then as you move away and see the structures of the bridge, those are designed to be more magical. There’s this inner light pulsing through it and a lot of motion happens ancillary to the trusses that are all intentional. The top surface is what’s given to animation, crowds and layout so they know what to shoot for. Then the bridge is grown around those structures. The trellises we would erode into the columns that were below to get the shapes that we wanted, and the glowing patterns are all added. There’s a dual res noise that makes a nice wind pattern that tweaks the orientation of the petals along the edge and side of the bridge to give a sense that they’re solid but still able to move and have dynamics.” The ghost effect consists of five to six layers, says O’Brien. “They’re all done subtly because they’re not meant to detract from the animation. The layers are highlighted a lot more when Miguel passes through somebody with each having their own color. A glow happens around the face and another for the clothes. There are some wispy bits that came off of each layer that are dialed independently.” The canine companion of Miguel transitions into an alebrije (a fantastical creature). “That was difficult for us because it’s full screen when Dante transforms from a black dog into this colorful spirit animal with wings,” explains O’Brien. “We wanted the transition to feel like it’s welling up inside of him, not running across the surface. It was a tough look for us because you’re essentially trying to get a single surface scatter thing to feel like it’s starting to glow and feel natural. We drew a lot from the patterns that are painted on him and tried to have those drive the animation.” There are not many effects in the Land of the Living. “We did a lot of ambient things like candles, drinks on tables, and a bouquet of flowers that people carry [which was complicated].” Among the voice cast for Coco are Edward James Olmos, Gael García Bernal, Benjamin Bratt, Cheech Marin, Ana Ofelia Murguía, Sofía Espinosa and Alanna Ubach. “I’m proud that we have an all-Latino cast,” states Producer Darla Anderson. “Lee knows what he wants in terms of creating the texture and acting in the film. It
TOP: Unlike most dogs that have appeared in Pixar movies, Dante is a Xolo, which does not have any hair. BELOW: The ghost effect, developed by Keith Klohn, consists of five to six layers that are accentuated when a ghost passes through another being. BOTOM: Six different kinds and millions of lights populate the Land of the Dead, but rendering only what were essential for a scene required new code to be written by the RenderMan team and a partnership with Intel.
“Everything is flat and horizontal in this little town of Santa Cecilia in which Miguel lives. When he goes to the Land of the Dead we suddenly introduce all of these vibrant colors. It’s our version of Dorothy going to the Land of Oz.” —Lee Unkrich, Coco Director
WINTER 2018 VFXVOICE.COM • 85
11/22/17 5:00 PM
ANIMATION
TOP: A sculpture created by Greg Dykstra of the alebrije Pepita which is a personal favorite of producer Darla Anderson. BOTTOM: Concept art by Robert Kondo of the Marigold Bridge with the final effect being executed by Dave Hale who needed to make the structure grounded but also magical.
84 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 84-85
eyelids were treated as one malleable organic piece so we shaped the eye sockets to work like eyebrows in creating shapes of expressions.” “Skeletons are humanoid, so we were able to use underlying rigs for arms, legs, fingers, bones, eyes and mouths,” notes Ryu. “The skeletons can take off their arms and put their hands on their heads, and take off their heads and throw them to the next guy. That’s part of the fun of the world. We had to augment the human rig to be able to detach.” Some rules needed to be put in place to ensure an aspect of believability. “The bones could separate because of impact or excitement,” explains Santos. “That’s our skeleton version of squash and stretch. It was more difficult because we required a lot more control.” An automated tool called Kingpin was utilized to animate a ‘jiggle’ to the bones to give them a sense of looseness. Clothes worn by the skeletons come from the time period in which they lived. “Our cloth simulator went through a basic overhaul,” states Ryu. “Our tools team put in new collision algorithms and lots of pieces of new technology to make simulations of higher density cloth meshes against this skinny bone geometry robust. We ran a bunch of tests in the beginning when we had skeletons against a skirt with a petticoat and blouse on top of it, and stuff would blow up. After this overhaul that stuff was super stable which enabled us to push it.” “There are some scenes in the movie with tens of thousands of skeleton crowds in them and we have a Houdini-based system for simulating these crowds,” adds Ryu. “We built a system that could take each individual thing in a shot and analyze how big it ever gets onscreen. If it is super faraway it isn’t very important, but if it gets up close it’s super important. We auto-generated a bunch of levels of detail for every single crowd character for every single animation cycle. This system would figure out how important everything was onscreen and then automatically assign these levels of details.” One of the characters that went through a few iterations and had the most discussions about was Mama Imelda, Miguel’s greatgreat-grandmother, who we see in photos in the Land of the Living but in person in the Land of the Dead,” remarks Molina. “There was a lot of talk about what her character was motivated by, and what she was protecting in terms of her family and why she creates these rules that are trouble for Miguel. How she holds herself, the way she does her hair and the outfits she wears. We talked about a woman who was once vulnerable and got hurt because her heart
was open – even down to the design of her dress and shoulders to give a sense of armoring up and of being strong for the sake of her family. There was also a lot of talk about what visually sells this character who is protecting her family.” Most of the effects in Coco are based on taking a cultural idiom of a holiday and turning that into a real thing, like a path on the ground [of marigold petals] turns into this large bridge,” remarks Effects Supervisor Michael O’Brien. “Where the characters interact is designed like somebody walking through real petals. Then as you move away and see the structures of the bridge, those are designed to be more magical. There’s this inner light pulsing through it and a lot of motion happens ancillary to the trusses that are all intentional. The top surface is what’s given to animation, crowds and layout so they know what to shoot for. Then the bridge is grown around those structures. The trellises we would erode into the columns that were below to get the shapes that we wanted, and the glowing patterns are all added. There’s a dual res noise that makes a nice wind pattern that tweaks the orientation of the petals along the edge and side of the bridge to give a sense that they’re solid but still able to move and have dynamics.” The ghost effect consists of five to six layers, says O’Brien. “They’re all done subtly because they’re not meant to detract from the animation. The layers are highlighted a lot more when Miguel passes through somebody with each having their own color. A glow happens around the face and another for the clothes. There are some wispy bits that came off of each layer that are dialed independently.” The canine companion of Miguel transitions into an alebrije (a fantastical creature). “That was difficult for us because it’s full screen when Dante transforms from a black dog into this colorful spirit animal with wings,” explains O’Brien. “We wanted the transition to feel like it’s welling up inside of him, not running across the surface. It was a tough look for us because you’re essentially trying to get a single surface scatter thing to feel like it’s starting to glow and feel natural. We drew a lot from the patterns that are painted on him and tried to have those drive the animation.” There are not many effects in the Land of the Living. “We did a lot of ambient things like candles, drinks on tables, and a bouquet of flowers that people carry [which was complicated].” Among the voice cast for Coco are Edward James Olmos, Gael García Bernal, Benjamin Bratt, Cheech Marin, Ana Ofelia Murguía, Sofía Espinosa and Alanna Ubach. “I’m proud that we have an all-Latino cast,” states Producer Darla Anderson. “Lee knows what he wants in terms of creating the texture and acting in the film. It
TOP: Unlike most dogs that have appeared in Pixar movies, Dante is a Xolo, which does not have any hair. BELOW: The ghost effect, developed by Keith Klohn, consists of five to six layers that are accentuated when a ghost passes through another being. BOTOM: Six different kinds and millions of lights populate the Land of the Dead, but rendering only what were essential for a scene required new code to be written by the RenderMan team and a partnership with Intel.
“Everything is flat and horizontal in this little town of Santa Cecilia in which Miguel lives. When he goes to the Land of the Dead we suddenly introduce all of these vibrant colors. It’s our version of Dorothy going to the Land of Oz.” —Lee Unkrich, Coco Director
WINTER 2018 VFXVOICE.COM • 85
11/22/17 5:00 PM
ANIMATION
TOP: The Land of the Living was purposely designed to be less vibrant and horizontal to contrast with the brightly colored and vertically oriented Land of the Dead. BOTTOM: Skeletons appear in a Pixar movie for the first time in Coco which required the human rigging be altered to allow for the bones to be removed and reattached.
was an arduous and interesting experience as we have lots of characters and family members. We needed each one to have a unique voice. The voice cast infused their own spirit into the characters.” “From the beginning it was important to me that the characters were really playing the musical instruments in the movie,” reveals Unkrich. “I wanted them to be playing the right chord, strumming strings in the right way, and to do that we did have to record a lot of the music well in advance and film lots of reference video of musicians playing the instruments so that the animators could make sure they created animation which was believable and accurate in how the music would be played.” Along with a score composed by Michael Giacchino, Adrian Molina collaborated with Germaine Franco, Bobby Lopez and Kristen Anderson-Lopez to write songs that helped to push the story forward. “We wanted to do a broad variety of Mexican music,” notes Molina. “We had this opportunity to lean into Mexican and indigenous instruments to create music that says a lot about our story and characters in a unique way. “The look of the film is spectacular but the moments which are most core and beautiful are the quiet ones which take you by surprise,” observes Molina. “I hope that people feel the same way.” A scene on a gondola traveling through the Land of the Dead stands out to Ryu. “It’s a shot that’s all about how huge and beautiful the world is.” O’Brien is pleased with the unique effects. “The spectacle of the Marigold Bridge, the ghosts, Dante’s transition and the Final Death are all things that people haven’t seen before.” The quest for a guitar is a wonderful moment for Gini Santos. “Miguel and Hector visit this character named Chicharron. It’s the setting, how Hector sings this song, and the lighting is so beautiful. I feel like here in the United States we have our own take on Día de los Muertos. The film taught me a lot about what that celebration is to Mexicans.” Anderson is pleased with the end result. “I’m proud of the team and what we’ve managed to do. Coco is unique, but yet feels like it belongs in the pantheon of all the other Pixar films.”
TOP TO BOTTOM, LEFT TO RIGHT:
“We had never done skeleton characters at Pixar before, so a lot of time was spent looking at how skeletons had been used in other animation, from Ray Harryhausen to early Disney cartoons to Tim Burton. It was all done with an eye towards being as appealing as possible and at the same time we had to create characters that could emote and people could care about. That’s what led us to make decisions like allowing our skeletons to have eyeballs. Eyes are the windows to the soul and I couldn’t imagine playing emotional scenes with a character that had empty eye sockets.” —Lee Unkrich, Coco Director
86 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 87
Adrian Molina (Photo: Deborah Coleman/Pixar)
TOP: A storyboard of the pivotal moment when Miguel Rivera strums the guitar that once belonged to his idol Ernesto de la Cruz. A signature motif of the skull is found in the design of the guitar head.
Lee Unkrich (Photo: Deborah Coleman/Pixar) David Ryu (Photo: Jessica Lifland/Pixar) Gini Santos (Photo: Deborah Coleman) Michael O’Brien (Photo: Deborah Coleman/Pixar) Darla Anderson (Photo: Deborah Coleman/Pixar)
“Skeletons are humanoid, so we were able to use underlying rigs for arms, legs, fingers, bones, eyes and mouths. The skeletons can take off their arms and put their hands on their heads, and take off their heads and throw them to the next guy. That’s part of the fun of the world. We had to augment the human rig to be able to detach.” —David Ryu, Supervising Technical Director
WINTER 2018 VFXVOICE.COM • 87
11/22/17 5:00 PM
ANIMATION
TOP: The Land of the Living was purposely designed to be less vibrant and horizontal to contrast with the brightly colored and vertically oriented Land of the Dead. BOTTOM: Skeletons appear in a Pixar movie for the first time in Coco which required the human rigging be altered to allow for the bones to be removed and reattached.
was an arduous and interesting experience as we have lots of characters and family members. We needed each one to have a unique voice. The voice cast infused their own spirit into the characters.” “From the beginning it was important to me that the characters were really playing the musical instruments in the movie,” reveals Unkrich. “I wanted them to be playing the right chord, strumming strings in the right way, and to do that we did have to record a lot of the music well in advance and film lots of reference video of musicians playing the instruments so that the animators could make sure they created animation which was believable and accurate in how the music would be played.” Along with a score composed by Michael Giacchino, Adrian Molina collaborated with Germaine Franco, Bobby Lopez and Kristen Anderson-Lopez to write songs that helped to push the story forward. “We wanted to do a broad variety of Mexican music,” notes Molina. “We had this opportunity to lean into Mexican and indigenous instruments to create music that says a lot about our story and characters in a unique way. “The look of the film is spectacular but the moments which are most core and beautiful are the quiet ones which take you by surprise,” observes Molina. “I hope that people feel the same way.” A scene on a gondola traveling through the Land of the Dead stands out to Ryu. “It’s a shot that’s all about how huge and beautiful the world is.” O’Brien is pleased with the unique effects. “The spectacle of the Marigold Bridge, the ghosts, Dante’s transition and the Final Death are all things that people haven’t seen before.” The quest for a guitar is a wonderful moment for Gini Santos. “Miguel and Hector visit this character named Chicharron. It’s the setting, how Hector sings this song, and the lighting is so beautiful. I feel like here in the United States we have our own take on Día de los Muertos. The film taught me a lot about what that celebration is to Mexicans.” Anderson is pleased with the end result. “I’m proud of the team and what we’ve managed to do. Coco is unique, but yet feels like it belongs in the pantheon of all the other Pixar films.”
TOP TO BOTTOM, LEFT TO RIGHT:
“We had never done skeleton characters at Pixar before, so a lot of time was spent looking at how skeletons had been used in other animation, from Ray Harryhausen to early Disney cartoons to Tim Burton. It was all done with an eye towards being as appealing as possible and at the same time we had to create characters that could emote and people could care about. That’s what led us to make decisions like allowing our skeletons to have eyeballs. Eyes are the windows to the soul and I couldn’t imagine playing emotional scenes with a character that had empty eye sockets.” —Lee Unkrich, Coco Director
86 • VFXVOICE.COM WINTER 2018
PG 82-87 COCO.indd 87
Adrian Molina (Photo: Deborah Coleman/Pixar)
TOP: A storyboard of the pivotal moment when Miguel Rivera strums the guitar that once belonged to his idol Ernesto de la Cruz. A signature motif of the skull is found in the design of the guitar head.
Lee Unkrich (Photo: Deborah Coleman/Pixar) David Ryu (Photo: Jessica Lifland/Pixar) Gini Santos (Photo: Deborah Coleman) Michael O’Brien (Photo: Deborah Coleman/Pixar) Darla Anderson (Photo: Deborah Coleman/Pixar)
“Skeletons are humanoid, so we were able to use underlying rigs for arms, legs, fingers, bones, eyes and mouths. The skeletons can take off their arms and put their hands on their heads, and take off their heads and throw them to the next guy. That’s part of the fun of the world. We had to augment the human rig to be able to detach.” —David Ryu, Supervising Technical Director
WINTER 2018 VFXVOICE.COM • 87
11/22/17 5:00 PM
VR/AR/MR TRENDS
AUGMENTED REALITY CATCHES A WAVE WITH APPLE, GOOGLE By CHRIS MCGOWAN
TOP: Pokémon Go, published by Niantic, is the free-to-play, location-based AR app for iOS and Android devices that took the world by storm in 2016. (Image copyright © Niantic)
88 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 88-89
Many of us have already found Pokémon in our homes and gardens. Soon, more will watch airships attack tiny outposts on dining room tables, view design specs appearing in the air at work, or see streams of information hovering before us as we walk down the street. The world is about to get more crowded. Augmented reality (AR) superimposes text, graphics and audio upon a user’s view of the real world. It has gotten a boost from Apple’s recent launch of ARKit, a software developer kit, and Google’s release of the competing ARCore; both will bring high-quality augmented reality to smartphones and tablets. Along with those firms, Facebook, Microsoft and other tech giants are actively involved in AR, as are a number of major film studios and producers. “The recent release of these technologies [ARKit, etc.] is rather important as they will serve as a bridge, an introduction to the concept of AR for both consumers and developers,” comments Vangelis Lympouridis, Founder of Los Angeles-based Enosis VR. “Using existing mobile platforms as a window to see and experience digital data in our physical spaces – or as augmented tools for physical tasks – will ease the way to our future integrations of digital devices with our senses.” The ARKit uses camera sensor data to accurately track motion and the space around the iPhone or iPad, and lets creators lock digital content onto real-world objects such as a floor or tabletop. “It enables you to visually track the environment around you. It helps the camera makes sense of the world around it,” comments Tuong Huy Nguyen, Principal Research Analyst for Gartner Inc.’s Consumer Technologies and Markets Research Group. Nguyen adds that Apple’s dive into AR will have a huge impact, as there is an “Apple effect” that results in “increased interest in adoption across the board,” as tends to happen with Apple’s “introduction of anything.” Apple is “not necessarily the first but they are the first to go out there and say, ‘This is a thing and this is why it’s cool.’ They have a pretty strong base of engaged users.” Pokémon Go is one AR app that many kids and parents already know. Published by Niantic, the free-to-play, location-based game for iOS and Android devices took the world by storm in 2016 and Niantic claims that it has been downloaded more than 650 million times. It features virtual creatures (Pokémon) that appear on a smartphone or tablet screen as if they were in the player’s realworld location. The creatures are located, captured and trained. “It changed user behavior and expectations considerably. Prior to Pokémon Go what reason would you have, or expectation would you have, of holding your phone up to a scene other than taking a picture? I would say not much else,” comments Nguyen. “What Pokémon did is that now you recognized that here is a thing your device can do. I can hold it up and it will superimpose some kind of information on top of the real world. I would say that application was by far the most influential [AR app until now].” A few years prior, Nearest Tube was another popular AR app. Developed in England by Acrossair for the iPhone and released in 2009, it helped tourists and locals locate the nearest London Underground stations. Another landmark came a few years later
with the debut of Google Glass. While the AR headset in the shape of eyeglasses wasn’t a success, it paved the way for the better received Glass Enterprise Edition, which appears to have great practicality as a workplace tool. Nguyen observes that “Google Glass, despite what the critics have said, played an enormous role. Prior to that, AR wasn’t a thing for consumers or businesses. After that, the [bar] was raised for everybody. It’s one of the biggest reasons we are where we are today.” “We feel AR will affect gaming and most other industries in significant ways. Unlike VR, AR is a complete paradigm shift in how to make and play games,” comments Mike Levine, CEO of Happy Giant, a social-mobile game developer and publisher that is releasing the tactical strategy game HoloGrid: Monster Battle. “It’s going to be very interesting to see what kind of experiences people gravitate to.” Happy Giant worked together with Tippett Studio to create HoloGrid. The game features monsters designed by Phil Tippett, VES, who created the creatures in Star Wars’ famous Holochess scene 40 years ago. “What we used photogrammetry to scan his monsters [and] turn them into digital 3D models, which we rigged and animated to use in the game,” comments Levine. Speaking of Holochess, it will soon also come to life with AR. Disney is launching the AR title Star Wars: Jedi Challenges, which will include the Holochess game, lightsaber battles and strategic
TOP: Happy Giant, a social-mobile game developer and publisher, is releasing the tactical strategy game HoloGrid: Monster Battle. Happy Giant worked with Tippett Studio to create HoloGrid, a game that features monsters designed by Phil Tippett, VES, who created the creatures in Star Wars’ famous Holochess scene 40 years ago. (Images copyright © Happy Giant and Tippett Studio) BOTTOM: Disney is launching the AR title Star Wars: Jedi Challenges, which will include the Holochess game, lightsaber battles and strategic combat, packaged with a Lenovo Mirage AR headset, a tracking beacon and lightsaber controller. The headset is compatible with select iOS and Android smartphones. (Images copyright © 2017 Lenovo and Disney. Star Wars © & ™ 2017 Lucasfilm Ltd. All rights reserved.)
“The recent release of these technologies [ARKit, etc.] is rather important as they will serve as a bridge, an introduction to the concept of AR for both consumers and developers. Using existing mobile platforms as a window to see and experience digital data in our physical spaces — or as augmented tools for physical tasks — will ease the way to our future integrations of digital devices with our senses.” —Vangelis Lympouridis, Founder, Enosis VR
WINTER 2018 VFXVOICE.COM • 89
11/22/17 5:00 PM
VR/AR/MR TRENDS
AUGMENTED REALITY CATCHES A WAVE WITH APPLE, GOOGLE By CHRIS MCGOWAN
TOP: Pokémon Go, published by Niantic, is the free-to-play, location-based AR app for iOS and Android devices that took the world by storm in 2016. (Image copyright © Niantic)
88 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 88-89
Many of us have already found Pokémon in our homes and gardens. Soon, more will watch airships attack tiny outposts on dining room tables, view design specs appearing in the air at work, or see streams of information hovering before us as we walk down the street. The world is about to get more crowded. Augmented reality (AR) superimposes text, graphics and audio upon a user’s view of the real world. It has gotten a boost from Apple’s recent launch of ARKit, a software developer kit, and Google’s release of the competing ARCore; both will bring high-quality augmented reality to smartphones and tablets. Along with those firms, Facebook, Microsoft and other tech giants are actively involved in AR, as are a number of major film studios and producers. “The recent release of these technologies [ARKit, etc.] is rather important as they will serve as a bridge, an introduction to the concept of AR for both consumers and developers,” comments Vangelis Lympouridis, Founder of Los Angeles-based Enosis VR. “Using existing mobile platforms as a window to see and experience digital data in our physical spaces – or as augmented tools for physical tasks – will ease the way to our future integrations of digital devices with our senses.” The ARKit uses camera sensor data to accurately track motion and the space around the iPhone or iPad, and lets creators lock digital content onto real-world objects such as a floor or tabletop. “It enables you to visually track the environment around you. It helps the camera makes sense of the world around it,” comments Tuong Huy Nguyen, Principal Research Analyst for Gartner Inc.’s Consumer Technologies and Markets Research Group. Nguyen adds that Apple’s dive into AR will have a huge impact, as there is an “Apple effect” that results in “increased interest in adoption across the board,” as tends to happen with Apple’s “introduction of anything.” Apple is “not necessarily the first but they are the first to go out there and say, ‘This is a thing and this is why it’s cool.’ They have a pretty strong base of engaged users.” Pokémon Go is one AR app that many kids and parents already know. Published by Niantic, the free-to-play, location-based game for iOS and Android devices took the world by storm in 2016 and Niantic claims that it has been downloaded more than 650 million times. It features virtual creatures (Pokémon) that appear on a smartphone or tablet screen as if they were in the player’s realworld location. The creatures are located, captured and trained. “It changed user behavior and expectations considerably. Prior to Pokémon Go what reason would you have, or expectation would you have, of holding your phone up to a scene other than taking a picture? I would say not much else,” comments Nguyen. “What Pokémon did is that now you recognized that here is a thing your device can do. I can hold it up and it will superimpose some kind of information on top of the real world. I would say that application was by far the most influential [AR app until now].” A few years prior, Nearest Tube was another popular AR app. Developed in England by Acrossair for the iPhone and released in 2009, it helped tourists and locals locate the nearest London Underground stations. Another landmark came a few years later
with the debut of Google Glass. While the AR headset in the shape of eyeglasses wasn’t a success, it paved the way for the better received Glass Enterprise Edition, which appears to have great practicality as a workplace tool. Nguyen observes that “Google Glass, despite what the critics have said, played an enormous role. Prior to that, AR wasn’t a thing for consumers or businesses. After that, the [bar] was raised for everybody. It’s one of the biggest reasons we are where we are today.” “We feel AR will affect gaming and most other industries in significant ways. Unlike VR, AR is a complete paradigm shift in how to make and play games,” comments Mike Levine, CEO of Happy Giant, a social-mobile game developer and publisher that is releasing the tactical strategy game HoloGrid: Monster Battle. “It’s going to be very interesting to see what kind of experiences people gravitate to.” Happy Giant worked together with Tippett Studio to create HoloGrid. The game features monsters designed by Phil Tippett, VES, who created the creatures in Star Wars’ famous Holochess scene 40 years ago. “What we used photogrammetry to scan his monsters [and] turn them into digital 3D models, which we rigged and animated to use in the game,” comments Levine. Speaking of Holochess, it will soon also come to life with AR. Disney is launching the AR title Star Wars: Jedi Challenges, which will include the Holochess game, lightsaber battles and strategic
TOP: Happy Giant, a social-mobile game developer and publisher, is releasing the tactical strategy game HoloGrid: Monster Battle. Happy Giant worked with Tippett Studio to create HoloGrid, a game that features monsters designed by Phil Tippett, VES, who created the creatures in Star Wars’ famous Holochess scene 40 years ago. (Images copyright © Happy Giant and Tippett Studio) BOTTOM: Disney is launching the AR title Star Wars: Jedi Challenges, which will include the Holochess game, lightsaber battles and strategic combat, packaged with a Lenovo Mirage AR headset, a tracking beacon and lightsaber controller. The headset is compatible with select iOS and Android smartphones. (Images copyright © 2017 Lenovo and Disney. Star Wars © & ™ 2017 Lucasfilm Ltd. All rights reserved.)
“The recent release of these technologies [ARKit, etc.] is rather important as they will serve as a bridge, an introduction to the concept of AR for both consumers and developers. Using existing mobile platforms as a window to see and experience digital data in our physical spaces — or as augmented tools for physical tasks — will ease the way to our future integrations of digital devices with our senses.” —Vangelis Lympouridis, Founder, Enosis VR
WINTER 2018 VFXVOICE.COM • 89
11/22/17 5:00 PM
VR/AR/MR TRENDS
“Google Glass, despite what the critics have said, played an enormous role. Prior to that, AR wasn’t a thing for consumers or businesses. After that, the [bar] was raised for everybody. It’s one of biggest reasons we are where we are today.” —Tuong Huy Nguyen, Principal Research Analyst, Gartner Inc. TOP: Microsoft’s HoloLens AR/MR headset demos the AR version of the hugely popular Minecraft game as well as a motorcycle visualization. (Images copyright © Microsoft) BOTTOM: Meta’s AR headset (Photo copyright © Meta)
combat. It will be packaged with a Lenovo Mirage AR headset, a “tracking beacon” and a “lightsaber controller.” The headset is compatible with select iOS and Android smartphones. In terms of other headsets, Microsoft has sold the HoloLens AR/MR headset since 2016, and has shown a demo AR version of the hugely popular Minecraft game on it. Avegant and Meta are among others offering AR headsets. IDC (the International Data Corp.) predicts the combined AR and VR headset market will hit 92 million units (61 million of those for consumers) by 2021. “Consumer-focused AR headsets are still some way off, as most people will first experience AR through the screen on their phone,” says Tom Mainelli, Program Vice President, Devices and AR/VR at IDC, on the company’s website. AR smartglasses include the aforementioned Google Glass (“Google Glass Enterprise Edition” in its 2.0 version), Epson, ODG and Vuzix. Facebook and Apple are developing smartglasses, while mysterious startup Magic Leap has everyone wondering what it’s
up to. Based in Plantation, Florida, Magic Leap is funded by such heavyweights as Google, Warner Bros., Alibaba, Qualcomm and VC firms like Kleiner Perkins, Caufield and Byers. It is reported to have raised more than $1.4 billion in capital, have more than 1,000 employees, and be valued at several billion dollars without having released a product. Magic Leap appears to be creating a new type of wearable computer device – smartglasses for augmented reality and mixed reality. The device reportedly projects an image directly onto the retina, but that is currently unconfirmed, as is the product’s launch date. Peter Jackson and Fran Walsh’s (Lord of the Rings) augmented-reality studio Wingnut AR showed off an impressive demo in June at Apple’s Worldwide Development Conference in San Jose and at the SIGGRAPH convention in Los Angeles. In the demo, airships attacked a tiny bustling town in a scenario that was superimposed on a tabletop on the stage. The AR piece was created with ARKit and Epic Games’ Unreal Engine software, and demonstrated on an iPad. During the Apple WWDC demo, Alasdair Coull, Creative Director of Wingnut AR and former R&D Head at Weta Digital, says, “We’re on a remote outpost in a desolate world where supplies are scarce.” He circled around the table, viewing the impressive scene from all sides and angles. “Wouldn’t it be cool to have airship battles in the midst of your own living room?” he asked. Along with Pokémon Go, some current or upcoming AR entertainment app titles include: Touch Press’s The Very Hungry Caterpillar, Climax Studios’ ARise, Next Game’s The Walking Dead: Our World. SnapChat and others have also delved into the
TOP: Magic Leap’s AR whale demo teaser. This well-funded startup is creating a new type of wearable computer device – smartglasses for augmented reality and mixed reality. The device reportedly projects an image directly onto the retina. (Image copyright © Magic Leap) BOTTOM: Lenovo’s Mirage headset with lightsaber and tracking beacon for Disney’s Star Wars: Jedi Challenges for AR. The headset is compatible with select iOS and Android smartphones. (Images copyright © 2017 Lenovo and Disney. Star Wars © & ™ 2017 Lucasfilm Ltd. All rights reserved.)
“We feel AR will affect gaming and most other industries in significant ways. Unlike VR, AR is a complete paradigm shift in how to make and play games. It’s going to be very interesting to see what kind of experiences people gravitate to.” —Mike Levine, CEO, Happy Giant
90 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 90-91
WINTER 2018 VFXVOICE.COM • 91
11/22/17 5:00 PM
VR/AR/MR TRENDS
“Google Glass, despite what the critics have said, played an enormous role. Prior to that, AR wasn’t a thing for consumers or businesses. After that, the [bar] was raised for everybody. It’s one of biggest reasons we are where we are today.” —Tuong Huy Nguyen, Principal Research Analyst, Gartner Inc. TOP: Microsoft’s HoloLens AR/MR headset demos the AR version of the hugely popular Minecraft game as well as a motorcycle visualization. (Images copyright © Microsoft) BOTTOM: Meta’s AR headset (Photo copyright © Meta)
combat. It will be packaged with a Lenovo Mirage AR headset, a “tracking beacon” and a “lightsaber controller.” The headset is compatible with select iOS and Android smartphones. In terms of other headsets, Microsoft has sold the HoloLens AR/MR headset since 2016, and has shown a demo AR version of the hugely popular Minecraft game on it. Avegant and Meta are among others offering AR headsets. IDC (the International Data Corp.) predicts the combined AR and VR headset market will hit 92 million units (61 million of those for consumers) by 2021. “Consumer-focused AR headsets are still some way off, as most people will first experience AR through the screen on their phone,” says Tom Mainelli, Program Vice President, Devices and AR/VR at IDC, on the company’s website. AR smartglasses include the aforementioned Google Glass (“Google Glass Enterprise Edition” in its 2.0 version), Epson, ODG and Vuzix. Facebook and Apple are developing smartglasses, while mysterious startup Magic Leap has everyone wondering what it’s
up to. Based in Plantation, Florida, Magic Leap is funded by such heavyweights as Google, Warner Bros., Alibaba, Qualcomm and VC firms like Kleiner Perkins, Caufield and Byers. It is reported to have raised more than $1.4 billion in capital, have more than 1,000 employees, and be valued at several billion dollars without having released a product. Magic Leap appears to be creating a new type of wearable computer device – smartglasses for augmented reality and mixed reality. The device reportedly projects an image directly onto the retina, but that is currently unconfirmed, as is the product’s launch date. Peter Jackson and Fran Walsh’s (Lord of the Rings) augmented-reality studio Wingnut AR showed off an impressive demo in June at Apple’s Worldwide Development Conference in San Jose and at the SIGGRAPH convention in Los Angeles. In the demo, airships attacked a tiny bustling town in a scenario that was superimposed on a tabletop on the stage. The AR piece was created with ARKit and Epic Games’ Unreal Engine software, and demonstrated on an iPad. During the Apple WWDC demo, Alasdair Coull, Creative Director of Wingnut AR and former R&D Head at Weta Digital, says, “We’re on a remote outpost in a desolate world where supplies are scarce.” He circled around the table, viewing the impressive scene from all sides and angles. “Wouldn’t it be cool to have airship battles in the midst of your own living room?” he asked. Along with Pokémon Go, some current or upcoming AR entertainment app titles include: Touch Press’s The Very Hungry Caterpillar, Climax Studios’ ARise, Next Game’s The Walking Dead: Our World. SnapChat and others have also delved into the
TOP: Magic Leap’s AR whale demo teaser. This well-funded startup is creating a new type of wearable computer device – smartglasses for augmented reality and mixed reality. The device reportedly projects an image directly onto the retina. (Image copyright © Magic Leap) BOTTOM: Lenovo’s Mirage headset with lightsaber and tracking beacon for Disney’s Star Wars: Jedi Challenges for AR. The headset is compatible with select iOS and Android smartphones. (Images copyright © 2017 Lenovo and Disney. Star Wars © & ™ 2017 Lucasfilm Ltd. All rights reserved.)
“We feel AR will affect gaming and most other industries in significant ways. Unlike VR, AR is a complete paradigm shift in how to make and play games. It’s going to be very interesting to see what kind of experiences people gravitate to.” —Mike Levine, CEO, Happy Giant
90 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 90-91
WINTER 2018 VFXVOICE.COM • 91
11/22/17 5:00 PM
VR/AR/MR TRENDS
“To create a pure play AR entertainment app will require many attempts before we see wins. Pokémon Go is clearly a huge win, but it wasn’t a pure AR win. Location was a big factor in the attach rate for that app. The innovations will be amazing to watch over the next several decades. But the impact of both AR and VR on narrative and game stories is going to be profound.” —Anthony Batt, Co-founder, Wevr TOP AND OPPOSITE PAGE: Wingnut AR’s demo shows an airship attacking a tiny bustling town in a scenario superimposed on a tabletop on a stage. The AR piece was created with ARKit and Epic Games’ Unreal Engine software, and demonstrated on an iPad. (Images copyright © of Wingnut AR)
92 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 93
whimsical side of AR with popular face-filter apps. Wevr’s co-founder Anthony Batt comments, “Generally, I think AR will make an immediate impact on the companies behind today’s successful apps. For example, Airbnb may let owners put AR notes in parts of their housing as a simple web app interface. Airbnb users could simply point their app at a corner of the space and get instructions or information. Apps like that will be helpful, but they will not be ‘entertainment.’ To create a pure play AR entertainment app will require many attempts before we see wins. Pokémon Go is clearly a huge win, but it wasn’t a pure AR win. Location was a big factor in the attach rate for that app. The innovations will be amazing to watch over the next several decades. But the impact of both AR and VR on narrative and game stories is going to be profound.” “AR is already popular and has grown significantly for the past 12 to 18 months. [We’re] seeing enterprises using it as a tool for their employees. Even though AR has been around for a long time, the iteration we’re talking about now is new and so the market is new. And we’re still in discovery mode,” says Nguyen.
“What we did was use photogrammetry to scan [Phil Tippett’s] monsters [and] turn them into digital 3D models, which we rigged and animated to use in the game.” —Mike Levine, CEO, Happy Giant
Tom Szirtes, Creative Technologist for London-based design studio Mbryonic, offers a different perspective. “My personal belief is that for AR adoption the issue isn’t so much limitations of tracking spaces but one of form factors.” It is important, he says, to transition to “affordable wearable AR.” Jon Peddie, President of Tiburon, California-based Jon Peddie Research and author of Augmented Reality, Where We Will All Live, adds, “The bottom line is that AR is very similar to electricity. It is not a thing; you can’t go buy 13 pounds of AR. And like electricity it is totally application-dependent for a description. An electric fan is an app that uses electricity. Pokémon is an app.” Lympouridis adds, “We are currently in the early days of this amazing technology. AR has the power to drastically transform every aspect of our lives. In contrast with VR that introduces parallel universes to our existence, AR expands our current reality and creates a fusion of data with our senses. When established, our everyday experience will become an amalgam of our physical, sensorial and technological layers, affecting once and for all how we live our lives, navigate our structures and recall our memories.”
WINTER 2018 VFXVOICE.COM • 93
11/22/17 5:00 PM
VR/AR/MR TRENDS
“To create a pure play AR entertainment app will require many attempts before we see wins. Pokémon Go is clearly a huge win, but it wasn’t a pure AR win. Location was a big factor in the attach rate for that app. The innovations will be amazing to watch over the next several decades. But the impact of both AR and VR on narrative and game stories is going to be profound.” —Anthony Batt, Co-founder, Wevr TOP AND OPPOSITE PAGE: Wingnut AR’s demo shows an airship attacking a tiny bustling town in a scenario superimposed on a tabletop on a stage. The AR piece was created with ARKit and Epic Games’ Unreal Engine software, and demonstrated on an iPad. (Images copyright © of Wingnut AR)
92 • VFXVOICE.COM WINTER 2018
PG 88-93 AUGMENTED REALITY.indd 93
whimsical side of AR with popular face-filter apps. Wevr’s co-founder Anthony Batt comments, “Generally, I think AR will make an immediate impact on the companies behind today’s successful apps. For example, Airbnb may let owners put AR notes in parts of their housing as a simple web app interface. Airbnb users could simply point their app at a corner of the space and get instructions or information. Apps like that will be helpful, but they will not be ‘entertainment.’ To create a pure play AR entertainment app will require many attempts before we see wins. Pokémon Go is clearly a huge win, but it wasn’t a pure AR win. Location was a big factor in the attach rate for that app. The innovations will be amazing to watch over the next several decades. But the impact of both AR and VR on narrative and game stories is going to be profound.” “AR is already popular and has grown significantly for the past 12 to 18 months. [We’re] seeing enterprises using it as a tool for their employees. Even though AR has been around for a long time, the iteration we’re talking about now is new and so the market is new. And we’re still in discovery mode,” says Nguyen.
“What we did was use photogrammetry to scan [Phil Tippett’s] monsters [and] turn them into digital 3D models, which we rigged and animated to use in the game.” —Mike Levine, CEO, Happy Giant
Tom Szirtes, Creative Technologist for London-based design studio Mbryonic, offers a different perspective. “My personal belief is that for AR adoption the issue isn’t so much limitations of tracking spaces but one of form factors.” It is important, he says, to transition to “affordable wearable AR.” Jon Peddie, President of Tiburon, California-based Jon Peddie Research and author of Augmented Reality, Where We Will All Live, adds, “The bottom line is that AR is very similar to electricity. It is not a thing; you can’t go buy 13 pounds of AR. And like electricity it is totally application-dependent for a description. An electric fan is an app that uses electricity. Pokémon is an app.” Lympouridis adds, “We are currently in the early days of this amazing technology. AR has the power to drastically transform every aspect of our lives. In contrast with VR that introduces parallel universes to our existence, AR expands our current reality and creates a fusion of data with our senses. When established, our everyday experience will become an amalgam of our physical, sensorial and technological layers, affecting once and for all how we live our lives, navigate our structures and recall our memories.”
WINTER 2018 VFXVOICE.COM • 93
11/22/17 5:00 PM
IMMERSIVE MEDIA
VR180: NEW FORMAT AIMS TO EXPAND IMMERSIVE MARKET By CHRIS MCGOWAN
TOP AND BOTTOM: VR180 titles can be experienced immersively with Google Daydream View and other gear as well. (Image copyright © Google)
94 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 94-95
The expression “doing a 180” just took on a new meaning. In June, Google announced VR180, a new format that the company hopes will boost the production of immersive media and expand its potential market. VR180 basically cuts the visual range of 360-degree video in half and promises to lower costs significantly for both production and viewing. Despite the “VR” in its name, VR180 isn’t virtual reality, although it is immersive to the extent of filling one’s forward field of vision with images. It may just prove that in terms of immersion, sometimes a little is enough. In his June 16 blog post “Hot and Cold: Heat Maps in VR,” YouTube Product Manager Frank Rodriguez wrote about 360-degree videos, stating: “Surprisingly, people spent 75% of their time within the front 90 degrees of a video.” So, perhaps VR180 will suffice for the Super Bowl or a Bruno Mars concert, while 360 will be sensational for a Serengeti safari or a visit to Vienna. Not all content is 360-worthy, let alone VR-worthy with complex interactivity. VR180 isn’t as exciting as virtual reality, but it may draw in a mass market for immersive media in short order. However, some developers see it as an old-school “180” (reversing direction) on the road to virtual reality. Or, maybe, as an unnecessary move sideways. The key point may be that if VR180 proves to be as easy to shoot and edit as claimed, it will quickly become a mass-market format that is somewhat immersive that everyone can use; that will provide YouTube and other channels with a vast amount of home-made and professional content. In 360-degree video, the viewer is inside a sphere of visual and audio information; wherever you look, you are inside the filmed or simulated environment. VR180 presents just the front half of that sphere. The image fills a person’s field of view, so there is a sense of immersion; you can see what’s in front of you, in 3D, but not what’s behind you. Take a look back and there’s no there there. Behind you is just blackness – a void. So, because it has fewer pixels to present, VR180 potentially offers higher resolution visuals than a comparable 360-degree video. Or, depending on resolution choices, VR180 videos could take up less bandwidth and space. In addition, VR180 cameras can
potentially be cheap and easy to use; according to Google, VR180 videos can be shot much like regular videos. Projects should be less time-consuming – and do without the 360 post-production process of “stitching” (combining various video footage from omnidirectional cameras or multi-camera rigs) to create the spherical visuals. Filmmakers behind the camera also won’t have to worry about appearing in the shot. Once a VR180 title is completed, it can be viewed as a flat video on desktop and mobile, or experienced immersively with Google Cardboard, Google Daydream or PSVR headsets, as well as other gear that’s on its way. The Google Daydream division is working with hardware companies Lenovo Group, YI Technology and LG to create cameras designed for shooting VR180 that “will hit the shelves this winter,” according to YouTube’s Frank Rodriguez. Lenovo has announced that it will be the first of these companies to launch a VR180 camera and that it will “fit in your pocket and is meant to be as user-friendly as a point-and-shoot.” The Lenovo camera will create 180-degree videos, shoot half-sphere still photos, and stream live in the VR180 format. It will support a stereoscopic view when the footage is viewed on a VR headset. Lenovo is aiming the camera at consumers and says it will cost the same as other
TOP LEFT: A VR180 demo video on YouTube. (Image copyright © YouTube and Google) TOP RIGHT: Lucid VR’s LucidCam is a 3D 180-degree photo/video camera with a lightweight, pocketable design. It supports images and videos in 2K, 3K and 4K. With an optional waterproof case, the LucidCam can film underwater up to 12 meters in depth. (Image copyright © Lucid VR) BOTTOM: NextVR sports broadcasting. (Image copyright © NextVR)
“This is more like a natural progression towards a broader adaptation of VR and a mature way to understand and contextualize the ecology of different uses and experiences of the new medium. … Especially when experiencing live events such as concerts and sports, 180-degree videos solve a lot of technical issues with a minor sacrifice in immersion.” —Vangelis Lympouridis, Founder, Enosis VR
WINTER 2018 VFXVOICE.COM • 95
11/22/17 5:01 PM
IMMERSIVE MEDIA
VR180: NEW FORMAT AIMS TO EXPAND IMMERSIVE MARKET By CHRIS MCGOWAN
TOP AND BOTTOM: VR180 titles can be experienced immersively with Google Daydream View and other gear as well. (Image copyright © Google)
94 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 94-95
The expression “doing a 180” just took on a new meaning. In June, Google announced VR180, a new format that the company hopes will boost the production of immersive media and expand its potential market. VR180 basically cuts the visual range of 360-degree video in half and promises to lower costs significantly for both production and viewing. Despite the “VR” in its name, VR180 isn’t virtual reality, although it is immersive to the extent of filling one’s forward field of vision with images. It may just prove that in terms of immersion, sometimes a little is enough. In his June 16 blog post “Hot and Cold: Heat Maps in VR,” YouTube Product Manager Frank Rodriguez wrote about 360-degree videos, stating: “Surprisingly, people spent 75% of their time within the front 90 degrees of a video.” So, perhaps VR180 will suffice for the Super Bowl or a Bruno Mars concert, while 360 will be sensational for a Serengeti safari or a visit to Vienna. Not all content is 360-worthy, let alone VR-worthy with complex interactivity. VR180 isn’t as exciting as virtual reality, but it may draw in a mass market for immersive media in short order. However, some developers see it as an old-school “180” (reversing direction) on the road to virtual reality. Or, maybe, as an unnecessary move sideways. The key point may be that if VR180 proves to be as easy to shoot and edit as claimed, it will quickly become a mass-market format that is somewhat immersive that everyone can use; that will provide YouTube and other channels with a vast amount of home-made and professional content. In 360-degree video, the viewer is inside a sphere of visual and audio information; wherever you look, you are inside the filmed or simulated environment. VR180 presents just the front half of that sphere. The image fills a person’s field of view, so there is a sense of immersion; you can see what’s in front of you, in 3D, but not what’s behind you. Take a look back and there’s no there there. Behind you is just blackness – a void. So, because it has fewer pixels to present, VR180 potentially offers higher resolution visuals than a comparable 360-degree video. Or, depending on resolution choices, VR180 videos could take up less bandwidth and space. In addition, VR180 cameras can
potentially be cheap and easy to use; according to Google, VR180 videos can be shot much like regular videos. Projects should be less time-consuming – and do without the 360 post-production process of “stitching” (combining various video footage from omnidirectional cameras or multi-camera rigs) to create the spherical visuals. Filmmakers behind the camera also won’t have to worry about appearing in the shot. Once a VR180 title is completed, it can be viewed as a flat video on desktop and mobile, or experienced immersively with Google Cardboard, Google Daydream or PSVR headsets, as well as other gear that’s on its way. The Google Daydream division is working with hardware companies Lenovo Group, YI Technology and LG to create cameras designed for shooting VR180 that “will hit the shelves this winter,” according to YouTube’s Frank Rodriguez. Lenovo has announced that it will be the first of these companies to launch a VR180 camera and that it will “fit in your pocket and is meant to be as user-friendly as a point-and-shoot.” The Lenovo camera will create 180-degree videos, shoot half-sphere still photos, and stream live in the VR180 format. It will support a stereoscopic view when the footage is viewed on a VR headset. Lenovo is aiming the camera at consumers and says it will cost the same as other
TOP LEFT: A VR180 demo video on YouTube. (Image copyright © YouTube and Google) TOP RIGHT: Lucid VR’s LucidCam is a 3D 180-degree photo/video camera with a lightweight, pocketable design. It supports images and videos in 2K, 3K and 4K. With an optional waterproof case, the LucidCam can film underwater up to 12 meters in depth. (Image copyright © Lucid VR) BOTTOM: NextVR sports broadcasting. (Image copyright © NextVR)
“This is more like a natural progression towards a broader adaptation of VR and a mature way to understand and contextualize the ecology of different uses and experiences of the new medium. … Especially when experiencing live events such as concerts and sports, 180-degree videos solve a lot of technical issues with a minor sacrifice in immersion.” —Vangelis Lympouridis, Founder, Enosis VR
WINTER 2018 VFXVOICE.COM • 95
11/22/17 5:01 PM
IMMERSIVE MEDIA
“VR180 is a very practical compromise to get around the current issues of bandwidth on 360 videos, but really it’s just a widescreen stereoscopic format. Personally, fixed point, immersive linear films are of limited interest to me as a creator of virtual realities.” —Tom Szirtes, Creative Technologist, Mbryonic
TOP RIGHT: From Clark’s “Peak Magnetic: VR180 Experience – Fuji Rock Festival” (Image copyright © Clark, Warp Records, YouTube) BOTTOM: From “Charlie Puth in London” VR180 on YouTube. (Image copyright © Charlie Puth, Atlantic Records and Google)
96 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 96-97
Virtual Vocabulary
point-and-shoot cameras. Lucid VR is also active in this area and raised large sums through IndieGoGo crowdfunding in 2015 to develop its camera. The LucidCam is a 3D 180-degree photo/video camera with a lightweight, pocketable design. It supports images and videos in 2K, 3K and 4K. The device is touted as user-friendly and is on sale now. With an optional waterproof case, the LucidCam can film underwater up to 12 meters in depth. Oculus Co-founder Jack McCauley is one of the heavyweights on the firm’s board of advisors. In the near future, Google will loan developers VR180 cameras from one of its “YouTube Spaces” in nine cities worldwide. Google is also setting up a “VR180 certification program” for VR180 cameras. Z CAM was one of the first to sign up and will also launch a VR camera soon. In addition, YouTube has worked closely with Adobe to make sure the video-editing application Premiere is compatible with VR180. VR180 demos are already available on YouTube’s site. Titles include Clark’s “Peak Magnetic,” “The Poppy VR180 Experience,” “JammJam at Tower Records,” “Red Carpet Live Stream in VR180: 2017 Teen Choice Awards,” “Charlie Puth in London,” “Coachella VR180 Tour,” “Future Islands Interview: Coachella 2017,” “Dreamcar Interview: Coachella 2017” and “Merrell Twins Live VR180 Broadcast.” There is even a “Baking Soda and Vinegar Easy Science Experiments for Kids” with NBA star Kevin Durant. NextVR, based in Newport Beach, California, has already been filming some NBA games, NFL highlights and other sporting fare with its own 3D, 180-degree system. It requires a Google Daydream View or Samsung Gear VR headset and compatible smartphone for playback. But not everyone is excited about the new format. “VR180 is a very practical compromise to get around the current issues of bandwidth on 360 videos, but really it’s just a widescreen stereoscopic format,” says Tom Szirtes, Creative Technologist at Mbryonic, a London-based digital design studio. He adds, “Personally, fixed point, immersive linear films are of limited interest to me as a creator of virtual realities.” Tuong Huy Nguyen, Principal Research Analyst for Gartner,
Virtual reality is a phrase that means different things to different people, some of whom don’t like the term being applied to 360-degree video – or to VR180, for that matter. Many products available now or coming soon share the VR label but vary greatly in terms of immersion and interactivity. To add to the complexity, we also have AR (augmented reality) and MR (mixed reality). So, how can we distinguish between these terms? The term “virtual reality” in the traditional sense was popularized by Jaron Lanier around 1987 referring to a three-dimensional simulated world that is immersive and interactive. But according to much current common usage, “immersive” is the key point and “interactive” is not so essential – as in the case of 360-degree videos. In them, you can pan around the immersive setting, but you can’t interact with it. About the format, Gartner Inc. Principal Research Analyst Tuong Huy Nguyen comments, “We consider it virtual reality as well. It is a recreated scene. You’re not physically there. You’re virtually somewhere else.” Lanier’s work in VR built on the ideas of pioneers like Morton Heilig (the “Sensorama” machine) and Ivan Sutherland (the first HMD or head-mounted display, in the late ‘60s). Meanwhile, in popular culture, the idea of virtual reality became associated with that of cyberspace (networked VR), which was conceptualized in science-fiction author Vernor Vinge’s 1981 novella True Names and William Gibson’s novel Neuromancer (Gibson coined the term “cyberspace”). And, in the 1982 movie Tron, the protagonist enters a virtual reality called “the grid” that is inside a video game. Lanier co-founded VPL Research, the first company to sell VR products. They developed their own HMD, a data glove (which had position-sensitive sensors) and a data suit (which had sensors for detecting body movement). VPL implemented the first usage of avatars (representations of users within such systems). In the years since, others have continued VR research and launched related products, with varying degrees of success. In full virtual reality, the computer-generated environment (virtual world) responds to an individual’s actions. Currently, this involves head tracking (through the HMD) and sometimes hand and body tracking with sensors. In room-scale VR, users can move around a physical space and their movements are reflected in the virtual world. Presence is when users feel like they really are in the artificial environment. In the future, some forms of virtual reality will offer visuals, audio and the sense of touch in the virtual world. Haptic devices deliver tactile feedback and further heighten presence. Some experiences will feature shared worlds, or virtual environments shared by multiple participants.
“I don’t feel the term ‘virtual reality’ is a great term to accurately capture everything that is happening within the immersive creative community. It is simple, but also an over-simplified way to describe the many types of profound, immersive work that are being produced,” says Anthony Batt, Co-founder of the VR studio Wevr. Augmented Reality (AR) overlays text, graphics and audio onto a user’s view of the real world. For example, you are in your living room, and text or even zombies are superimposed on your dining room table, on the walls, or simply in space before you. Pokémon Go is an AR game in which virtual creatures (Pokémon) appear on a smartphone or tablet screen as if they were in the player’s real-world location. Much future AR will be experienced through smartglasses – special glasses that will display digital content atop your real-world view. Mixed Reality (MR) is a merging of real and virtual worlds, wherein the digital objects are aware of the real-world environment and can interact with it. “There’s a dynamic interaction of the digital content with the world around you,” observes Gartner’s Nguyen. This term has its detractors, with some saying it is just a subset of AR. 360-degree video (“360 video” or simply “360”) offers immersion but limited interactivity. They are videos in which views in every direction are recorded at the same time (up, down and all around) with omnidirectional cameras or multi-camera rigs. The different filmed areas are merged in a method called video stitching. When viewing a 360-degree video, the user controls the viewing direction. For some people, a 360-degree video qualifies as virtual reality because you can view a virtual world through a headset and feel immersed. However, others argue that in true VR a user can move around and interact with the virtual world. VR180 is like a 360-degree video cut in half, showing just the front half of what is being presented. When viewed with an HMD, it can fill one’s FOV (field of view) and feel immersive. It promises to be easy and cheap to film and upload. Enosis VR Founder Vangelis Lympouridis thinks we’ll have more terms in the future to capture the many variations of VR. “It is very funny to see how the term VR was overused and even misused by the industry and consumers over the past few years. VR is actually the medium, such as the radio or the TV, and doesn’t really tell us anything about what is that we are going to experience. As a term it already carries strong historical connotations from academia and the tech industry, and it will be very liberating to see new terms that describe the actual experience of VR within different contexts such as gaming, music, medical, education, etc. rather than piling endless different experiences under one single term.”
WINTER 2018 VFXVOICE.COM • 97
11/22/17 5:01 PM
IMMERSIVE MEDIA
“VR180 is a very practical compromise to get around the current issues of bandwidth on 360 videos, but really it’s just a widescreen stereoscopic format. Personally, fixed point, immersive linear films are of limited interest to me as a creator of virtual realities.” —Tom Szirtes, Creative Technologist, Mbryonic
TOP RIGHT: From Clark’s “Peak Magnetic: VR180 Experience – Fuji Rock Festival” (Image copyright © Clark, Warp Records, YouTube) BOTTOM: From “Charlie Puth in London” VR180 on YouTube. (Image copyright © Charlie Puth, Atlantic Records and Google)
96 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 96-97
Virtual Vocabulary
point-and-shoot cameras. Lucid VR is also active in this area and raised large sums through IndieGoGo crowdfunding in 2015 to develop its camera. The LucidCam is a 3D 180-degree photo/video camera with a lightweight, pocketable design. It supports images and videos in 2K, 3K and 4K. The device is touted as user-friendly and is on sale now. With an optional waterproof case, the LucidCam can film underwater up to 12 meters in depth. Oculus Co-founder Jack McCauley is one of the heavyweights on the firm’s board of advisors. In the near future, Google will loan developers VR180 cameras from one of its “YouTube Spaces” in nine cities worldwide. Google is also setting up a “VR180 certification program” for VR180 cameras. Z CAM was one of the first to sign up and will also launch a VR camera soon. In addition, YouTube has worked closely with Adobe to make sure the video-editing application Premiere is compatible with VR180. VR180 demos are already available on YouTube’s site. Titles include Clark’s “Peak Magnetic,” “The Poppy VR180 Experience,” “JammJam at Tower Records,” “Red Carpet Live Stream in VR180: 2017 Teen Choice Awards,” “Charlie Puth in London,” “Coachella VR180 Tour,” “Future Islands Interview: Coachella 2017,” “Dreamcar Interview: Coachella 2017” and “Merrell Twins Live VR180 Broadcast.” There is even a “Baking Soda and Vinegar Easy Science Experiments for Kids” with NBA star Kevin Durant. NextVR, based in Newport Beach, California, has already been filming some NBA games, NFL highlights and other sporting fare with its own 3D, 180-degree system. It requires a Google Daydream View or Samsung Gear VR headset and compatible smartphone for playback. But not everyone is excited about the new format. “VR180 is a very practical compromise to get around the current issues of bandwidth on 360 videos, but really it’s just a widescreen stereoscopic format,” says Tom Szirtes, Creative Technologist at Mbryonic, a London-based digital design studio. He adds, “Personally, fixed point, immersive linear films are of limited interest to me as a creator of virtual realities.” Tuong Huy Nguyen, Principal Research Analyst for Gartner,
Virtual reality is a phrase that means different things to different people, some of whom don’t like the term being applied to 360-degree video – or to VR180, for that matter. Many products available now or coming soon share the VR label but vary greatly in terms of immersion and interactivity. To add to the complexity, we also have AR (augmented reality) and MR (mixed reality). So, how can we distinguish between these terms? The term “virtual reality” in the traditional sense was popularized by Jaron Lanier around 1987 referring to a three-dimensional simulated world that is immersive and interactive. But according to much current common usage, “immersive” is the key point and “interactive” is not so essential – as in the case of 360-degree videos. In them, you can pan around the immersive setting, but you can’t interact with it. About the format, Gartner Inc. Principal Research Analyst Tuong Huy Nguyen comments, “We consider it virtual reality as well. It is a recreated scene. You’re not physically there. You’re virtually somewhere else.” Lanier’s work in VR built on the ideas of pioneers like Morton Heilig (the “Sensorama” machine) and Ivan Sutherland (the first HMD or head-mounted display, in the late ‘60s). Meanwhile, in popular culture, the idea of virtual reality became associated with that of cyberspace (networked VR), which was conceptualized in science-fiction author Vernor Vinge’s 1981 novella True Names and William Gibson’s novel Neuromancer (Gibson coined the term “cyberspace”). And, in the 1982 movie Tron, the protagonist enters a virtual reality called “the grid” that is inside a video game. Lanier co-founded VPL Research, the first company to sell VR products. They developed their own HMD, a data glove (which had position-sensitive sensors) and a data suit (which had sensors for detecting body movement). VPL implemented the first usage of avatars (representations of users within such systems). In the years since, others have continued VR research and launched related products, with varying degrees of success. In full virtual reality, the computer-generated environment (virtual world) responds to an individual’s actions. Currently, this involves head tracking (through the HMD) and sometimes hand and body tracking with sensors. In room-scale VR, users can move around a physical space and their movements are reflected in the virtual world. Presence is when users feel like they really are in the artificial environment. In the future, some forms of virtual reality will offer visuals, audio and the sense of touch in the virtual world. Haptic devices deliver tactile feedback and further heighten presence. Some experiences will feature shared worlds, or virtual environments shared by multiple participants.
“I don’t feel the term ‘virtual reality’ is a great term to accurately capture everything that is happening within the immersive creative community. It is simple, but also an over-simplified way to describe the many types of profound, immersive work that are being produced,” says Anthony Batt, Co-founder of the VR studio Wevr. Augmented Reality (AR) overlays text, graphics and audio onto a user’s view of the real world. For example, you are in your living room, and text or even zombies are superimposed on your dining room table, on the walls, or simply in space before you. Pokémon Go is an AR game in which virtual creatures (Pokémon) appear on a smartphone or tablet screen as if they were in the player’s real-world location. Much future AR will be experienced through smartglasses – special glasses that will display digital content atop your real-world view. Mixed Reality (MR) is a merging of real and virtual worlds, wherein the digital objects are aware of the real-world environment and can interact with it. “There’s a dynamic interaction of the digital content with the world around you,” observes Gartner’s Nguyen. This term has its detractors, with some saying it is just a subset of AR. 360-degree video (“360 video” or simply “360”) offers immersion but limited interactivity. They are videos in which views in every direction are recorded at the same time (up, down and all around) with omnidirectional cameras or multi-camera rigs. The different filmed areas are merged in a method called video stitching. When viewing a 360-degree video, the user controls the viewing direction. For some people, a 360-degree video qualifies as virtual reality because you can view a virtual world through a headset and feel immersed. However, others argue that in true VR a user can move around and interact with the virtual world. VR180 is like a 360-degree video cut in half, showing just the front half of what is being presented. When viewed with an HMD, it can fill one’s FOV (field of view) and feel immersive. It promises to be easy and cheap to film and upload. Enosis VR Founder Vangelis Lympouridis thinks we’ll have more terms in the future to capture the many variations of VR. “It is very funny to see how the term VR was overused and even misused by the industry and consumers over the past few years. VR is actually the medium, such as the radio or the TV, and doesn’t really tell us anything about what is that we are going to experience. As a term it already carries strong historical connotations from academia and the tech industry, and it will be very liberating to see new terms that describe the actual experience of VR within different contexts such as gaming, music, medical, education, etc. rather than piling endless different experiences under one single term.”
WINTER 2018 VFXVOICE.COM • 97
11/22/17 5:01 PM
IMMERSIVE MEDIA
TOP: From “Merrell Twins Live VR180 Broadcast” VR180. (Image copyright © Merrell Twins and YouTube) BOTTOM: From Makala Cheung’s “Weak” VR180. (Image copyright © Makala Cheung, Yellow Rhinestone Records, YouTube)
TOP: “JamJamm at Tower Records” and other VR180 titles are available on YouTube. (Image copyright © JamJamm and Google)
“Immersive music would definitely benefit from VR180 as more and more content will be produced using existing tools and a much lighter production pipeline. Directors, musicians and any performer will be able to design and capture their stages and augment them with VFX and traditional scene compositing techniques without the complexities of 360 video both in terms of technical sophistication as well as storytelling.” —Vangelis Lympouridis, Founder, Enosis VR
98 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 99
It seems likely that VR180 will work just fine for many uses, and will be the format of choice for non-industry people (i.e., most everyone) who want to shoot semi-immersive, panoramic home videos, or consumers who don’t need 360-degree coverage of what they’re watching. Even if VR180 is limited in the VR sense, and would be better called “180video,” it could also greatly expand the market for immersive media and serve as a “gateway” medium that will inspire new users to plunge further into immersive realms. Quite probably, immersive media will take many forms and there is room for them all.
Inc., had this to say about VR180: “It just seems a little limiting to me. It’s basically like panorama, but really close to your face. I don’t want to completely dismiss it right away. VR and 3D interaction is a new and tricky area to tackle. Maybe this is a necessary steppingstone as we figure out this new three-dimensional interface.” However, Vangelis Lympouridis, Founder of Los Angeles-based Enosis VR and creator of the “Bohemian Rhapsody” VR experience, has a more optimistic view about the nascent format. “This is more like a natural progression towards a broader adaptation of VR and a mature way to understand and contextualize the ecology of different uses and experiences of the new medium. It is true that [in 360 video] after the initial ‘Wow’ moment of being able to watch content in any possible direction including the back, most of the viewers limit their focus into the frontal plane anyway. Especially when experiencing live events such as concerts and sports, 180-degree videos solve a lot of technical issues with a minor sacrifice in immersion.” Lympouridis thinks VR180 could find success in the realm of music. “Immersive music would definitely benefit from VR180 as more and more content will be produced using existing tools and a much lighter production pipeline. Directors, musicians and any performer will be able to design and capture their stages and augment them with VFX and traditional scene compositing techniques without the complexities of 360 video, both in terms of technical sophistication as well as storytelling.”
WINTER 2018 VFXVOICE.COM • 99
11/22/17 5:01 PM
IMMERSIVE MEDIA
TOP: From “Merrell Twins Live VR180 Broadcast” VR180. (Image copyright © Merrell Twins and YouTube) BOTTOM: From Makala Cheung’s “Weak” VR180. (Image copyright © Makala Cheung, Yellow Rhinestone Records, YouTube)
TOP: “JamJamm at Tower Records” and other VR180 titles are available on YouTube. (Image copyright © JamJamm and Google)
“Immersive music would definitely benefit from VR180 as more and more content will be produced using existing tools and a much lighter production pipeline. Directors, musicians and any performer will be able to design and capture their stages and augment them with VFX and traditional scene compositing techniques without the complexities of 360 video both in terms of technical sophistication as well as storytelling.” —Vangelis Lympouridis, Founder, Enosis VR
98 • VFXVOICE.COM WINTER 2018
PG 94-99 VR 180.indd 99
It seems likely that VR180 will work just fine for many uses, and will be the format of choice for non-industry people (i.e., most everyone) who want to shoot semi-immersive, panoramic home videos, or consumers who don’t need 360-degree coverage of what they’re watching. Even if VR180 is limited in the VR sense, and would be better called “180video,” it could also greatly expand the market for immersive media and serve as a “gateway” medium that will inspire new users to plunge further into immersive realms. Quite probably, immersive media will take many forms and there is room for them all.
Inc., had this to say about VR180: “It just seems a little limiting to me. It’s basically like panorama, but really close to your face. I don’t want to completely dismiss it right away. VR and 3D interaction is a new and tricky area to tackle. Maybe this is a necessary steppingstone as we figure out this new three-dimensional interface.” However, Vangelis Lympouridis, Founder of Los Angeles-based Enosis VR and creator of the “Bohemian Rhapsody” VR experience, has a more optimistic view about the nascent format. “This is more like a natural progression towards a broader adaptation of VR and a mature way to understand and contextualize the ecology of different uses and experiences of the new medium. It is true that [in 360 video] after the initial ‘Wow’ moment of being able to watch content in any possible direction including the back, most of the viewers limit their focus into the frontal plane anyway. Especially when experiencing live events such as concerts and sports, 180-degree videos solve a lot of technical issues with a minor sacrifice in immersion.” Lympouridis thinks VR180 could find success in the realm of music. “Immersive music would definitely benefit from VR180 as more and more content will be produced using existing tools and a much lighter production pipeline. Directors, musicians and any performer will be able to design and capture their stages and augment them with VFX and traditional scene compositing techniques without the complexities of 360 video, both in terms of technical sophistication as well as storytelling.”
WINTER 2018 VFXVOICE.COM • 99
11/22/17 5:01 PM
VFX VAULT
REVISITING THE RAMPAGING CG ANIMALS OF JUMANJI By IAN FAILES
By 1995, many Hollywood films had embraced the use of CGI to help tell more elaborate tales. The industry had been spurred along in this way by the success of digital effects in films such as Terminator 2: Judgment Day, Jurassic Park and The Mask. Joe Johnston’s Jumanji, starring Robin Williams who, as a child, is lost in a magical jungle after playing a mysterious board game, is one of those films that took full advantage of the state of CGI in filmmaking – from Industrial Light & Magic (ILM) in particular – using it to depict living, breathing jungle animals. However, with digital effects only then in their infancy, Johnston still employed a great deal of practical effects in Jumanji, including full-scale creatures made and puppeteered by Amalgamated Dynamics, Inc. and miniatures crafted by ILM. A new new take on the board game adventure – Jumanji: Welcome to the Jungle, directed by Jake Kasdan – has just hit cinemas, so VFX Voice took the opportunity to ask Johnston and ILM’s Animation Supervisor at that time, Kyle Balda, about how the CG creatures in the original film were brought to life. CAN THIS BE DONE?
Jumanji’s digital visual effects were ambitious. A herd of rhinos would be seen crashing through the corridors of a house. A stream of wild animals would pound through the center of a small town – this scene included the iconic shot of an elephant crushing a car. A lion would need to leap down stairs. And wild monkeys would tear up a kitchen, and then drive a car. Johnston was not sure any of these shots would be possible, especially since many of the creatures had hairy or feathery features, which was a major challenge in VFX at the time. “I had no confidence at first that it would work at all,” the director says. “Early tests on the lion’s mane were not confidence builders, but the team kept digging down on the tech to make it work, and I think it paid off well in the end. If you compare it to where CGI is now with textures like hair and feathers and other subtle skin surfaces, the Jumanji stuff is fairly primitive, even crude in places, but audiences are forgiving when they are engaged in the story, and hopefully they were. “I think the biggest challenge with the CGI in Jumanji was its place in time,” Johnston adds. “If the film had been made five years earlier, before the groundbreaking CG of Jurassic Park, everything would have been done with animatronics, puppets and wire and rod removal. Five years later everything would probably have been CGI.
TOP LEFT: Former ILM Animation Supervisor Kyle Balda at work on animating a CG monkey. Reference footage of monkeys revealed largely sedated creatures. “We needed the monkeys to be messing up the kitchen and throwing teacups, hitting stuff, and stealing televisions,” says Balda. “We tried to do something more like, what would little hooligan teenagers be like if you just let them loose?” (Photo courtesy of Industrial Light & Magic) BOTTOM LEFT: The CG monkeys wreak havoc in the house and then also take control of police vehicles. Dealing with furry creatures at the time was a breakthrough for ILM. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
100 • VFXVOICE.COM WINTER 2018
PG 100-103 VFX VAULT JUMANJI.indd 100-101
“We were at a point where we could do some of the creatures in CGI that would have been the most difficult, but we couldn’t afford to do them all,” he explains, “so the spiders, the crocodile and some of the vines became animatronic. The lion and pelican had to exist in both worlds. The monkeys were all CG. The bats were CG except for one shot where it lands on young Sara’s shoulder.” THE MAGIC OF ILM
ILM’s work on Jumanji was led by Visual Effects Supervisor Stephen Price, who sadly passed away during production. Ken Ralston later came on board as Visual Effects Supervisor. Despite several digital effects productions under its belt, the studio still had a relatively small computer graphics department. “When I started in the department there were about 30 people,” recalls Animation Supervisor Kyle Balda, who had already worked at ILM on The Flinstones and The Mask before Jumanji. “Just a few years later there were well over 800 people.” The sudden rise of using CGI in movies was not without its challenges. Firstly, computers were expensive – and slow, as were some of the animation techniques (this was before the full adoption of inverse kinematics). “The update speeds on the computer were really, really slow,” outlines Balda, “and as a result we would be working on one shot for four weeks that would last seven seconds. These days, from an animator’s point of view, that’s kind of what you’d be expected to do in a week or 10 days.” Still, Balda retains fond memories of his time on Jumanji, particularly arising from the close-knit quarters in which the computer graphics department worked in ILM’s then-offices in San Rafael. “In our room we had technical directors and a couple of other animators,” says Balda, who has since gone on to direct films such as Minions and Despicable Me 3. “So you could talk to everybody who was doing every part of the film. People were also working on different movies. The guy behind me was working on Dragonheart while I started working on Jumanji. It was cool to be able to see what was going on elsewhere, and to get a little bit of inspiration from that.” APPROACH TO ANIMATION
For Jurassic Park, ILM’s animators had looked to plenty of present-day animals for reference, but they certainly had some artistic license in coming up with dinosaur walk and run cycles. Jumanji’s
“I think the biggest challenge with the CGI in Jumanji was its place in time. If the film had been made five years earlier, before the groundbreaking CG of Jurassic Park, everything would have been done with animatronics, puppets and wire and rod removal. Five years later everything would probably have been CGI.” —Joe Johnston, Director, Jumanji
TOP TO BOTTOM: This image breakdown shows the original plate through to wire frame and final render a scene where the rhinos are racing through (and destroying) much of the house. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
WINTER 2018 VFXVOICE.COM • 101
11/22/17 5:02 PM
VFX VAULT
REVISITING THE RAMPAGING CG ANIMALS OF JUMANJI By IAN FAILES
By 1995, many Hollywood films had embraced the use of CGI to help tell more elaborate tales. The industry had been spurred along in this way by the success of digital effects in films such as Terminator 2: Judgment Day, Jurassic Park and The Mask. Joe Johnston’s Jumanji, starring Robin Williams who, as a child, is lost in a magical jungle after playing a mysterious board game, is one of those films that took full advantage of the state of CGI in filmmaking – from Industrial Light & Magic (ILM) in particular – using it to depict living, breathing jungle animals. However, with digital effects only then in their infancy, Johnston still employed a great deal of practical effects in Jumanji, including full-scale creatures made and puppeteered by Amalgamated Dynamics, Inc. and miniatures crafted by ILM. A new new take on the board game adventure – Jumanji: Welcome to the Jungle, directed by Jake Kasdan – has just hit cinemas, so VFX Voice took the opportunity to ask Johnston and ILM’s Animation Supervisor at that time, Kyle Balda, about how the CG creatures in the original film were brought to life. CAN THIS BE DONE?
Jumanji’s digital visual effects were ambitious. A herd of rhinos would be seen crashing through the corridors of a house. A stream of wild animals would pound through the center of a small town – this scene included the iconic shot of an elephant crushing a car. A lion would need to leap down stairs. And wild monkeys would tear up a kitchen, and then drive a car. Johnston was not sure any of these shots would be possible, especially since many of the creatures had hairy or feathery features, which was a major challenge in VFX at the time. “I had no confidence at first that it would work at all,” the director says. “Early tests on the lion’s mane were not confidence builders, but the team kept digging down on the tech to make it work, and I think it paid off well in the end. If you compare it to where CGI is now with textures like hair and feathers and other subtle skin surfaces, the Jumanji stuff is fairly primitive, even crude in places, but audiences are forgiving when they are engaged in the story, and hopefully they were. “I think the biggest challenge with the CGI in Jumanji was its place in time,” Johnston adds. “If the film had been made five years earlier, before the groundbreaking CG of Jurassic Park, everything would have been done with animatronics, puppets and wire and rod removal. Five years later everything would probably have been CGI.
TOP LEFT: Former ILM Animation Supervisor Kyle Balda at work on animating a CG monkey. Reference footage of monkeys revealed largely sedated creatures. “We needed the monkeys to be messing up the kitchen and throwing teacups, hitting stuff, and stealing televisions,” says Balda. “We tried to do something more like, what would little hooligan teenagers be like if you just let them loose?” (Photo courtesy of Industrial Light & Magic) BOTTOM LEFT: The CG monkeys wreak havoc in the house and then also take control of police vehicles. Dealing with furry creatures at the time was a breakthrough for ILM. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
100 • VFXVOICE.COM WINTER 2018
PG 100-103 VFX VAULT JUMANJI.indd 100-101
“We were at a point where we could do some of the creatures in CGI that would have been the most difficult, but we couldn’t afford to do them all,” he explains, “so the spiders, the crocodile and some of the vines became animatronic. The lion and pelican had to exist in both worlds. The monkeys were all CG. The bats were CG except for one shot where it lands on young Sara’s shoulder.” THE MAGIC OF ILM
ILM’s work on Jumanji was led by Visual Effects Supervisor Stephen Price, who sadly passed away during production. Ken Ralston later came on board as Visual Effects Supervisor. Despite several digital effects productions under its belt, the studio still had a relatively small computer graphics department. “When I started in the department there were about 30 people,” recalls Animation Supervisor Kyle Balda, who had already worked at ILM on The Flinstones and The Mask before Jumanji. “Just a few years later there were well over 800 people.” The sudden rise of using CGI in movies was not without its challenges. Firstly, computers were expensive – and slow, as were some of the animation techniques (this was before the full adoption of inverse kinematics). “The update speeds on the computer were really, really slow,” outlines Balda, “and as a result we would be working on one shot for four weeks that would last seven seconds. These days, from an animator’s point of view, that’s kind of what you’d be expected to do in a week or 10 days.” Still, Balda retains fond memories of his time on Jumanji, particularly arising from the close-knit quarters in which the computer graphics department worked in ILM’s then-offices in San Rafael. “In our room we had technical directors and a couple of other animators,” says Balda, who has since gone on to direct films such as Minions and Despicable Me 3. “So you could talk to everybody who was doing every part of the film. People were also working on different movies. The guy behind me was working on Dragonheart while I started working on Jumanji. It was cool to be able to see what was going on elsewhere, and to get a little bit of inspiration from that.” APPROACH TO ANIMATION
For Jurassic Park, ILM’s animators had looked to plenty of present-day animals for reference, but they certainly had some artistic license in coming up with dinosaur walk and run cycles. Jumanji’s
“I think the biggest challenge with the CGI in Jumanji was its place in time. If the film had been made five years earlier, before the groundbreaking CG of Jurassic Park, everything would have been done with animatronics, puppets and wire and rod removal. Five years later everything would probably have been CGI.” —Joe Johnston, Director, Jumanji
TOP TO BOTTOM: This image breakdown shows the original plate through to wire frame and final render a scene where the rhinos are racing through (and destroying) much of the house. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
WINTER 2018 VFXVOICE.COM • 101
11/22/17 5:02 PM
VFX VAULT
TOP: Visual Effects Supervisor Ken Ralston (far left) oversees a miniature shoot for a scene in which the house splits in two. (Photo courtesy of Industrial Light & Magic) BOTTOM: The resulting house-split shot. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
TOP: Favreau and Robert Downey Jr. on the set of Iron Man. (Photo credit Zade Rosenthal. Copyright © 2008 of Marvel Studios/Paramount) BOTTOM: Favreau with the Iron Man suit of armor developed by character Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
animation team, on the other hand, had to ensure their wildlife closely matched the real thing. “In every scene,” notes Johnston, “the thing I stressed the most was that the creature needed to look and behave as realistically as possible, which isn’t always what you think it should be. The animators watched a lot of nature documentaries to study each animal’s idiosyncrasies, but there wasn’t a lot of screen time to take advantage of what they had learned. That kind of realism is a lot easier to achieve with a stampede than it is with a kitchen full of demonic monkeys.” After visiting a zoo for reference, Balda found that just replicating the motion of real animals was not quite going to work in the film; animators would need to exaggerate this motion in their CG models. “When we were looking at the elephants we took something like a 29 frame cycle of video footage and stabilized it so that you could see it looping, but the elephant would be walking in place. So the starting point was, ‘Let’s just try rotoscoping it just to see what that movement is all about. What’s happening? What’s the cadence and the foot order?’ “And, of course,” adds Balda, “rotoscoping always looks very cold when you finish it because it doesn’t have any exaggeration. It’s like
How the Elephant Shot Was Captured
LEFT TO RIGHT: On set in Vancouver (standing in for New Hampshire), the film’s special effects team rigged a car – a Chevy Caprice – to smash inwards, as if trampled on by the elephant. ILM animators meticulously puppeteered wireframe versions of the elephant, matching a proxy version of the car, to coincide with the practical crushing. The final rendered shot, complete with other CG animals during the stampede. Says Kyle Balda: “I remember a couple of years after the
PG 100-103 VFX VAULT JUMANJI.indd 102-103
Photos copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic.
“I had no confidence at first that it would work at all. ... If you compare it to where CGI is now with textures like hair and feathers and other subtle skin surfaces, the Jumanji stuff is fairly primitive, even crude in places, but audiences are forgiving when they are engaged in the story, and hopefully they were.” —Joe Johnston, Director, Jumanji a facsimile of the life itself. So you would need exaggerate the up and down movement of the elephant. You would make the stride a little bit bigger than it is in life. And only by exaggerating does it feel like you compensate for the thing that’s missing, and then it starts to look more alive.” THE LEGACY OF JUMANJI
Johnston cites the prilivege of working with Robin Williams as a highlight of his Jumanji experience. “People would ask me if Robin would go wild and just start ad-libbing lines during the scene. But he once told me that he was happy to be in a film where he couldn’t go crazy because of all the visual effects he had to interface with. He would occasionally ask for another take or two, to try something different but it was always in the context of the script.” And although it is considered a landmark film in terms of digital visual effects, Johnston remains adamant that then – and now – VFX should only ever serve the story they’re helping to tell. “I’m a firm believer in the unwritten rule that a film shouldn’t have one more effect than is necessary to tell the best version of that story,” Johnston says. “Filmmakers, and I include myself in the list of the guilty, sometimes get enamored with how well an effect is working and start trying to convince themselves – and the studio – that more is better which is seldom the case. Fortunately, budgets are often the things that keep us honest.”
102 • VFXVOICE.COM WINTER 2018
film came out being on an aeroplane and there were these two old ladies sitting behind me watching Jumanji, and one of them leaned over to the other one and said, ‘How did they get the elephant to do that?’ From an animator’s perspective, that’s the best thing you can hear.”
ABOVE: Director Joe Johnston on the set of Jumanji. (Photo copyright © 1995 Columbia/TriStar Pictures)
WINTER 2018 VFXVOICE.COM • 103
11/22/17 5:02 PM
VFX VAULT
TOP: Visual Effects Supervisor Ken Ralston (far left) oversees a miniature shoot for a scene in which the house splits in two. (Photo courtesy of Industrial Light & Magic) BOTTOM: The resulting house-split shot. (Photo copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic)
TOP: Favreau and Robert Downey Jr. on the set of Iron Man. (Photo credit Zade Rosenthal. Copyright © 2008 of Marvel Studios/Paramount) BOTTOM: Favreau with the Iron Man suit of armor developed by character Tony Stark. (Photo credit: Zade Rosenthal. Copyright © 2008 Marvel Studios/Paramount)
animation team, on the other hand, had to ensure their wildlife closely matched the real thing. “In every scene,” notes Johnston, “the thing I stressed the most was that the creature needed to look and behave as realistically as possible, which isn’t always what you think it should be. The animators watched a lot of nature documentaries to study each animal’s idiosyncrasies, but there wasn’t a lot of screen time to take advantage of what they had learned. That kind of realism is a lot easier to achieve with a stampede than it is with a kitchen full of demonic monkeys.” After visiting a zoo for reference, Balda found that just replicating the motion of real animals was not quite going to work in the film; animators would need to exaggerate this motion in their CG models. “When we were looking at the elephants we took something like a 29 frame cycle of video footage and stabilized it so that you could see it looping, but the elephant would be walking in place. So the starting point was, ‘Let’s just try rotoscoping it just to see what that movement is all about. What’s happening? What’s the cadence and the foot order?’ “And, of course,” adds Balda, “rotoscoping always looks very cold when you finish it because it doesn’t have any exaggeration. It’s like
How the Elephant Shot Was Captured
LEFT TO RIGHT: On set in Vancouver (standing in for New Hampshire), the film’s special effects team rigged a car – a Chevy Caprice – to smash inwards, as if trampled on by the elephant. ILM animators meticulously puppeteered wireframe versions of the elephant, matching a proxy version of the car, to coincide with the practical crushing. The final rendered shot, complete with other CG animals during the stampede. Says Kyle Balda: “I remember a couple of years after the
PG 100-103 VFX VAULT JUMANJI.indd 102-103
Photos copyright © 1995 TriStar Pictures. Courtesy of Industrial Light & Magic.
“I had no confidence at first that it would work at all. ... If you compare it to where CGI is now with textures like hair and feathers and other subtle skin surfaces, the Jumanji stuff is fairly primitive, even crude in places, but audiences are forgiving when they are engaged in the story, and hopefully they were.” —Joe Johnston, Director, Jumanji a facsimile of the life itself. So you would need exaggerate the up and down movement of the elephant. You would make the stride a little bit bigger than it is in life. And only by exaggerating does it feel like you compensate for the thing that’s missing, and then it starts to look more alive.” THE LEGACY OF JUMANJI
Johnston cites the prilivege of working with Robin Williams as a highlight of his Jumanji experience. “People would ask me if Robin would go wild and just start ad-libbing lines during the scene. But he once told me that he was happy to be in a film where he couldn’t go crazy because of all the visual effects he had to interface with. He would occasionally ask for another take or two, to try something different but it was always in the context of the script.” And although it is considered a landmark film in terms of digital visual effects, Johnston remains adamant that then – and now – VFX should only ever serve the story they’re helping to tell. “I’m a firm believer in the unwritten rule that a film shouldn’t have one more effect than is necessary to tell the best version of that story,” Johnston says. “Filmmakers, and I include myself in the list of the guilty, sometimes get enamored with how well an effect is working and start trying to convince themselves – and the studio – that more is better which is seldom the case. Fortunately, budgets are often the things that keep us honest.”
102 • VFXVOICE.COM WINTER 2018
film came out being on an aeroplane and there were these two old ladies sitting behind me watching Jumanji, and one of them leaned over to the other one and said, ‘How did they get the elephant to do that?’ From an animator’s perspective, that’s the best thing you can hear.”
ABOVE: Director Joe Johnston on the set of Jumanji. (Photo copyright © 1995 Columbia/TriStar Pictures)
WINTER 2018 VFXVOICE.COM • 103
11/22/17 5:02 PM
[ VES SECTION SPOTLIGHT: NEW YORK ]
Thriving VES Section Celebrates VFX Metropolis By NAOMI GOLDMAN
TOP: Empire Award winner Darren Aronofsky flanked by VES leadership at the 2017 celebration From left to right: Eric Robertson, Co-Chair NY Section, VFX Supervisor & Co-Founder Mr. X Gotham; Andrew Bly, Co-Chair NY Section, Member VES Board of Directors and Co-Founder of The Molecule; Aronofsky; Dan Schrecker, former Chair of the NY Section and member VES Board of Directors, VFX Supervisor; and Eric Roth, VES Executive Director. BOTTOM: The high-caliber panel at “Magic on the Small Screen: VFX Episodic ,” a VES NY-LA collaboration.
104 • VFXVOICE.COM WINTER 2018
PG 104-107 NY SECTION.indd 104-105
As a flourishing global honorary society, the Visual Effects Society’s growing international presence owes much to its thriving network of Sections, which galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. Founded in 2011, the VES New York Section hit the ground running and now has a membership tipping 200. Thanks to enthusiastic leadership and a penchant for pioneering impactful new events, the Gotham City arm of the VES has quickly evolved into one known for its robust programming, entrepreneurial spirit and desire to cultivate burgeoning Sections across the globe. The Section’s membership is largely comprised of VFX practitioners working in feature films and television (a huge growth sector in New York). It enjoys strong support from companies invigorating the region, including Alkemy X, Big Film Design, Blue Sky Studios, BrainStorm Digital, East Side Effects, CineSysOceana, Method Studios, Phosphene, Shade and Zoic, as well as those helmed by the Section’s current leadership, The Molecule and Mr. X Gotham. At its core, the New York Section has created a diverse roster of programs and special events – designed to serve its members by fostering professional development, education, networking, and recognition of the vast spectrum of visual effects talent in the region – while adding its unique flair in sync with the rhythm of the city. In 2015, the New York Section became the first to stage a regional VES Awards celebration – an innovative extension of the annual international ceremony in Los Angeles designed to recognize local visual effects talent. It was an instant success and has since become a must-attend annual affair at the Brooklyn’s renowned venue, The Green Building. “The New York VES Awards Celebration was envisioned as a VES Awards after-after party allowing us to celebrate the finest achievements in visual effects artistry – from New York’s VFX community to the VES Award winners around the globe,” says Andrew Bly, New York Section Co-Chair and Co-Founder of The Molecule. “I wasn’t sure that the New York crowd would get dressed up for this VFX night out, but just the opposite happened. The event was embraced and a lot of local talent was thrilled to participate. It felt really good to be able to provide this important spotlight for so many talented artists and innovators.” The Section established The Empire Award to recognize a New York-based visual effects professional who has made enormous contributions to the field of visual arts and filmed entertainment and whose work embodies the spirit of New York. In its first year, the inaugural award went to award-winning titles designer and Visual Effects Supervisor Randy Balsmeyer – the go-to choice for New York filmmakers including Spike Lee, Woody Allen and Martin Scorsese – presented by Tina Fey. In 2016, the award was bestowed upon Academy Award-winning director and Co-Founder of Blue Sky Studios Chris Wedge, best known for
his work helming Ice Age, Robots, Epic and Monster Trucks. In 2017, the VES Empire Award went to acclaimed visionary director Darren Aronofsky, known for his provocative films including Pi, Requiem for a Dream, Black Swan and this season’s mother! Plans for this year’s 4th annual celebration are well underway. Another rich program the Section launched in 2016 is the Master Class Series, a collection of seminars and workshops offering an intimate learning experience. Recent programs have included a session on personal finance for VFX professionals and freelancers, bringing together a financial planner, mortgage broker and analyst from CNBC; a seminar with colorists to help bridge the gap between people in the same pipeline who might not have an opportunity to interact; and a program featuring visual effects supervisors providing their valuable insights from the front line. ‘This is perhaps our most exciting endeavor moving forward,” says Eric Robertson, New York Section Co-Chair and Co-Founder of Mr. X Gotham. “It delivers on our value proposition to serve our members in supporting their growth as visual effects professionals while growing our community.” An ongoing roster of special events has complemented these signature programs, thanks in great part to robust partnerships with other leading entertainment organizations. In 2015, the VES New York Section and HBO, in association with Post New York Alliance and the School of Visual Arts, hosted a vibrant discussion with women in the visual effects industry, “Breaking into VFX.” Moderated by postPerspective Editor in Chief Randi
“We would love to take what we’ve been able to do for the film and TV arm and engage the entire spectrum of talent – professionals working in video games, commercials, animation – so that we are all connected and benefit from valuable resources and enrichment programs.” —Andrew Bly, New York Section Co-Chair and Co-founder of The Molecule
TOP LEFT: A packed house at “Magic on the Small Screen: VFX Episodic”, co-presented by VES NY and LA Sections. TOP RIGHT: VES New York members at the VR demo at the 2nd Annual Awards Celebration MIDDLE LEFT: Academy Award-winning director and Co-Founder of Blue Sky Studios Chris Wedge, recipient of the 2016 Empire Award. MIDDLE RIGHT: Award-winning titles designer Randy Balsmeyer receives the inaugural Empire Award at the 1st Annual VES New York Awards celebration. (Photo credit: Jon Stulich) BOTTOM: VES New York members at the 2nd Annual Awards Celebration.
WINTER 2018 VFXVOICE.COM • 105
11/22/17 5:02 PM
[ VES SECTION SPOTLIGHT: NEW YORK ]
Thriving VES Section Celebrates VFX Metropolis By NAOMI GOLDMAN
TOP: Empire Award winner Darren Aronofsky flanked by VES leadership at the 2017 celebration From left to right: Eric Robertson, Co-Chair NY Section, VFX Supervisor & Co-Founder Mr. X Gotham; Andrew Bly, Co-Chair NY Section, Member VES Board of Directors and Co-Founder of The Molecule; Aronofsky; Dan Schrecker, former Chair of the NY Section and member VES Board of Directors, VFX Supervisor; and Eric Roth, VES Executive Director. BOTTOM: The high-caliber panel at “Magic on the Small Screen: VFX Episodic ,” a VES NY-LA collaboration.
104 • VFXVOICE.COM WINTER 2018
PG 104-107 NY SECTION.indd 104-105
As a flourishing global honorary society, the Visual Effects Society’s growing international presence owes much to its thriving network of Sections, which galvanize their regional VFX communities while advancing the reach and reputation of the Society and industry worldwide. Founded in 2011, the VES New York Section hit the ground running and now has a membership tipping 200. Thanks to enthusiastic leadership and a penchant for pioneering impactful new events, the Gotham City arm of the VES has quickly evolved into one known for its robust programming, entrepreneurial spirit and desire to cultivate burgeoning Sections across the globe. The Section’s membership is largely comprised of VFX practitioners working in feature films and television (a huge growth sector in New York). It enjoys strong support from companies invigorating the region, including Alkemy X, Big Film Design, Blue Sky Studios, BrainStorm Digital, East Side Effects, CineSysOceana, Method Studios, Phosphene, Shade and Zoic, as well as those helmed by the Section’s current leadership, The Molecule and Mr. X Gotham. At its core, the New York Section has created a diverse roster of programs and special events – designed to serve its members by fostering professional development, education, networking, and recognition of the vast spectrum of visual effects talent in the region – while adding its unique flair in sync with the rhythm of the city. In 2015, the New York Section became the first to stage a regional VES Awards celebration – an innovative extension of the annual international ceremony in Los Angeles designed to recognize local visual effects talent. It was an instant success and has since become a must-attend annual affair at the Brooklyn’s renowned venue, The Green Building. “The New York VES Awards Celebration was envisioned as a VES Awards after-after party allowing us to celebrate the finest achievements in visual effects artistry – from New York’s VFX community to the VES Award winners around the globe,” says Andrew Bly, New York Section Co-Chair and Co-Founder of The Molecule. “I wasn’t sure that the New York crowd would get dressed up for this VFX night out, but just the opposite happened. The event was embraced and a lot of local talent was thrilled to participate. It felt really good to be able to provide this important spotlight for so many talented artists and innovators.” The Section established The Empire Award to recognize a New York-based visual effects professional who has made enormous contributions to the field of visual arts and filmed entertainment and whose work embodies the spirit of New York. In its first year, the inaugural award went to award-winning titles designer and Visual Effects Supervisor Randy Balsmeyer – the go-to choice for New York filmmakers including Spike Lee, Woody Allen and Martin Scorsese – presented by Tina Fey. In 2016, the award was bestowed upon Academy Award-winning director and Co-Founder of Blue Sky Studios Chris Wedge, best known for
his work helming Ice Age, Robots, Epic and Monster Trucks. In 2017, the VES Empire Award went to acclaimed visionary director Darren Aronofsky, known for his provocative films including Pi, Requiem for a Dream, Black Swan and this season’s mother! Plans for this year’s 4th annual celebration are well underway. Another rich program the Section launched in 2016 is the Master Class Series, a collection of seminars and workshops offering an intimate learning experience. Recent programs have included a session on personal finance for VFX professionals and freelancers, bringing together a financial planner, mortgage broker and analyst from CNBC; a seminar with colorists to help bridge the gap between people in the same pipeline who might not have an opportunity to interact; and a program featuring visual effects supervisors providing their valuable insights from the front line. ‘This is perhaps our most exciting endeavor moving forward,” says Eric Robertson, New York Section Co-Chair and Co-Founder of Mr. X Gotham. “It delivers on our value proposition to serve our members in supporting their growth as visual effects professionals while growing our community.” An ongoing roster of special events has complemented these signature programs, thanks in great part to robust partnerships with other leading entertainment organizations. In 2015, the VES New York Section and HBO, in association with Post New York Alliance and the School of Visual Arts, hosted a vibrant discussion with women in the visual effects industry, “Breaking into VFX.” Moderated by postPerspective Editor in Chief Randi
“We would love to take what we’ve been able to do for the film and TV arm and engage the entire spectrum of talent – professionals working in video games, commercials, animation – so that we are all connected and benefit from valuable resources and enrichment programs.” —Andrew Bly, New York Section Co-Chair and Co-founder of The Molecule
TOP LEFT: A packed house at “Magic on the Small Screen: VFX Episodic”, co-presented by VES NY and LA Sections. TOP RIGHT: VES New York members at the VR demo at the 2nd Annual Awards Celebration MIDDLE LEFT: Academy Award-winning director and Co-Founder of Blue Sky Studios Chris Wedge, recipient of the 2016 Empire Award. MIDDLE RIGHT: Award-winning titles designer Randy Balsmeyer receives the inaugural Empire Award at the 1st Annual VES New York Awards celebration. (Photo credit: Jon Stulich) BOTTOM: VES New York members at the 2nd Annual Awards Celebration.
WINTER 2018 VFXVOICE.COM • 105
11/22/17 5:02 PM
[ VES SECTION SPOTLIGHT: NEW YORK ]
“This [Master Class Series of seminars and workshops] is perhaps our most exciting endeavor moving forward. It delivers on our value proposition to serve our members in supporting their growth as visual effects professionals while growing our community.” —Eric Robertson, New York Section Co-Chair and Co-founder of Mr. X Gotham
TOP LEFT: VES Executive Director Eric Roth, Empire Award recipient Randy Balsmeyer and former Chair of the New York section Dan Schrecker at the 2015 Awards celebration. (Photo credit: Jon Stulich) TOP RIGHT: VES New York members at the 1st Annual VES New York Awards Celebration (photo credit: Jon Stulich) BOTTOM LEFT AND RIGHT: VES New York members at the 2nd Annual VES New York Awards Celebration.
106 • VFXVOICE.COM WINTER 2018
PG 104-107 NY SECTION.indd 106-107
Altman and keynoted by Marvel Studios Executive Vice President of Physical Production Victoria Alonso, the program focused on the achievement and career paths of the panelists, as well as the challenges and changing opportunities for young women trying to find their way into the business of VFX and animation. The School of Visual Arts (SVA), a creative and cultural center in Midtown Manhattan, is also a frequent collaborator with VES New York. Many screenings are held in an SVA theater, and the school regularly donates facilities for seminars. “Our theater managers established an open-door policy for VES to screen the most recent feature films to our members,” says John McIntosh, former Chair of SVA’s Computer Art, Computer Animation & Visual Effects Department and member of the New York VES Section. “The beauty of this partnership is that our students have the opportunity to see great films for free. The result is amazing. VES members are given the best seats in the house and with each screening more than 500 students see the very best in visual effects.” The Section also invites members of the general visual effects and animation community locally to its screenings and other events, including its popular pub night networking receptions and Summer Party. The result is that VES New York events are strong networking opportunities for members and future mem-
“Looking at the evolution of visual effects over the past 20 years, I’m proud of what the VES has achieved and for the role we are playing in bringing the New York community together. We’ve made enormous progress since the Section launched and it’s gratifying to see how our original vision has been realized.” —Sarah Dowland, inaugural VES New York Section Chair bers alike. “Of everything we do, the biggest thing has been helping our members to network with their peers,” says Bly. “People have actually started getting jobs out of our parties. In fact, I’ve lost some interns because of our parties!” In building its foundation, the New York Section Managers acknowledge the vital role that peer-to-peer mentoring from more established Sections played in its rapid evolution. “David Tanaka from the San Francisco Bay Area Section was enormously generous in sharing advice from their growth trajectory – the challenges and lessons learned that contributed to their success,” says Bly. “We are proud to have created an organization that was well organized and driven from the outset. And given our experience in program planning and membership recruitment, we now have the opportunity and rewarding responsibility to pass on our knowledge to support the development of other regional VES hubs,” adds Robertson. The dynamics of the visual effects business landscape have also propelled visual effects practitioners to gravitate to the New York Section and assume positions of leadership. “The release of the VES whitepaper in 2013 on the state of the global industry, labor and pipeline concerns is what called to me to help remedy those issues,” states Dan Schrecker, VFX Supervisor and former VES New York Section Chair. “I didn’t know what the answers were, but it was clear to me that the VES was far and away the best
forum to talk about these things. We realized the best thing we could do was focus on building the community in New York and serve their needs.” Looking forward, the New York Section aims to continuously adapt to keep pace with the rapidly-evolving visual effects industry. “Looking at the evolution of visual effects over the past 20 years, I’m proud of what the VES has achieved and for the role we are playing in bringing the New York community together,” says Sarah Dowland, inaugural VES New York Section Chair. “We’ve made enormous progress since the Section launched and it’s gratifying to see how our original vision has been realized.” “We would love to take what we’ve been able to do for the film and TV arm and engage the entire spectrum of talent – professionals working in video games, commercials, animation – so that we are all connected and benefit from valuable resources and enrichment programs,” adds Bly.
LEFT TO RIGHT: VES New York Co-Chair Andrew Bly kicks off the 2nd Annual VES New York Awards Celebration. Director Darren Aronofsky enjoys networking with members of the VES New York Section. Visionary director Darren Aronofsky receives the Empire Award at the 3rd Annual VES New York Awards Celebration.
WINTER 2018 VFXVOICE.COM • 107
11/22/17 5:02 PM
[ VES SECTION SPOTLIGHT: NEW YORK ]
“This [Master Class Series of seminars and workshops] is perhaps our most exciting endeavor moving forward. It delivers on our value proposition to serve our members in supporting their growth as visual effects professionals while growing our community.” —Eric Robertson, New York Section Co-Chair and Co-founder of Mr. X Gotham
TOP LEFT: VES Executive Director Eric Roth, Empire Award recipient Randy Balsmeyer and former Chair of the New York section Dan Schrecker at the 2015 Awards celebration. (Photo credit: Jon Stulich) TOP RIGHT: VES New York members at the 1st Annual VES New York Awards Celebration (photo credit: Jon Stulich) BOTTOM LEFT AND RIGHT: VES New York members at the 2nd Annual VES New York Awards Celebration.
106 • VFXVOICE.COM WINTER 2018
PG 104-107 NY SECTION.indd 106-107
Altman and keynoted by Marvel Studios Executive Vice President of Physical Production Victoria Alonso, the program focused on the achievement and career paths of the panelists, as well as the challenges and changing opportunities for young women trying to find their way into the business of VFX and animation. The School of Visual Arts (SVA), a creative and cultural center in Midtown Manhattan, is also a frequent collaborator with VES New York. Many screenings are held in an SVA theater, and the school regularly donates facilities for seminars. “Our theater managers established an open-door policy for VES to screen the most recent feature films to our members,” says John McIntosh, former Chair of SVA’s Computer Art, Computer Animation & Visual Effects Department and member of the New York VES Section. “The beauty of this partnership is that our students have the opportunity to see great films for free. The result is amazing. VES members are given the best seats in the house and with each screening more than 500 students see the very best in visual effects.” The Section also invites members of the general visual effects and animation community locally to its screenings and other events, including its popular pub night networking receptions and Summer Party. The result is that VES New York events are strong networking opportunities for members and future mem-
“Looking at the evolution of visual effects over the past 20 years, I’m proud of what the VES has achieved and for the role we are playing in bringing the New York community together. We’ve made enormous progress since the Section launched and it’s gratifying to see how our original vision has been realized.” —Sarah Dowland, inaugural VES New York Section Chair bers alike. “Of everything we do, the biggest thing has been helping our members to network with their peers,” says Bly. “People have actually started getting jobs out of our parties. In fact, I’ve lost some interns because of our parties!” In building its foundation, the New York Section Managers acknowledge the vital role that peer-to-peer mentoring from more established Sections played in its rapid evolution. “David Tanaka from the San Francisco Bay Area Section was enormously generous in sharing advice from their growth trajectory – the challenges and lessons learned that contributed to their success,” says Bly. “We are proud to have created an organization that was well organized and driven from the outset. And given our experience in program planning and membership recruitment, we now have the opportunity and rewarding responsibility to pass on our knowledge to support the development of other regional VES hubs,” adds Robertson. The dynamics of the visual effects business landscape have also propelled visual effects practitioners to gravitate to the New York Section and assume positions of leadership. “The release of the VES whitepaper in 2013 on the state of the global industry, labor and pipeline concerns is what called to me to help remedy those issues,” states Dan Schrecker, VFX Supervisor and former VES New York Section Chair. “I didn’t know what the answers were, but it was clear to me that the VES was far and away the best
forum to talk about these things. We realized the best thing we could do was focus on building the community in New York and serve their needs.” Looking forward, the New York Section aims to continuously adapt to keep pace with the rapidly-evolving visual effects industry. “Looking at the evolution of visual effects over the past 20 years, I’m proud of what the VES has achieved and for the role we are playing in bringing the New York community together,” says Sarah Dowland, inaugural VES New York Section Chair. “We’ve made enormous progress since the Section launched and it’s gratifying to see how our original vision has been realized.” “We would love to take what we’ve been able to do for the film and TV arm and engage the entire spectrum of talent – professionals working in video games, commercials, animation – so that we are all connected and benefit from valuable resources and enrichment programs,” adds Bly.
LEFT TO RIGHT: VES New York Co-Chair Andrew Bly kicks off the 2nd Annual VES New York Awards Celebration. Director Darren Aronofsky enjoys networking with members of the VES New York Section. Visionary director Darren Aronofsky receives the Empire Award at the 3rd Annual VES New York Awards Celebration.
WINTER 2018 VFXVOICE.COM • 107
11/22/17 5:02 PM
[ VES NEWS ]
VES Summit & Honors Celebration
VES Hall of Fame
By NAOMI GOLDMAN
By NAOMI GOLDMAN
In October, the VES hosted a special recognition program at the 9th Annual VES Summit to honor distinguished individuals with the Founders Award, Fellows distinction, Honorary Membership and Lifetime Membership designations, and to celebrate the inaugural VES Hall of Fame inductees.
TOP LEFT: Summit keynote speaker Ava DuVernay flanked by her A Wrinkle in Time Visual Effects Supervisor Rich McBride and Deadline Hollywood Senior Editor Dominic Patten. TOP RIGHT: VES Lifetime Membership honoree Chuck Finance. MIDDLE RIGHT: VES Board Chair Mike Chambers introducing the 2017 VES Fellows – Joe Letteri, VES, John Richardson, VES and Lynda Ellenshaw Thompson, VES. MIDDLE LEFT: From left: VES Executive Director Eric Roth, Summit Co-Chair Jeff Barnes, Founders Award recipient Toni Pace Carstensen, Summit Co-Chair Rita Cahill, VES Board Chair Mike Chambers. BOTTOM LEFT: VES Honorary Membership recipient Bob Burns (second from the left) with VES Hall of Fame inductees Dennis Muren, VES, Phil Tippett, VES and Doug Trumbull, VES. Photo credits: Tony Ducret (Evening Photographer – Hall of Fame, Awards); Marlon Rivas (Summit Photographer – Ava DuVernay)
108 • VFXVOICE.COM WINTER 2018
PG 108-111 VES NEWS.indd 108-109
The Summit program concluded with introducing the inaugural members of the VES Hall of Fame. “The VES Hall of Fame represents a class of exceptional artists and innovators who have had a profound impact on the field of visual effects,” says Mike Chambers, VES Board Chair. “We are proud to pay homage to those who have helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” Living legends Ed Catmull, VES, John Knoll, Syd Mead, Dennis Muren, VES, Phil Tippett, VES and Doug Trumbull, VES were on hand to receive their honors among the capacity industry crowd at the Sofitel Hotel Beverly Hills (unfortunately, honoree Roger Corman was unable to attend). Joining the celebration were dozens of proud family members enjoying recognition for Hall of Fame inductees who are no longer with us. VES Hall of Fame posthumous honorees include: Robert Abel, Linwood Dunn, ASC, Peter Ellenshaw, Jim Henson, Ub Iwerks, Grant McCune, Georges Méliès, Willis O’Brien, Carlo Rambaldi, Joe Viskocil, Petro Vlahos, Albert Whitlock, Stan Winston and Matthew Yuricich.
CLOCKWISE: Hall of Fame honoree Dennis Muren, VES. Hall of Fame honoree John Knoll. Hall of Fame honoree Syd Mead with VES Board Chair Mike Chambers. Three generations of family representing deceased Hall of Fame honoree Matthew Yuricich.
WINTER 2018 VFXVOICE.COM • 109
11/22/17 5:03 PM
[ VES NEWS ]
VES Summit & Honors Celebration
VES Hall of Fame
By NAOMI GOLDMAN
By NAOMI GOLDMAN
In October, the VES hosted a special recognition program at the 9th Annual VES Summit to honor distinguished individuals with the Founders Award, Fellows distinction, Honorary Membership and Lifetime Membership designations, and to celebrate the inaugural VES Hall of Fame inductees.
TOP LEFT: Summit keynote speaker Ava DuVernay flanked by her A Wrinkle in Time Visual Effects Supervisor Rich McBride and Deadline Hollywood Senior Editor Dominic Patten. TOP RIGHT: VES Lifetime Membership honoree Chuck Finance. MIDDLE RIGHT: VES Board Chair Mike Chambers introducing the 2017 VES Fellows – Joe Letteri, VES, John Richardson, VES and Lynda Ellenshaw Thompson, VES. MIDDLE LEFT: From left: VES Executive Director Eric Roth, Summit Co-Chair Jeff Barnes, Founders Award recipient Toni Pace Carstensen, Summit Co-Chair Rita Cahill, VES Board Chair Mike Chambers. BOTTOM LEFT: VES Honorary Membership recipient Bob Burns (second from the left) with VES Hall of Fame inductees Dennis Muren, VES, Phil Tippett, VES and Doug Trumbull, VES. Photo credits: Tony Ducret (Evening Photographer – Hall of Fame, Awards); Marlon Rivas (Summit Photographer – Ava DuVernay)
108 • VFXVOICE.COM WINTER 2018
PG 108-111 VES NEWS.indd 108-109
The Summit program concluded with introducing the inaugural members of the VES Hall of Fame. “The VES Hall of Fame represents a class of exceptional artists and innovators who have had a profound impact on the field of visual effects,” says Mike Chambers, VES Board Chair. “We are proud to pay homage to those who have helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” Living legends Ed Catmull, VES, John Knoll, Syd Mead, Dennis Muren, VES, Phil Tippett, VES and Doug Trumbull, VES were on hand to receive their honors among the capacity industry crowd at the Sofitel Hotel Beverly Hills (unfortunately, honoree Roger Corman was unable to attend). Joining the celebration were dozens of proud family members enjoying recognition for Hall of Fame inductees who are no longer with us. VES Hall of Fame posthumous honorees include: Robert Abel, Linwood Dunn, ASC, Peter Ellenshaw, Jim Henson, Ub Iwerks, Grant McCune, Georges Méliès, Willis O’Brien, Carlo Rambaldi, Joe Viskocil, Petro Vlahos, Albert Whitlock, Stan Winston and Matthew Yuricich.
CLOCKWISE: Hall of Fame honoree Dennis Muren, VES. Hall of Fame honoree John Knoll. Hall of Fame honoree Syd Mead with VES Board Chair Mike Chambers. Three generations of family representing deceased Hall of Fame honoree Matthew Yuricich.
WINTER 2018 VFXVOICE.COM • 109
11/22/17 5:03 PM
[ VES NEWS ]
OPPOSITE TOP: Hall of Fame honoree Doug Trumbull, VES with with VES Board Chair Mike Chambers. OPPOSITE BOTTOM: VES Hall of Fame honorees from left: Brian Henson (accepting Hall of Fame on behalf of Jim Henson), Dennis Muren, VES, Honorary Member Bob Burns, Phil Tippett, VES, John Knoll, Doug Trumbull, VES, Ed Catmull, VES, Syd Mead and VES Board Chair Mike Chambers.
TOP: VES Executive Director Eric Roth, VES Board member Brooke Breton, Hall of Fame honoree John Knoll, director John Landis, VES Board Chair Mike Chambers, VES Board 2nd Vice Chair Kim Lavery, VES and Board Secretary and Summit Co-Chair Rita Cahill. ABOVE LEFT: Hall of Fame honorees Doug Trumbull, VES and Syd Mead with VES Board member Richard Winn Taylor II, VES. ABOVE RIGHT: Hall of Fame honoree Phil Tippett, VES. BOTTOM RIGHT: Hall of Fame honoree Ed Catmull, VES.
110 • VFXVOICE.COM WINTER 2018
PG 108-111 VES NEWS.indd 111
WINTER 2018
VFXVOICE.COM • 111
11/22/17 5:03 PM
[ VES NEWS ]
OPPOSITE TOP: Hall of Fame honoree Doug Trumbull, VES with with VES Board Chair Mike Chambers. OPPOSITE BOTTOM: VES Hall of Fame honorees from left: Brian Henson (accepting Hall of Fame on behalf of Jim Henson), Dennis Muren, VES, Honorary Member Bob Burns, Phil Tippett, VES, John Knoll, Doug Trumbull, VES, Ed Catmull, VES, Syd Mead and VES Board Chair Mike Chambers.
TOP: VES Executive Director Eric Roth, VES Board member Brooke Breton, Hall of Fame honoree John Knoll, director John Landis, VES Board Chair Mike Chambers, VES Board 2nd Vice Chair Kim Lavery, VES and Board Secretary and Summit Co-Chair Rita Cahill. ABOVE LEFT: Hall of Fame honorees Doug Trumbull, VES and Syd Mead with VES Board member Richard Winn Taylor II, VES. ABOVE RIGHT: Hall of Fame honoree Phil Tippett, VES. BOTTOM RIGHT: Hall of Fame honoree Ed Catmull, VES.
110 • VFXVOICE.COM WINTER 2018
PG 108-111 VES NEWS.indd 111
WINTER 2018
VFXVOICE.COM • 111
11/22/17 5:03 PM
[ FINAL FRAME ]
In a Galaxy 40 Years Ago...
ILM Effects Camera Operator Dennis Muren, VES takes a light reading during the filming of the Death Star miniature for Star Wars: Episode IV – A New Hope (1977).
Photo courtesy of ILM and Lucasfilm Ltd.
112 • VFXVOICE.COM WINTER 2018
PG 112 FINAL FRAME.indd 112
11/22/17 5:04 PM
CVR3 AD FOTOKEM.indd 3
11/22/17 4:44 PM
NETFLIX OKJA.indd 4
11/22/17 4:45 PM