VFXVOICE.COM
SPRING 2018
A WRINKLE IN TIME: COSMIC VFX
FROM VFX TO DIRECTING • AARDMAN’S EARLY MAN: STONE AGE VFX • NONNY DE LA PEÑA: VR PIONEER DOUBLE NEGATIVE: 20TH ANNIVERSARY • VFX VAULT: WILLOW • 2018 VES AWARDS & WINNERS
COVER 1 VFXV issue 5 SPRING 2018.indd 1
3/5/18 4:03 PM
CVR2 YANNIX AD.indd 2
3/5/18 4:05 PM
PG 1 FMX AD.indd 1
3/5/18 4:06 PM
[ EXECUTIVE NOTE ]
Welcome to the Spring issue of VFX Voice! Coming off of our capacity-crowd 16th Annual VES Awards, which took place on February 13 in Beverly Hills, California, this issue celebrates the artists and innovators from across the globe who created outstanding visual effects in film, TV, animation commercials, gaming and special venues. In this issue we examine VFX trends – effects pros moving behind the camera into the director’s chair and the compelling art of VFX breakdowns. We look at three of the season’s most anticipated films – Annihilation, Early Man and juggernaut A Wrinkle in Time. We highlight VFX powerhouse Double Negative upon its 20th anniversary, go inside the gaming boom and spotlight VR at the intersection of technology and advocacy in our special profile of “The Godmother of VR” Nonny de la Peña. Add to that VES News and a profile of the VES New Zealand Section. Our first year publishing VFX Voice has been a wonderful experience, and the magazine’s success has surpassed our expectations thanks to your enthusiastic support. We are very excited to bring you another year of this dynamic addition to the VES legacy. Moving forward, we at VFX Voice are planning more articles that dive deep into issues of significant importance to our community, while continuing to deliver extraordinary visuals, in-depth profiles and insightful stories that highlight the exceptional artists and innovators all across our global community. In addition to the print edition, we encourage you to access the full digital version at VFXVoice.com where you can also catch our web exclusives – new and timely stories between issues that you can only find online. You can also get updates on VFX Voice features – as well as VES and VFX industry news – by following us on Twitter at @VFXSociety. We hope you will share VFX Voice with your colleagues to help us grow the community, and continue to send us your thoughts and ideas. Thank you again to our readers, writers and advertisers for your partnership!
Mike Chambers, Chair, VES Board of Directors
Eric Roth, VES Executive Director
2 • VFXVOICE.COM SPRING 2018
PG 2 EXECUTIVE NOTE.indd 2
3/5/18 4:07 PM
PG 3 BLACKMAGIC AD.indd 3
3/5/18 4:07 PM
[ CONTENTS ] FEATURES 8 VFX TRENDS: FROM VFX TO DIRECTING Effects pros are moving behind the camera.
VFXVOICE.COM
DEPARTMENTS 2 EXECUTIVE NOTE 88 VES SECTION SPOTLIGHT: NEW ZEALAND
16 VFX TRENDS: VFX BREAKDOWNS The art of making a making-of video. 24 PROFILE: NONNY DE LA PEÑA A visionary storyteller’s mission to foster empathy. 30 FILM: ANNIHILATION Chilling VFX begins in suburbia, ends in psychedelia. 36 COVER: A WRINKLE IN TIME The challenge of translating effects into feelings. 44 ANIMATION: EARLY MAN Aardman Animations’ stop-motion legacy continues.
92 VES NEWS 94 V-ART: ALEX NICE 96 FINAL FRAME: DOUGLAS TRUMBULL, VES
ON THE COVER: Oprah Winfrey as celestial being Mrs. Which in A Wrinkle in Time. (Image copyright © 2018 Walt Disney Pictures. All rights reserved.)
50 THE 16TH ANNUAL VES AWARDS Celebrating the best in VFX. 58 VES AWARDS WINNERS Photo Gallery. 64 ROUNDTABLE: VFX MOVIEMAKING Industry leaders discuss trends and issues. 72 COMPANY PROFILE: DOUBLE NEGATIVE 20th anniversary of a VFX powerhouse. 78 VFX VAULT: WILLOW A classic showcase of ILM’s VFX might. 84 GAMES: SYSTEM BOOM Thriving with a diversity of interactive options.
4 • VFXVOICE.COM SPRING 2018
PG 4 TOC.indd 4
3/5/18 4:08 PM
PG 5 SVA-NYC AD.indd 5
3/5/18 4:08 PM
SPRING 2018 • VOL. 2, NO. 2
VFXVOICE
Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com
VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS
EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING advertising@vfxvoice.com Maria Lopez mmlopezmarketing@earthlink.net MEDIA media@vfxvoice.com CIRCULATION circulation@vfxvoice.com CONTRIBUTING WRITERS Ian Failes Naomi Goldman Andrew Hayward Trevor Hogg Kevin H. Martin ADVISORY COMMITTEE Rob Bredow Mike Chambers Neil Corbould Debbie Denise Paul Franklin David Johnson Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Eric Roth
OFFICERS Mike Chambers, Chair Jeffrey A. Okun, VES, 1st Vice Chair Kim Lavery, VES, 2nd Vice Chair Dan Schrecker, Treasurer Rita Cahill, Secretary DIRECTORS Jeff Barnes, Andrew Bly, Brooke Breton Kathryn Brillhart, Emma Clifton Perry Bob Coleman, Dayne Cowan Kim Davidson, Debbie Denise Richard Edlund, VES, Pam Hogarth Joel Hynek, Jeff Kleiser, Neil Lim-Sang Brooke Lyndon-Stanford, Tim McGovern Kevin Rafferty, Scott Ross, Barry Sandrew Tim Sassoon, David Tanaka, Bill Taylor, VES Richard Winn Taylor II, VES, Susan Zwerman ALTERNATES Fon Davis, Charlie Iturriaga Christian Kubsch, Andres Martinez Daniel Rosen, Katie Stetson, Bill Villarreal Tom Atkin, Founder Allen Battino, VES Logo Design Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Chris McKittrick, Director of Operations Jeff Casper, Manager of Media & Graphics Ben Schneider, Membership Coordinator Colleen Kelly, Office Manager Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations
Follow us on social media VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other foreign countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or advertising@vfxvoice.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2018 The Visual Effects Society. Printed in the U.S.A.
6 • VFXVOICE.COM SPRING 2018
PG 6 MASTHEAD.indd 6
3/5/18 4:09 PM
PG 7 SIDEFX AD REVISED.indd 7
3/5/18 4:10 PM
VFX TRENDS
Over the years, several high profile directors have started out in visual effects, including James Cameron, David Fincher and Neill Blomkamp. A number of well-established visual effects supervisors have also transitioned to directing, including Stefen Fangmeier (Eragon), Eric Brevig (Journey to the Center of the Earth, Yogi Bear) and Hoyt Yeatman (G-Force). Of course, VFX artists and supervisors tend to be wellplaced to break into directing, with intimate knowledge of the production process and often having to be on top of story development, live-action shooting and post-production. It is perhaps now a career choice even more accessible thanks partly to the wider availability of digital camera systems and filmmaking software. VFX Voice spoke to three visual effects artists, Hasraf Dulull, Victor Perez and Miguel Ortega, who are in the early stages of pivoting from visual effects into the world of directing, about their journeys. HASRAF DULULL
EFFECTS PROS PIVOT FROM VFX TO DIRECTING By IAN FAILES
8 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 8-9
Starting out in videogame cinematics, before moving into compositing and then visual effects supervision and producing in London, Hasraf “Haz” Dulull says his time in VFX has been his ‘film school’. Now he has several indie shorts and features behind him, beginning with his 2013 short Project Kronos and more recently the feature The Beyond. Project Kronos scored well online and landed the filmmaker a manager in Hollywood. Although some writing gigs followed for Paramount and Twentieth Century Fox, Dulull still needed to support himself with continued VFX work while pitching projects. “What I quickly learned was that it doesn’t matter how many short films you make or how many awards you win, you are still a first-time director when pitching for your debut feature film,” he says.
That spurred Dulull on to self-fund The Beyond, calling on frequent past collaborators to help. He also relied on technical knowledge to acquire suitable equipment, with assistance from Blackmagic and Adobe. Ultimately, a private investor, impressed with what Dulull had produced, gave additional funds. The film was picked up for worldwide distribution by Gravitas Ventures and was released in January. During production on The Beyond and a second feature, Origin Unknown, Dulull counted his effects experience as helping him get the best out of VFX from a limited budget. He also took advantage of previs and VFX shortcuts to plan things out. “On both films I used VFX to quickly visualize shot compositions for the actors to know what was going on, I would do those on my laptop using Photoshop and sometimes Maya to help block out scenes so that when I do the walk through in the morning on set I had my laptop and played back the ideas to the DP and crew and also to the actor – which served to be very helpful on a tight shoot where everyone needs to be on the same page.” “The other hat I was wearing was the producer hat,” adds Dulull, “and one thing I had to embrace was the sheer amount of paperwork!” Dulull has now been signed to a talent agency, APA, and set up his own production company, Haz Film, to tackle film and TV projects. The director believes any VFX artist can embark on the same dream. “The big advice I would give anyone looking to make their first film is to surround yourself with the best possible people you can get, and on top of that listen to them. Throw your ego out of the window, as every decision made or action taken on every step of the film is for one motive only – to serve the film, nothing else.”
OPPOSITE TOP: Director Hasraf Dulull on the set of his film, The Beyond. OPPOSITE BOTTOM: A scene from The Beyond. The film tells the story of a group of astronauts, modified with advanced robotics, who travel through a newly discovered wormhole. TOP: During shooting of The Beyond, astronaut actors wore simple suits with tracking markers that would be replaced with CG elements. BOTTOM LEFT: Echo director Victor Perez, left, on the motion-control greenscreen set. BOTTOM RIGHT: Perez was able to capitalize on a pre-existing relationship with motion-control specialists Stiller Studios in Sweden in order to direct his short film, Echo.
SPRING 2018 VFXVOICE.COM • 9
3/5/18 4:11 PM
VFX TRENDS
Over the years, several high profile directors have started out in visual effects, including James Cameron, David Fincher and Neill Blomkamp. A number of well-established visual effects supervisors have also transitioned to directing, including Stefen Fangmeier (Eragon), Eric Brevig (Journey to the Center of the Earth, Yogi Bear) and Hoyt Yeatman (G-Force). Of course, VFX artists and supervisors tend to be wellplaced to break into directing, with intimate knowledge of the production process and often having to be on top of story development, live-action shooting and post-production. It is perhaps now a career choice even more accessible thanks partly to the wider availability of digital camera systems and filmmaking software. VFX Voice spoke to three visual effects artists, Hasraf Dulull, Victor Perez and Miguel Ortega, who are in the early stages of pivoting from visual effects into the world of directing, about their journeys. HASRAF DULULL
EFFECTS PROS PIVOT FROM VFX TO DIRECTING By IAN FAILES
8 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 8-9
Starting out in videogame cinematics, before moving into compositing and then visual effects supervision and producing in London, Hasraf “Haz” Dulull says his time in VFX has been his ‘film school’. Now he has several indie shorts and features behind him, beginning with his 2013 short Project Kronos and more recently the feature The Beyond. Project Kronos scored well online and landed the filmmaker a manager in Hollywood. Although some writing gigs followed for Paramount and Twentieth Century Fox, Dulull still needed to support himself with continued VFX work while pitching projects. “What I quickly learned was that it doesn’t matter how many short films you make or how many awards you win, you are still a first-time director when pitching for your debut feature film,” he says.
That spurred Dulull on to self-fund The Beyond, calling on frequent past collaborators to help. He also relied on technical knowledge to acquire suitable equipment, with assistance from Blackmagic and Adobe. Ultimately, a private investor, impressed with what Dulull had produced, gave additional funds. The film was picked up for worldwide distribution by Gravitas Ventures and was released in January. During production on The Beyond and a second feature, Origin Unknown, Dulull counted his effects experience as helping him get the best out of VFX from a limited budget. He also took advantage of previs and VFX shortcuts to plan things out. “On both films I used VFX to quickly visualize shot compositions for the actors to know what was going on, I would do those on my laptop using Photoshop and sometimes Maya to help block out scenes so that when I do the walk through in the morning on set I had my laptop and played back the ideas to the DP and crew and also to the actor – which served to be very helpful on a tight shoot where everyone needs to be on the same page.” “The other hat I was wearing was the producer hat,” adds Dulull, “and one thing I had to embrace was the sheer amount of paperwork!” Dulull has now been signed to a talent agency, APA, and set up his own production company, Haz Film, to tackle film and TV projects. The director believes any VFX artist can embark on the same dream. “The big advice I would give anyone looking to make their first film is to surround yourself with the best possible people you can get, and on top of that listen to them. Throw your ego out of the window, as every decision made or action taken on every step of the film is for one motive only – to serve the film, nothing else.”
OPPOSITE TOP: Director Hasraf Dulull on the set of his film, The Beyond. OPPOSITE BOTTOM: A scene from The Beyond. The film tells the story of a group of astronauts, modified with advanced robotics, who travel through a newly discovered wormhole. TOP: During shooting of The Beyond, astronaut actors wore simple suits with tracking markers that would be replaced with CG elements. BOTTOM LEFT: Echo director Victor Perez, left, on the motion-control greenscreen set. BOTTOM RIGHT: Perez was able to capitalize on a pre-existing relationship with motion-control specialists Stiller Studios in Sweden in order to direct his short film, Echo.
SPRING 2018 VFXVOICE.COM • 9
3/5/18 4:11 PM
VFX TRENDS
VICTOR PEREZ
Also with several years of visual effects compositing and supervision experience is new director Victor Perez. His film credits include The Dark Knight Rises, Rogue One, Harry Potter and the Deathly Hallows, Pirates of the Caribbean: On Stranger Tides and 127 Hours. Perez is also well known for his work with The Foundry’s NUKE. He recently branched out to directing with the short Echo, an experimental film about a girl who awakes in an undetermined location and who sees her mirrored reflection 10 seconds ahead of her own time. It’s a film where, says Perez, prior visual effects knowledge was always going to be key. “For Echo, the process of post-production was a killer:
11,738 frames, 3K resolution, all-over greenscreen with full CG environments, no motion blur because of a limitation of the tech (so I had to create the motion blur of the plates using third-parties’ software), creating a synthetic mirror reflection with a synchronized set of motion-control rigs – never attempted before – all done in the ‘free’ time of me and my friends.” Live action for Echo was filmed in one day, actually a 22hour stretch at Stiller Studios in Sweden, a specialist motioncontrol studio. With his prior VFX experience, Perez says he considered himself very well prepared to handle the shoot, but it was still incredibly challenging. “As it’s a pioneering technology every time we found an issue we had to think how
“Delegate, don’t allow the project to overwhelm you because you are a controlling maniac. Delegate in others, others of your trust; that is a dangerous leap of faith. How can you trust someone if you don’t know him or her?” —Victor Perez
OPPOSITE BOTTOM: Ball’s RUIN short, a CG-animated race through a post-apocalyptic landscape, quickly caught the attention of studios and fans alike. (Image courtesy of Wes Ball) BOTTOM LEFT: A scene from Wes Ball’s Maze Runner: The Death Cure, the final film in the trilogy. The film’s lead visual effects vendor is Weta Digital. (Image copyright © 2017 Twentieth Century Fox Film Corporation. All rights reserved.) BOTTOM RIGHT: Director Wes Ball on the set of The Maze Runner. (Image copyright © 2013 Twentieth Century Fox Film Corporation. All rights reserved.)
Wes Ball: From VFX Newcomer to Major Film Director
One director who has successfully made the transition from visual effects and animation to major feature films is Wes Ball, now with three VFX-heavy movies under his belt in the Maze Runner trilogy, based on the James Dashner novels. Ball gained attention with his fully CG-animated short RUIN, set in a post-apocalyptic universe. Twentieth Century Fox then brought the director on board to helm the first Maze Runner film in 2014. The third movie in the trilogy, The Death Cure, released in February. Ball discussed his experience moving from indie VFX and animation to large-scale features. VFX Voice: What was your first film, VFX or animation project ever? Wes Ball: I guess, technically, it was a little short in high school I did with friends called The Bot. I know, super-creative title. It involved a student alone in high school hallways being chased by a mysterious flying robot. Nothing much happened, but it was my first attempt at marrying CG animation into a video plate. I used Lightwave 3D when it had just become a standalone application
10 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 10-11
from its Video Toaster days. Fun stuff. My first animation project was a short called A Work in Progress. This was my thesis film in film school. It was my first attempt at full CG filmmaking. It can be found on YouTube for anyone interested. VFX Voice: How did RUIN help you in terms of learning about creating worlds, directing and pitching projects? Ball: RUIN was a pretty big deal for me. It basically launched my career. It came from the culmination of several previous short films and VFX projects and pitching other movies around that time. I had a very broad sense of a big story that RUIN came from; I just attempted to bite off a small piece from the opening of that story. It definitely helped show people I knew how to move a camera and make something exciting to watch. And it certainly helped having a much larger story for RUIN when I wound up in various producer and studio offices around town. But I should say, I didn’t originally intend RUIN to be a short to ‘pitch’ or ‘sell’. I genuinely just wanted to make something and get it out of my head. I figured something
good may come out of it, but I never dreamed it would be the thing that ultimately gave me a shot at directing a studio movie. VFX Voice: What things do you feel you learned in VFX and animation that helped you when you started taking on The Maze Runner, and into the other films? Ball: It was probably just a sense of how VFX are generally pulled off. What the tricky parts are of making a shot work. I never wanted to just shoot something and let the ‘VFX guys’ figure it out. I tried to use what knowledge I had to set us up for good shots. Things like not taking roto for granted. Things like making any given shot count by not just spraying and praying with the camera. I tried to design the VFX aspects just as precisely as scene dialogue. VFX Voice: Do you have time to still get on the box and do any VFX shots or designs on your movies?
Maze Runner. I enjoy flexing those muscles, although I admit my skills have gotten pretty rusty compared to the artists that work on my movies now. But yeah, I’ve had a few – admittedly very simple shots on every movie I’ve directed so far. Hoping to keep that tradition alive as long I don’t ruin my own movie. VFX Voice: What advice would you give to visual effects artists or animators who are looking to get into directing? Ball: There’s no one way to get into directing. Don’t get caught up in the ‘step’ or how others have done it. If you want to direct, go do it. Make your own stuff and make people see you should be directing. Also, don’t let VFX drive your projects. Keep story and character in mind. It’s important. If you have the ability to do great VFX work, make sure that’s not what’s driving the project, make sure those skills are servicing some interesting concept or story or character. It’s easy to fall into that place of showing off to an audience instead of engaging and exciting them.
Ball: I still try to get on the box for every film I’ve done since the first
SPRING 2018 VFXVOICE.COM • 11
3/5/18 4:11 PM
VFX TRENDS
VICTOR PEREZ
Also with several years of visual effects compositing and supervision experience is new director Victor Perez. His film credits include The Dark Knight Rises, Rogue One, Harry Potter and the Deathly Hallows, Pirates of the Caribbean: On Stranger Tides and 127 Hours. Perez is also well known for his work with The Foundry’s NUKE. He recently branched out to directing with the short Echo, an experimental film about a girl who awakes in an undetermined location and who sees her mirrored reflection 10 seconds ahead of her own time. It’s a film where, says Perez, prior visual effects knowledge was always going to be key. “For Echo, the process of post-production was a killer:
11,738 frames, 3K resolution, all-over greenscreen with full CG environments, no motion blur because of a limitation of the tech (so I had to create the motion blur of the plates using third-parties’ software), creating a synthetic mirror reflection with a synchronized set of motion-control rigs – never attempted before – all done in the ‘free’ time of me and my friends.” Live action for Echo was filmed in one day, actually a 22hour stretch at Stiller Studios in Sweden, a specialist motioncontrol studio. With his prior VFX experience, Perez says he considered himself very well prepared to handle the shoot, but it was still incredibly challenging. “As it’s a pioneering technology every time we found an issue we had to think how
“Delegate, don’t allow the project to overwhelm you because you are a controlling maniac. Delegate in others, others of your trust; that is a dangerous leap of faith. How can you trust someone if you don’t know him or her?” —Victor Perez
OPPOSITE BOTTOM: Ball’s RUIN short, a CG-animated race through a post-apocalyptic landscape, quickly caught the attention of studios and fans alike. (Image courtesy of Wes Ball) BOTTOM LEFT: A scene from Wes Ball’s Maze Runner: The Death Cure, the final film in the trilogy. The film’s lead visual effects vendor is Weta Digital. (Image copyright © 2017 Twentieth Century Fox Film Corporation. All rights reserved.) BOTTOM RIGHT: Director Wes Ball on the set of The Maze Runner. (Image copyright © 2013 Twentieth Century Fox Film Corporation. All rights reserved.)
Wes Ball: From VFX Newcomer to Major Film Director
One director who has successfully made the transition from visual effects and animation to major feature films is Wes Ball, now with three VFX-heavy movies under his belt in the Maze Runner trilogy, based on the James Dashner novels. Ball gained attention with his fully CG-animated short RUIN, set in a post-apocalyptic universe. Twentieth Century Fox then brought the director on board to helm the first Maze Runner film in 2014. The third movie in the trilogy, The Death Cure, released in February. Ball discussed his experience moving from indie VFX and animation to large-scale features. VFX Voice: What was your first film, VFX or animation project ever? Wes Ball: I guess, technically, it was a little short in high school I did with friends called The Bot. I know, super-creative title. It involved a student alone in high school hallways being chased by a mysterious flying robot. Nothing much happened, but it was my first attempt at marrying CG animation into a video plate. I used Lightwave 3D when it had just become a standalone application
10 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 10-11
from its Video Toaster days. Fun stuff. My first animation project was a short called A Work in Progress. This was my thesis film in film school. It was my first attempt at full CG filmmaking. It can be found on YouTube for anyone interested. VFX Voice: How did RUIN help you in terms of learning about creating worlds, directing and pitching projects? Ball: RUIN was a pretty big deal for me. It basically launched my career. It came from the culmination of several previous short films and VFX projects and pitching other movies around that time. I had a very broad sense of a big story that RUIN came from; I just attempted to bite off a small piece from the opening of that story. It definitely helped show people I knew how to move a camera and make something exciting to watch. And it certainly helped having a much larger story for RUIN when I wound up in various producer and studio offices around town. But I should say, I didn’t originally intend RUIN to be a short to ‘pitch’ or ‘sell’. I genuinely just wanted to make something and get it out of my head. I figured something
good may come out of it, but I never dreamed it would be the thing that ultimately gave me a shot at directing a studio movie. VFX Voice: What things do you feel you learned in VFX and animation that helped you when you started taking on The Maze Runner, and into the other films? Ball: It was probably just a sense of how VFX are generally pulled off. What the tricky parts are of making a shot work. I never wanted to just shoot something and let the ‘VFX guys’ figure it out. I tried to use what knowledge I had to set us up for good shots. Things like not taking roto for granted. Things like making any given shot count by not just spraying and praying with the camera. I tried to design the VFX aspects just as precisely as scene dialogue. VFX Voice: Do you have time to still get on the box and do any VFX shots or designs on your movies?
Maze Runner. I enjoy flexing those muscles, although I admit my skills have gotten pretty rusty compared to the artists that work on my movies now. But yeah, I’ve had a few – admittedly very simple shots on every movie I’ve directed so far. Hoping to keep that tradition alive as long I don’t ruin my own movie. VFX Voice: What advice would you give to visual effects artists or animators who are looking to get into directing? Ball: There’s no one way to get into directing. Don’t get caught up in the ‘step’ or how others have done it. If you want to direct, go do it. Make your own stuff and make people see you should be directing. Also, don’t let VFX drive your projects. Keep story and character in mind. It’s important. If you have the ability to do great VFX work, make sure that’s not what’s driving the project, make sure those skills are servicing some interesting concept or story or character. It’s easy to fall into that place of showing off to an audience instead of engaging and exciting them.
Ball: I still try to get on the box for every film I’ve done since the first
SPRING 2018 VFXVOICE.COM • 11
3/5/18 4:11 PM
VFX TRENDS
AD “The biggest problem was that we didn’t know what the hell we were doing! This was our first time dealing with dialogue, ensemble cast, even model T cars. Even on a VFX level we had never done an 80-person all-CG room with cloth simulations and hair. The other big problem was of course just financial. We didn’t work for three-plus years, our savings were completely drained.” —Miguel Ortega
TOP: This step-by-step breakdown reveals the design, shooting and postproduction work required for an establishing shot in The Ningyo. BOTTOM: The Ningyo director Miguel Ortega describes his short film as a “26-minute Faustian tale about Cryptozoology.”
12 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 13
to solve it on the fly, without the possibility of asking anybody.” What greatly benefited Perez and his team during the shoot was a past knowledge from on-set supervision and post work. He brought those lessons directly into production. One of the big ones was ‘don’t fix it in post’. “When you are on set, time is precious so you tend to be in a trance state of rush, and as a VFX artist you could oversimplify your own job and think that doing certain tasks in post could be easier than doing them in the moment as you have no time to waste, but sometimes
SPRING 2018 VFXVOICE.COM • 13
3/5/18 4:11 PM
PG 13 VES HOUSE AD.indd 13
3/5/18 4:11 PM
VFX TRENDS
spending one minute on set could save a huge amount of hours in post.” “Also, with Echo I learned another lesson: you cannot get everything under control, no matter who you are, your experience or involvement in the project, even if you are paying the bills. Delegate, don’t allow the project to overwhelm you because you are a controlling maniac. Delegate in others, others of your trust; at the same time, that is a dangerous leap of faith. How can you trust someone if you don’t know him or her?” MIGUEL ORTEGA
“The big advice I would give anyone looking to make their first film is to surround yourself with the best possible people you can get, and on top of that listen to them. Throw your ego out of the window, as every decision made or action taken on every step of the film is for one motive only – to serve the film, nothing else.” —Hasraf “Haz” Dulull
TOP: One of the CG creatures designed and built for The Ningyo.
Pursuing a career in directing can involve significant sacrifice. Miguel Ortega, and his partner Tran Ma, know that more than most. The visual effects artists, who had worked at studios such as Digital Domain and Sony Pictures Imageworks, embarked on their own short film, The Ningyo, with little funding but a lot of determination. Along the way, they would be filming key scenes in their own modified house for months, crowdfunding extra resources, and then working on the VFX themselves, along with many helpful associates. The result was a four-year journey that resulted in a compelling story – about the search for the elusive mythical Ningyo creature – and a proof of concept for a feature film or even television series. Ortega directed his actors sometimes in the living room or on a stairway, almost any way he could knowing that he and his collaborators had the VFX wherewithal to extend environments and add the necessary CG creatures in the film. “VFX allowed me to know what I could cheat and what I couldn’t,” says Ortega. “We only built what was absolutely needed. I actually wish we would have built less in some areas. I recommend people watch the making-of video we created because it really shows how we used every inch of our house as a set. None of this would work without VFX. The experienced actors were terrified, I will say, seeing our cheap set-ups.” Ortega had not had directorial experience before, but he drew on the support of filmmaking friends and his VFX history to get shots done. Nevertheless, he admits it was a major learning curve. “The biggest problem was that we didn’t know what the hell we were doing! This was our first time dealing with dialogue, ensemble cast, even model T cars. Even on a VFX level we had never done an 80-person all-CG room with cloth simulations and hair. The other big problem was of course just financial. We didn’t work for three-plus years; our savings were completely drained.” But Ortega did pull off the project, and has regularly wowed audiences in presentations about the short by revealing the innovative shooting techniques and the extensive visual effects work. And although Ortega does say “there isn’t a day when I don’t sit and wonder if I’m on my way to being a total failure,” he has been in regular discussions with interested parties about The Ningyo and possible future films.
14 • VFXVOICE.COM SPRING 2018
PG 8-14 VFX TO DIRECT.indd 14
3/5/18 4:11 PM
PG 15 VFXV HOUSE AD.indd 15
3/5/18 4:12 PM
VFX TRENDS
IMAGE ENGINE AND THE ART OF THE VFX BREAKDOWN By IAN FAILES
A good visual effects breakdown video that breaks down how a shot or sequence was put together can easily go viral. But these making-of reels serve many other purposes as well. They are used for award submissions, in marketing for film, TV shows and home entertainment releases, and of course to promote the visual effects studios and artists behind the magic. There’s an art to making a VFX breakdown that can go viral, or help win awards, and visual effects studios spend significant time combining the expertise from their production and marketing departments in developing the best showcase possible. VFX Voice asked one studio, Image Engine in Vancouver, how their team approaches breakdowns. Crews from different areas of the company shared their process, with examples of making-of videos from Logan, Power Rangers, Fantastic Beasts and Where to Find Them and Kingsglaive: Final Fantasy XV. GETTING STARTED
TOP: These frames from Image Engine’s breakdown for CG-animated film Kingsglaive: Final Fantasy XV showcase the character animation and destruction simulations for a fight sequence the studio completed. OPPOSITE: TOP: “When we put together the breakdown reel for our work on the film, we really wanted to make it have the same high energy and ‘epic-ness’ of the shots,” says editor Jeremy Szostak. OPPOSITE MIDDLE: “The breakdowns we made were a huge collaboration between the comp and editorial team,” notes Szostak. “Comp created thousands of layers for the editorial team to use for simpler 2D breakdowns. They then worked on the more complex 3D camera moves and animation for more complicated work.” OPPOSITE BOTTOM: Adds Szostak: “One section in particular was a combination of both of these techniques, where we use a custom 3D camera to ‘travel’ between several different shots and sequences that we worked on, on each shot stopping to do a 2D or 3D breakdown and then continuously moving back into the custom camera which travels to the next shot.”
16 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 16-17
The first decision to make in putting together a VFX breakdown is who to involve. At Image Engine, that is typically the visual effects supervisor, the visual effects executive producer, the marketing manager and the editorial department. But due to the nature of breakdowns – which often show the layers of elements that go into a shot – the compositing team and other departments can also be heavily involved. “Many times these breakdowns require us to create additional assets and renderings from the existing source material and as such they can be a little mini-production unto themselves,” explains Image Engine Visual Effects Executive Producer Shawn Walsh. Then it’s a matter of deciding what’s important about the work that needs to be showcased. “We consider that these reels are not only for the purposes of promotion, but also in order to recognize the hard work of our teams by submitting for awards season,” notes Walsh.
SPRING 2018 VFXVOICE.COM • 17
3/5/18 4:12 PM
VFX TRENDS
IMAGE ENGINE AND THE ART OF THE VFX BREAKDOWN By IAN FAILES
A good visual effects breakdown video that breaks down how a shot or sequence was put together can easily go viral. But these making-of reels serve many other purposes as well. They are used for award submissions, in marketing for film, TV shows and home entertainment releases, and of course to promote the visual effects studios and artists behind the magic. There’s an art to making a VFX breakdown that can go viral, or help win awards, and visual effects studios spend significant time combining the expertise from their production and marketing departments in developing the best showcase possible. VFX Voice asked one studio, Image Engine in Vancouver, how their team approaches breakdowns. Crews from different areas of the company shared their process, with examples of making-of videos from Logan, Power Rangers, Fantastic Beasts and Where to Find Them and Kingsglaive: Final Fantasy XV. GETTING STARTED
TOP: These frames from Image Engine’s breakdown for CG-animated film Kingsglaive: Final Fantasy XV showcase the character animation and destruction simulations for a fight sequence the studio completed. OPPOSITE: TOP: “When we put together the breakdown reel for our work on the film, we really wanted to make it have the same high energy and ‘epic-ness’ of the shots,” says editor Jeremy Szostak. OPPOSITE MIDDLE: “The breakdowns we made were a huge collaboration between the comp and editorial team,” notes Szostak. “Comp created thousands of layers for the editorial team to use for simpler 2D breakdowns. They then worked on the more complex 3D camera moves and animation for more complicated work.” OPPOSITE BOTTOM: Adds Szostak: “One section in particular was a combination of both of these techniques, where we use a custom 3D camera to ‘travel’ between several different shots and sequences that we worked on, on each shot stopping to do a 2D or 3D breakdown and then continuously moving back into the custom camera which travels to the next shot.”
16 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 16-17
The first decision to make in putting together a VFX breakdown is who to involve. At Image Engine, that is typically the visual effects supervisor, the visual effects executive producer, the marketing manager and the editorial department. But due to the nature of breakdowns – which often show the layers of elements that go into a shot – the compositing team and other departments can also be heavily involved. “Many times these breakdowns require us to create additional assets and renderings from the existing source material and as such they can be a little mini-production unto themselves,” explains Image Engine Visual Effects Executive Producer Shawn Walsh. Then it’s a matter of deciding what’s important about the work that needs to be showcased. “We consider that these reels are not only for the purposes of promotion, but also in order to recognize the hard work of our teams by submitting for awards season,” notes Walsh.
SPRING 2018 VFXVOICE.COM • 17
3/5/18 4:12 PM
VFX TRENDS
Stepping Through a Logan Breakdown Scene For one action scene in Logan, a stunt double played the title character during filming and then was replaced with a photoreal digital Hugh Jackman, his head created by Image Engine. “When we created the breakdowns for Logan our goal was simply to show the extent to which our team went towards creating a completely believable digital Hugh Jackman,” outlines Image Engine’s Shawn Walsh. “It was important to us that anyone viewing the breakdowns had a sense for the pore-level detail that we put into the asset.” Here’s a look at some example frames from the limo sequence part of Image Engine’s breakdown: Plate: The original photography featuring the stunt performer wearing a few tracking markers.
“Picture-in-picture breakdowns and cutting to surrounding finals in context are just some of the ways that we try to keep the breakdowns more interesting.” —Ted Proctor, Editor, Image Engine
For this Power Rangers VFX breakdown, Image Engine sought to demonstrate the process involved in integrating their CG character, the robot assistant Alpha 5, into live-action scenes. (Power Rangers images copyright © 2017 Lions Gate Entertainment Inc. All rights reserved.)
18 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 18-19
Clean plate: Parts of the plate around the head are painted to deal with the movement of the CG head during the shot. CG elements: Image Engine’s gray-shaded Hugh Jackman head, which also included simulated hair. Lighting pass: The render pass out of Solid Angle’s Arnold showcased the use of sub-surface scattering for Hugh Jackman’s skin. Final composite: The final composited frame with added motion blur. In the breakdown, each of these frames wiped down in turn. This part of the reel also included run-throughs of the shot (forward and in reverse), and a quick side-by-side comparison of the before and after of the driving shot.
STORYTELLING ABOUT THE STORY
So how do you then choose what goes into a breakdown reel? It can be subjective, suggests Image Engine Visual Effects Supervisor Joao Sita, who also notes that even before a show is complete there is much consideration about whether a particular shot may go into a making-of. “We start with a basic selection of the shots that represent the best work done or the shots in which we had a specific story point to tell,” he says. “Usually the VFX supervisor along with the department supervisors will browse through the shots and select them for promo, and from there we come up with an approach of how this will be put together.” Telling a story with the breakdown is often a key goal. Image Engine Visual Effects Supervisor Martyn “Moose” Culpitt, who oversaw the studio’s digital double and other VFX work for Logan, put together the breakdowns for that film himself. “I really wanted to tell the whole story and process involved so that everyone could understand the complexity of what we did. For me it’s not just looking at pretty pictures, it’s actually being able to understand the entire process behind them.” However, how much emphasis is placed on telling a story or just showcasing ‘cool’ shots can depend on who the breakdown reel is intended for, according to Sita. “If we are presenting a breakdown for future clients, you might show the whole process involved in the shot or sequence from concept to builds and so forth. This allows for a greater understanding of the process and how that process might affect their experience working with us.” On the other hand, says Sita, “If it’s a presentation for a bigger audience in which you can simplify how you present the work,
“When we created the breakdowns for Logan our goal was simply to show the extent to which our team went towards creating a completely believable digital Hugh Jackman.” —Shawn Walsh, Visual Effects Executive Producer, Image Engine All Logan images copyright © 2017 Marvel. TM and © 2017 Twentieth Century Fox Film Corporation. All rights reserved.
SPRING 2018 VFXVOICE.COM • 19
3/5/18 4:12 PM
VFX TRENDS
Stepping Through a Logan Breakdown Scene For one action scene in Logan, a stunt double played the title character during filming and then was replaced with a photoreal digital Hugh Jackman, his head created by Image Engine. “When we created the breakdowns for Logan our goal was simply to show the extent to which our team went towards creating a completely believable digital Hugh Jackman,” outlines Image Engine’s Shawn Walsh. “It was important to us that anyone viewing the breakdowns had a sense for the pore-level detail that we put into the asset.” Here’s a look at some example frames from the limo sequence part of Image Engine’s breakdown: Plate: The original photography featuring the stunt performer wearing a few tracking markers.
“Picture-in-picture breakdowns and cutting to surrounding finals in context are just some of the ways that we try to keep the breakdowns more interesting.” —Ted Proctor, Editor, Image Engine
For this Power Rangers VFX breakdown, Image Engine sought to demonstrate the process involved in integrating their CG character, the robot assistant Alpha 5, into live-action scenes. (Power Rangers images copyright © 2017 Lions Gate Entertainment Inc. All rights reserved.)
18 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 18-19
Clean plate: Parts of the plate around the head are painted to deal with the movement of the CG head during the shot. CG elements: Image Engine’s gray-shaded Hugh Jackman head, which also included simulated hair. Lighting pass: The render pass out of Solid Angle’s Arnold showcased the use of sub-surface scattering for Hugh Jackman’s skin. Final composite: The final composited frame with added motion blur. In the breakdown, each of these frames wiped down in turn. This part of the reel also included run-throughs of the shot (forward and in reverse), and a quick side-by-side comparison of the before and after of the driving shot.
STORYTELLING ABOUT THE STORY
So how do you then choose what goes into a breakdown reel? It can be subjective, suggests Image Engine Visual Effects Supervisor Joao Sita, who also notes that even before a show is complete there is much consideration about whether a particular shot may go into a making-of. “We start with a basic selection of the shots that represent the best work done or the shots in which we had a specific story point to tell,” he says. “Usually the VFX supervisor along with the department supervisors will browse through the shots and select them for promo, and from there we come up with an approach of how this will be put together.” Telling a story with the breakdown is often a key goal. Image Engine Visual Effects Supervisor Martyn “Moose” Culpitt, who oversaw the studio’s digital double and other VFX work for Logan, put together the breakdowns for that film himself. “I really wanted to tell the whole story and process involved so that everyone could understand the complexity of what we did. For me it’s not just looking at pretty pictures, it’s actually being able to understand the entire process behind them.” However, how much emphasis is placed on telling a story or just showcasing ‘cool’ shots can depend on who the breakdown reel is intended for, according to Sita. “If we are presenting a breakdown for future clients, you might show the whole process involved in the shot or sequence from concept to builds and so forth. This allows for a greater understanding of the process and how that process might affect their experience working with us.” On the other hand, says Sita, “If it’s a presentation for a bigger audience in which you can simplify how you present the work,
“When we created the breakdowns for Logan our goal was simply to show the extent to which our team went towards creating a completely believable digital Hugh Jackman.” —Shawn Walsh, Visual Effects Executive Producer, Image Engine All Logan images copyright © 2017 Marvel. TM and © 2017 Twentieth Century Fox Film Corporation. All rights reserved.
SPRING 2018 VFXVOICE.COM • 19
3/5/18 4:12 PM
VFX TRENDS
“I really wanted to tell the whole story and process involved so that everyone could understand the complexity of what we did. For me it’s not just looking at pretty pictures, it’s actually being able to understand the entire process behind them.” —Martyn Culpitt, Visual Effects Supervisor, Image Engine
Part of Image Engine’s Fantastic Beasts and Where to Find Them breakdown, which showcases the bluescreen stage plates, in-progress shots, and the final CG creature renders and compositing. (Fantastic Beasts and Where to Find Them images copyright © 2016 WBEI. Publishing Rights © J.K.R. All rights reserved.)
20 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 21
most of the time befores and afters are enough, along with some on-set type of information, such as how it was shot, camera type, and brushing in the whole process in a more superficial way.”
AD
LOGISTICS, PROBLEMS AND SOLUTIONS
On most Image Engine VFX breakdowns, the editorial department gets involved early – the department also deals with key logistical issues like simply finding the right footage and layers, and making creative decisions about whether any new or additional layers might be required to add extra storytelling elements to a reel. That first part, finding material, can be time-consuming enough, especially collecting archived material or re-rendering elements, but it is more than often confounded by the dreaded issue of studio permissions. Reels regularly need to be shown to the original filmmakers and/or film distributors before finalization. Music is also an incredibly important part of a VFX breakdown. It can often define the beats and rhythm of the reel, but of course requires clearances. According to Image Engine’s team, it’s best to know what might come up later and deal with it in the present. “At times the logistical aspects of demo reels can seem overwhelming,” admits Walsh. “From security approvals, to whether or not we are able to show an actor’s face, to whether our breakdown is illustrative of the overall marketing campaign for the film – we can face many delays for exclusions of material that we would otherwise like to use. Over the years we’ve gotten savvy about getting to know who at the studio will ultimately help us shepherd the approval process so that
SPRING 2018 VFXVOICE.COM • 21
3/5/18 4:12 PM
PG 21 CINEGEAR AD.indd 21
3/5/18 4:13 PM
VFX TRENDS
“At times the logistical aspects of demo reels can seem overwhelming. From security approvals, to whether or not we are able to show an actor’s face, to whether our breakdown is illustrative of the overall marketing campaign for the film – we can face many delays for exclusions of material that we would otherwise like to use.” —Shawn Walsh, Visual Effects Executive Producer, Image Engine we can capture the broadest amount of material from our work possible.” WHAT TO INCLUDE, HOW TO INCLUDE IT
Collecting final shots, layers, b-roll, individual renders and all the other elements for a breakdown will hopefully be a matter of trawling through the material produced during a production. But for extra elements that form part of a breakdown reel, editorial at Image Engine tends to look to the compositing department. “The comp team will handle any breakdowns that require actual 3D work or delving into the actual scene files,” states Image Engine editor Jeremy Szostak. “An example of this would be a breakdown in which we stop on a frame and the camera pulls away from a model and spins around giving us a 360-degree view of the work that was done for a shot. For simpler wipes the comp team will supply layers of a shot to the editorial team and they will build the breakdown in their NLE [non-linear editing].” There are several ways that a typical shot might be shown being ‘made’ in a breakdown. Simple A over B wipes are common, as are re-times and freeze-frames to swap between layers (Walsh says Image Engine has typically relied on transitions that “don’t distract from the goal of the breakdown or aren’t totally out of keeping with the source material”). “Sometimes wipes can become too repetitive and the breakdowns become boring regardless of the complexity of the layers that are being transitioned to,” details Image Engine editor Ted Proctor. “We try to use some variation whenever possible to make the breakdowns less repetitive. Picture-in-picture breakdowns and cutting to surrounding finals in context are just some of the ways that we try to keep the breakdowns more interesting.” “Anything that helps the viewer to understand the process is valid and ultimately it comes down to the ‘how much time’ you can actually put on this before the company resources are allocated in different shows,” adds Sita. “Often times rendering additional elements, turntables, additional camera moves help with the dynamic of the reel as well as keeping it visually interesting.”
A common goal among the Image Engine team is making sure the result is comprehensible. Visual effects can be complex and can sometimes involve fast-moving cameras. Slowing the action down and pin-pointing a key moment can help the viewer understand both the shot and how it was made. “For a very dynamic shot with a lot of movement or animation happening,” explains Szostak, “you usually want to freeze on a frame and then do your breakdown on a frozen image. In contrast, if the shot is a locked off camera and a large wide of a big environment, it might work better to just let the shot play out in real-time and have the wipes happen simultaneously.” TOOLS AND TRICKS OF THE TRADE
The tools Image Engine uses to actually put the breakdowns together differs. Editorial relies on familiar non-linear editing programs, with NUKE and Maya used for compositing and generating any extra 3D elements. Culpitt notes that the most complex breakdowns remain mainly in NUKE. “For Logan, for example,” says Culpitt, “we had one big breakdown that I actually did all the editing in NUKE myself and then rendered one long clip for editorial to add music to. That specific breakdown took a few weeks to create, as I had to source many different pieces from the show and even get the crew to render new passes for me.” Ultimately, a VFX breakdown needs to sum up the spectacle of a particular shot, or surprise the viewer by perhaps revealing how much of a shot involved visual effects. It’s often more than people realize. “First and foremost, a breakdown’s most important purpose is to show the extent of the work that was done on a shot,” asserts Szostak. “Once that is fully achieved, then we start to think about how to build the edit of the actual breakdown reel, which continues to forward this goal while also being enjoyable to watch.” Visual effects breakdown videos discussed in this article can be seen on Image Engine’s Vimeo page: https://vimeo.com/imageengine
22 • VFXVOICE.COM SPRING 2018
PG 16-22 VFX BREAKDOWNS.indd 22
3/5/18 4:12 PM
PG 23 SIGGRAPH AD.indd 23
3/5/18 4:13 PM
PROFILE
NONNY DE LA PEÑA: PIONEERING VR AND IMMERSIVE JOURNALISM By NAOMI GOLDMAN
All photos courtesy of Emblematic Group
TOP: Nonny de la Peña
24 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 24-25
What if you could get inside the headlines and be one with the story? What if you could go beyond observing things on the street and experience emotion as if a situation is happening to you? Bearing witness to a violent attack, feeling the power of gunfire, being in the room with prisoners in jail cells and patients in clinics – these are real-life stories being brought to life through the dynamic field of virtual reality journalism. Nonny de la Peña, named “The Godmother of Virtual Reality” by The Guardian and Engadget and one of the 20 most influential Latinas in tech by CNET, is a pioneer of virtual reality and immersive journalism – a phrase she coined to describe the groundbreaking application of VR for social impact. As the founder and CEO of Emblematic Group, a leading VR/AR company that focuses on creating empathetic engagement, she has collaborated with PBS Frontline, Wall Street Journal, The New York Times, Planned Parenthood, the True Colors Fund and other organizations to create impactful virtual reality experiences depicting real-life events. Her VR projects include Across the Line, which helps viewers understand what some women go through to access reproductive health services, After Solitary, which takes viewers inside the Maine State Prison to experience a harrowing story of solitary confinement, and Out of Exile, which uses VR to draw attention to the plight of homeless LGBTQ youth. Other projects have explored Guantanamo Bay Prison, experiences of refugees and, most recently, the impact of climate change on the landscape of Greenland. VFX Voice experienced several of Emblematic’s projects in their Santa Monica, California studio and sat down with de la Peña to talk about the value of interactivity and her mission as a visionary storyteller seeking to foster empathy. VFX Voice: You have a reputation as a boundary breaker who has changed the face of multimedia journalism. Tell us about your trajectory from mainstream news journalist to using VR for social impact. de la Peña: As a correspondent for Newsweek and then a traditional documentary filmmaker, I was always interested in conveying stories about the underdogs or people you might not have access to in your day-to-day life. Much of journalism is about reporting the facts, the scene on the ground, describing the reactions and aftermath of often vexing situations. As journalists we work to generate critical engagement, but I wanted to go beyond the reporting and focus on the experience wherein people could virtually transport themselves. In 2004, I wrote and directed a documentary entitled Unconstitutional: The War on Our Civil Liberties, which examined the subjugation of American civil liberties following the events of 9/11. It had a big segment on Guantanamo Bay Prison, as I felt that the media was not adequately covering Gitmo. I was awarded a grant from The MacArthur Foundation and Bay Area Video Coalition to translate an existing documentary project into digital media. How do you report on a destination where you don’t have access? That was the catalyst. We built a virtual but accessible version in Second Life in 2007 (and later rebuilt it for our premiere at Unity Moscow
“What if I could present you with a story that you would remember with your entire body and not just your mind? With VR, I can put you in a scene in the middle of a story you normally see on the news and lessen the gap between actual events and personal experience.” —Nonny de la Peña Museum) that threw people into detainee positions as they heard tormenting encounters with guards, and had them face the concept of habeas corpus. That technology is something I embraced and advanced to allow for that frame-breaking user experience. I think that anything that exposes you to differing viewpoints or culture or lifestyle is important and enriches our understanding of the human condition. So I moved further into negotiating the boundaries between conveying the news and creating opportunities for people to fully immerse themselves – sometimes as a victim, sometimes as a witness, or sometimes as the protagonist, but in all cases getting a real-life view from the ground. VFX Voice: You’ve addressed many of the world’s most pressing social issues. Give us some insight into your most recent projects and what you were hoping they accomplish. de la Peña: Across the Line was produced in partnership with the Planned Parenthood Foundation of America and premiered at the 2016 Sundance Film Festival as a walk-around experience. It helps viewers understand what some women go through to access abortion services. We wanted to place viewers in the shoes of a patient entering a health center to experience what they encounter and depict the toxic environment that many healthcare providers, health center staff and patients must endure to provide or access care on a daily basis. A true multimedia project, we used real audio gathered at protests, scripted scenes and documentary footage. We created a 360-degree version so the piece could be distributed as widely as possible. Out of Exile: Daniel’s Story is a powerful reminder of the kind of hostility faced by so many in the LGBTQ community. In this piece we wanted to shine a light on the fact that 40% of homeless youth in America identify as LGBTQ, with the majority coming from communities of color. We take you into the story of Daniel Ashley Pierce as he is confronted about his sexual orientation by his family in a “religious intervention,” an event that turns dramatic and violent. It was important to us to create a powerful reminder of the kind of hostility faced by so many in the LGBTQ community, but also end with a sense of optimism as Daniel and others share their personal accounts of triumphing over despair. We used videogrammetry to create holograms of Daniel and his peers and recreate the event using video captured by Daniel at the time. It was a challenging and emotionally raw experience because we were asking Daniel to recount the brutal attack, blow by blow,
TOP TO BOTTOM: Across the Line helps viewers understand what some women go through to access reproductive health services. Viewers are placed “in the shoes of a patient entering a health center to experience what many health-care providers, health center staff and patients must endure to provide or access care on a daily basis,” says de la Peña. The multimedia project premiered at the 2016 Sundance Film Festival as a walk-around experience.
SPRING 2018 VFXVOICE.COM • 25
3/5/18 4:14 PM
PROFILE
NONNY DE LA PEÑA: PIONEERING VR AND IMMERSIVE JOURNALISM By NAOMI GOLDMAN
All photos courtesy of Emblematic Group
TOP: Nonny de la Peña
24 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 24-25
What if you could get inside the headlines and be one with the story? What if you could go beyond observing things on the street and experience emotion as if a situation is happening to you? Bearing witness to a violent attack, feeling the power of gunfire, being in the room with prisoners in jail cells and patients in clinics – these are real-life stories being brought to life through the dynamic field of virtual reality journalism. Nonny de la Peña, named “The Godmother of Virtual Reality” by The Guardian and Engadget and one of the 20 most influential Latinas in tech by CNET, is a pioneer of virtual reality and immersive journalism – a phrase she coined to describe the groundbreaking application of VR for social impact. As the founder and CEO of Emblematic Group, a leading VR/AR company that focuses on creating empathetic engagement, she has collaborated with PBS Frontline, Wall Street Journal, The New York Times, Planned Parenthood, the True Colors Fund and other organizations to create impactful virtual reality experiences depicting real-life events. Her VR projects include Across the Line, which helps viewers understand what some women go through to access reproductive health services, After Solitary, which takes viewers inside the Maine State Prison to experience a harrowing story of solitary confinement, and Out of Exile, which uses VR to draw attention to the plight of homeless LGBTQ youth. Other projects have explored Guantanamo Bay Prison, experiences of refugees and, most recently, the impact of climate change on the landscape of Greenland. VFX Voice experienced several of Emblematic’s projects in their Santa Monica, California studio and sat down with de la Peña to talk about the value of interactivity and her mission as a visionary storyteller seeking to foster empathy. VFX Voice: You have a reputation as a boundary breaker who has changed the face of multimedia journalism. Tell us about your trajectory from mainstream news journalist to using VR for social impact. de la Peña: As a correspondent for Newsweek and then a traditional documentary filmmaker, I was always interested in conveying stories about the underdogs or people you might not have access to in your day-to-day life. Much of journalism is about reporting the facts, the scene on the ground, describing the reactions and aftermath of often vexing situations. As journalists we work to generate critical engagement, but I wanted to go beyond the reporting and focus on the experience wherein people could virtually transport themselves. In 2004, I wrote and directed a documentary entitled Unconstitutional: The War on Our Civil Liberties, which examined the subjugation of American civil liberties following the events of 9/11. It had a big segment on Guantanamo Bay Prison, as I felt that the media was not adequately covering Gitmo. I was awarded a grant from The MacArthur Foundation and Bay Area Video Coalition to translate an existing documentary project into digital media. How do you report on a destination where you don’t have access? That was the catalyst. We built a virtual but accessible version in Second Life in 2007 (and later rebuilt it for our premiere at Unity Moscow
“What if I could present you with a story that you would remember with your entire body and not just your mind? With VR, I can put you in a scene in the middle of a story you normally see on the news and lessen the gap between actual events and personal experience.” —Nonny de la Peña Museum) that threw people into detainee positions as they heard tormenting encounters with guards, and had them face the concept of habeas corpus. That technology is something I embraced and advanced to allow for that frame-breaking user experience. I think that anything that exposes you to differing viewpoints or culture or lifestyle is important and enriches our understanding of the human condition. So I moved further into negotiating the boundaries between conveying the news and creating opportunities for people to fully immerse themselves – sometimes as a victim, sometimes as a witness, or sometimes as the protagonist, but in all cases getting a real-life view from the ground. VFX Voice: You’ve addressed many of the world’s most pressing social issues. Give us some insight into your most recent projects and what you were hoping they accomplish. de la Peña: Across the Line was produced in partnership with the Planned Parenthood Foundation of America and premiered at the 2016 Sundance Film Festival as a walk-around experience. It helps viewers understand what some women go through to access abortion services. We wanted to place viewers in the shoes of a patient entering a health center to experience what they encounter and depict the toxic environment that many healthcare providers, health center staff and patients must endure to provide or access care on a daily basis. A true multimedia project, we used real audio gathered at protests, scripted scenes and documentary footage. We created a 360-degree version so the piece could be distributed as widely as possible. Out of Exile: Daniel’s Story is a powerful reminder of the kind of hostility faced by so many in the LGBTQ community. In this piece we wanted to shine a light on the fact that 40% of homeless youth in America identify as LGBTQ, with the majority coming from communities of color. We take you into the story of Daniel Ashley Pierce as he is confronted about his sexual orientation by his family in a “religious intervention,” an event that turns dramatic and violent. It was important to us to create a powerful reminder of the kind of hostility faced by so many in the LGBTQ community, but also end with a sense of optimism as Daniel and others share their personal accounts of triumphing over despair. We used videogrammetry to create holograms of Daniel and his peers and recreate the event using video captured by Daniel at the time. It was a challenging and emotionally raw experience because we were asking Daniel to recount the brutal attack, blow by blow,
TOP TO BOTTOM: Across the Line helps viewers understand what some women go through to access reproductive health services. Viewers are placed “in the shoes of a patient entering a health center to experience what many health-care providers, health center staff and patients must endure to provide or access care on a daily basis,” says de la Peña. The multimedia project premiered at the 2016 Sundance Film Festival as a walk-around experience.
SPRING 2018 VFXVOICE.COM • 25
3/5/18 4:14 PM
PROFILE
“I moved further into negotiating the boundaries between conveying the news and creating opportunities for people to fully immerse themselves – sometimes as a victim, sometimes as a witness, or sometimes as the protagonist, but in all cases, getting a real-life view from the ground.” —Nonny de la Peña
“I think a lot of the power comes from what happens after you take off the goggles and have to confront your feelings with others around you. People are pushed out of their comfort zones and cannot look away from difficult situations. I think that sense of empathy drew others to the field.” —Nonny de la Peña
sharing with us who hit him and how, so that we could reenact the event with motion capture and stay as true to the actual experience as possible. Having his words as a guidepost is a really important part of our process to instill the experience with integrity and authenticity. And on the heels of the United States’ withdrawal from the Paris Climate agreement, we created Greenland Melting in collaboration with PBS Frontline and NOVA. The piece provides a rare, up-close view of the icy arctic scenery that’s disappearing faster than predicted. We used a 360 Ozo Camera clamped onto the edge of a helicopter allowing us to create that vantage point so that users are viewing the landscape as if they are in the helicopter themselves.
the treatment of these woman and healthcare workers – and that is significant in terms of creating a meaningful sense of understanding by literally walking in someone else’s shoes.
VFX Voice: You’ve said that ‘as the world becomes increasingly global and our online and offline lives become increasingly integrated, it is critical to convey stories that create empathy and preserve our humanity.’ What can virtual reality teach us about empathy?
TOP: Out of Exile: Daniel’s Story uses VR to draw attention to the plight of homeless LGBTQ youth. De la Peña calls Daniel Ashley Pierce’s story a “powerful reminder of the kind of hostility faced by so many in the LGBTQ community.” But the VR ends “with a sense of optimism as Daniel and others share their personal accounts of triumphing over despair.” Videogrammetry was used to create holograms of Daniel and his peers and recreate the event using video captured by Daniel at the time. OPPOSITE PAGE: Greenland Melting explores the impact of climate change on Greenland, providing a rare, up-close view of the rapidly disappearing arctic landscape. A 360 Ozo Camera clamped onto the edge of a helicopter gives users the vantage point of viewing the landscape as if they’re in a helicopter themselves.
26 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 26-27
de la Peña: What if I could present you with a story that you would remember with your entire body and not just your mind? With VR, I can put you in a scene in the middle of a story you normally see on the news and lessen the gap between actual events and personal experience. In 2012, we premiered the first virtual reality experience at the Sundance Film Festival, Hunger in Los Angeles. In the piece, animated characters stand in line outside a soup kitchen that has run out of food as real audio plays – and you watch as a man collapses in a diabetic seizure. We had no idea what to expect. But the emotion was overwhelming. People were crying their eyes out in a way I haven’t seen since then on any of our projects. I think a lot of the power comes from what happens after you take off the goggles and have to confront your feelings with others around you. People are pushed out of their comfort zones and cannot look away from difficult situations. I think that sense of empathy drew others to the field. It was during this project that I introduced Alejandro (Academy Award-winning director Alejandro González Iñárritu) to the medium, and I’m really inspired by the powerful immersive works he has created. Here’s another good example. Planned Parenthood took the Across the Line piece into communities with a strong anti-abortion stance and after seeing it, many came out feeling that women should not have to experience that kind of hateful vitriol. It didn’t change their views on abortion per se, but in some cases it opened their eyes to
VFX Voice: What is your approach to starting a VR project? de la Peña: At the beginning of any project, I think about how I would experience a situation with my whole body, using all of my senses. I literally close my eyes and place myself right into the scene. I walk around the scene and think about what I would feel and hear and see and what emotions would be running through my body as the story unfolds around me. I am always thinking about the spatial nature of the narrative and try to capture and absorb the situation. Think about how your experience changes if you see something through a car window vs. being right there on the street. Or how your point of view changes if you’re scared or hungry or tired. If you’re a bystander or a victim or a perpetrator of a crime. Telling people about something you have heard or watched is just less visceral than describing it first-person – and that sense of presence is what we strive for. In creating After Solitary, we used photogrammetry. We took a huge number of photos of a solitary cell from every conceivable angle and then stitched them together. Then we used volumetric capture photography using multiple cameras and stitched the frames together. We captured video of an actual inmate and then dropped him into the solitary cell with you so that he and you are moving together around the cell as he tells his story in that confined space. Viewers come out with a very raw understanding of that restricted environment and its psychological impact. VFX Voice: Are there best practices for VR journalism? What’s an example of the balancing act you managed between VR technology and authenticity? de la Peña: Maintaining authenticity and integrity are of the utmost importance as we create these virtual experiences. Our projects are guided by real-life accounts, and having captured audio and video footage from real events is a critical marker for our simulated environments and animated characters. It’s something I’m always thinking about as we negotiate boundaries.
SPRING 2018 VFXVOICE.COM • 27
3/5/18 4:14 PM
PROFILE
“I moved further into negotiating the boundaries between conveying the news and creating opportunities for people to fully immerse themselves – sometimes as a victim, sometimes as a witness, or sometimes as the protagonist, but in all cases, getting a real-life view from the ground.” —Nonny de la Peña
“I think a lot of the power comes from what happens after you take off the goggles and have to confront your feelings with others around you. People are pushed out of their comfort zones and cannot look away from difficult situations. I think that sense of empathy drew others to the field.” —Nonny de la Peña
sharing with us who hit him and how, so that we could reenact the event with motion capture and stay as true to the actual experience as possible. Having his words as a guidepost is a really important part of our process to instill the experience with integrity and authenticity. And on the heels of the United States’ withdrawal from the Paris Climate agreement, we created Greenland Melting in collaboration with PBS Frontline and NOVA. The piece provides a rare, up-close view of the icy arctic scenery that’s disappearing faster than predicted. We used a 360 Ozo Camera clamped onto the edge of a helicopter allowing us to create that vantage point so that users are viewing the landscape as if they are in the helicopter themselves.
the treatment of these woman and healthcare workers – and that is significant in terms of creating a meaningful sense of understanding by literally walking in someone else’s shoes.
VFX Voice: You’ve said that ‘as the world becomes increasingly global and our online and offline lives become increasingly integrated, it is critical to convey stories that create empathy and preserve our humanity.’ What can virtual reality teach us about empathy?
TOP: Out of Exile: Daniel’s Story uses VR to draw attention to the plight of homeless LGBTQ youth. De la Peña calls Daniel Ashley Pierce’s story a “powerful reminder of the kind of hostility faced by so many in the LGBTQ community.” But the VR ends “with a sense of optimism as Daniel and others share their personal accounts of triumphing over despair.” Videogrammetry was used to create holograms of Daniel and his peers and recreate the event using video captured by Daniel at the time. OPPOSITE PAGE: Greenland Melting explores the impact of climate change on Greenland, providing a rare, up-close view of the rapidly disappearing arctic landscape. A 360 Ozo Camera clamped onto the edge of a helicopter gives users the vantage point of viewing the landscape as if they’re in a helicopter themselves.
26 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 26-27
de la Peña: What if I could present you with a story that you would remember with your entire body and not just your mind? With VR, I can put you in a scene in the middle of a story you normally see on the news and lessen the gap between actual events and personal experience. In 2012, we premiered the first virtual reality experience at the Sundance Film Festival, Hunger in Los Angeles. In the piece, animated characters stand in line outside a soup kitchen that has run out of food as real audio plays – and you watch as a man collapses in a diabetic seizure. We had no idea what to expect. But the emotion was overwhelming. People were crying their eyes out in a way I haven’t seen since then on any of our projects. I think a lot of the power comes from what happens after you take off the goggles and have to confront your feelings with others around you. People are pushed out of their comfort zones and cannot look away from difficult situations. I think that sense of empathy drew others to the field. It was during this project that I introduced Alejandro (Academy Award-winning director Alejandro González Iñárritu) to the medium, and I’m really inspired by the powerful immersive works he has created. Here’s another good example. Planned Parenthood took the Across the Line piece into communities with a strong anti-abortion stance and after seeing it, many came out feeling that women should not have to experience that kind of hateful vitriol. It didn’t change their views on abortion per se, but in some cases it opened their eyes to
VFX Voice: What is your approach to starting a VR project? de la Peña: At the beginning of any project, I think about how I would experience a situation with my whole body, using all of my senses. I literally close my eyes and place myself right into the scene. I walk around the scene and think about what I would feel and hear and see and what emotions would be running through my body as the story unfolds around me. I am always thinking about the spatial nature of the narrative and try to capture and absorb the situation. Think about how your experience changes if you see something through a car window vs. being right there on the street. Or how your point of view changes if you’re scared or hungry or tired. If you’re a bystander or a victim or a perpetrator of a crime. Telling people about something you have heard or watched is just less visceral than describing it first-person – and that sense of presence is what we strive for. In creating After Solitary, we used photogrammetry. We took a huge number of photos of a solitary cell from every conceivable angle and then stitched them together. Then we used volumetric capture photography using multiple cameras and stitched the frames together. We captured video of an actual inmate and then dropped him into the solitary cell with you so that he and you are moving together around the cell as he tells his story in that confined space. Viewers come out with a very raw understanding of that restricted environment and its psychological impact. VFX Voice: Are there best practices for VR journalism? What’s an example of the balancing act you managed between VR technology and authenticity? de la Peña: Maintaining authenticity and integrity are of the utmost importance as we create these virtual experiences. Our projects are guided by real-life accounts, and having captured audio and video footage from real events is a critical marker for our simulated environments and animated characters. It’s something I’m always thinking about as we negotiate boundaries.
SPRING 2018 VFXVOICE.COM • 27
3/5/18 4:14 PM
PROFILE
For example, in Greenland Melting we needed to drop in holograms of NASA scientists. The tricky question – what should they wear? We didn’t want them to dress in parkas since they were shot against a greenscreen, and we thought that wearing a parka would suggest the experts were actually shot at the glacier. It’s details like that can raise questions about journalistic integrity that we take painstaking measures to address properly. We are currently working on a paper on best practices, and I have recently joined The Aspen Institute and Knight Foundation, Knight Commission on Trust, Media and Democracy to confront important questions around responsible immersive journalism and journalistic integrity due to the subjectivity of experience and editorial control. One of our priority goals is to identify the perennial and emerging values and social obligations that should guide those who produce distribute and consume news and information to ensure a functioning democracy. VFX Voice: What’s next for you and Emblematic Group? de la Peña: We have another project in the works with PBS Frontline on cross-contamination of DNA that we’re excited to share. And we just got a $250,000 grant to start building a platform where people can film their own stories. Called REACH, it’s a platform for hosting and distributing 3D models of locations that news organizations can use to create innovative and cost-effective “walk-around” VR content. You film the story and we will provide the volumetric locations that can be dropped in. And something fun and totally different – we’re working on SLASH Presents Trashed! A VR Game of Epic Hotel Destruction. So we’re building a game MC’d by legendary rocker Slash that includes a lot of crazy antics, from cars being driven into swimming pools to throwing TVs over balconies. And of course it will also be multiplayer, because, come on, you can’t nail a bed to the ceiling without your friends, right?
TOP AND BOTTOM: Hunger in Los Angeles was the first virtual reality experience to premiere at the Sundance Film Festival, in 2012. In the piece, animated characters stand in line outside a soup kitchen that has run out of food as real audio plays – and you watch as a man collapses in a diabetic seizure. “We had no idea what to expect,” says de la Peña. “But the emotion was overwhelming. People were crying their eyes out in a way I haven’t seen since then on any of our projects.”
VFX Voice: What excites you about the future of immersive journalism? de la Peña: I’m excited by the potential of VR to change the way that mainstream media tell stories. I think that as this area evolves and technology becomes ever more accessible, journalists will more organically pivot towards VR as the best way to tell a particular kind of emotional, place-based story. And I’m excited that one day you’ll be interviewing Virtual Nonny – a fantastic avatar that I’ll puppeteer from home! TOP TO BOTTOM: After Solitary takes viewers inside the Maine State Prison to experience a harrowing story of solitary confinement. Photos were taken of a solitary cell from every conceivable angle and then stitched together. The filmmakers, says de la Peña, “captured video of an actual inmate and then dropped him into the solitary cell with you so that he and you are moving together around the cell as he tells his story in that confined space. Viewers come out with a very raw understanding of that restricted environment and its psychological impact.” At bottom is the inmate’s living space upon leaving prison. He can’t shake the idea of living in something other than a small, confined space, so he spends most of his post-prison time in his bedroom.
28 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 29
“At the beginning of any project, I think about how I would experience a situation with my whole body, using all of my senses. I literally close my eyes and place myself right into the scene. I walk around the scene and think about what I would feel and hear and see and what emotions would be running through my body as the story unfolds around me. I am always thinking about the spatial nature of the narrative and try to capture and absorb the situation.” —Nonny de la Peña
SPRING 2018 VFXVOICE.COM • 29
3/5/18 4:14 PM
PROFILE
For example, in Greenland Melting we needed to drop in holograms of NASA scientists. The tricky question – what should they wear? We didn’t want them to dress in parkas since they were shot against a greenscreen, and we thought that wearing a parka would suggest the experts were actually shot at the glacier. It’s details like that can raise questions about journalistic integrity that we take painstaking measures to address properly. We are currently working on a paper on best practices, and I have recently joined The Aspen Institute and Knight Foundation, Knight Commission on Trust, Media and Democracy to confront important questions around responsible immersive journalism and journalistic integrity due to the subjectivity of experience and editorial control. One of our priority goals is to identify the perennial and emerging values and social obligations that should guide those who produce distribute and consume news and information to ensure a functioning democracy. VFX Voice: What’s next for you and Emblematic Group? de la Peña: We have another project in the works with PBS Frontline on cross-contamination of DNA that we’re excited to share. And we just got a $250,000 grant to start building a platform where people can film their own stories. Called REACH, it’s a platform for hosting and distributing 3D models of locations that news organizations can use to create innovative and cost-effective “walk-around” VR content. You film the story and we will provide the volumetric locations that can be dropped in. And something fun and totally different – we’re working on SLASH Presents Trashed! A VR Game of Epic Hotel Destruction. So we’re building a game MC’d by legendary rocker Slash that includes a lot of crazy antics, from cars being driven into swimming pools to throwing TVs over balconies. And of course it will also be multiplayer, because, come on, you can’t nail a bed to the ceiling without your friends, right?
TOP AND BOTTOM: Hunger in Los Angeles was the first virtual reality experience to premiere at the Sundance Film Festival, in 2012. In the piece, animated characters stand in line outside a soup kitchen that has run out of food as real audio plays – and you watch as a man collapses in a diabetic seizure. “We had no idea what to expect,” says de la Peña. “But the emotion was overwhelming. People were crying their eyes out in a way I haven’t seen since then on any of our projects.”
VFX Voice: What excites you about the future of immersive journalism? de la Peña: I’m excited by the potential of VR to change the way that mainstream media tell stories. I think that as this area evolves and technology becomes ever more accessible, journalists will more organically pivot towards VR as the best way to tell a particular kind of emotional, place-based story. And I’m excited that one day you’ll be interviewing Virtual Nonny – a fantastic avatar that I’ll puppeteer from home! TOP TO BOTTOM: After Solitary takes viewers inside the Maine State Prison to experience a harrowing story of solitary confinement. Photos were taken of a solitary cell from every conceivable angle and then stitched together. The filmmakers, says de la Peña, “captured video of an actual inmate and then dropped him into the solitary cell with you so that he and you are moving together around the cell as he tells his story in that confined space. Viewers come out with a very raw understanding of that restricted environment and its psychological impact.” At bottom is the inmate’s living space upon leaving prison. He can’t shake the idea of living in something other than a small, confined space, so he spends most of his post-prison time in his bedroom.
28 • VFXVOICE.COM SPRING 2018
PG 24-29 NONNY.indd 29
“At the beginning of any project, I think about how I would experience a situation with my whole body, using all of my senses. I literally close my eyes and place myself right into the scene. I walk around the scene and think about what I would feel and hear and see and what emotions would be running through my body as the story unfolds around me. I am always thinking about the spatial nature of the narrative and try to capture and absorb the situation.” —Nonny de la Peña
SPRING 2018 VFXVOICE.COM • 29
3/5/18 4:14 PM
FILM
AN ALTERED REALM OF BEING (AND BEINGS) HAUNT ANNIHILATION By KEVIN H. MARTIN
All images copyright © 2017 Paramount Pictures. All rights reserved.
30 • VFXVOICE.COM SPRING 2018
PG 30-34 ANNIHILATION.indd 30-31
Adapted from an acclaimed science-fiction novel by Jeff VanderMeer, Annihilation posits an unusual form of invasion – one in which a section of our world, dubbed ‘Area X,’ has become altered, with natural laws being rewritten, owing to an apparently extraterrestrial influence. A microbiologist played by Natalie Portman is chosen to lead the twelfth expedition into the affected area, but once inside, the team experiences technical malfunctions while encountering strange and dangerous lifeforms. In interviews, writer/director Alex Garland has stated Annihilation begins in suburbia and ends in psychedelia, which sounds like an inspired prescription for any cinematic hero’s journey. The filmmaker re-teamed with a number of his Ex Machina collaborators, including VFX Supervisor Andrew Whitehurst, lead VFX house Double Negative and Milk VFX, plus special makeup effects by Tristan Versluis. Other collaborators on the film include Union VFX and Nvisizble, with Plowman Craven & Associates handling LIDAR surveys and scanning.
Whitehurst, who began his feature career with the second Lara Croft Tomb Raider film, worked for Framestore before joining DNEG, where he subsequently amassed credits on the Harry Potter saga and Hellboy II: The Golden Army. He supervised 3D work on Scott Pilgrim Vs. the World before supervising Ex Machina, which won critical acclaim and earned his team a Best Visual Effects win at the Oscars. Garland first broached the subject of Annihilation with Whitehurst as work on Ex Machina was finishing up, but there was a lengthy gestation before the project was to go before the cameras. “He sent me his first draft for Annihilation a year prior to pre-production. I was halfway up a mountain shooting Spectre when he sent the draft,” Whitehurst recalls. “Development was a very fluid process, with the key players discussing artistic as well as nuts-andbolts aspects while Alex continued to refine the script. Some of the creatures changed significantly as a result of this input as he moved from first draft to final.” For Whitehurst, one of the best parts of this collaboration was being able to build on the existing relationships among the creative principals. “We knew Alex, director of photography Rob Hardy and the people in the art department from the last go-round, so all the getting-to-know-you conversations were out of the way, allowing us to hit the ground running,” he relates. “This facilitated the free flow of ideas and working out the visual ideas that merited exploration. Alex and Rob drove a lot of early conversations about visual ideas, based upon camera movement and composition. It’s absolutely necessary to decide some things up front, and the camera part of that was crucial, but we built a flexibility into the process that let all of us take advantage of new ideas on the day.” The way storyboards were used reflects that ability to adapt to new inspirations on set and even during post, while also allowing the production to make the most of the available dollars. “While we had storyboarded a lot,” notes Whitehurst, “it wasn’t a prescriptive process that locked us into anything. It was often more about getting to thinking about how we would structure shots. On set, it might occur to Alex that the camera needed to be over there to ensure the right information in frame. And when you don’t have limitless funds, every single artist has to make sure the budget available does all turn up onscreen. It was great for us that Alex could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move. “The fact our director could draw also helped enormously,” continues Whitehurst. “Using iPads with Procreate, I could send him an idea and get a file back with a drawing he made on top of mine, then enter that into the Shotgun database for artists and let them know this is what we want, without a ton of iterations.” This was of great significance when it came to the design of various creatures encountered within Area X, which Whitehurst characterizes as carrying a “meditative oddness and beauty” along with a highly original element. “I’ve often described the film to people as Andrei Tarkovsky’s Stalker meets The Thing,” he elaborates, “though there are elements of David Cronenberg’s The Fly as well. The concept
OPPOSITE TOP: A view of the Shimmer. OPPOSITE BOTTOM: A biologist played by Natalie Portman finds herself in a decidedly un-Earthlike dominion in Alex Garland’s Annihilation. In addition to the now-traditional CGI work, Double Negative (DNEG) also made use of a number of practically-shot visual aberrations to enhance the bizarre qualities of Area X. TOP AND MIDDLE: The team takes a closer look at how the impact of Area X on terrestrial biology has mutated a gator-like critter. As was the case on the previous DNEG/Garland project, Ex Machina, a seamless hand-off between visual and practical effects was mandated throughout.
“It was great for us that [writer/director] Alex [Garland]could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move.” —Andrew Whitehurst, Visual Effects Supervisor
SPRING 2018 VFXVOICE.COM • 31
3/5/18 4:14 PM
FILM
AN ALTERED REALM OF BEING (AND BEINGS) HAUNT ANNIHILATION By KEVIN H. MARTIN
All images copyright © 2017 Paramount Pictures. All rights reserved.
30 • VFXVOICE.COM SPRING 2018
PG 30-34 ANNIHILATION.indd 30-31
Adapted from an acclaimed science-fiction novel by Jeff VanderMeer, Annihilation posits an unusual form of invasion – one in which a section of our world, dubbed ‘Area X,’ has become altered, with natural laws being rewritten, owing to an apparently extraterrestrial influence. A microbiologist played by Natalie Portman is chosen to lead the twelfth expedition into the affected area, but once inside, the team experiences technical malfunctions while encountering strange and dangerous lifeforms. In interviews, writer/director Alex Garland has stated Annihilation begins in suburbia and ends in psychedelia, which sounds like an inspired prescription for any cinematic hero’s journey. The filmmaker re-teamed with a number of his Ex Machina collaborators, including VFX Supervisor Andrew Whitehurst, lead VFX house Double Negative and Milk VFX, plus special makeup effects by Tristan Versluis. Other collaborators on the film include Union VFX and Nvisizble, with Plowman Craven & Associates handling LIDAR surveys and scanning.
Whitehurst, who began his feature career with the second Lara Croft Tomb Raider film, worked for Framestore before joining DNEG, where he subsequently amassed credits on the Harry Potter saga and Hellboy II: The Golden Army. He supervised 3D work on Scott Pilgrim Vs. the World before supervising Ex Machina, which won critical acclaim and earned his team a Best Visual Effects win at the Oscars. Garland first broached the subject of Annihilation with Whitehurst as work on Ex Machina was finishing up, but there was a lengthy gestation before the project was to go before the cameras. “He sent me his first draft for Annihilation a year prior to pre-production. I was halfway up a mountain shooting Spectre when he sent the draft,” Whitehurst recalls. “Development was a very fluid process, with the key players discussing artistic as well as nuts-andbolts aspects while Alex continued to refine the script. Some of the creatures changed significantly as a result of this input as he moved from first draft to final.” For Whitehurst, one of the best parts of this collaboration was being able to build on the existing relationships among the creative principals. “We knew Alex, director of photography Rob Hardy and the people in the art department from the last go-round, so all the getting-to-know-you conversations were out of the way, allowing us to hit the ground running,” he relates. “This facilitated the free flow of ideas and working out the visual ideas that merited exploration. Alex and Rob drove a lot of early conversations about visual ideas, based upon camera movement and composition. It’s absolutely necessary to decide some things up front, and the camera part of that was crucial, but we built a flexibility into the process that let all of us take advantage of new ideas on the day.” The way storyboards were used reflects that ability to adapt to new inspirations on set and even during post, while also allowing the production to make the most of the available dollars. “While we had storyboarded a lot,” notes Whitehurst, “it wasn’t a prescriptive process that locked us into anything. It was often more about getting to thinking about how we would structure shots. On set, it might occur to Alex that the camera needed to be over there to ensure the right information in frame. And when you don’t have limitless funds, every single artist has to make sure the budget available does all turn up onscreen. It was great for us that Alex could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move. “The fact our director could draw also helped enormously,” continues Whitehurst. “Using iPads with Procreate, I could send him an idea and get a file back with a drawing he made on top of mine, then enter that into the Shotgun database for artists and let them know this is what we want, without a ton of iterations.” This was of great significance when it came to the design of various creatures encountered within Area X, which Whitehurst characterizes as carrying a “meditative oddness and beauty” along with a highly original element. “I’ve often described the film to people as Andrei Tarkovsky’s Stalker meets The Thing,” he elaborates, “though there are elements of David Cronenberg’s The Fly as well. The concept
OPPOSITE TOP: A view of the Shimmer. OPPOSITE BOTTOM: A biologist played by Natalie Portman finds herself in a decidedly un-Earthlike dominion in Alex Garland’s Annihilation. In addition to the now-traditional CGI work, Double Negative (DNEG) also made use of a number of practically-shot visual aberrations to enhance the bizarre qualities of Area X. TOP AND MIDDLE: The team takes a closer look at how the impact of Area X on terrestrial biology has mutated a gator-like critter. As was the case on the previous DNEG/Garland project, Ex Machina, a seamless hand-off between visual and practical effects was mandated throughout.
“It was great for us that [writer/director] Alex [Garland]could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move.” —Andrew Whitehurst, Visual Effects Supervisor
SPRING 2018 VFXVOICE.COM • 31
3/5/18 4:14 PM
FILM
TOP LEFT: The twelfth expedition sent to probe Area X stand before a ‘Shimmer’ field encompassing the phenomena. DNEG was lead VFX house on Annihilation, conceptualizing visual manifestations and mutations encountered within, as well as the Shimmer itself. TOP RIGHT: Live-action plate of same shot. MIDDLE LEFT: Among the mutations encountered by the team is a strangely mutated gator. Alterations to earthly lifeforms were developed at DNEG, then further refined through sketches made by director Alex Garland, with Procreate employed to facilitate the review process. MIDDLE RIGHT: Plate shot reveals physical representation of mutated alligator. Whenever possible, these were used both for lighting reference and to provide the cast and camera operators with a presence that facilitated performance and composition of shots. BOTTOM LEFT: Another view of the CG gator as it approaches the shack inhabited by the exploratory team. BOTTOM RIGHT: Photographying the prop version of the creature in the water provided excellent reference for mimicking the interaction of an animal in the water. OPPOSITE TOP ROW: Final comp of the gator in closeup. OPPOSITE MIDDLE ROW: Strange new denizens of Area X … or are they scarecrows? OPPOSITE BOTOM ROW: More examples of how Area X is introducing unusual changes to the earthly denizens inhabiting that realm.
32 • VFXVOICE.COM SPRING 2018
PG 30-34 ANNIHILATION.indd 32-33
is that the animal DNA is being messed with, so we looked at references of tumors and cancers to see what effects those have on how biology functions. Then that got rolled back into our creature design. In a science-fiction film, you can push things quite a long way into the realm of the fantastic, but for good cinematic storytelling, the way a thing behaves in a real-world setting has to make it recognizable for audiences to latch onto it and accept it, so grounding these animals in reality was always a factor for us.” As a result, physical presences were shot on set for nearly every incarnation of creature depicted. “Even though this created more work for VFX, in terms of having to paint things out, what we gained in terms of lighting and composition from having a physical object on set was tremendous,” Whitehurst maintains. “Then there’s the obvious benefit for our actors having something to work with, but all the wonderful bits of interactivity you get with light falling on a form that is properly framed in the shot can be even more important to the success of the effect.” One sequence involves a large creature in a confined space interacting with the investigating team. “If we’d shot this without a physical creature, the scene would have had no power,” states Whitehurst. “But with production having a big guy there,
presenting in a roughly correct form – with the right mass and presence, which is so much superior than just the old ball-ona-stick-for-eyeline approach – it allowed us to choreograph the scene. Rob used very directional lighting in the scene, which was casting strong shadows on the floor, so that was another aspect we were able to keep when we dressed in the CG.” The physical on-set presence also informed the editorial effort. “When Alex and editor Barney Pilling are cutting the scene, they can make a finer edit than if we’d only been shooting empty frames. I’ve noticed on VFX films that there is a strong tendency for scenes to be cut faster and faster when there’s nothing yet comped into the frame; this is because the thing that should be making the scene interesting is absent. So to keep pace in the edit, the instinct is to cut too quickly. But if you can see something actually happening, sensing the true rhythm of the scene and being able to build on that in a sensible fashion is much more straightforward. Then, when we drop in animatics of our work on top, it doesn’t change the whole feel of the sequence or trigger a re-think, because the correct feeling has been visible in there all along.” Everything taking place within the Shimmer – the territory encompassed by Area X – was a concern for the VFX teams. “There
was a ton of environmental work required to add the necessary strangeness,” says Whitehurst. “There’s a psychedelic multi-dimensional element that appears toward the end that had to be thought out and designed. There’s very little dialogue during that experience, and we had to make a lot of creative decisions before shooting it. Then it became necessary to reevaluate those decisions after evaluating the performances of the actors. It was important to find exactly what we could take from those performances to inform the effects we’d later create. In one instance, the performance was so strong and engaging that it suggested an effect far different than what we had imagined delivering. So being light on our feet and flexible was the key to deal with this aspect.” Translating difficult concepts into cinematic reality in a credible fashion is part and parcel of fantasy filmmaking, but that was even more difficult on Annihilation given that this is happening here on Earth in a modern setting. “There is a presence encountered in the film, but its physical manifestation is difficult for anybody to wrap his head around in any kind of rational way,” Whitehurst reveals. “With how it is created, the form is mathematical; but when you see it, the thing comes off as being psychedelic.” Even with all the challenges of depicting never-before-seen
SPRING 2018 VFXVOICE.COM • 33
3/5/18 4:14 PM
FILM
TOP LEFT: The twelfth expedition sent to probe Area X stand before a ‘Shimmer’ field encompassing the phenomena. DNEG was lead VFX house on Annihilation, conceptualizing visual manifestations and mutations encountered within, as well as the Shimmer itself. TOP RIGHT: Live-action plate of same shot. MIDDLE LEFT: Among the mutations encountered by the team is a strangely mutated gator. Alterations to earthly lifeforms were developed at DNEG, then further refined through sketches made by director Alex Garland, with Procreate employed to facilitate the review process. MIDDLE RIGHT: Plate shot reveals physical representation of mutated alligator. Whenever possible, these were used both for lighting reference and to provide the cast and camera operators with a presence that facilitated performance and composition of shots. BOTTOM LEFT: Another view of the CG gator as it approaches the shack inhabited by the exploratory team. BOTTOM RIGHT: Photographying the prop version of the creature in the water provided excellent reference for mimicking the interaction of an animal in the water. OPPOSITE TOP ROW: Final comp of the gator in closeup. OPPOSITE MIDDLE ROW: Strange new denizens of Area X … or are they scarecrows? OPPOSITE BOTOM ROW: More examples of how Area X is introducing unusual changes to the earthly denizens inhabiting that realm.
32 • VFXVOICE.COM SPRING 2018
PG 30-34 ANNIHILATION.indd 32-33
is that the animal DNA is being messed with, so we looked at references of tumors and cancers to see what effects those have on how biology functions. Then that got rolled back into our creature design. In a science-fiction film, you can push things quite a long way into the realm of the fantastic, but for good cinematic storytelling, the way a thing behaves in a real-world setting has to make it recognizable for audiences to latch onto it and accept it, so grounding these animals in reality was always a factor for us.” As a result, physical presences were shot on set for nearly every incarnation of creature depicted. “Even though this created more work for VFX, in terms of having to paint things out, what we gained in terms of lighting and composition from having a physical object on set was tremendous,” Whitehurst maintains. “Then there’s the obvious benefit for our actors having something to work with, but all the wonderful bits of interactivity you get with light falling on a form that is properly framed in the shot can be even more important to the success of the effect.” One sequence involves a large creature in a confined space interacting with the investigating team. “If we’d shot this without a physical creature, the scene would have had no power,” states Whitehurst. “But with production having a big guy there,
presenting in a roughly correct form – with the right mass and presence, which is so much superior than just the old ball-ona-stick-for-eyeline approach – it allowed us to choreograph the scene. Rob used very directional lighting in the scene, which was casting strong shadows on the floor, so that was another aspect we were able to keep when we dressed in the CG.” The physical on-set presence also informed the editorial effort. “When Alex and editor Barney Pilling are cutting the scene, they can make a finer edit than if we’d only been shooting empty frames. I’ve noticed on VFX films that there is a strong tendency for scenes to be cut faster and faster when there’s nothing yet comped into the frame; this is because the thing that should be making the scene interesting is absent. So to keep pace in the edit, the instinct is to cut too quickly. But if you can see something actually happening, sensing the true rhythm of the scene and being able to build on that in a sensible fashion is much more straightforward. Then, when we drop in animatics of our work on top, it doesn’t change the whole feel of the sequence or trigger a re-think, because the correct feeling has been visible in there all along.” Everything taking place within the Shimmer – the territory encompassed by Area X – was a concern for the VFX teams. “There
was a ton of environmental work required to add the necessary strangeness,” says Whitehurst. “There’s a psychedelic multi-dimensional element that appears toward the end that had to be thought out and designed. There’s very little dialogue during that experience, and we had to make a lot of creative decisions before shooting it. Then it became necessary to reevaluate those decisions after evaluating the performances of the actors. It was important to find exactly what we could take from those performances to inform the effects we’d later create. In one instance, the performance was so strong and engaging that it suggested an effect far different than what we had imagined delivering. So being light on our feet and flexible was the key to deal with this aspect.” Translating difficult concepts into cinematic reality in a credible fashion is part and parcel of fantasy filmmaking, but that was even more difficult on Annihilation given that this is happening here on Earth in a modern setting. “There is a presence encountered in the film, but its physical manifestation is difficult for anybody to wrap his head around in any kind of rational way,” Whitehurst reveals. “With how it is created, the form is mathematical; but when you see it, the thing comes off as being psychedelic.” Even with all the challenges of depicting never-before-seen
SPRING 2018 VFXVOICE.COM • 33
3/5/18 4:14 PM
FILM
vistas, Double Negative was able to rely primarily on established technologies and approaches. “Without a really massive budget, we had to be very smart about how we invested our resources,” he states. “We were a bit leery of putting the resources into creating custom software for specific looks, just because that look might not wind up surviving. Mostly we used standard packages. And to be honest, when using Houdini, you can do a lot within the software, so you don’t find yourself writing custom code that would be necessary in other circumstances. We also leveraged a lot of development from past DNEG shows, pulling those into our pipeline.” The company also does what it can to anticipate needs during the digital intermediate process, especially with the advent of HDR. “We deliver DPXs that have to be able to handle the roll-off on the highlights with HDR. Just on a QC basis, we always look at everything from four stops up to four stops under, so that when stuff gets pushed around in the DI, we know it won’t get pushed so far that it breaks.” As a matter of course, the VFX house nearly always includes practically-shot elements to enhance natural effects. But this time out, the practical aspect expanded in a way that was fun and visually rewarding. “To explore the psychedelic aspect during camera testing, we spent a day in prep when everybody brought in objects with interesting optical properties. We got some unusual lights
TOP ROW: Even the extensive efforts by the art department on location were enhanced with VFX. BOTTOM ROW: Some of the greenery within the Shimmer is anything but just green. DNEG employed standard animation packages, including Houdini for animation, while also leveraging off R&D from past productions.
“The fact our director could draw also helped enormously. Using iPads with Procreate, I could send him an idea and get a file back with a drawing he made on top of mine, then enter that into the Shotgun database for artists and let them know this is what we want, without a ton of iterations.” —Andrew Whitehurst, Visual Effects Supervisor and, using strange lenses, experimented with how we could get various prismatic and flaring effects, and wound up shooting a few hours’ worth of material. We sifted through that much later and made selects, assembling a library of these odd optical phenomena. These got dressed into frames to add these natural/organic glints in an aesthetically-driven, rather than physically-driven way, so a lot of the weirdness you see in terms of light effects, while added digitally, were photographed optical elements.” Whitehurst tends not to differentiate between traditional effects and digital when it comes to creativity. “I’ve always had a foot in both camps, so whether you’re writing code or moving two prisms next to a strobe light while shooting high-speed, it’s all good, going towards the end of creating beautiful imagery that helps make the story. I’m not going to lie; that’s one of the best parts of the job. That, and the surprises along the way – the emergent property of the efforts from everybody working on the film – they ultimately always result in something far different than what I first imagined when starting a project. Everybody’s efforts combine, and you hope it turns out to be greater than the sum of its parts.”
34 • VFXVOICE.COM SPRING 2018
PG 30-34 ANNIHILATION.indd 34
3/5/18 4:14 PM
PG 35 NAB AD.indd 35
3/5/18 4:15 PM
COVER
A WRINKLE IN TIME: ‘THE EMOTIONAL VISUAL EFFECTS SHOW’ By TREVOR HOGG
36 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 36-37
American filmmaker Ava DuVernay (Selma) was presented with an opportunity to produce a Hollywood blockbuster with the adaptation of A Wrinkle in Time. The young adult fantasy tale revolves around a 13-year-old girl traveling through time and space to rescue her imprisoned physicist-father from an expanding evil presence with help from her genius brother, a classmate and three celestial beings. Budgeted at $103 million and featuring a cast that includes Oprah Winfrey, Reese Witherspoon, Mindy Kaling, Gugu Mbatha-Raw, Michael Peña, Storm Reid, Zach Galifianakis and Chris Pine, DuVernay was faced with having to deal with extensive visual and special effects for the first time. As a result, she hired Visual Effects Supervisor Rich McBride (The Revenant) to oversee the digital wizardry and Special Effects Supervisor Mark Hawker (Terminator Genisys) to guide the practical trickery. “Directors say that they have a willingness to learn and understand the process. In Ava’s case she was honest in wanting to do so,” notes McBride. “I showed her all of the different pieces and components that go into a visual effects shot. You might have effects passes or animation that is greyshaded or wireframes.” DuVernay did have a limit when it came to learning about the intricacies. “When building creatures, you have to build the skeletal structure and the muscle systems. Ava was funny because she never wanted to know about the insides of the creatures!” McBride was brought early on into the process of rethinking the book authored by Madeleine L’Engle. “Ava and I have a good line of communication with one another that we’ve built up. During the post-production phase I was comfortable showing her work that was unfinished but was also in pieces. I’d say, ‘Don’t look at this but at the composition of the background. How do you feel about the movement of this particular aspect here?’ I could usually focus Ava fairly well, which was useful because I felt like I could put a lot more work in front of her than I would normally do with a director.” A solid foundation of concept art was generated by the production art department led by Production Designer Naomi Shohan (The Lovely Bones). “There were definitely some things that came into the fantasy realm that we didn’t quite have any visual concepts for,” states McBride. “That usually fell to the vendor whether it was ILM or MPC or even using outside concept artists to say, ‘When Mrs. Which first comes into the backyard she is in what they described as a vaporous form.’ We went through a lot of iterations of concept art and used different effects passes to start build up on things. In some cases, we pulled things from our original concept work, like the design of our flowers on [fantasy planet] Uriel.” Storyboards and previs were produced to get a better sense of what the action was going to be for some scenes. “Ava didn’t always adhere strictly to those. We covered a lot of the same action with multiple cameras – on average three sometimes four – which can be tricky for visual effects.” “One of our worlds ended up being pieces of set with a lot
of bluescreen around it,” reveals McBride. “But we did do extensive shooting in New Zealand for our exterior Uriel work. We ended up doing a lot of roto work so that Ava could get the takes that she wanted. Also, working with child actors often times you are restricted on the time of day and how long you’ve got with them so moving quickly and having to make quick calls on the day to say, ‘Don’t worry about putting the bluescreen up here. We’re going to manage without it.’ Those were things that we had to be flexible with.” No major technology developments were needed to complete the shots. “We have heavy effects simulations. The creature work is straightforward Maya. Each of the vendors used their own proprietary packages for any of their animation and rendering. For the most part compositing was done across the board by using NUKE.” A key element of the story is the ability of the characters to travel through a tesseract or wormhole to go from one world to the next. “We had definitely looked at Interstellar and what has been done in past movies as far as traveling through space and time,” remarks McBride. “That was our big challenge. Ava wanted something that wasn’t a space tube or porthole. It had to be somewhat grounded in the natural world yet looked magical. Our effect is that we’re almost distorting the world around them and rippling them with wave patterns that are distorting the physical world itself.” Sine waves were a big influence. “The movement that we ended up going with had a more languid beautiful liquid light look.” Creatures had to be reimagined such as the Centaurlike being from the novel. “We were transforming one of the Missuses so we wanted it to be something of the world that they were in. We went with the idea that this was a lush planet, so we have a flying leaf creature.” “For the monster on Camazotz we opted for something that was more about the environment coming after the kids,” remarks McBride. “We were using the idea of simulating the Earth, dust and a storm; all of these different natural elements coming together in a slightly natural but unnatural way. We’re trying to walk a fine line of it looking like something you’re familiar with but not behaving the way you would expect it to.” Conveying size and scale was important. “It’s always a challenge in visual effects whenever we do massive size effects and simulations, especially when our only scale reference is a couple of our characters. Speed, detail and movement of things become a big part of it. But at the same time, shot to shot, you want the energy and excitement to be there. That’s the balance we play with all of the time – real-world physics versus what is the narrative of the shot.” Complicating the ability to shoot the ‘land as monster’ sequence was the remote location and not being allowed to bring foreign matter such as soil and leaves into [the second location] Sequoia National Park. “We planned way ahead and talked to the maintenance people,” states SFX Supervisor Mark Hawker. “Whatever leaves that they had for us we went through to make sure that were no there sticks.”
OPPOSITE TOP: Director Ava DuVernay, left, with Storm Reid discussing a shot. OPPOSITE BOTTOM: Cinematographer Tobias A. Schliessler, ASC (center left) and Ava DuVernay (center right) commenced principal photography in November 2016 and concluded in February 2017, with primary locations in California and New Zealand. TOP: Rich McBride on set discussing a bluescreen visual effects shot. BOTTOM: Rich McBride out on location where he had to be flexible to accommodate the spontaneous creativity of Ava DuVernay.
SPRING 2018 VFXVOICE.COM • 37
3/5/18 4:15 PM
COVER
A WRINKLE IN TIME: ‘THE EMOTIONAL VISUAL EFFECTS SHOW’ By TREVOR HOGG
36 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 36-37
American filmmaker Ava DuVernay (Selma) was presented with an opportunity to produce a Hollywood blockbuster with the adaptation of A Wrinkle in Time. The young adult fantasy tale revolves around a 13-year-old girl traveling through time and space to rescue her imprisoned physicist-father from an expanding evil presence with help from her genius brother, a classmate and three celestial beings. Budgeted at $103 million and featuring a cast that includes Oprah Winfrey, Reese Witherspoon, Mindy Kaling, Gugu Mbatha-Raw, Michael Peña, Storm Reid, Zach Galifianakis and Chris Pine, DuVernay was faced with having to deal with extensive visual and special effects for the first time. As a result, she hired Visual Effects Supervisor Rich McBride (The Revenant) to oversee the digital wizardry and Special Effects Supervisor Mark Hawker (Terminator Genisys) to guide the practical trickery. “Directors say that they have a willingness to learn and understand the process. In Ava’s case she was honest in wanting to do so,” notes McBride. “I showed her all of the different pieces and components that go into a visual effects shot. You might have effects passes or animation that is greyshaded or wireframes.” DuVernay did have a limit when it came to learning about the intricacies. “When building creatures, you have to build the skeletal structure and the muscle systems. Ava was funny because she never wanted to know about the insides of the creatures!” McBride was brought early on into the process of rethinking the book authored by Madeleine L’Engle. “Ava and I have a good line of communication with one another that we’ve built up. During the post-production phase I was comfortable showing her work that was unfinished but was also in pieces. I’d say, ‘Don’t look at this but at the composition of the background. How do you feel about the movement of this particular aspect here?’ I could usually focus Ava fairly well, which was useful because I felt like I could put a lot more work in front of her than I would normally do with a director.” A solid foundation of concept art was generated by the production art department led by Production Designer Naomi Shohan (The Lovely Bones). “There were definitely some things that came into the fantasy realm that we didn’t quite have any visual concepts for,” states McBride. “That usually fell to the vendor whether it was ILM or MPC or even using outside concept artists to say, ‘When Mrs. Which first comes into the backyard she is in what they described as a vaporous form.’ We went through a lot of iterations of concept art and used different effects passes to start build up on things. In some cases, we pulled things from our original concept work, like the design of our flowers on [fantasy planet] Uriel.” Storyboards and previs were produced to get a better sense of what the action was going to be for some scenes. “Ava didn’t always adhere strictly to those. We covered a lot of the same action with multiple cameras – on average three sometimes four – which can be tricky for visual effects.” “One of our worlds ended up being pieces of set with a lot
of bluescreen around it,” reveals McBride. “But we did do extensive shooting in New Zealand for our exterior Uriel work. We ended up doing a lot of roto work so that Ava could get the takes that she wanted. Also, working with child actors often times you are restricted on the time of day and how long you’ve got with them so moving quickly and having to make quick calls on the day to say, ‘Don’t worry about putting the bluescreen up here. We’re going to manage without it.’ Those were things that we had to be flexible with.” No major technology developments were needed to complete the shots. “We have heavy effects simulations. The creature work is straightforward Maya. Each of the vendors used their own proprietary packages for any of their animation and rendering. For the most part compositing was done across the board by using NUKE.” A key element of the story is the ability of the characters to travel through a tesseract or wormhole to go from one world to the next. “We had definitely looked at Interstellar and what has been done in past movies as far as traveling through space and time,” remarks McBride. “That was our big challenge. Ava wanted something that wasn’t a space tube or porthole. It had to be somewhat grounded in the natural world yet looked magical. Our effect is that we’re almost distorting the world around them and rippling them with wave patterns that are distorting the physical world itself.” Sine waves were a big influence. “The movement that we ended up going with had a more languid beautiful liquid light look.” Creatures had to be reimagined such as the Centaurlike being from the novel. “We were transforming one of the Missuses so we wanted it to be something of the world that they were in. We went with the idea that this was a lush planet, so we have a flying leaf creature.” “For the monster on Camazotz we opted for something that was more about the environment coming after the kids,” remarks McBride. “We were using the idea of simulating the Earth, dust and a storm; all of these different natural elements coming together in a slightly natural but unnatural way. We’re trying to walk a fine line of it looking like something you’re familiar with but not behaving the way you would expect it to.” Conveying size and scale was important. “It’s always a challenge in visual effects whenever we do massive size effects and simulations, especially when our only scale reference is a couple of our characters. Speed, detail and movement of things become a big part of it. But at the same time, shot to shot, you want the energy and excitement to be there. That’s the balance we play with all of the time – real-world physics versus what is the narrative of the shot.” Complicating the ability to shoot the ‘land as monster’ sequence was the remote location and not being allowed to bring foreign matter such as soil and leaves into [the second location] Sequoia National Park. “We planned way ahead and talked to the maintenance people,” states SFX Supervisor Mark Hawker. “Whatever leaves that they had for us we went through to make sure that were no there sticks.”
OPPOSITE TOP: Director Ava DuVernay, left, with Storm Reid discussing a shot. OPPOSITE BOTTOM: Cinematographer Tobias A. Schliessler, ASC (center left) and Ava DuVernay (center right) commenced principal photography in November 2016 and concluded in February 2017, with primary locations in California and New Zealand. TOP: Rich McBride on set discussing a bluescreen visual effects shot. BOTTOM: Rich McBride out on location where he had to be flexible to accommodate the spontaneous creativity of Ava DuVernay.
SPRING 2018 VFXVOICE.COM • 37
3/5/18 4:15 PM
COVER
At one point 20 fans ranging from heavy-duty 100-mph units to portable electrics were transported by small forklifts and backpacks. “The way that Ava wanted to shoot it was that the wind is chasing them. We had to bring the fans on when the kids are running and try to chase them with the wind.” Rain was another critical element. “A river gets sucked up into this maelstrom. We took the 100-mph fans and put twoand-a-half-inch hose feeds in front of [the children] so that the water was being pushed horizontally.” A trio of celestial beings known as Mrs. Whatsit (Reese Witherspoon), Mrs. Which (Oprah Winfrey) and Mrs. Who (Mindy Kaling) needed to be powerful but also relatable. “That’s always a challenge of the fantasy aspect,” observes McBride. “Ava’s aesthetic is that she likes to keep things subtle so a lot of times these characters feel like they are human. The only difference is that we’ve got Oprah Winfrey’s character who has trouble with her scale. Whenever Mrs. Which materializes she can’t always resolve herself at the right height. The first scene that you see her she’s 18 feet tall and then later on when they go to the first planet she’s about 35 feet tall. There are some little things in there that keep [the three TOP: SFX Supervisor Mark Hawker had to adapt to the location, what was allowed and what he could do, so a different approach had to be utilized each time. BOTTOM: Middle school students Meg Murry (Storm Reid) and Calvin O’Keefe (Levi Miller) travel to worlds that test their friendship.
characters] a little bit off. Not too much fantasy, but enough for you to feel like they are not human.” Extensive testing was required for the balancing chamber where the children have to walk across crystals that act like teeter-totters. “Ava was specific about the way it moved,” notes Hawker. “She wanted it to be smooth as they’re walking on crystals, but also needed for them to shake violently. It was like two different setups that we had to try to combine. We talked about doing motion control, but it ended up being an effects technician on each teeter-totter watching and doing it electronically.” Element shoots were conducted, such as breaking 12-inchdiameter tree trunks hydraulically. “The kids get chased towards the edge of this ravine and can’t proceed any further. Then a tree falls next to them and creates a bridge over this ravine. They start to go over the ravine, ‘land as monster’ starts pulling the tree down, it breaks, and the kids ride the tree down into the canyon. That was several rigs that we had to build.” World building was required for the utopian setting of Uriel, the dark planet of Camazotz controlled by an evil entity referred to as IT, and the healing environment of Ixchel. “For Uriel, which was one of our biggest environment builds, we want it to feel otherworldly yet grounded and familiar,” states McBride. “That’s why we chose to shoot in New Zealand because it definitely has a landscape that feels
38 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 38-39
TOP: Nature has a major role to play in the world building of A Wrinkle in Time.
“During the post-production phase I was comfortable showing Ava work that was unfinished but was also in pieces. I’d say, ‘Don’t look at this but at the composition of the background. How do you feel about the movement of this particular aspect here?’ … I felt like I could put a lot more work in front of her than I would normally do with a director.” —Rich McBride, Visual Effects Supervisor
SPRING 2018 VFXVOICE.COM • 39
3/5/18 4:16 PM
COVER
At one point 20 fans ranging from heavy-duty 100-mph units to portable electrics were transported by small forklifts and backpacks. “The way that Ava wanted to shoot it was that the wind is chasing them. We had to bring the fans on when the kids are running and try to chase them with the wind.” Rain was another critical element. “A river gets sucked up into this maelstrom. We took the 100-mph fans and put twoand-a-half-inch hose feeds in front of [the children] so that the water was being pushed horizontally.” A trio of celestial beings known as Mrs. Whatsit (Reese Witherspoon), Mrs. Which (Oprah Winfrey) and Mrs. Who (Mindy Kaling) needed to be powerful but also relatable. “That’s always a challenge of the fantasy aspect,” observes McBride. “Ava’s aesthetic is that she likes to keep things subtle so a lot of times these characters feel like they are human. The only difference is that we’ve got Oprah Winfrey’s character who has trouble with her scale. Whenever Mrs. Which materializes she can’t always resolve herself at the right height. The first scene that you see her she’s 18 feet tall and then later on when they go to the first planet she’s about 35 feet tall. There are some little things in there that keep [the three TOP: SFX Supervisor Mark Hawker had to adapt to the location, what was allowed and what he could do, so a different approach had to be utilized each time. BOTTOM: Middle school students Meg Murry (Storm Reid) and Calvin O’Keefe (Levi Miller) travel to worlds that test their friendship.
characters] a little bit off. Not too much fantasy, but enough for you to feel like they are not human.” Extensive testing was required for the balancing chamber where the children have to walk across crystals that act like teeter-totters. “Ava was specific about the way it moved,” notes Hawker. “She wanted it to be smooth as they’re walking on crystals, but also needed for them to shake violently. It was like two different setups that we had to try to combine. We talked about doing motion control, but it ended up being an effects technician on each teeter-totter watching and doing it electronically.” Element shoots were conducted, such as breaking 12-inchdiameter tree trunks hydraulically. “The kids get chased towards the edge of this ravine and can’t proceed any further. Then a tree falls next to them and creates a bridge over this ravine. They start to go over the ravine, ‘land as monster’ starts pulling the tree down, it breaks, and the kids ride the tree down into the canyon. That was several rigs that we had to build.” World building was required for the utopian setting of Uriel, the dark planet of Camazotz controlled by an evil entity referred to as IT, and the healing environment of Ixchel. “For Uriel, which was one of our biggest environment builds, we want it to feel otherworldly yet grounded and familiar,” states McBride. “That’s why we chose to shoot in New Zealand because it definitely has a landscape that feels
38 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 38-39
TOP: Nature has a major role to play in the world building of A Wrinkle in Time.
“During the post-production phase I was comfortable showing Ava work that was unfinished but was also in pieces. I’d say, ‘Don’t look at this but at the composition of the background. How do you feel about the movement of this particular aspect here?’ … I felt like I could put a lot more work in front of her than I would normally do with a director.” —Rich McBride, Visual Effects Supervisor
SPRING 2018 VFXVOICE.COM • 39
3/5/18 4:16 PM
COVER
TOP: The three celestial beings, Mrs. Whatsit (Reese Witherspoon), Mrs. Who (Mindy Kaling) and Mrs. Which (Oprah Winfrey). BOTTOM: Mr. Murry (Chris Pine) is held prisoner by expanding dark entity known as IT. OPPOSITE TOP: Calvin O’Keefe (Levi Miller), Charles Wallace Murry (Deric McCabe) and Meg Murry (Storm Reid) find themselves in the suburbs of Camazotz. OPPOSITE MIDDLE: Zach Galifianakis is The Happy Medium. OPPOSITE BOTTOM: Meg Murry (Storm Reid) explains how traveling in a straight line is not the quickest way to travel through space and time.
like you’re somewhere else yet it is still on our Earth. We have since pushed the look of it going into post-production so that it feels more otherworldly and a bit more fantasy. It’s got more color and a bit more flower coverage. We’ve littered the landscape with giant poppies vegetation. There are whole ecosystems that live on these poppies with a big lake at the top of surrounding foothills. Ixchel ended up being a clean environment. It was meant to be a healing place. Ava wanted something simple so we created a clean ice plain. We have these beautiful ice structures which are something that we haven’t seen before. Camazotz was a mixed bag because it is a surreal place that is always changing. Camazotz is a place not to be trusted. It has many faces.” Practical elements were needed for a gentle creature that lives on Ixchel. “When they do the leaps, Meg Murry [Storm Reid] is getting sick and there’s this big furry creature that helps to make her well again,” remarks Hawker. “Special effects built a rig to hold her in the position, and it was covered with fur because for Rich to do fur right up against Storm’s skin is an expensive thing. Two of my special effects technicians puppeteered the rig that was put on a bungee because it moves around, is breathing and had to do gentle moves. I’m sure the gentle creature is going to be beautiful when visual effects are finished with it.” Air cylinders were dug into the ground for the first time the children travel through space. “The kids would walk or stand on the air cylinders, and Rich blended them into the environment to create the impression that the surroundings are wrinkling.” The color palette was manipulated during the DI process to heighten the sense of surrealism. “Uriel is quite colorful,” states McBride. “If you’re shooting over multiple days you get different weather especially in New Zealand where it changes
“Rich definitely wanted as much input from us as he could. They want special effects to start the event and have visual effects take over so you always have something of realism in the shot to try to sell it to the audience.” —Mark Hawker, SFX Supervisor
40 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 41
from day to day. We’re using shadowing from the clouds and mountains to help blend shots together in the color grade.” In the book, Camazotz was always dark and ominous. “In ours, we’re not playing it the same way, so Camazotz is going to vary, but there are some scenes that are pulled right out of the book such as the suburbs with all of the kids and families.” The suburban setting has the inhabitants of Camazotz appearing to be robotic in their mannerisms and behaviors. “It’s more the theme of control and conformity; that was the point of that scene. We had the kids performing that way, but we definitely did a lot of re-timing and manipulation of the plate to get that all aligned.” A number of characters appear in the different scenes which presented a challenge during the editing process. “There is a lot of coverage to deal with and choreography too,” notes McBride. “It’s tricky if we have to pull somebody out of the background of a shot or change the position of where they are in a scene.” Digital doubles were used sparingly while only a couple of cases required face replacements. “We spent a lot of time thinking about the kids’ safety and comfort. There’s a scene where the kids are flying over Uriel with our creature. We wanted to put them into rigs that they were going to be having fun with.” The special effects team created and maneuvered a six-axis motion-base rig for the back of the creature which was a big foam platform that the children treated like a ride at Disneyland. “We had robotic arms from Robomoco programmed at long enough durations so that they could do multiple takes and even had flexibility with the cameras. It created more work for us on the back end as far as cleanup of the rigging, but it made the kids comfortable and enabled Ava to have the freedom to shoot the scene the way she wanted to.” Not limiting the options for DuVernay was an overriding principle. “It’s created more work than I anticipated,” reveals McBride. “In my initial conversations with Ava, I said, ‘I would love to be able to plan this out.’ She would look at me and say, ‘Rich, I don’t know. I might change my mind on the day.’ It made me re-think how I could set this up in the way to give her the most flexibility, but still make sure that we were going to be covering ourselves and being able to accomplish what she wanted.” Hawker enjoyed collaborating with the visual effects team. “I’ve been on shows where the visual effects supervisor is like, ‘We’ll do that.’ Rich definitely wanted as much input from us as he could get. They want special effects to start the event and have visual effects take over so you always have something of realism in the shot to try to sell it to the audience.” MPC, ILM, Digital Domain, Luma Pictures, Rodeo FX and One of Us are responsible for producing 1,500 to 1,600 visual effects shots. For McBride, one moment stands out. “The kids flying over Uriel with the creature will be a lot of fun to see on the big screen.” ‘Land as monster’ stands out for Hawker. “I had 25 effects guys just for that sequence. That was challenging. I’m looking
SPRING 2018 VFXVOICE.COM • 41
3/5/18 4:16 PM
COVER
TOP: The three celestial beings, Mrs. Whatsit (Reese Witherspoon), Mrs. Who (Mindy Kaling) and Mrs. Which (Oprah Winfrey). BOTTOM: Mr. Murry (Chris Pine) is held prisoner by expanding dark entity known as IT. OPPOSITE TOP: Calvin O’Keefe (Levi Miller), Charles Wallace Murry (Deric McCabe) and Meg Murry (Storm Reid) find themselves in the suburbs of Camazotz. OPPOSITE MIDDLE: Zach Galifianakis is The Happy Medium. OPPOSITE BOTTOM: Meg Murry (Storm Reid) explains how traveling in a straight line is not the quickest way to travel through space and time.
like you’re somewhere else yet it is still on our Earth. We have since pushed the look of it going into post-production so that it feels more otherworldly and a bit more fantasy. It’s got more color and a bit more flower coverage. We’ve littered the landscape with giant poppies vegetation. There are whole ecosystems that live on these poppies with a big lake at the top of surrounding foothills. Ixchel ended up being a clean environment. It was meant to be a healing place. Ava wanted something simple so we created a clean ice plain. We have these beautiful ice structures which are something that we haven’t seen before. Camazotz was a mixed bag because it is a surreal place that is always changing. Camazotz is a place not to be trusted. It has many faces.” Practical elements were needed for a gentle creature that lives on Ixchel. “When they do the leaps, Meg Murry [Storm Reid] is getting sick and there’s this big furry creature that helps to make her well again,” remarks Hawker. “Special effects built a rig to hold her in the position, and it was covered with fur because for Rich to do fur right up against Storm’s skin is an expensive thing. Two of my special effects technicians puppeteered the rig that was put on a bungee because it moves around, is breathing and had to do gentle moves. I’m sure the gentle creature is going to be beautiful when visual effects are finished with it.” Air cylinders were dug into the ground for the first time the children travel through space. “The kids would walk or stand on the air cylinders, and Rich blended them into the environment to create the impression that the surroundings are wrinkling.” The color palette was manipulated during the DI process to heighten the sense of surrealism. “Uriel is quite colorful,” states McBride. “If you’re shooting over multiple days you get different weather especially in New Zealand where it changes
“Rich definitely wanted as much input from us as he could. They want special effects to start the event and have visual effects take over so you always have something of realism in the shot to try to sell it to the audience.” —Mark Hawker, SFX Supervisor
40 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 41
from day to day. We’re using shadowing from the clouds and mountains to help blend shots together in the color grade.” In the book, Camazotz was always dark and ominous. “In ours, we’re not playing it the same way, so Camazotz is going to vary, but there are some scenes that are pulled right out of the book such as the suburbs with all of the kids and families.” The suburban setting has the inhabitants of Camazotz appearing to be robotic in their mannerisms and behaviors. “It’s more the theme of control and conformity; that was the point of that scene. We had the kids performing that way, but we definitely did a lot of re-timing and manipulation of the plate to get that all aligned.” A number of characters appear in the different scenes which presented a challenge during the editing process. “There is a lot of coverage to deal with and choreography too,” notes McBride. “It’s tricky if we have to pull somebody out of the background of a shot or change the position of where they are in a scene.” Digital doubles were used sparingly while only a couple of cases required face replacements. “We spent a lot of time thinking about the kids’ safety and comfort. There’s a scene where the kids are flying over Uriel with our creature. We wanted to put them into rigs that they were going to be having fun with.” The special effects team created and maneuvered a six-axis motion-base rig for the back of the creature which was a big foam platform that the children treated like a ride at Disneyland. “We had robotic arms from Robomoco programmed at long enough durations so that they could do multiple takes and even had flexibility with the cameras. It created more work for us on the back end as far as cleanup of the rigging, but it made the kids comfortable and enabled Ava to have the freedom to shoot the scene the way she wanted to.” Not limiting the options for DuVernay was an overriding principle. “It’s created more work than I anticipated,” reveals McBride. “In my initial conversations with Ava, I said, ‘I would love to be able to plan this out.’ She would look at me and say, ‘Rich, I don’t know. I might change my mind on the day.’ It made me re-think how I could set this up in the way to give her the most flexibility, but still make sure that we were going to be covering ourselves and being able to accomplish what she wanted.” Hawker enjoyed collaborating with the visual effects team. “I’ve been on shows where the visual effects supervisor is like, ‘We’ll do that.’ Rich definitely wanted as much input from us as he could get. They want special effects to start the event and have visual effects take over so you always have something of realism in the shot to try to sell it to the audience.” MPC, ILM, Digital Domain, Luma Pictures, Rodeo FX and One of Us are responsible for producing 1,500 to 1,600 visual effects shots. For McBride, one moment stands out. “The kids flying over Uriel with the creature will be a lot of fun to see on the big screen.” ‘Land as monster’ stands out for Hawker. “I had 25 effects guys just for that sequence. That was challenging. I’m looking
SPRING 2018 VFXVOICE.COM • 41
3/5/18 4:16 PM
COVER
“In my initial conversations with Ava, I said, ‘I would love to be able to plan this out.’ She would look at me and say, ‘Rich, I don’t know. I might change my mind on the day.’ It made me re-think how I could set this up in the way to give her the most flexibility, but still make sure that we were going to be covering ourselves and being able to accomplish what she wanted.” —Rich McBride, Visual Effects Supervisor
TOP LEFT: Mindy Kaling is Mrs. Who and Storm Reid is Meg Murry in A Wrinkle in Time. TOP RIGHT: World-renowned physicist Mr. Murry (Chris Pine) before he vanishes. BOTTOM: The three celestial beings, Mrs. Who (Mindy Kaling), Mrs. Which (Oprah Winfrey) and Mrs. Whatsit (Reese Witherspoon).
forward to seeing how our practical effects work with Rich’s visual effects and how it all ties together.” A Wrinkle in Time has been a unique project for the filmmakers. “Ava has such a great handle on character and emotion,” observes McBride. “Normally, it would be about the technical aspects or composition or lighting. But in this case, I’m like, ‘How does that shot make you feel? How is that shot meant to make you feel?’ Those are the kinds of things that were often coming up in conversations. I call this the emotional visual effects show!”
42 • VFXVOICE.COM SPRING 2018
PG 36-42 WRINKLE.indd 42
3/5/18 4:16 PM
PG 43 VES HANDBOOK AD.indd 3
3/5/18 4:17 PM
ANIMATION
AARDMAN GOES BACK TO STOP-MOTION BASICS WITH EARLY MAN By IAN FAILES
Few would argue that Aardman Animations is one of the most well-respected stop-motion animation studios around the world, with a long legacy of charming and successful projects including Wallace and Gromit and Chicken Run. Aardman’s latest feature film, Early Man, directed by Nick Park, continues the stop-motion legacy with a story set in prehistoric times that follows a caveman name Dug (voiced by Eddie Redmayne), his pet Hognob and a new friend Goona (Maisie Williams) as they take on the evil Lord Nooth (Tom Hiddleston). VFX Voice went behind the scenes of the meticulous stop-motion production with Animation Supervisors Merlin Crossingham and Will Becher, who outline the practical side of Aardman’s frame-by-frame filmmaking.
were perfectly suited to prehistory, cavemen and that feeling of crudeness. “Then once the first sculpt was working,” adds Becher, “the model-making department set about building a miniature, a fully posable puppet with an internal metal skeleton. This is the prototype; by the end of the production there will be multiple copies of each character. There were 18 identical Dug puppets by the end of the shoot.” The puppets were built with a stainless steel skeleton known as an armature inside of them. Practically, the armature works like a human skeleton with matching ball and socket joints at places such as the elbows and knees. The team fleshes the armatures out either with a resin or other core over which a silicone skin is dressed. Sometimes a foam latex body is cast. “These methods require very accurate molds to be made from the master sculpts so that details are maintained and that multiple puppets can be built,” notes Crossingham. The faces on all the characters in Early Man are made from modeling clay. Clay, in particular, is one of Park’s trademarks, and is also a material that can be easily sculpted by hand. To help with the many and varied mouth shapes required during animation, a set of pre-made replacement mouths were used for dialogue. “The mouths are still made from modeling clay,” says Crossingham, “but having shapes pre-made saves lots of time in an already time-consuming process. Another advantage of using modeling clay is that the animators can adapt and adjust the mouth shapes to any subtleties of voice or character performance. The eyes of our characters are solid resin and have a tiny hole in the pupil that allows the animator to move them with a pin.” FINDING THE CHARACTER
All images copyright © 2018 Studiocanal S.A.S and The British Film Institute. Courtesy of Aardman Animations. TOP: Early design sculpts of the Early Man characters. Notes Animation Supervisor Will Becher: “Early Man came from the brilliant and quirky mind of Nick Park. He first started working on the idea with a caveman back in 2007.”
44 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 44-45
FROM SCRIPT TO PUPPETS
Like all Aardman films, the creation of convincing characters was crucial to realizing the story of Early Man. Starting with the script through to early sketches by Park, artists at the studio then sculpted rough versions of the characters into three dimensions in clay. “Clay was a very important ingredient in the film,” states Becher. “Nick felt its unique qualities
During the puppet prototype building stage and into final production, Aardman embarked on an animation development stage of the film to help ‘find’ the characters. “This is much like a rehearsal,” says Crossingham. “The voice artist is also pivotal in contributing to the character as we use their vocal performance to inspire the physical action on screen.” “Nick works closely with the actors to find the character’s
TOP LEFT: Director Nick Park reviews design models. TOP MIDDLE: Mouth designs and mouth sculpts for the character Dug. TOP RIGHT: Junior animator Emma Diaz works on an early animation test for Dug.
“We like to use the best tools for the job, and while Early Man is a stop-motion film we do use a bevy of digital techniques that are now standard in feature films. We make use of green and bluescreens to separate elements, digital set extensions, matte paintings for skies and CGI particle effects, and character doubles for crowds and occasional background action. I like to think of harnessing the bleeding edge of creative technology with modeling clay.” —Merlin Crossingham, Animation Supervisor
SPRING 2018 VFXVOICE.COM • 45
3/5/18 4:17 PM
ANIMATION
AARDMAN GOES BACK TO STOP-MOTION BASICS WITH EARLY MAN By IAN FAILES
Few would argue that Aardman Animations is one of the most well-respected stop-motion animation studios around the world, with a long legacy of charming and successful projects including Wallace and Gromit and Chicken Run. Aardman’s latest feature film, Early Man, directed by Nick Park, continues the stop-motion legacy with a story set in prehistoric times that follows a caveman name Dug (voiced by Eddie Redmayne), his pet Hognob and a new friend Goona (Maisie Williams) as they take on the evil Lord Nooth (Tom Hiddleston). VFX Voice went behind the scenes of the meticulous stop-motion production with Animation Supervisors Merlin Crossingham and Will Becher, who outline the practical side of Aardman’s frame-by-frame filmmaking.
were perfectly suited to prehistory, cavemen and that feeling of crudeness. “Then once the first sculpt was working,” adds Becher, “the model-making department set about building a miniature, a fully posable puppet with an internal metal skeleton. This is the prototype; by the end of the production there will be multiple copies of each character. There were 18 identical Dug puppets by the end of the shoot.” The puppets were built with a stainless steel skeleton known as an armature inside of them. Practically, the armature works like a human skeleton with matching ball and socket joints at places such as the elbows and knees. The team fleshes the armatures out either with a resin or other core over which a silicone skin is dressed. Sometimes a foam latex body is cast. “These methods require very accurate molds to be made from the master sculpts so that details are maintained and that multiple puppets can be built,” notes Crossingham. The faces on all the characters in Early Man are made from modeling clay. Clay, in particular, is one of Park’s trademarks, and is also a material that can be easily sculpted by hand. To help with the many and varied mouth shapes required during animation, a set of pre-made replacement mouths were used for dialogue. “The mouths are still made from modeling clay,” says Crossingham, “but having shapes pre-made saves lots of time in an already time-consuming process. Another advantage of using modeling clay is that the animators can adapt and adjust the mouth shapes to any subtleties of voice or character performance. The eyes of our characters are solid resin and have a tiny hole in the pupil that allows the animator to move them with a pin.” FINDING THE CHARACTER
All images copyright © 2018 Studiocanal S.A.S and The British Film Institute. Courtesy of Aardman Animations. TOP: Early design sculpts of the Early Man characters. Notes Animation Supervisor Will Becher: “Early Man came from the brilliant and quirky mind of Nick Park. He first started working on the idea with a caveman back in 2007.”
44 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 44-45
FROM SCRIPT TO PUPPETS
Like all Aardman films, the creation of convincing characters was crucial to realizing the story of Early Man. Starting with the script through to early sketches by Park, artists at the studio then sculpted rough versions of the characters into three dimensions in clay. “Clay was a very important ingredient in the film,” states Becher. “Nick felt its unique qualities
During the puppet prototype building stage and into final production, Aardman embarked on an animation development stage of the film to help ‘find’ the characters. “This is much like a rehearsal,” says Crossingham. “The voice artist is also pivotal in contributing to the character as we use their vocal performance to inspire the physical action on screen.” “Nick works closely with the actors to find the character’s
TOP LEFT: Director Nick Park reviews design models. TOP MIDDLE: Mouth designs and mouth sculpts for the character Dug. TOP RIGHT: Junior animator Emma Diaz works on an early animation test for Dug.
“We like to use the best tools for the job, and while Early Man is a stop-motion film we do use a bevy of digital techniques that are now standard in feature films. We make use of green and bluescreens to separate elements, digital set extensions, matte paintings for skies and CGI particle effects, and character doubles for crowds and occasional background action. I like to think of harnessing the bleeding edge of creative technology with modeling clay.” —Merlin Crossingham, Animation Supervisor
SPRING 2018 VFXVOICE.COM • 45
3/5/18 4:17 PM
ANIMATION
expressions. Up to 35 animators will be working on the film at its peak and they all need to know how to work with all the film’s cast of characters.” Another key aspect of finding the characters involves the use of live-action video, or LAVs. Here, Park will typically act out every shot in the film while being filmed to give the animators a solid understanding of the performance required. “This doesn’t detract from their unique skills, they will need to embellish and take ownership of the performance in the shot in order to make it really work,” states Becher. “We treat the animation team like individual actors; they build on the voice, but the visual performance on screen is down entirely to them.” THE ART OF STOP-MOTION
TOP LEFT: Senior set dresser Paul Bryant working on the valley set. The backing is a bluescreen which was later extended with more hills and trees. TOP RIGHT: Animator Claire Rolls uses a rig to hold Dug and Hognob in place while animating. The rig was later removed in post. BOTTOM LEFT: Director Nick Park and crew on set during the making of Early Man. The set floor was always a hive of activity, according to Animation Supervisor Merlin Crossingham. “We had a crew of around 200 people in the studio – many more were involved over the whole films production – and any bottle-neck or delay has big implications. Luckily this was not our first feature and our production team made it work brilliantly.” BOTTOM RIGHT: Animator Steve Cox animates the tribe for part of a hunting sequence. OPPOSITE TOP: Supervising set dresser Andy Brown on the badlands set. OPPOSITE MIDDLE: From left: Eddie Redmayne with his character puppet Dug, Maisie Williams with Goona and director Nick Park with Hognob. OPPOSITE BOTTOM: Animator Jo Fenton works with the Dug and Hognob puppets on the valley set.
46 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 46-47
voice,” adds Becher. “I work with the lead animators who start to animate the prototype puppet to recorded dialogue from Nick’s session with the actor. At this stage we look for key expressions and traits which help establish who the character is and how they might deliver a strong and believable performance on screen. Remember, no one has ever seen this character on screen before, so by a process of exploration we have to find it and hone it down.” This process involves changing and developing the way the character looks and moves, and even how he or she speaks. It’s why clay works so well, notes Becher and Crossingham, because the medium is flexible and changeable and allows for the changes to evolve organically. With reference animation under their belts, the animation supervisors formally introduce new animators to the characters. Says Becher: “We go through in detail how the characters look, how they stand and walk – referring to a character bible – and what techniques can be used to create great dialogue and
Actual stop-motion animation using puppets that are shot on built sets has remained largely unchanged since the beginning of cinema itself. However, Aardman took advantage of digital SLR cameras and tools for reviewing frame-by-frame work. And they shot multiple sets at once – about 40 at the peak of production. “Each set has a team that prepares it for animation,” says Crossingham. “When it comes to animate, there will be one camera and one animator on the set or unit. If there are 10 characters in the shot then that one animator animates all 10 characters. The animators really have to focus.” So what does stop-motion actually involve? It’s a slow and meticulous process. Animators shoot a single frame, move the puppet and then shoot another frame. They repeat the process over and over, sometimes only producing less than a second of animation in a day. With 24 frames in one second, the footage played back – which is a series of static images – appears to be linked together and taken in by our brains as movement on the screen. During production, animators make use of specialized rigs to hold up the puppets, keep them still, make them appear to be in mid-air, and so on. In the days before digital visual effects, these rigs would typically be hidden from the camera, but now they can be front and center in the scene and simply painted out. The team also used green or bluescreen sets to enable the compositing of their characters into different backgrounds or for doing digital set extensions. Visual effects has certainly widened the scope of many Aardman productions, including Early Man. “We like to use the best tools for the job, and while Early Man is a stop-motion film we do use a bevy of digital techniques that are now standard in feature films,” comments Crossingham. “We make use of green and bluescreens to separate elements, digital set extensions, matte paintings for skies, and CGI particle effects and character doubles for crowds and occasional background action. I like to think of harnessing the bleeding edge of creative technology with modeling clay.” OLD AND THE NEW
For a film set in prehistoric times, Early Man relied on an interesting mix of old-school techniques, such as the stop-
SPRING 2018 VFXVOICE.COM • 47
3/5/18 4:17 PM
ANIMATION
expressions. Up to 35 animators will be working on the film at its peak and they all need to know how to work with all the film’s cast of characters.” Another key aspect of finding the characters involves the use of live-action video, or LAVs. Here, Park will typically act out every shot in the film while being filmed to give the animators a solid understanding of the performance required. “This doesn’t detract from their unique skills, they will need to embellish and take ownership of the performance in the shot in order to make it really work,” states Becher. “We treat the animation team like individual actors; they build on the voice, but the visual performance on screen is down entirely to them.” THE ART OF STOP-MOTION
TOP LEFT: Senior set dresser Paul Bryant working on the valley set. The backing is a bluescreen which was later extended with more hills and trees. TOP RIGHT: Animator Claire Rolls uses a rig to hold Dug and Hognob in place while animating. The rig was later removed in post. BOTTOM LEFT: Director Nick Park and crew on set during the making of Early Man. The set floor was always a hive of activity, according to Animation Supervisor Merlin Crossingham. “We had a crew of around 200 people in the studio – many more were involved over the whole films production – and any bottle-neck or delay has big implications. Luckily this was not our first feature and our production team made it work brilliantly.” BOTTOM RIGHT: Animator Steve Cox animates the tribe for part of a hunting sequence. OPPOSITE TOP: Supervising set dresser Andy Brown on the badlands set. OPPOSITE MIDDLE: From left: Eddie Redmayne with his character puppet Dug, Maisie Williams with Goona and director Nick Park with Hognob. OPPOSITE BOTTOM: Animator Jo Fenton works with the Dug and Hognob puppets on the valley set.
46 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 46-47
voice,” adds Becher. “I work with the lead animators who start to animate the prototype puppet to recorded dialogue from Nick’s session with the actor. At this stage we look for key expressions and traits which help establish who the character is and how they might deliver a strong and believable performance on screen. Remember, no one has ever seen this character on screen before, so by a process of exploration we have to find it and hone it down.” This process involves changing and developing the way the character looks and moves, and even how he or she speaks. It’s why clay works so well, notes Becher and Crossingham, because the medium is flexible and changeable and allows for the changes to evolve organically. With reference animation under their belts, the animation supervisors formally introduce new animators to the characters. Says Becher: “We go through in detail how the characters look, how they stand and walk – referring to a character bible – and what techniques can be used to create great dialogue and
Actual stop-motion animation using puppets that are shot on built sets has remained largely unchanged since the beginning of cinema itself. However, Aardman took advantage of digital SLR cameras and tools for reviewing frame-by-frame work. And they shot multiple sets at once – about 40 at the peak of production. “Each set has a team that prepares it for animation,” says Crossingham. “When it comes to animate, there will be one camera and one animator on the set or unit. If there are 10 characters in the shot then that one animator animates all 10 characters. The animators really have to focus.” So what does stop-motion actually involve? It’s a slow and meticulous process. Animators shoot a single frame, move the puppet and then shoot another frame. They repeat the process over and over, sometimes only producing less than a second of animation in a day. With 24 frames in one second, the footage played back – which is a series of static images – appears to be linked together and taken in by our brains as movement on the screen. During production, animators make use of specialized rigs to hold up the puppets, keep them still, make them appear to be in mid-air, and so on. In the days before digital visual effects, these rigs would typically be hidden from the camera, but now they can be front and center in the scene and simply painted out. The team also used green or bluescreen sets to enable the compositing of their characters into different backgrounds or for doing digital set extensions. Visual effects has certainly widened the scope of many Aardman productions, including Early Man. “We like to use the best tools for the job, and while Early Man is a stop-motion film we do use a bevy of digital techniques that are now standard in feature films,” comments Crossingham. “We make use of green and bluescreens to separate elements, digital set extensions, matte paintings for skies, and CGI particle effects and character doubles for crowds and occasional background action. I like to think of harnessing the bleeding edge of creative technology with modeling clay.” OLD AND THE NEW
For a film set in prehistoric times, Early Man relied on an interesting mix of old-school techniques, such as the stop-
SPRING 2018 VFXVOICE.COM • 47
3/5/18 4:17 PM
ANIMATION
A Massage for Hiddleston Early Man’s villain is Lord Nooth, voiced by Tom Hiddleston, a man who lives a luxurious ‘Bronze Age’ life and who is intent on claiming the land of others. In one hilarious sequence, Hognob finds himself giving Nooth an unexpected massage. “During the voice recording,” recalls Animation Supervisor Will Becher, “director Nick Park went to lengths to get a genuine performance from Tom Hiddleston. In fact I think he gave him a massage as he was reading the lines. Meanwhile, I worked closely with the animator on that sequence, Steve Cox, to try to find the maximum comedy and performance. We recorded a number of live-action videos in which we tried different looks and expressions and timings.” Once they had what they thought was the best video reference for timing, Cox set about animating the shot. It would take seven and a half weeks to complete the animation. One of the major hurdles proved to be the soap bubbles, which were made of glass beads. “I would visit Steve on set once every week or two to see how it was going and to make sure he wasn’t going crazy,” says Becher. “It was useful to keep that objective overview as when you are so close to something and working on half a second a day, you can lose track of the plan. He did an amazing job on the shot and it remains one of my favorite scenes in the film. When he finally completed the shot and Nick approved it, we had a mini wrap party.”
TOP LEFT: Dug during the hunting sequence. “Some of the most impressive performances come from tiny changes in shape around the character’s eyes, so we retained as much clay as possible on the face to allow the animators scope,” explains Animation Supervisor Will Becher. “The eyes have a tiny pin hole in the middle which, invisible to the audience, allows for tiny eye movements using a pin.” TOP RIGHT: Lord Nooth, voiced by Tom Hiddleston. BOTTOM LEFT: Maisie Williams voices the character Goona in the film. BOTTOM RIGHT: From left: Bobnar (voiced by Timothy Spall) with Dug (Eddie Redmayne).
48 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 49
motion itself, while also requiring Aardman to venture into new areas. One of these was fur, particularly for Hognob and the clothing worn by Dug. “In the past we’ve avoided this material as it is so hard to control and mixes badly with clay,” says Becher. Another was building a stadium capable of holding around 60,000 people. Aardman actually turned to virtual reality to previsualize the right angles and find framings for their stop-motion characters to interact with here. But even this paled in comparison to the biggest challenge of having to animate one story requirement: a mammoth. Becher states that it was the single most complicated puppet ever built at Aardman. “It involved months of development across a number of different departments. The final working mammoth – we only built one – was so heavy it required scaffolding to hold it in place.” Luckily, Aardman is well-suited to these challenges, and although they seem, well, mammoth in size, Early Man remained a quirky character-driven film, something audiences
very much expect from the mind of Nick Park. “Nick’s style is firmly seated in the world of thumby and funny characters,” notes Crossingham, “and it is those characters that lead the story and define the world of Early Man.”
“I would visit animator Steve Cox on set once every week or two to see how it was going and to make sure he wasn’t going crazy. It was useful to keep that objective overview as when you are so close to something and working on half a second a day, you can lose track of the plan.” —Will Becher, Animation Supervisor
TOP: The characters find themselves in plenty of prehistoric predicaments during Early Man. Here, scenes of stop-motion animation were augmented with extra lava and smoke simulations. MIDDLE: Dug ventures outside of his peaceful valley to Bronze Age Town run by Lord Nooth. BOTTOM RIGHT: Hognob provides Lord Nooth with a relaxing back massage after infiltrating his home with Dug.
SPRING 2018 VFXVOICE.COM • 49
3/5/18 4:18 PM
ANIMATION
A Massage for Hiddleston Early Man’s villain is Lord Nooth, voiced by Tom Hiddleston, a man who lives a luxurious ‘Bronze Age’ life and who is intent on claiming the land of others. In one hilarious sequence, Hognob finds himself giving Nooth an unexpected massage. “During the voice recording,” recalls Animation Supervisor Will Becher, “director Nick Park went to lengths to get a genuine performance from Tom Hiddleston. In fact I think he gave him a massage as he was reading the lines. Meanwhile, I worked closely with the animator on that sequence, Steve Cox, to try to find the maximum comedy and performance. We recorded a number of live-action videos in which we tried different looks and expressions and timings.” Once they had what they thought was the best video reference for timing, Cox set about animating the shot. It would take seven and a half weeks to complete the animation. One of the major hurdles proved to be the soap bubbles, which were made of glass beads. “I would visit Steve on set once every week or two to see how it was going and to make sure he wasn’t going crazy,” says Becher. “It was useful to keep that objective overview as when you are so close to something and working on half a second a day, you can lose track of the plan. He did an amazing job on the shot and it remains one of my favorite scenes in the film. When he finally completed the shot and Nick approved it, we had a mini wrap party.”
TOP LEFT: Dug during the hunting sequence. “Some of the most impressive performances come from tiny changes in shape around the character’s eyes, so we retained as much clay as possible on the face to allow the animators scope,” explains Animation Supervisor Will Becher. “The eyes have a tiny pin hole in the middle which, invisible to the audience, allows for tiny eye movements using a pin.” TOP RIGHT: Lord Nooth, voiced by Tom Hiddleston. BOTTOM LEFT: Maisie Williams voices the character Goona in the film. BOTTOM RIGHT: From left: Bobnar (voiced by Timothy Spall) with Dug (Eddie Redmayne).
48 • VFXVOICE.COM SPRING 2018
PG 44-49 EARLY MAN.indd 49
motion itself, while also requiring Aardman to venture into new areas. One of these was fur, particularly for Hognob and the clothing worn by Dug. “In the past we’ve avoided this material as it is so hard to control and mixes badly with clay,” says Becher. Another was building a stadium capable of holding around 60,000 people. Aardman actually turned to virtual reality to previsualize the right angles and find framings for their stop-motion characters to interact with here. But even this paled in comparison to the biggest challenge of having to animate one story requirement: a mammoth. Becher states that it was the single most complicated puppet ever built at Aardman. “It involved months of development across a number of different departments. The final working mammoth – we only built one – was so heavy it required scaffolding to hold it in place.” Luckily, Aardman is well-suited to these challenges, and although they seem, well, mammoth in size, Early Man remained a quirky character-driven film, something audiences
very much expect from the mind of Nick Park. “Nick’s style is firmly seated in the world of thumby and funny characters,” notes Crossingham, “and it is those characters that lead the story and define the world of Early Man.”
“I would visit animator Steve Cox on set once every week or two to see how it was going and to make sure he wasn’t going crazy. It was useful to keep that objective overview as when you are so close to something and working on half a second a day, you can lose track of the plan.” —Will Becher, Animation Supervisor
TOP: The characters find themselves in plenty of prehistoric predicaments during Early Man. Here, scenes of stop-motion animation were augmented with extra lava and smoke simulations. MIDDLE: Dug ventures outside of his peaceful valley to Bronze Age Town run by Lord Nooth. BOTTOM RIGHT: Hognob provides Lord Nooth with a relaxing back massage after infiltrating his home with Dug.
SPRING 2018 VFXVOICE.COM • 49
3/5/18 4:18 PM
VES AWARDS
VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT Captions list all members of each Award-winning team, even if some members were not present. For more show photos and a complete list of nominees and winners of the 2018 VES Awards visit visualeffectssociety.com
1
All photos by: Danny Moloshok and Phil McCarten
1. Anticipation builds as the 16th Annual VES Awards is about to begin. 2. The view of the International Ballroom at the Beverly Hilton Hotel, Los Angeles, at the 16th Annual VES Awards. 3. Eric Roth, Executive Director of the Visual Effects Society, welcomes the crowd.
2
3
50 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 50-51
4
4. Patton Oswalt hosts the VES Awards Show.
On February 13, 2018, the Visual Effects Society held the 16th Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues. Comedian Patton Oswalt served as the host to the more than 1,000 guests gathered at the Beverly Hilton Hotel, Los Angeles, to celebrate VFX talent in 24 awards categories. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards – the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects. President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to acclaimed producer-writerdirector Jon Favreau. Academy-Award winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.
5
6
7
8
5. Joe Letteri, VES receives the Georges Méliès Award while presenter, producer Jon Landau, looks on. 6. Jon Favreau accepts the VES Lifetime Achievement Award. 7. The VES Award for Outstanding Visual Effects in a Photoreal Feature went to War for the Planet of the Apes and the team of Joe Letteri, VES, Ryan Stafford, Daniel Barrett, Dan Lemmon and Joel Whist. 8. The VES Award for Outstanding Visual Effects in a Photoreal Episode went to Game of Thrones; Beyond the Wall and the team of David Ramos, Joe Bauer, Steve Kullback, Chris Baird and Sam Conway. Presenter Mark Hamill, right, looks on. 9. The VES Award for Outstanding Visual Effects in an Animated Feature went to Coco and the team of Lee Unkrich, Darla K. Anderson, David Ryu and Michael K. O’Brien.
9
SPRING 2018 VFXVOICE.COM • 51
3/5/18 4:18 PM
VES AWARDS
VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT Captions list all members of each Award-winning team, even if some members were not present. For more show photos and a complete list of nominees and winners of the 2018 VES Awards visit visualeffectssociety.com
1
All photos by: Danny Moloshok and Phil McCarten
1. Anticipation builds as the 16th Annual VES Awards is about to begin. 2. The view of the International Ballroom at the Beverly Hilton Hotel, Los Angeles, at the 16th Annual VES Awards. 3. Eric Roth, Executive Director of the Visual Effects Society, welcomes the crowd.
2
3
50 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 50-51
4
4. Patton Oswalt hosts the VES Awards Show.
On February 13, 2018, the Visual Effects Society held the 16th Annual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues. Comedian Patton Oswalt served as the host to the more than 1,000 guests gathered at the Beverly Hilton Hotel, Los Angeles, to celebrate VFX talent in 24 awards categories. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards – the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects. President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to acclaimed producer-writerdirector Jon Favreau. Academy-Award winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.
5
6
7
8
5. Joe Letteri, VES receives the Georges Méliès Award while presenter, producer Jon Landau, looks on. 6. Jon Favreau accepts the VES Lifetime Achievement Award. 7. The VES Award for Outstanding Visual Effects in a Photoreal Feature went to War for the Planet of the Apes and the team of Joe Letteri, VES, Ryan Stafford, Daniel Barrett, Dan Lemmon and Joel Whist. 8. The VES Award for Outstanding Visual Effects in a Photoreal Episode went to Game of Thrones; Beyond the Wall and the team of David Ramos, Joe Bauer, Steve Kullback, Chris Baird and Sam Conway. Presenter Mark Hamill, right, looks on. 9. The VES Award for Outstanding Visual Effects in an Animated Feature went to Coco and the team of Lee Unkrich, Darla K. Anderson, David Ryu and Michael K. O’Brien.
9
SPRING 2018 VFXVOICE.COM • 51
3/5/18 4:18 PM
VES AWARDS 10. The VES Award for Outstanding Visual Effects in a Commercial went to Samsung; Do What You Can’t; Ostrich and the team of Diarmid Harrison-Murray, Tomek Zietkiewicz, Amir Bazazi and Martino Madeddu.
10
11. The VES Award for Outstanding Animated Character in an Animated Feature went to Coco; Hèctor and the team of Emron Grover, Jonathan Hoffman, Michael Honsel and Guilherme Sauerbronn Jacinto.
11
12. The VES Award for Outstanding Animated Character in a Photoreal Feature went to War for the Planet of the Apes; Caesar and the team of Dennis Yoo, Ludovic Chailloleau, Douglas McHale and Tim Forbes. 13. The VES Award for Outstanding Animated Character in a Commercial went to Samsung; Do What You Can’t; Ostrich and the team of David Bryan (at podium), Maximilian Mallmann, Tim Van Hussen and Brendan Fagan. 12
13
52 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 52-53
14. The VES Award for Outstanding Animated Character in an Episode or RealTime Project went to Game of Thrones; The Spoils of War; Drogon Loot Train Attack and the team of Murray Stevenson, Jason Snyman, Jenn Taylor and Florian Friedmann. 15. The VES Award for Outstanding Effects Simulations in a Photoreal Feature went to War for the Planet of the Apes and the team of David Caeiro Cebrián, Johnathan Nixon, Chet Leavai and Gary Boyle.
14
15
16
17
16. The VES Award for Outstanding Effects Simulations in an Animated Feature went to Coco and the team of Kristopher Campbell, Stephen Gustafson, Dave Hale and Keith Klohn. 17. The VES Award for Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project went to Game of Thrones; The Dragon and the Wolf; Wall Destruction and the team of Thomas Hullin, Dominik Kirouac, Sylvain Nouveau and Nathan Arbuckle. 18. The VES Award for Outstanding Model in a Photoreal or Animated Project went to Blade Runner 2049; LAPD Headquarters and the team of Alex Funke, Steven Saunders, Joaquin Loyzaga and Chris Menges.
18
SPRING 2018 VFXVOICE.COM • 53
3/5/18 4:19 PM
VES AWARDS 10. The VES Award for Outstanding Visual Effects in a Commercial went to Samsung; Do What You Can’t; Ostrich and the team of Diarmid Harrison-Murray, Tomek Zietkiewicz, Amir Bazazi and Martino Madeddu.
10
11. The VES Award for Outstanding Animated Character in an Animated Feature went to Coco; Hèctor and the team of Emron Grover, Jonathan Hoffman, Michael Honsel and Guilherme Sauerbronn Jacinto.
11
12. The VES Award for Outstanding Animated Character in a Photoreal Feature went to War for the Planet of the Apes; Caesar and the team of Dennis Yoo, Ludovic Chailloleau, Douglas McHale and Tim Forbes. 13. The VES Award for Outstanding Animated Character in a Commercial went to Samsung; Do What You Can’t; Ostrich and the team of David Bryan (at podium), Maximilian Mallmann, Tim Van Hussen and Brendan Fagan. 12
13
52 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 52-53
14. The VES Award for Outstanding Animated Character in an Episode or RealTime Project went to Game of Thrones; The Spoils of War; Drogon Loot Train Attack and the team of Murray Stevenson, Jason Snyman, Jenn Taylor and Florian Friedmann. 15. The VES Award for Outstanding Effects Simulations in a Photoreal Feature went to War for the Planet of the Apes and the team of David Caeiro Cebrián, Johnathan Nixon, Chet Leavai and Gary Boyle.
14
15
16
17
16. The VES Award for Outstanding Effects Simulations in an Animated Feature went to Coco and the team of Kristopher Campbell, Stephen Gustafson, Dave Hale and Keith Klohn. 17. The VES Award for Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project went to Game of Thrones; The Dragon and the Wolf; Wall Destruction and the team of Thomas Hullin, Dominik Kirouac, Sylvain Nouveau and Nathan Arbuckle. 18. The VES Award for Outstanding Model in a Photoreal or Animated Project went to Blade Runner 2049; LAPD Headquarters and the team of Alex Funke, Steven Saunders, Joaquin Loyzaga and Chris Menges.
18
SPRING 2018 VFXVOICE.COM • 53
3/5/18 4:19 PM
VES AWARDS
19
20
19. The VES Award for Outstanding Created Environment in a Photoreal Feature went to Blade Runner 2049; Los Angeles and the team of Chris McLaughlin, Rhys Salcombe, Seungjin Woo and Francesco Dell’Anna.
26. The VES Award for Outstanding Compositing in a Photoreal Feature went to War for the Planet of the Apes and the team of Christoph Salzmann, Robin Hollander, Ben Warner and Beck Veitch.
20. The VES Award for Outstanding Visual Effects in a RealTime Project went to Assassin’s Creed Origins and the team of Raphael Lacoste, Patrick Limoges, Jean-Sebastien Guay and Ulrich Haar.
27. The VES Award for Outstanding Compositing in a Photoreal Episode went to Game of Thrones; The Spoils of War; Loot Train Attack and the team of Dom Hellier, Thijs Noij, Edwin Holdsworth and Giacomo Matteucci.
21. The VES Award for Outstanding Created Environment in an Animated Feature went to Coco; City of the Dead and the team of Michael Frederickson, Jamie Hecker, Jonathan Pytko and Dave Strick.
21
22
22. The VES Award for Outstanding Created Environment in an Episode, Commercial or Real-Time Project went to Game of Thrones; Beyond the Wall; Frozen Lake and the team of Daniel Villalba, Antonio Lado, José Luis Barreiro and Isaac de la Pompa. Presenter Dan Stevens is at left. 23. The VES Award for Outstanding Visual Effects in a Student Project went to Hybrids and the team of Florian Brauch, Romain Thirion, Matthieu Pujol and Kim Tailhades.
23
25
54 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 55
24
28. The VES Award for Outstanding Compositing in a Photoreal Commercial went to Samsung; Do What You Can’t; Ostrich and the team of Michael Gregory, Andrew Roberts, Gustavo Bellon and Rashabh Ramesh Butani.
26
29. The VES Award for Outstanding Virtual Cinematography in a Photoreal Project went to Guardians of the Galaxy Vol. 2; Groot Dance/Opening Fight and the team of James Baker, Steven Lo, Alvise Avati and Robert Stipp. 27
24. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Dunkirk and the team of Andrew Jackson, Mike Chambers, Andrew Lockley, Alison Wortman and Scott Fisher. 25. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Black Sails; XXIX and the team of Erik Henry, Terron Pratt, Yafei Wu, David Wahlberg and Paul Dimmer.
28
29
SPRING 2018 VFXVOICE.COM • 55
3/5/18 4:19 PM
VES AWARDS
19
20
19. The VES Award for Outstanding Created Environment in a Photoreal Feature went to Blade Runner 2049; Los Angeles and the team of Chris McLaughlin, Rhys Salcombe, Seungjin Woo and Francesco Dell’Anna.
26. The VES Award for Outstanding Compositing in a Photoreal Feature went to War for the Planet of the Apes and the team of Christoph Salzmann, Robin Hollander, Ben Warner and Beck Veitch.
20. The VES Award for Outstanding Visual Effects in a RealTime Project went to Assassin’s Creed Origins and the team of Raphael Lacoste, Patrick Limoges, Jean-Sebastien Guay and Ulrich Haar.
27. The VES Award for Outstanding Compositing in a Photoreal Episode went to Game of Thrones; The Spoils of War; Loot Train Attack and the team of Dom Hellier, Thijs Noij, Edwin Holdsworth and Giacomo Matteucci.
21. The VES Award for Outstanding Created Environment in an Animated Feature went to Coco; City of the Dead and the team of Michael Frederickson, Jamie Hecker, Jonathan Pytko and Dave Strick.
21
22
22. The VES Award for Outstanding Created Environment in an Episode, Commercial or Real-Time Project went to Game of Thrones; Beyond the Wall; Frozen Lake and the team of Daniel Villalba, Antonio Lado, José Luis Barreiro and Isaac de la Pompa. Presenter Dan Stevens is at left. 23. The VES Award for Outstanding Visual Effects in a Student Project went to Hybrids and the team of Florian Brauch, Romain Thirion, Matthieu Pujol and Kim Tailhades.
23
25
54 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 55
24
28. The VES Award for Outstanding Compositing in a Photoreal Commercial went to Samsung; Do What You Can’t; Ostrich and the team of Michael Gregory, Andrew Roberts, Gustavo Bellon and Rashabh Ramesh Butani.
26
29. The VES Award for Outstanding Virtual Cinematography in a Photoreal Project went to Guardians of the Galaxy Vol. 2; Groot Dance/Opening Fight and the team of James Baker, Steven Lo, Alvise Avati and Robert Stipp. 27
24. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Dunkirk and the team of Andrew Jackson, Mike Chambers, Andrew Lockley, Alison Wortman and Scott Fisher. 25. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to Black Sails; XXIX and the team of Erik Henry, Terron Pratt, Yafei Wu, David Wahlberg and Paul Dimmer.
28
29
SPRING 2018 VFXVOICE.COM • 55
3/5/18 4:19 PM
VES AWARDS
30. The VES Award for Outstanding Visual Effects in a Special Venue Project went to Avatar: Flight of Passage and the team of Richard Baneham, Amy Jupiter, David Lester and Thrain Shadbolt.
30
31
31. Presenter Gabriel “Fluffy” Iglesias, left, and host Patton Oswalt grab selfies on the red carpet. 32. Presenters Dan Stevens and Sydelle Noel enjoy themselves on the red carpet. 33. VES First Vice Chair Jeffrey A. Okun, VES, left, and VES Executive Director Eric Roth, far right, flank producer Jon Landau, second from left, and Georges Méliès Award recipient Joe Letteri, VES.
33
32
37. Presenter Diane Warren is interviewed on the red carpet. 38. VFX legend Richard Edlund, VES, ASC at the VES Awards. 39. Lee Unkrich, director of Coco, and Darla K. Anderson, producer of Coco, enjoy the red carpet. 40. Astronaut and VFX enthusiast Colonel Terry Virts was in attendance. 41. Awards presenter, actor Jaime Camil of Coco and Jane the Virgin fame. 42. Elizabeth Henstridge of Agents of Shield fame was a presenter.
37
38
39
40
43. Awards presenter Katy Mixon of American Housewife fame.
34. Host Patton Oswalt and presenter Mark Hamill ham it up on the red carpet. 35. VES Lifetime Achievement Award winner Jon Favreau, left, has a good time backstage with presenter Gabriel “Fluffy” Iglesias, middle, and Marvel Studios president Kevin Feige, who presented Favreau with his Award.
34
35
36
56 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 57
36. VES First Vice Chair Jeffrey A. Okun, VES, left, shares a moment with War for the Planet of the Apes director Matt Reeves, middle, and Eric Roth, Executive Director of the VES.
41
42
43
SPRING 2018 VFXVOICE.COM • 57
3/5/18 4:20 PM
VES AWARDS
30. The VES Award for Outstanding Visual Effects in a Special Venue Project went to Avatar: Flight of Passage and the team of Richard Baneham, Amy Jupiter, David Lester and Thrain Shadbolt.
30
31
31. Presenter Gabriel “Fluffy” Iglesias, left, and host Patton Oswalt grab selfies on the red carpet. 32. Presenters Dan Stevens and Sydelle Noel enjoy themselves on the red carpet. 33. VES First Vice Chair Jeffrey A. Okun, VES, left, and VES Executive Director Eric Roth, far right, flank producer Jon Landau, second from left, and Georges Méliès Award recipient Joe Letteri, VES.
33
32
37. Presenter Diane Warren is interviewed on the red carpet. 38. VFX legend Richard Edlund, VES, ASC at the VES Awards. 39. Lee Unkrich, director of Coco, and Darla K. Anderson, producer of Coco, enjoy the red carpet. 40. Astronaut and VFX enthusiast Colonel Terry Virts was in attendance. 41. Awards presenter, actor Jaime Camil of Coco and Jane the Virgin fame. 42. Elizabeth Henstridge of Agents of Shield fame was a presenter.
37
38
39
40
43. Awards presenter Katy Mixon of American Housewife fame.
34. Host Patton Oswalt and presenter Mark Hamill ham it up on the red carpet. 35. VES Lifetime Achievement Award winner Jon Favreau, left, has a good time backstage with presenter Gabriel “Fluffy” Iglesias, middle, and Marvel Studios president Kevin Feige, who presented Favreau with his Award.
34
35
36
56 • VFXVOICE.COM SPRING 2018
PG 50-57 VES AWARDS EVENT.indd 57
36. VES First Vice Chair Jeffrey A. Okun, VES, left, shares a moment with War for the Planet of the Apes director Matt Reeves, middle, and Eric Roth, Executive Director of the VES.
41
42
43
SPRING 2018 VFXVOICE.COM • 57
3/5/18 4:20 PM
VES AWARD WINNERS
WAR FOR THE PLANET OF THE APES
The VES Award for Outstanding Visual Effects in a Photoreal Feature went to War for the Planet of the Apes, which won four VES Awards, including Outstanding Animated Character in a Photoreal Feature (Caesar), Outstanding Effects Simulations in a Photoreal Feature and Outstanding Compositing in a Photoreal Feature. (Photos courtesy of Twentieth Century Fox. All rights reserved.)
58 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 58-59
SPRING 2018 VFXVOICE.COM • 59
3/5/18 4:20 PM
VES AWARD WINNERS
WAR FOR THE PLANET OF THE APES
The VES Award for Outstanding Visual Effects in a Photoreal Feature went to War for the Planet of the Apes, which won four VES Awards, including Outstanding Animated Character in a Photoreal Feature (Caesar), Outstanding Effects Simulations in a Photoreal Feature and Outstanding Compositing in a Photoreal Feature. (Photos courtesy of Twentieth Century Fox. All rights reserved.)
58 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 58-59
SPRING 2018 VFXVOICE.COM • 59
3/5/18 4:20 PM
VES AWARD WINNERS
COCO
Coco won the VES Award for Outstanding Visual Effects in an Animated Feature. Coco won four VES Awards, including Outstanding Animated Character in an Animated Feature (Hèctor), Outstanding Created Environment in an Animated Feature (City of the Dead) and Outstanding Effects Simulations in an Animated Feature. (Photos courtesy of Disney-Pixar. All rights reserved.)
60 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 60-61
SPRING 2018 VFXVOICE.COM • 61
3/5/18 4:20 PM
VES AWARD WINNERS
COCO
Coco won the VES Award for Outstanding Visual Effects in an Animated Feature. Coco won four VES Awards, including Outstanding Animated Character in an Animated Feature (Hèctor), Outstanding Created Environment in an Animated Feature (City of the Dead) and Outstanding Effects Simulations in an Animated Feature. (Photos courtesy of Disney-Pixar. All rights reserved.)
60 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 60-61
SPRING 2018 VFXVOICE.COM • 61
3/5/18 4:20 PM
VES AWARD WINNERS
GAME OF THRONES
62 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 63
Game of Thrones won the VES Award for Outstanding Visual Effects in a Photoreal Episode (Game of Thrones; Beyond the Wall). The series won five VES Awards, including Outstanding Animated Character in an Episode or Real-Time Project (Game of Thrones; The Spoils of War; Drogon Loot Train Attack), Outstanding Created Environment in an Episode, Commercial or Real-Time Project (Game of Thrones; Beyond the Wall; Frozen Lake), Outstanding Effects Simulations in an Episode, Commercial or RealTime Project (Game of Thrones; The Dragon and the Wolf; Wall Destruction) and Outstanding Compositing in a Photoreal Episode (Game of Thrones; The Spoils of War; Loot Train Attack). (Photos courtesy of HBO. All rights reserved.)
SPRING 2018 VFXVOICE.COM • 63
3/5/18 4:20 PM
VES AWARD WINNERS
GAME OF THRONES
62 • VFXVOICE.COM SPRING 2018
PG 58-63 VES AWARDS WINNERS.indd 63
Game of Thrones won the VES Award for Outstanding Visual Effects in a Photoreal Episode (Game of Thrones; Beyond the Wall). The series won five VES Awards, including Outstanding Animated Character in an Episode or Real-Time Project (Game of Thrones; The Spoils of War; Drogon Loot Train Attack), Outstanding Created Environment in an Episode, Commercial or Real-Time Project (Game of Thrones; Beyond the Wall; Frozen Lake), Outstanding Effects Simulations in an Episode, Commercial or RealTime Project (Game of Thrones; The Dragon and the Wolf; Wall Destruction) and Outstanding Compositing in a Photoreal Episode (Game of Thrones; The Spoils of War; Loot Train Attack). (Photos courtesy of HBO. All rights reserved.)
SPRING 2018 VFXVOICE.COM • 63
3/5/18 4:20 PM
INDUSTRY ROUNDTABLE
2018 – YEAR OF THE RAPIDLY EXPANDING VFX MOVIEMAKING VOCABULARY By TREVOR HOGG
“As Virtual Production has become less obtrusive to the process, it has been used more and more to drive productions.” —Lou Pecora
64 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 64-65
“Growth is being accelerated because Silicon Valley is playing a significant role in Hollywood and made some big bets on where they believe audiences, creators and the overall business is headed in the coming years.” —Dane Smith
No longer simply the means for spectacle, visual effects have become a hybrid of technology and artistry to the point that live-action and digital animation are indistinguishable from one another. To gain insight into emerging patterns, directions and applications in visual effects filmmaking and production, VFX Voice consulted a virtual panel of executives and supervisors on the creative and technical trends shaping the industry in 2018. Following are their comments. LOU PECORA, VISUAL EFFECTS SUPERVISOR, ZOIC STUDIOS
Two technologies will continue to grow in the coming year and years: Virtual Production and Performance Capture. Both of these technologies have been in use for a few years, but have been somewhat clunky and intrusive to the process. Only recently has the hardware and software become streamlined enough to where the process is smooth, organic and, most importantly, doesn’t slow production down. As Virtual Production has become less obtrusive to the process, it has been used drive productions more. As more DPs get exposed to and embrace this technology we will see some very interesting and clever use of it. To be able to choreograph a scene and then dynamically change camera angles, lenses and timings really frees up the filmmakers to first focus on the performances and then on the coverage of those performances. More companies are offering this service and allowing filmmakers to basically ‘live-vis’ a scene. They stage the action, and everything is captured and tracked so that the scene can be replayed in Unity. Then the director and DP can walk around the virtual set with virtual cameras and get the coverage of the scene they want. In the hands of an experienced filmmaker this can be a powerful tool. In some hands, it can create an expensive mess, so it will be interesting to see what happens. DANE ALLAN SMITH, HEAD OF GLOBAL BUSINESS DEVELOPMENT, THE THIRD FLOOR
The landscape in our industry is rapidly developing to include new audiences. The area of overlap between gaming, mixed reality and conventional media is growing into its own viable, sustainable medium. That growth is being accelerated because Silicon Valley is playing a significant role in Hollywood and has made some big bets on where they believe audiences, creators and the overall business is headed in the coming years. As a visualization company, we have seen an early reluctance by the major studios to embrace virtual production, augmented reality and the integration of the game engine melt away as these technologies become more effective at increasing production value and reducing costs. Our clients have crossed the threshold and are very familiar with tools that were first forged in the gaming industry and are now a familiar sight on set. The next wave of content producers is tech-savvy and less averse to disrupting the status quo with applied science than content producers in the past. They have access to analytics
that offer unprecedented insight into audience wants and needs. This new resource will fuel dramatic change. The mixed-reality toolset is migrating from – building on, drawing on, evolving from – the production pipeline and will become essential to the way ‘next generation’ audiences experience media. We see this happening in the location-based entertainment space now, where audiences are used to early adoption of state of the art technology. As the technology matures, a new audience will emerge. We are designing our pipeline to serve that audience by empowering the expanding and diverse creative community with tools that amplify their voices. TRENT CLAUS, VISUAL EFFECTS SUPERVISOR, LOLA VFX
In the next year and into the future, the march toward photo-real digital characters – human and otherwise – will continue. I was really impressed with the work MPC did on The Jungle Book, and what Framestore has been doing with Rocket in the Guardians of the Galaxy films. The artists at WETA have been doing a phenomenal job on the The Planet of the Apes movies, and I’m excited to see what comes next. An increasing emphasis and value given to “invisible effects” has been great to see, and it will only continue to grow, especially with the rising volume of television/streaming VFX work being done. The desire to blend digital effects with practical effects will continue, and it will lead to some great collaborations between artists and companies of different backgrounds. On the technical side of things, I’m very excited about the technological prospects of getting rid of chroma-keying. It would dramatically affect VFX processes on set for the better and, one would hope, allow us more time in post to focus on integrating elements rather than isolating them. I’ve been very pleased to see VFX departments getting involved with productions earlier on, and that will continue to happen more frequently. Having the VFX teams involved during development can save tremendous amounts of time and money during production and post, and ultimately lead to better-looking effects, a better work environment for the artists and a better experience for the audience.
“In the next year and into the future, the march toward photo-real digital characters – human and otherwise – will continue.” —Trent Claus
“Spanning all technological advancements, the perennial requirement for the sharpest artistic eyes creating and nurturing VFX images will only compound.” —Philip Greenlow
PHILIP GREENLOW, EXECUTIVE PRODUCER, MPC FILM
‘Virtual Production’ has been an increasingly used term throughout our operations this year. It’s likely the application of this new form of digital filmmaking will intensify in 2018. Combining complex CG content with cameras and real-time rendering technology is an exciting and powerful tool, one which will become integral to the ever-escalating ambition of cinema to portray the fantastic and bring to audiences images and stories never before seen on screen. We should expect to see the envelope of photo-real rendering push further. The next generation of believable, immersive digital worlds and fully-realized digital characters will continue to exceed the bar. Photo and motion-capture techniques, as well as hair, muscle and shading tools will continue to improve
SPRING 2018 VFXVOICE.COM • 65
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
2018 – YEAR OF THE RAPIDLY EXPANDING VFX MOVIEMAKING VOCABULARY By TREVOR HOGG
“As Virtual Production has become less obtrusive to the process, it has been used more and more to drive productions.” —Lou Pecora
64 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 64-65
“Growth is being accelerated because Silicon Valley is playing a significant role in Hollywood and made some big bets on where they believe audiences, creators and the overall business is headed in the coming years.” —Dane Smith
No longer simply the means for spectacle, visual effects have become a hybrid of technology and artistry to the point that live-action and digital animation are indistinguishable from one another. To gain insight into emerging patterns, directions and applications in visual effects filmmaking and production, VFX Voice consulted a virtual panel of executives and supervisors on the creative and technical trends shaping the industry in 2018. Following are their comments. LOU PECORA, VISUAL EFFECTS SUPERVISOR, ZOIC STUDIOS
Two technologies will continue to grow in the coming year and years: Virtual Production and Performance Capture. Both of these technologies have been in use for a few years, but have been somewhat clunky and intrusive to the process. Only recently has the hardware and software become streamlined enough to where the process is smooth, organic and, most importantly, doesn’t slow production down. As Virtual Production has become less obtrusive to the process, it has been used drive productions more. As more DPs get exposed to and embrace this technology we will see some very interesting and clever use of it. To be able to choreograph a scene and then dynamically change camera angles, lenses and timings really frees up the filmmakers to first focus on the performances and then on the coverage of those performances. More companies are offering this service and allowing filmmakers to basically ‘live-vis’ a scene. They stage the action, and everything is captured and tracked so that the scene can be replayed in Unity. Then the director and DP can walk around the virtual set with virtual cameras and get the coverage of the scene they want. In the hands of an experienced filmmaker this can be a powerful tool. In some hands, it can create an expensive mess, so it will be interesting to see what happens. DANE ALLAN SMITH, HEAD OF GLOBAL BUSINESS DEVELOPMENT, THE THIRD FLOOR
The landscape in our industry is rapidly developing to include new audiences. The area of overlap between gaming, mixed reality and conventional media is growing into its own viable, sustainable medium. That growth is being accelerated because Silicon Valley is playing a significant role in Hollywood and has made some big bets on where they believe audiences, creators and the overall business is headed in the coming years. As a visualization company, we have seen an early reluctance by the major studios to embrace virtual production, augmented reality and the integration of the game engine melt away as these technologies become more effective at increasing production value and reducing costs. Our clients have crossed the threshold and are very familiar with tools that were first forged in the gaming industry and are now a familiar sight on set. The next wave of content producers is tech-savvy and less averse to disrupting the status quo with applied science than content producers in the past. They have access to analytics
that offer unprecedented insight into audience wants and needs. This new resource will fuel dramatic change. The mixed-reality toolset is migrating from – building on, drawing on, evolving from – the production pipeline and will become essential to the way ‘next generation’ audiences experience media. We see this happening in the location-based entertainment space now, where audiences are used to early adoption of state of the art technology. As the technology matures, a new audience will emerge. We are designing our pipeline to serve that audience by empowering the expanding and diverse creative community with tools that amplify their voices. TRENT CLAUS, VISUAL EFFECTS SUPERVISOR, LOLA VFX
In the next year and into the future, the march toward photo-real digital characters – human and otherwise – will continue. I was really impressed with the work MPC did on The Jungle Book, and what Framestore has been doing with Rocket in the Guardians of the Galaxy films. The artists at WETA have been doing a phenomenal job on the The Planet of the Apes movies, and I’m excited to see what comes next. An increasing emphasis and value given to “invisible effects” has been great to see, and it will only continue to grow, especially with the rising volume of television/streaming VFX work being done. The desire to blend digital effects with practical effects will continue, and it will lead to some great collaborations between artists and companies of different backgrounds. On the technical side of things, I’m very excited about the technological prospects of getting rid of chroma-keying. It would dramatically affect VFX processes on set for the better and, one would hope, allow us more time in post to focus on integrating elements rather than isolating them. I’ve been very pleased to see VFX departments getting involved with productions earlier on, and that will continue to happen more frequently. Having the VFX teams involved during development can save tremendous amounts of time and money during production and post, and ultimately lead to better-looking effects, a better work environment for the artists and a better experience for the audience.
“In the next year and into the future, the march toward photo-real digital characters – human and otherwise – will continue.” —Trent Claus
“Spanning all technological advancements, the perennial requirement for the sharpest artistic eyes creating and nurturing VFX images will only compound.” —Philip Greenlow
PHILIP GREENLOW, EXECUTIVE PRODUCER, MPC FILM
‘Virtual Production’ has been an increasingly used term throughout our operations this year. It’s likely the application of this new form of digital filmmaking will intensify in 2018. Combining complex CG content with cameras and real-time rendering technology is an exciting and powerful tool, one which will become integral to the ever-escalating ambition of cinema to portray the fantastic and bring to audiences images and stories never before seen on screen. We should expect to see the envelope of photo-real rendering push further. The next generation of believable, immersive digital worlds and fully-realized digital characters will continue to exceed the bar. Photo and motion-capture techniques, as well as hair, muscle and shading tools will continue to improve
SPRING 2018 VFXVOICE.COM • 65
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
in both efficiency and accuracy. Add the potential of machine-learning techniques and the horizon broadens for faster as well as stronger results. Spanning all technological advancements, the perennial requirement for the sharpest artistic eyes creating and nurturing VFX images will only compound. The number of VFX shots the industry produces as a whole is climbing, and audiences are smart and discerning. The illusions we created yesterday will be found out tomorrow, so it goes without saying that the detail and realism produced by the tools and techniques we harness will be held to closer scrutiny from the accumulating authors who put their names to the images.
“My eyes are specifically set on the latest advances in real-time tracking/compositing that allow us to lock the actors into digital sets or set extensions during the shoot.” —Volker Engel
“We will see much greater use of virtual production as the tools continue to improve. The ability for a director to visualize a mostly or even fully CG scene in real time as he or she walks around the set will also have a positive impact on VFX artists and their work.” —Erika Burton
VOLKER ENGEL, VFX SUPERVISOR/CO-PRODUCER, INDEPENDENCE DAY: RESURGENCE
Looking at War for the Planet of the Apes and Furious 7 it becomes crystal-clear that digital characters in live-action feature films are on the rise. But one of the biggest breakthroughs from a pure story-telling point of view are the possibilities that de-aging of well-known actors brings to the table. In 2017, we have seen a younger Anthony Hopkins, Kurt Russell, Johnny Depp and Sean Young. In 2005, our company, Uncharted Territory, was approached by director Harald Zwart [The Karate Kid remake] who came to us with an incredible screenplay about a protagonist who meets his younger self and has to first bond and then work with this character throughout the story of the film. It was clear from the get-go that a young look-alike actor would not do. It clearly had to be that same person. Unfortunately, we had to turn him down in 2005 because the technology was still in its infancy. Now I look forward to this and other screenplays with similar storylines and technical challenges being turned into thrilling feature films. VR and AR will for sure be a big topic in 2018. My eyes are specifically set on the latest advances in real-time tracking/ compositing that allow us to lock the actors into digital sets or set extensions during the shoot. We used the Ncam system extensively on Independence Day: Resurgence. It is a multi-sensor hybrid technology that creates a point cloud of the environment and instantly locks our pre-created digital environment to the camera image the director sees on his monitor. Director Roland Emmerich calls it his “favorite tool.” My favorite moment of the shoot was when Roland discovered he could include pre-animated objects or characters, i.e. show a half dozen jet fighters vertically lift off inside a hangar. The camera operator was able to pan with the moving jets, and also the actors finally knew what they were looking at, besides a big blue screen. ERIKA BURTON, EVP GLOBAL FEATURES VFX, METHOD STUDIOS
A key technical trend that will continue to drive processes for VFX shops is global integration. At Method and other facilities, different outposts used to handle different shows or special-
66 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 66-67
ties, but over the past year we’ve identified one set of best practices to implement across the board, thereby standardizing production operations at every Method location worldwide. We can adjust on the fly to redistribute capacity and share capabilities as one fully integrated global operation. Cloudbased tools have matured in the past year and had a big impact in making this possible. On the creative side, world building will become more important with the continued rise of “shared” cinematic universes, including assets and characters. We’ve experienced this as longtime Marvel vendors, but it’s happening across the board now as well. The software and tools are so advanced now that VFX artists can create massive alternate universes that are photoreal. World building will also be key as VR continues to take hold. Consumers want to step inside these fantastic worlds they’ve seen on screen. It’s becoming more important, especially with VFX-driven features, to think about how that IP can also exist in VR/AR mediums. This not only affects environments, but also character design and even assets like weaponry and costumes. In 2018, we also will see much greater use of virtual production as the tools continue to improve. The ability for a director to visualize a mostly or even fully CG scene in real time as he or she walks around the set will also have a positive impact on VFX artists and their work, as more looks can be ironed out further in advance – meaning less time iterating and more time achieving the filmmaker’s creative vision.
“Audiences’ notion of real vs. artificial on the big screen will be blurred as VFX continues to test existing boundaries of what stories can be told.” —Florian Gellinger
“The exponential rise of machine learning/ neural networks over the last couple of years will increasingly have an impact on VFX/animation productions.” —Carsten Kolve
FLORIAN GELLINGER, VISUAL EFFECTS SUPERVISOR AND CO-FOUNDER, RISE VISUAL EFFECTS STUDIOS
Looking back on the releases from the last few years, I believe that studios and filmmakers have become less fearful in incorporating entirely digital actors into their productions. We have seen with notable films such as Avatar and War for the Planet of the Apes that through these VFX advancements a greater creative range allowed for new story and character developments to be pursued. Apart from the more stylized examples, as demonstrated in the upcoming Alita: Battle Angel and Ready Player One, the versatility of full CG character construction can also produce photorealistic portrayals as was seen in Gravity’s space sequences. With this evolving development, audiences’ notion of real vs. artificial on the big screen will be blurred as VFX continues to test existing boundaries of what stories can be told. CARSTEN KOLVE, DIGITAL SUPERVISOR, IMAGE ENGINE
Three areas of development will make a significant impact on VFX/animation production in the new year. Scene Graph Based Workflows: While in the past a lot of focus has been placed on standardizing file formats for specific use – cases like animated mesh caches, volumes, point clouds – we will see a more widespread adoption of workflows that deal with the complexities of a scene graph. While these workflows
SPRING 2018 VFXVOICE.COM • 67
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
in both efficiency and accuracy. Add the potential of machine-learning techniques and the horizon broadens for faster as well as stronger results. Spanning all technological advancements, the perennial requirement for the sharpest artistic eyes creating and nurturing VFX images will only compound. The number of VFX shots the industry produces as a whole is climbing, and audiences are smart and discerning. The illusions we created yesterday will be found out tomorrow, so it goes without saying that the detail and realism produced by the tools and techniques we harness will be held to closer scrutiny from the accumulating authors who put their names to the images.
“My eyes are specifically set on the latest advances in real-time tracking/compositing that allow us to lock the actors into digital sets or set extensions during the shoot.” —Volker Engel
“We will see much greater use of virtual production as the tools continue to improve. The ability for a director to visualize a mostly or even fully CG scene in real time as he or she walks around the set will also have a positive impact on VFX artists and their work.” —Erika Burton
VOLKER ENGEL, VFX SUPERVISOR/CO-PRODUCER, INDEPENDENCE DAY: RESURGENCE
Looking at War for the Planet of the Apes and Furious 7 it becomes crystal-clear that digital characters in live-action feature films are on the rise. But one of the biggest breakthroughs from a pure story-telling point of view are the possibilities that de-aging of well-known actors brings to the table. In 2017, we have seen a younger Anthony Hopkins, Kurt Russell, Johnny Depp and Sean Young. In 2005, our company, Uncharted Territory, was approached by director Harald Zwart [The Karate Kid remake] who came to us with an incredible screenplay about a protagonist who meets his younger self and has to first bond and then work with this character throughout the story of the film. It was clear from the get-go that a young look-alike actor would not do. It clearly had to be that same person. Unfortunately, we had to turn him down in 2005 because the technology was still in its infancy. Now I look forward to this and other screenplays with similar storylines and technical challenges being turned into thrilling feature films. VR and AR will for sure be a big topic in 2018. My eyes are specifically set on the latest advances in real-time tracking/ compositing that allow us to lock the actors into digital sets or set extensions during the shoot. We used the Ncam system extensively on Independence Day: Resurgence. It is a multi-sensor hybrid technology that creates a point cloud of the environment and instantly locks our pre-created digital environment to the camera image the director sees on his monitor. Director Roland Emmerich calls it his “favorite tool.” My favorite moment of the shoot was when Roland discovered he could include pre-animated objects or characters, i.e. show a half dozen jet fighters vertically lift off inside a hangar. The camera operator was able to pan with the moving jets, and also the actors finally knew what they were looking at, besides a big blue screen. ERIKA BURTON, EVP GLOBAL FEATURES VFX, METHOD STUDIOS
A key technical trend that will continue to drive processes for VFX shops is global integration. At Method and other facilities, different outposts used to handle different shows or special-
66 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 66-67
ties, but over the past year we’ve identified one set of best practices to implement across the board, thereby standardizing production operations at every Method location worldwide. We can adjust on the fly to redistribute capacity and share capabilities as one fully integrated global operation. Cloudbased tools have matured in the past year and had a big impact in making this possible. On the creative side, world building will become more important with the continued rise of “shared” cinematic universes, including assets and characters. We’ve experienced this as longtime Marvel vendors, but it’s happening across the board now as well. The software and tools are so advanced now that VFX artists can create massive alternate universes that are photoreal. World building will also be key as VR continues to take hold. Consumers want to step inside these fantastic worlds they’ve seen on screen. It’s becoming more important, especially with VFX-driven features, to think about how that IP can also exist in VR/AR mediums. This not only affects environments, but also character design and even assets like weaponry and costumes. In 2018, we also will see much greater use of virtual production as the tools continue to improve. The ability for a director to visualize a mostly or even fully CG scene in real time as he or she walks around the set will also have a positive impact on VFX artists and their work, as more looks can be ironed out further in advance – meaning less time iterating and more time achieving the filmmaker’s creative vision.
“Audiences’ notion of real vs. artificial on the big screen will be blurred as VFX continues to test existing boundaries of what stories can be told.” —Florian Gellinger
“The exponential rise of machine learning/ neural networks over the last couple of years will increasingly have an impact on VFX/animation productions.” —Carsten Kolve
FLORIAN GELLINGER, VISUAL EFFECTS SUPERVISOR AND CO-FOUNDER, RISE VISUAL EFFECTS STUDIOS
Looking back on the releases from the last few years, I believe that studios and filmmakers have become less fearful in incorporating entirely digital actors into their productions. We have seen with notable films such as Avatar and War for the Planet of the Apes that through these VFX advancements a greater creative range allowed for new story and character developments to be pursued. Apart from the more stylized examples, as demonstrated in the upcoming Alita: Battle Angel and Ready Player One, the versatility of full CG character construction can also produce photorealistic portrayals as was seen in Gravity’s space sequences. With this evolving development, audiences’ notion of real vs. artificial on the big screen will be blurred as VFX continues to test existing boundaries of what stories can be told. CARSTEN KOLVE, DIGITAL SUPERVISOR, IMAGE ENGINE
Three areas of development will make a significant impact on VFX/animation production in the new year. Scene Graph Based Workflows: While in the past a lot of focus has been placed on standardizing file formats for specific use – cases like animated mesh caches, volumes, point clouds – we will see a more widespread adoption of workflows that deal with the complexities of a scene graph. While these workflows
SPRING 2018 VFXVOICE.COM • 67
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
“There have been significant advancements in technology over the last couple years that can make the virtual filmmaking process intuitive for filmmakers, and a large part of that is the utilization of game engines, such as Unreal.” —Casey Pyke
“The new trends that people are playing with is machine learning and deep neural networks. … Given its incredible potential, everyone seems to be playing around with it. It’s only a matter of time before it gets widely used in production.” —Mathieu Leclaire
themselves are not new – tools like Katana or Gaffer have been giving TDs and artists these capabilities for many years – the release of USD and its support by both the standard DCC tools and facility-sponsored open-source projects will make a tangible difference in how and at what level data is exchanged between applications and vendors. Machine Learning: The exponential rise of machine learning/ neural networks over the last couple of years will increasingly have an impact on VFX/animation productions. While simple data-driven solutions have been used in VFX production for a while – for example, to control geometry deformations in rigs – more applications for this new, more powerful wave of technology are being found at a steady pace and making their way into production workflows. De-noising, increasingly accurate alpha-matting, rotoscoping, context-aware painting, simplified versions of complex shading effects, example-based facial animation and secondary deformations, 3D feature, object and pose-tracking are all areas where machine learning has already demonstrated the potential to speed up the ‘time to first presentable iteration.’ More of these techniques are making their way into the everyday toolset. It will be interesting to see the impact this will have on work that might today still be a candidate for outsourcing. Big Data & Production Efficiency: As budget and time constraints become tighter, the need to be as efficient as possible, while at least maintaining the same quality standard, becomes an imperative. VFX and animation production generate a huge amount of data from a variety of sources: production tracking software, bidding, time-keeping and accounting systems, render farms, IT infrastructure and asset-management systems. These are rich sources of information that show a huge amount about how you spent your available human and machine resources. Just like ‘traditional’ businesses have already embraced using this data to gain production insights, so will the creative industries in order to optimize common processes. Using business intelligence and data-mining tools will be more commonplace to measure the performance of anything from ‘machine utilization,’ ‘rendering speed’ and ‘the time it takes for an update to make it through into a final image’ to ‘usable data generated during overtime’. These historical insights will help in more rational decision-making to validate past changes and investments, and they will help in predicting possible problems earlier. CASEY PYKE, SUPERVISOR, HALON ENTERTAINMENT
Virtual production and the use of game engines in film are both technical and creative trends that will become even more widely used in 2018 effects films. There have been significant advancements in technology over the last couple years that can make the virtual filmmaking process intuitive for filmmakers, and a large part of that is the utilization of game engines, such as Unreal. In our field – visualization – planning out sequences in previs can directly transfer over to a virtual scout or even a shoot
68 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 69
with a virtual camera. Using previs animation and preliminary motion capture exported into a game engine, we can set up master scenes for the filmmakers to shoot which can be rather complex. The rendering capabilities of the game engine make crowds, lighting, atmospherics and depth of field all adjustable and shootable in real-time. We used Unreal on Logan, and deployed it on War for the Planet of the Apes for the entire visualization pipeline through to finals. The animation and assets made in that early stage can be used in live-action production for Simulcam setups or as virtual assets in virtual production. HALON’s use of Unreal for previs and postvis work make these technical developments work in all stages of a film’s production. It gives filmmakers a more intuitive and accurate idea of how their film is going to look, giving them the ability to better realize their vision. MATHIEU LECLAIRE, DIRECTOR OF R&D, HYBRIDE
The new trends that people are playing with is machine learning and deep neural networks. I’ve seen it being explored to blend animations, de-noise and accelerate renders, accelerate long and complicated simulations – like water and fluid simulations, rotoscoping and matte generation. I don’t know if it’s been successfully used in production much yet, but given its incredible potential, everyone seems to be playing around with it. It’s only a matter of time before it gets widely used in production. The arrival of the USD [Universal Scene Description] format is getting a lot of interest. Pipelines are getting more complex, and there are a lot more shared shots between FX facilities which makes this format interesting. MaterialX also shows potential for exchanging materials between renderers and facilities who use different renderers. The lines are being blurred between layout and final renders. A software like Clarisse makes it so much easier to quickly build very complex shots with a ton of assets and geometry in a very user-friendly way. Facilities are building libraries of assets for quick and easy reuse. Facilities rely more and more on custom libraries of animation vignettes to use to quickly lay out crowds instead of turning towards complicated AI based systems. We even build huge libraries of pre-simulated FX elements that we can quickly lay out when building a shot. These can often be directly used in a final render or they can serve as reference for the FX teams in order to guide the final simulations. Since scenes are increasingly huge and complex, people turn more towards procedural and simulation tools. That’s one of the main reasons a tool like Houdini is gaining more popularity. It already contains many tools and it is so flexible that it can help automate content creation that would otherwise be very costly to create by hand. OpenVDB is one such tool that opened up so many possibilities that simply weren’t possible before. Deep Images still is a good tool to render and merge complex assets that can even come from various renderers. The Cloud and GPUs are more easily accessible these days and can help distribute otherwise very costly computational
“When we get a good response on a vignette it’s much easier to build the pipeline – only once! – knowing we’ve cracked the big design problems without affecting scores of artists.” —Jake Morrison
“Time must be shared for VFX artists to gestate a creative answer to creative questions, and VFX filmmakers will find it increasingly necessary to find techniques and practices that allow the creative space to deliver volume on tighter schedules at a sustained quality.” —Jonathan Fawkner
SPRING 2018 VFXVOICE.COM • 69
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
“There have been significant advancements in technology over the last couple years that can make the virtual filmmaking process intuitive for filmmakers, and a large part of that is the utilization of game engines, such as Unreal.” —Casey Pyke
“The new trends that people are playing with is machine learning and deep neural networks. … Given its incredible potential, everyone seems to be playing around with it. It’s only a matter of time before it gets widely used in production.” —Mathieu Leclaire
themselves are not new – tools like Katana or Gaffer have been giving TDs and artists these capabilities for many years – the release of USD and its support by both the standard DCC tools and facility-sponsored open-source projects will make a tangible difference in how and at what level data is exchanged between applications and vendors. Machine Learning: The exponential rise of machine learning/ neural networks over the last couple of years will increasingly have an impact on VFX/animation productions. While simple data-driven solutions have been used in VFX production for a while – for example, to control geometry deformations in rigs – more applications for this new, more powerful wave of technology are being found at a steady pace and making their way into production workflows. De-noising, increasingly accurate alpha-matting, rotoscoping, context-aware painting, simplified versions of complex shading effects, example-based facial animation and secondary deformations, 3D feature, object and pose-tracking are all areas where machine learning has already demonstrated the potential to speed up the ‘time to first presentable iteration.’ More of these techniques are making their way into the everyday toolset. It will be interesting to see the impact this will have on work that might today still be a candidate for outsourcing. Big Data & Production Efficiency: As budget and time constraints become tighter, the need to be as efficient as possible, while at least maintaining the same quality standard, becomes an imperative. VFX and animation production generate a huge amount of data from a variety of sources: production tracking software, bidding, time-keeping and accounting systems, render farms, IT infrastructure and asset-management systems. These are rich sources of information that show a huge amount about how you spent your available human and machine resources. Just like ‘traditional’ businesses have already embraced using this data to gain production insights, so will the creative industries in order to optimize common processes. Using business intelligence and data-mining tools will be more commonplace to measure the performance of anything from ‘machine utilization,’ ‘rendering speed’ and ‘the time it takes for an update to make it through into a final image’ to ‘usable data generated during overtime’. These historical insights will help in more rational decision-making to validate past changes and investments, and they will help in predicting possible problems earlier. CASEY PYKE, SUPERVISOR, HALON ENTERTAINMENT
Virtual production and the use of game engines in film are both technical and creative trends that will become even more widely used in 2018 effects films. There have been significant advancements in technology over the last couple years that can make the virtual filmmaking process intuitive for filmmakers, and a large part of that is the utilization of game engines, such as Unreal. In our field – visualization – planning out sequences in previs can directly transfer over to a virtual scout or even a shoot
68 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 69
with a virtual camera. Using previs animation and preliminary motion capture exported into a game engine, we can set up master scenes for the filmmakers to shoot which can be rather complex. The rendering capabilities of the game engine make crowds, lighting, atmospherics and depth of field all adjustable and shootable in real-time. We used Unreal on Logan, and deployed it on War for the Planet of the Apes for the entire visualization pipeline through to finals. The animation and assets made in that early stage can be used in live-action production for Simulcam setups or as virtual assets in virtual production. HALON’s use of Unreal for previs and postvis work make these technical developments work in all stages of a film’s production. It gives filmmakers a more intuitive and accurate idea of how their film is going to look, giving them the ability to better realize their vision. MATHIEU LECLAIRE, DIRECTOR OF R&D, HYBRIDE
The new trends that people are playing with is machine learning and deep neural networks. I’ve seen it being explored to blend animations, de-noise and accelerate renders, accelerate long and complicated simulations – like water and fluid simulations, rotoscoping and matte generation. I don’t know if it’s been successfully used in production much yet, but given its incredible potential, everyone seems to be playing around with it. It’s only a matter of time before it gets widely used in production. The arrival of the USD [Universal Scene Description] format is getting a lot of interest. Pipelines are getting more complex, and there are a lot more shared shots between FX facilities which makes this format interesting. MaterialX also shows potential for exchanging materials between renderers and facilities who use different renderers. The lines are being blurred between layout and final renders. A software like Clarisse makes it so much easier to quickly build very complex shots with a ton of assets and geometry in a very user-friendly way. Facilities are building libraries of assets for quick and easy reuse. Facilities rely more and more on custom libraries of animation vignettes to use to quickly lay out crowds instead of turning towards complicated AI based systems. We even build huge libraries of pre-simulated FX elements that we can quickly lay out when building a shot. These can often be directly used in a final render or they can serve as reference for the FX teams in order to guide the final simulations. Since scenes are increasingly huge and complex, people turn more towards procedural and simulation tools. That’s one of the main reasons a tool like Houdini is gaining more popularity. It already contains many tools and it is so flexible that it can help automate content creation that would otherwise be very costly to create by hand. OpenVDB is one such tool that opened up so many possibilities that simply weren’t possible before. Deep Images still is a good tool to render and merge complex assets that can even come from various renderers. The Cloud and GPUs are more easily accessible these days and can help distribute otherwise very costly computational
“When we get a good response on a vignette it’s much easier to build the pipeline – only once! – knowing we’ve cracked the big design problems without affecting scores of artists.” —Jake Morrison
“Time must be shared for VFX artists to gestate a creative answer to creative questions, and VFX filmmakers will find it increasingly necessary to find techniques and practices that allow the creative space to deliver volume on tighter schedules at a sustained quality.” —Jonathan Fawkner
SPRING 2018 VFXVOICE.COM • 69
3/5/18 4:21 PM
INDUSTRY ROUNDTABLE
jobs. Color spaces are easier to manage thanks to better and easier-to-use standards. JAKE MORRISON, VISUAL EFFECTS SUPERVISOR, MARVEL STUDIOS
“Invisible effects are everywhere in the industry; they just don’t get the spotlight. However, with all invisible effects, VFX can fix small continuity errors, set issues, dress issues and even the makeup.” —Malte Sarnes
2018 needs to become the year of VFX Rapid Prototyping! As filmmakers become more accustomed to thinking about Visual Effects as an environment where they can explore design solutions rather than just executing a design that has been created by another department, we need to find quicker and more effective ways to ‘sketch’ in motion, in 3D. To illustrate the point: Creating a five-second cool-looking ‘character vignette’ that is technically bare-bones – essentially held together with digital duct-tape and glue – may well save a VFX facility the incredible pain of building and rebuilding a fully fledged industrial-scale pipeline over and over again as they try to keep up with the filmmakers’ creative exploration. The test shot/vignette doesn’t need to be long, doesn’t even need to look photoreal, but it should be crammed full of attitude with some basic FX if the character design supports it. Think of it as Concept Art in motion. When we get a good response on a vignette it’s much easier to build the pipeline – only once! – knowing we’ve cracked the big design problems without affecting scores of artists. JONATHAN FAWKNER, CREATIVE DIRECTOR, FRAMESTORE
As audiences binge on ever-magnified spectacles of the fantastic and on ever-developing platforms, the pressure for new ideas is the greatest it has ever been. There is a clear trend to defer idea creation in the belief that the best creativity evolves over time. The danger is that the visual effects process that empowers this deferment is seen simply as technical and not creative in its own right. Time must be shared for VFX artists to gestate a creative answer to creative questions, and VFX filmmakers will find it increasingly necessary to find techniques and practices that allow the creative space to deliver volume on tighter schedules at a sustained quality. MALTE SARNES, VISUAL EFFECTS SUPERVISOR, RISING SUN PICTURES
Expect another gain of invisible effects. Invisible effects are everywhere in the industry; they just don’t get the spotlight. However, with all invisible effects, VFX can fix small continuity errors, set issues, dress issues and even the makeup. I’m also happy practical effects are back in fashion. A multitude of CG will ultimately be part of the final film, but the reliance on practical explosions and stunts is clear. Wonder Woman and Dunkirk have that sense of grit and grime designed to make them look more hand-crafted than some of the big effects-driven blockbusters.
70 • VFXVOICE.COM SPRING 2018
PG 64-70 VFX ROUNDTABLE.indd 70
3/5/18 4:21 PM
PG 71 FOCAL PRESS MASTERS of FX AD.indd 71
3/5/18 4:22 PM
COMPANY PROFILE
Not too many studios can boast two decades of existence in the sometimes volatile visual effects industry. But Double Negative, founded in 1998, reaches that milestone this year, having grown from a 30-person startup to a global studio with thousands of employees. Ultimately, DNEG has emerged as one of the powerhouse visual effects studios in feature filmmaking. It is known for anchoring major releases, as well as continuing to work on a range of lower budget films, and has now branched out into television and animated features. In the process, the studio’s work has been honored by three Academy Awards (for Inception, Interstellar and Ex Machina). For its 20th anniversary, VFX Voice spoke to some of DNEG’s key staff for a look back at the rise of the visual effects studio. TWO DECADES OF DNEG
DOUBLE NEGATIVE: DOUBLE DECADES OF DOUBLE POSITIVES By IAN FAILES
While it is now a massive visual effects studio, DNEG started small. It was formed largely by a group of artists who had been working together at The Moving Picture Company in London for several years. Polygram backed the group to start their own outfit, with the first project the Vin Diesel sci-fi film Pitch Black. Pitch Black would prove to be a significant challenge for the newly formed DNEG team (who did not even have a name when they started on the film – they actually referred to themselves as ‘Newco’, as in ‘new company’, for the first few months). The studio was without a permanent base in London and was yet to purchase much-needed workstations. Also, Pitch Black was filming in Queensland, Australia, the other side of the world. “Of course,” relates DNEG Co-founder and Visual Effects Supervisor Paul Franklin, “once we got there we had all sorts of technical problems getting our first SGI workstations set up, made all the worse by the fact that our tech team was 10,000 miles away. The Internet wasn’t up to much back then – remember ISDN lines running at a blistering 28Kbps? – so communication with the home base was pretty difficult.” Franklin suggests that “much of the ‘core DNA’ of Double Negative was laid down on Pitch Black, which was VFX supervised by Co-founder Peter Chiang, and still remains to this day. Very quickly, in fact, the studio began forming key relationships with filmmakers, producers and film studios. DNEG was also well-placed to benefit from a dramatic rise in film and VFX production occurring in the UK, especially with the advent of the Harry Potter films. RISE OF UK VFX; RISE OF DNEG
As film production in the UK ramped up, so too did competition among visual effects studios, many of which happened to be located in the Soho area of London. DNEG Co-founder Alex Hope says the focus in those earlier years was to not only help grow DNEG as a business and as a creative enterprise, but also the local industry as a whole. “One of our primary aims was to get the visual effects
72 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 72-73
industry in the UK on the map,” recalls Hope. “Through that period, the Harry Potter films started being made, and we first became involved on Prisoner of Azkaban, released in 2004. Those films, and the commitment of Warner Bros., David Heyman [producer of Harry Potter films] and everyone associated with those films to want them to be done in the UK gave confidence to British visual effects companies, and gave confidence to Double Negative – that we could invest in training and R&D and capital expenditure and build our business.” Another cornerstone relationship that DNEG cultivated was with Christopher Nolan, first working with the director on Batman Begins and then several others (see sidebar). DNEG also quickly became one of the major contributors to other large franchises such as The Hunger Games series, Bond films, the DC Extended Universe and the Marvel Cinematic Universe. For that to happen, DNEG had to grow – quickly and creatively – not only in London but around the world. The studio now has locations in London, Vancouver, Mumbai, Los Angeles, Chennai and Montréal (DNEG had an office in Singapore but closed it in 2016). Another event also had the effect of expanding the studio: its merger with Prime Focus World in 2014. This global expansion certainly reflects the state of the industry – many other studios have set up in multiple countries, attracted by lower costs of production, tax credits, incentives and the availability of a near 24-hour production schedule.
OPPOSITE TOP: A shot is set up during the filming of Pitch Black. In its first film outing, DNEG collaborated closely with practical effects vendor John Cox’s Creature Workshop in delivering alien CG creatures. (Image courtesy of John Cox’s Creature Workshop) OPPOSITE BOTTOM: Gringotts Dragon from Harry Potter and the Deathly Hallows: Part 2, a DNEG creation. By the time of this final film, the VFX studio had become one of the major vendors for the franchise. (Image copyright © 2010 Warner Bros. Pictures. All rights reserved.) TOP: The folding Parisian landscape as seen in Inception wowed audiences and was one of many startling, yet invisible visual effects incorporated into Christopher Nolan’s film. (Image copyright © 2010 Warner Bros. Pictures. All rights reserved.) BOTTOM: Although it did not achieve wide success, John Carter proved to be a major demonstration of DNEG’s animated creatures pipeline. (Image copyright © 2012 Walt Disney Pictures. All rights reserved.)
DNEG’S CONSISTENT CREW
For a studio in operation for 20 years, a surprising number of crew who were there at the beginning or in the earliest days of production are still at the studio, and have risen to become experienced supervisors, producers and part of management. Visual Effects Supervisor Pete Bebb, for example, was DNEG’s
SPRING 2018 VFXVOICE.COM • 73
3/5/18 4:23 PM
COMPANY PROFILE
Not too many studios can boast two decades of existence in the sometimes volatile visual effects industry. But Double Negative, founded in 1998, reaches that milestone this year, having grown from a 30-person startup to a global studio with thousands of employees. Ultimately, DNEG has emerged as one of the powerhouse visual effects studios in feature filmmaking. It is known for anchoring major releases, as well as continuing to work on a range of lower budget films, and has now branched out into television and animated features. In the process, the studio’s work has been honored by three Academy Awards (for Inception, Interstellar and Ex Machina). For its 20th anniversary, VFX Voice spoke to some of DNEG’s key staff for a look back at the rise of the visual effects studio. TWO DECADES OF DNEG
DOUBLE NEGATIVE: DOUBLE DECADES OF DOUBLE POSITIVES By IAN FAILES
While it is now a massive visual effects studio, DNEG started small. It was formed largely by a group of artists who had been working together at The Moving Picture Company in London for several years. Polygram backed the group to start their own outfit, with the first project the Vin Diesel sci-fi film Pitch Black. Pitch Black would prove to be a significant challenge for the newly formed DNEG team (who did not even have a name when they started on the film – they actually referred to themselves as ‘Newco’, as in ‘new company’, for the first few months). The studio was without a permanent base in London and was yet to purchase much-needed workstations. Also, Pitch Black was filming in Queensland, Australia, the other side of the world. “Of course,” relates DNEG Co-founder and Visual Effects Supervisor Paul Franklin, “once we got there we had all sorts of technical problems getting our first SGI workstations set up, made all the worse by the fact that our tech team was 10,000 miles away. The Internet wasn’t up to much back then – remember ISDN lines running at a blistering 28Kbps? – so communication with the home base was pretty difficult.” Franklin suggests that “much of the ‘core DNA’ of Double Negative was laid down on Pitch Black, which was VFX supervised by Co-founder Peter Chiang, and still remains to this day. Very quickly, in fact, the studio began forming key relationships with filmmakers, producers and film studios. DNEG was also well-placed to benefit from a dramatic rise in film and VFX production occurring in the UK, especially with the advent of the Harry Potter films. RISE OF UK VFX; RISE OF DNEG
As film production in the UK ramped up, so too did competition among visual effects studios, many of which happened to be located in the Soho area of London. DNEG Co-founder Alex Hope says the focus in those earlier years was to not only help grow DNEG as a business and as a creative enterprise, but also the local industry as a whole. “One of our primary aims was to get the visual effects
72 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 72-73
industry in the UK on the map,” recalls Hope. “Through that period, the Harry Potter films started being made, and we first became involved on Prisoner of Azkaban, released in 2004. Those films, and the commitment of Warner Bros., David Heyman [producer of Harry Potter films] and everyone associated with those films to want them to be done in the UK gave confidence to British visual effects companies, and gave confidence to Double Negative – that we could invest in training and R&D and capital expenditure and build our business.” Another cornerstone relationship that DNEG cultivated was with Christopher Nolan, first working with the director on Batman Begins and then several others (see sidebar). DNEG also quickly became one of the major contributors to other large franchises such as The Hunger Games series, Bond films, the DC Extended Universe and the Marvel Cinematic Universe. For that to happen, DNEG had to grow – quickly and creatively – not only in London but around the world. The studio now has locations in London, Vancouver, Mumbai, Los Angeles, Chennai and Montréal (DNEG had an office in Singapore but closed it in 2016). Another event also had the effect of expanding the studio: its merger with Prime Focus World in 2014. This global expansion certainly reflects the state of the industry – many other studios have set up in multiple countries, attracted by lower costs of production, tax credits, incentives and the availability of a near 24-hour production schedule.
OPPOSITE TOP: A shot is set up during the filming of Pitch Black. In its first film outing, DNEG collaborated closely with practical effects vendor John Cox’s Creature Workshop in delivering alien CG creatures. (Image courtesy of John Cox’s Creature Workshop) OPPOSITE BOTTOM: Gringotts Dragon from Harry Potter and the Deathly Hallows: Part 2, a DNEG creation. By the time of this final film, the VFX studio had become one of the major vendors for the franchise. (Image copyright © 2010 Warner Bros. Pictures. All rights reserved.) TOP: The folding Parisian landscape as seen in Inception wowed audiences and was one of many startling, yet invisible visual effects incorporated into Christopher Nolan’s film. (Image copyright © 2010 Warner Bros. Pictures. All rights reserved.) BOTTOM: Although it did not achieve wide success, John Carter proved to be a major demonstration of DNEG’s animated creatures pipeline. (Image copyright © 2012 Walt Disney Pictures. All rights reserved.)
DNEG’S CONSISTENT CREW
For a studio in operation for 20 years, a surprising number of crew who were there at the beginning or in the earliest days of production are still at the studio, and have risen to become experienced supervisors, producers and part of management. Visual Effects Supervisor Pete Bebb, for example, was DNEG’s
SPRING 2018 VFXVOICE.COM • 73
3/5/18 4:23 PM
COMPANY PROFILE
Creative Peaks A look at some of DNEG’s key creative moments over the years. Pitch Black (2000): The project that kicked it all off and showcased DNEG’s early plans to work on high quality VFX. Enemy At the Gates (2001): A major compositing project that pushed for a photorealistic depiction of the epic battle for Stalingrad in the Second World War.
TOP LEFT: Ex Machina was recognized with a Best Visual Effects Oscar®, perhaps surprising many after being up against a host of much larger effects films. (Image copyright © 2012 A24. All rights reserved.) TOP RIGHT: Among DNEG’s several contributions to Blade Runner 2049 was the large hologram version of the character Joi. (Image copyright © 2017 Warner Bros. Pictures. All rights reserved.) BOTTOM: DNEG’s Oscar® winners, in foreground (from left): Paul Franklin, Mark Williams Ardington, Andrew Whitehurst, Paul Norris, Andrew Lockley and Pete Bebb, joined by Alex Hope and Matt Holben. OPPOSITE BOTTOM: DNEG Chief Scientist Oliver James (center), holding the BAFTA, with the Interstellar R&D team and collaborator Professor Kip Thorne, holding the Oscar®.
first runner, and later won an Oscar for Inception. Many other experienced supervisors started their careers at the studio. Cofounder Matt Holben says DNEG is firm on the importance of investing in staff: “Anybody can get investment and go and buy infrastructure and machinery and whatever, but what makes a company magic is the group of people that you have working with you – your team, who you surround yourself with.” That’s a view echoed by Peter Chiang, who has been at DNEG from the very beginning and marvels at the level of talent fostered at the studio and coming in from around the world. “We encourage individuals to pursue their goals and take on new challenges, all with a guiding hand from a team of very experienced individuals. We understand new talent may want to flex their wings and move on and try other companies, but while they are at DNEG we want to them to enjoy their experience and get the most out of working on the projects.”
Harry Potter and the Prisoner of Azkaban (2004): The studio’s first entry into the Harry Potter franchise. One of the most elaborate sequences in the film was the Night Bus journey, a DNEG creation. Batman Begins (2005): DNEG’s first collaboration with Christopher Nolan, and one in which it established a new color-management pipeline to meet the director’s expectations. Children of Men (2006): Long takes with seamless visual effects are characteristic of this Alfonso Cuarón outing, especially DNEG’s work in helping to orchestrate the famous ‘oner’ inside the car. Bourne films, United 93, Captain Phillips, Green Zone: These films, representing DNEG’s collaboration with director Paul Greengrass, often feature scenes with seamless visual effects that go unnoticed by audiences.
The Dark Knight (2008) and The Dark Knight Rises (2012): Solidifying DNEG’s strong association with Nolan, these two films saw the studio come on board more as a creative partner. Inception (2010): The VFX Oscar-winning film featured a diverse set of challenges for DNEG, from flipping Paris buildings, major wire removals for floating actors, and crumbling building façades for the limbo world. John Carter (2012): A major leap forward in DNEG’s animation pipeline, involving on-set performance capture and delivering believable animated performances. Interstellar (2014): DNEG collaborated with Professor Kip Thorne of Caltech on research and development into the visualization of blackholes for Interstellar, taking the idea of physically plausible visual effects to new levels, and earning a VFX Oscar. Ex Machina (2015): Another Oscar winner for visual effects, Ex Machina was a much lower budgeted film, but still benefited greatly from DNEG’s approach to storytelling. Blade Runner 2049 (2017): The studio delivered grand views of a future Los Angeles, as well as intricately orchestrated scenes of the holographic assistant Joi in one of the most hotly anticipated VFX films of last year.
NEW HORIZONS
DNEG is still primarily known for feature film visual effects. Its major recent projects include Dunkirk, Blade Runner 2049, Wonder Woman, Baby Driver, Pacific Rim Uprising and
74 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 74-75
SPRING 2018 VFXVOICE.COM • 75
3/5/18 4:23 PM
COMPANY PROFILE
Creative Peaks A look at some of DNEG’s key creative moments over the years. Pitch Black (2000): The project that kicked it all off and showcased DNEG’s early plans to work on high quality VFX. Enemy At the Gates (2001): A major compositing project that pushed for a photorealistic depiction of the epic battle for Stalingrad in the Second World War.
TOP LEFT: Ex Machina was recognized with a Best Visual Effects Oscar®, perhaps surprising many after being up against a host of much larger effects films. (Image copyright © 2012 A24. All rights reserved.) TOP RIGHT: Among DNEG’s several contributions to Blade Runner 2049 was the large hologram version of the character Joi. (Image copyright © 2017 Warner Bros. Pictures. All rights reserved.) BOTTOM: DNEG’s Oscar® winners, in foreground (from left): Paul Franklin, Mark Williams Ardington, Andrew Whitehurst, Paul Norris, Andrew Lockley and Pete Bebb, joined by Alex Hope and Matt Holben. OPPOSITE BOTTOM: DNEG Chief Scientist Oliver James (center), holding the BAFTA, with the Interstellar R&D team and collaborator Professor Kip Thorne, holding the Oscar®.
first runner, and later won an Oscar for Inception. Many other experienced supervisors started their careers at the studio. Cofounder Matt Holben says DNEG is firm on the importance of investing in staff: “Anybody can get investment and go and buy infrastructure and machinery and whatever, but what makes a company magic is the group of people that you have working with you – your team, who you surround yourself with.” That’s a view echoed by Peter Chiang, who has been at DNEG from the very beginning and marvels at the level of talent fostered at the studio and coming in from around the world. “We encourage individuals to pursue their goals and take on new challenges, all with a guiding hand from a team of very experienced individuals. We understand new talent may want to flex their wings and move on and try other companies, but while they are at DNEG we want to them to enjoy their experience and get the most out of working on the projects.”
Harry Potter and the Prisoner of Azkaban (2004): The studio’s first entry into the Harry Potter franchise. One of the most elaborate sequences in the film was the Night Bus journey, a DNEG creation. Batman Begins (2005): DNEG’s first collaboration with Christopher Nolan, and one in which it established a new color-management pipeline to meet the director’s expectations. Children of Men (2006): Long takes with seamless visual effects are characteristic of this Alfonso Cuarón outing, especially DNEG’s work in helping to orchestrate the famous ‘oner’ inside the car. Bourne films, United 93, Captain Phillips, Green Zone: These films, representing DNEG’s collaboration with director Paul Greengrass, often feature scenes with seamless visual effects that go unnoticed by audiences.
The Dark Knight (2008) and The Dark Knight Rises (2012): Solidifying DNEG’s strong association with Nolan, these two films saw the studio come on board more as a creative partner. Inception (2010): The VFX Oscar-winning film featured a diverse set of challenges for DNEG, from flipping Paris buildings, major wire removals for floating actors, and crumbling building façades for the limbo world. John Carter (2012): A major leap forward in DNEG’s animation pipeline, involving on-set performance capture and delivering believable animated performances. Interstellar (2014): DNEG collaborated with Professor Kip Thorne of Caltech on research and development into the visualization of blackholes for Interstellar, taking the idea of physically plausible visual effects to new levels, and earning a VFX Oscar. Ex Machina (2015): Another Oscar winner for visual effects, Ex Machina was a much lower budgeted film, but still benefited greatly from DNEG’s approach to storytelling. Blade Runner 2049 (2017): The studio delivered grand views of a future Los Angeles, as well as intricately orchestrated scenes of the holographic assistant Joi in one of the most hotly anticipated VFX films of last year.
NEW HORIZONS
DNEG is still primarily known for feature film visual effects. Its major recent projects include Dunkirk, Blade Runner 2049, Wonder Woman, Baby Driver, Pacific Rim Uprising and
74 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 74-75
SPRING 2018 VFXVOICE.COM • 75
3/5/18 4:23 PM
COMPANY PROFILE
A Formidable Team
CLOCKWISE FROM TOP: Visual Effects Supervisor Andrew Whitehurst: “One of my very fondest memories, and something which is unquestionably a perk of the job, was sitting down and sketching with Guillermo del Toro on Hellboy II. I was helping to create the Stone Giant and Guillermo had precise ideas about how he wanted the dust and debris to move. so we sat down and drew it together in my sketchpad. Moments when you get to directly create with someone of that caliber are to be treasured. And yes, I still have the drawings.” Visual Effects Supervisor Charlie Noble: “From aerial work in the highlands of Scotland to the poorest back streets of Rabat, Morocco. From US Navy warships in San Diego and Virginia to U2 spy planes in Sacramento. From Berlin to Moscow, Madrid, Tangiers, New Orleans, New York, Boston, Budapest, Sofia, Murcia, Malta and Las Vegas, I’ve seen more hotel rooms than my family would have liked, but have also seen some amazing places. I am so grateful that my work at DNEG has led me to these places and to some of the wonderful people that I’ve met along the way.” Visual Effects Supervisor John Moffatt: “The first show I worked on was Pitch Black. I earned the credit of ‘Digital Weasel’, and have many fond memories of being in late night reviews with Peter Chiang and Matt Plummer. I love creating amazing images with lovely people. That’s what we do at DNEG.” Visual Effects Supervisor Pete Bebb: “One of my stand-out experiences at DNEG was when I was working in studio and I was on a night shift filming out Pitch Black. While trying to load a 1,000 ft. mag in a tiny space in the dark room I cut my finger on the film – like a paper cut but worse – and bled over the first few hundred feet of the roll. So I quite literally bled for that film. Hopefully you don’t notice it on the playback!” Global Head of Animation Robyn Luckham: “My stand-out moments at DNEG have always been with the crew. Watching people start from scratch and grow. The bond we create as animators makes the work stronger and makes my job a lot easier. On In the Heart of the Sea we all became ‘Masters of Whales’. I would call my team my ‘shipmates’ and spent the whole show speaking like a pirate. I even received a ‘captain’s hat’ from the crew that I would wear in dailies! Shows are amazing experiences at DNEG, and a huge part of that is the crew we get to work with.” Co-founders Matt Holben (left) and Alex Hope.
76 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 77
“Supervising is just as enjoyable for me today as it was back then. The projects are bigger and so we have grown the company, and the result is that we can do more creative things because we have fantastic, talented crew that work hard and know what they are doing. The company has invested in a more flexible pipeline that links the whole company to maximize this creative process allowing everyone to contribute in creating great shots.” —Peter Chiang, Co-Founder, Double Negative Annihilation. Plus, there’s a slew of films to come, including Avengers: Infinity War, Ant-Man and the Wasp, Deadpool 2, First Man, M:I 6 - Mission Impossible, Bohemian Rhapsody, Godzilla: King of Monsters, Fantastic Beasts: The Crimes of Grindelwald, The Kid Who Would Be King and Venom. Like many other VFX companies, DNEG has also branched out into new areas. These include animated features – via a partnership with Locksmith Animation – and television. DNEG TV, in particular, has been one of the busiest new parts of the studio after launching only in 2013. The outfit boasts credits on shows such as Altered Carbon, Inhumans, The Young Pope, Agent Carter and Black Mirror, and others. DNEG might have grown in size and the scale of work it outputs, but the common theme among its crew is the level of creativity afforded to them every day, even in the sometimes cutthroat world of visual effects. “Supervising is just as enjoyable for me today as it was back then,” says Chiang, who most recently oversaw Pacific Rim Uprising. “The projects are bigger and so we have grown the company, and the result is that we can do more creative things because we have fantastic, talented crew that work hard and know what they are doing. The company has invested in a more flexible pipeline that links the whole company to maximize this creative process allowing everyone to contribute in creating great shots.”
Perhaps one of DNEG’s most successful collaborations has been with director Christopher Nolan. Starting with Batman Begins and then continuing with the Dark Knight films, Inception, Interstellar and Dunkirk, DNEG has been the director’s go-to VFX house, even though he is often known for eschewing digital effects. The reality is that DNEG’s team members form just part of the many effects contributors, from special effects to practical effects and miniatures, on Nolan’s films. “Chris has the sharpest eye of any filmmaker I have ever met,” says DNEG Visual Effects Supervisor Paul Franklin, a frequent Nolan collaborator. “The level of scrutiny he brings to bear on the work is absolutely punishing, and he makes it his business to learn as much as he can about your job, so you have to bring your A-game all the time. Chris is famous for putting as much reality on film as he possibly can. If he can get it in camera then
he will, but this also means that when he looks to the VFX team to create images for the movie he expects them to be held to the same standard; it has to look real, no matter what it is.” Franklin even teamed with Batman Begins overall VFX Supervisor Janek Sirrs to show Nolan digital versions of buildings and locations side-by-side with the real thing – on film – to help convince the director of the merits of VFX. “I think this helped to reassure Chris that we understood what he was looking for in the original photography and we went from there.” Adds Franklin: “Perhaps the most important thing I learned from my work on Chris’s films is that you should never do something for its own sake, regardless of what it is: just because you have a Steadicam on the truck doesn’t mean you have to use it if the shot works just as well on regular sticks. The same goes for VFX – just because you can do something spectacular in the computer doesn’t automatically earn it a place in the movie.”
CLOCKWISE: A scene from The Dark Knight. Although the film was a major blockbuster release, DNEG generally approached the visual effects as if they were largely invisible effects shots. (Image copyright © 2008 Warner Bros. Pictures. All rights reserved.)
Interstellar saw Nolan rely heavily on full-sized sets, miniatures from New Deal Studios, and computer simulations and visual effects by DNEG. In particular, the studio participated in the publication of new scientific papers involving the visualization of black holes. (Image copyright © 2014 Warner Bros. Pictures and Paramount Pictures Corp. All rights reserved.)
Paul Franklin (front) and fellow DNEG members on location during the production of The Dark Knight. (Image copyright © 2008 Warner Bros. Pictures. All rights reserved.)
Nolan’s latest film, Dunkirk, with visual effects supervised by Andrew Jackson, makes wide use of live-action, large-scale effects, miniatures, CG and seamless compositing from DNEG. (Image copyright © 2017 Warner Bros. Pictures. All rights reserved.)
SPRING 2018 VFXVOICE.COM • 77
3/5/18 4:23 PM
COMPANY PROFILE
A Formidable Team
CLOCKWISE FROM TOP: Visual Effects Supervisor Andrew Whitehurst: “One of my very fondest memories, and something which is unquestionably a perk of the job, was sitting down and sketching with Guillermo del Toro on Hellboy II. I was helping to create the Stone Giant and Guillermo had precise ideas about how he wanted the dust and debris to move. so we sat down and drew it together in my sketchpad. Moments when you get to directly create with someone of that caliber are to be treasured. And yes, I still have the drawings.” Visual Effects Supervisor Charlie Noble: “From aerial work in the highlands of Scotland to the poorest back streets of Rabat, Morocco. From US Navy warships in San Diego and Virginia to U2 spy planes in Sacramento. From Berlin to Moscow, Madrid, Tangiers, New Orleans, New York, Boston, Budapest, Sofia, Murcia, Malta and Las Vegas, I’ve seen more hotel rooms than my family would have liked, but have also seen some amazing places. I am so grateful that my work at DNEG has led me to these places and to some of the wonderful people that I’ve met along the way.” Visual Effects Supervisor John Moffatt: “The first show I worked on was Pitch Black. I earned the credit of ‘Digital Weasel’, and have many fond memories of being in late night reviews with Peter Chiang and Matt Plummer. I love creating amazing images with lovely people. That’s what we do at DNEG.” Visual Effects Supervisor Pete Bebb: “One of my stand-out experiences at DNEG was when I was working in studio and I was on a night shift filming out Pitch Black. While trying to load a 1,000 ft. mag in a tiny space in the dark room I cut my finger on the film – like a paper cut but worse – and bled over the first few hundred feet of the roll. So I quite literally bled for that film. Hopefully you don’t notice it on the playback!” Global Head of Animation Robyn Luckham: “My stand-out moments at DNEG have always been with the crew. Watching people start from scratch and grow. The bond we create as animators makes the work stronger and makes my job a lot easier. On In the Heart of the Sea we all became ‘Masters of Whales’. I would call my team my ‘shipmates’ and spent the whole show speaking like a pirate. I even received a ‘captain’s hat’ from the crew that I would wear in dailies! Shows are amazing experiences at DNEG, and a huge part of that is the crew we get to work with.” Co-founders Matt Holben (left) and Alex Hope.
76 • VFXVOICE.COM SPRING 2018
PG 72-77 DOUBLE NEGATIVE.indd 77
“Supervising is just as enjoyable for me today as it was back then. The projects are bigger and so we have grown the company, and the result is that we can do more creative things because we have fantastic, talented crew that work hard and know what they are doing. The company has invested in a more flexible pipeline that links the whole company to maximize this creative process allowing everyone to contribute in creating great shots.” —Peter Chiang, Co-Founder, Double Negative Annihilation. Plus, there’s a slew of films to come, including Avengers: Infinity War, Ant-Man and the Wasp, Deadpool 2, First Man, M:I 6 - Mission Impossible, Bohemian Rhapsody, Godzilla: King of Monsters, Fantastic Beasts: The Crimes of Grindelwald, The Kid Who Would Be King and Venom. Like many other VFX companies, DNEG has also branched out into new areas. These include animated features – via a partnership with Locksmith Animation – and television. DNEG TV, in particular, has been one of the busiest new parts of the studio after launching only in 2013. The outfit boasts credits on shows such as Altered Carbon, Inhumans, The Young Pope, Agent Carter and Black Mirror, and others. DNEG might have grown in size and the scale of work it outputs, but the common theme among its crew is the level of creativity afforded to them every day, even in the sometimes cutthroat world of visual effects. “Supervising is just as enjoyable for me today as it was back then,” says Chiang, who most recently oversaw Pacific Rim Uprising. “The projects are bigger and so we have grown the company, and the result is that we can do more creative things because we have fantastic, talented crew that work hard and know what they are doing. The company has invested in a more flexible pipeline that links the whole company to maximize this creative process allowing everyone to contribute in creating great shots.”
Perhaps one of DNEG’s most successful collaborations has been with director Christopher Nolan. Starting with Batman Begins and then continuing with the Dark Knight films, Inception, Interstellar and Dunkirk, DNEG has been the director’s go-to VFX house, even though he is often known for eschewing digital effects. The reality is that DNEG’s team members form just part of the many effects contributors, from special effects to practical effects and miniatures, on Nolan’s films. “Chris has the sharpest eye of any filmmaker I have ever met,” says DNEG Visual Effects Supervisor Paul Franklin, a frequent Nolan collaborator. “The level of scrutiny he brings to bear on the work is absolutely punishing, and he makes it his business to learn as much as he can about your job, so you have to bring your A-game all the time. Chris is famous for putting as much reality on film as he possibly can. If he can get it in camera then
he will, but this also means that when he looks to the VFX team to create images for the movie he expects them to be held to the same standard; it has to look real, no matter what it is.” Franklin even teamed with Batman Begins overall VFX Supervisor Janek Sirrs to show Nolan digital versions of buildings and locations side-by-side with the real thing – on film – to help convince the director of the merits of VFX. “I think this helped to reassure Chris that we understood what he was looking for in the original photography and we went from there.” Adds Franklin: “Perhaps the most important thing I learned from my work on Chris’s films is that you should never do something for its own sake, regardless of what it is: just because you have a Steadicam on the truck doesn’t mean you have to use it if the shot works just as well on regular sticks. The same goes for VFX – just because you can do something spectacular in the computer doesn’t automatically earn it a place in the movie.”
CLOCKWISE: A scene from The Dark Knight. Although the film was a major blockbuster release, DNEG generally approached the visual effects as if they were largely invisible effects shots. (Image copyright © 2008 Warner Bros. Pictures. All rights reserved.)
Interstellar saw Nolan rely heavily on full-sized sets, miniatures from New Deal Studios, and computer simulations and visual effects by DNEG. In particular, the studio participated in the publication of new scientific papers involving the visualization of black holes. (Image copyright © 2014 Warner Bros. Pictures and Paramount Pictures Corp. All rights reserved.)
Paul Franklin (front) and fellow DNEG members on location during the production of The Dark Knight. (Image copyright © 2008 Warner Bros. Pictures. All rights reserved.)
Nolan’s latest film, Dunkirk, with visual effects supervised by Andrew Jackson, makes wide use of live-action, large-scale effects, miniatures, CG and seamless compositing from DNEG. (Image copyright © 2017 Warner Bros. Pictures. All rights reserved.)
SPRING 2018 VFXVOICE.COM • 77
3/5/18 4:23 PM
VFX VAULT
OVER 30 YEARS, WILLOW HAS MORPHED INTO AN EFFECTS CLASSIC By IAN FAILES
All images copyright © 1988 Lucasfilm Limited. All rights reserved. TOP: Director Ron Howard on the set of Willow with actor Val Kilmer who played Madmartigan in the film. BOTTOM: Willow star Warwick Davis as Willow Ufgood, a kind but reluctant dwarf who becomes central to the story.
Thirty years ago this May, director Ron Howard and producer George Lucas combined to bring Willow to the big screen. The story of a dwarf who finds himself protecting a baby from an evil queen provided a classic showcase of the visual effects might of Industrial Light & Magic (ILM). Featured in the 1988 film were numerous animatronic characters, matte paintings, miniatures, miniaturization effects via oversize sets and bluescreen compositing, stop-motion animation and even rear projection. But the VFX-Oscar nominated Willow, which was supervised by Dennis Muren, VES and Michael J. McAlister, also represented a shift towards the digital realm of visual effects. This was thanks largely to ILM’s innovative 2D transformation system called MORF, used for a critical magical scene in the film that transformed a series of animals – from one to the other – and finally into an old sorceress, all in single, unbroken shots, which would ultimately lead to many more morphing effects in subsequent films. The film’s fantastical imagery involved a significant design effort, much of it still done traditionally with pen and paper. On its 30th anniversary, VFX Voice spoke to ILM’s Visual Effects Art Director on Willow, Dave Carson – a veteran of The Empire Strikes Back, Return of the Jedi, Titanic and later Casper, Forrest Gump and Jurassic Park – about his perspective on those design stages, the shift to digital, and the mammoth effects effort put into the film. VFX Voice: What are your memories of coming onto the film, initially? Dave Carson: At that time, I was head of the art department at ILM. My earliest recollection is sitting in the art department meetings with George and Ron, as they talked about the early drafts of the script, to give us an idea of the kind of imagery they would like to see us do. The script was still pretty fluid. They had a strong outline, but they were still open to changes as we contributed art and as they continued to have their story meetings. George had also brought in Moebius [pen name of French artist Jean Giraud] to do some concept art based on this early outline. I remember those pieces showing up, and they were really fun to look at. He would do these beautiful ink drawings and then he would Xerox them and color the Xeroxes. He never colored the originals. I had also just brought in a couple of new artists. One of them was a storyboard artist, Dave Lowery. There was another young man, Richard Vander Wende, who came on as a concept artist. He did these beautiful Ralph McQuarrie-esque kind of paintings. VFX Voice: What was involved in your role as visual effects art director? Carson: Apart from the initial design work, I would go to
78 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 78-79
dailies every day and Dennis would sometimes ask my opinion on how things looked, how shots looked. For some reason, early on, he decided that I should be the expert on whether the blue glow of the wand in the film was consistent from shot to shot. I don’t know how I got saddled with that chore, but that was one of the things that I was always asked about almost every daily: if the degree of blue glow on the wand was appropriate from shot to shot. VFX Voice: One of the things about Willow is that the visual effects were largely still done in the practical and optical days, but then it also saw the advent of morphing. What was the feeling there at ILM about how the effects techniques would be combined? Carson: George definitely was pushing us to do the most that we were capable of, using the technology at the time. Phil Tippett [VES] was still using traditional stop motion for the Eborsisk monster, and optical was being pushed to its limits. And the morphing thing was a huge challenge as well. The script was pushing us to do things that we maybe hadn’t done before. But, on the other hand, we had Dennis Muren [VES] at the helm, and Dennis is just the best when it comes to figuring out how to do effects. VFX Voice: What do you remember about how the morphing shots came about? Carson: I remember we had a meeting early on. It was myself, Dennis and George H. Joblove and Doug S. Kay, who were from the brand new computer graphics department. They were concerned about how they were going to do the scenes where the animals change from one to another. They had assumed that what they would do is build a computer model of each of the animals and then have those models, or the polygons that made up those models, shift from one animal to the next, which was a kind of a known technology. But what concerned them was that they weren’t sure they could make computer models of animals that looked sufficiently real. They couldn’t do hair at the time, either. The challenge wasn’t getting them to change from one to another, the challenge was getting them to look real enough. We were kicking things around and I said, “Well, instead of doing it as a model, what if you just did it as images?” I thought this would work because the computer knows where all the pixels are. They know the pixels that are making up each image. So the idea was, what if you changed from one image to the next, rather than the computer graphic representation of the animals? Doug and George were intrigued by that concept and they said, “Yeah, maybe that would work.” And they went back and they enlisted the help of Doug Smythe, who was in the computer graphics department. I believe they found somebody who had written a SIGGRAPH paper that had suggested kind
“The script was pushing us to do things that we maybe hadn’t done before. But, on the other hand, we had Dennis Muren, VES at the helm, and Dennis is just the best when it comes to figuring out how to do effects.” —Dave Carson, ILM’s Visual Effects Art Director on Willow
TOP: Director Ron Howard and producer George Lucas among actors and crew from Willow. BOTTOM: Animation Supervisor Phil Tippett, VES manipulates a stop-motion model of the Eborsisk dragon.
SPRING 2018 VFXVOICE.COM • 79
3/5/18 4:24 PM
VFX VAULT
OVER 30 YEARS, WILLOW HAS MORPHED INTO AN EFFECTS CLASSIC By IAN FAILES
All images copyright © 1988 Lucasfilm Limited. All rights reserved. TOP: Director Ron Howard on the set of Willow with actor Val Kilmer who played Madmartigan in the film. BOTTOM: Willow star Warwick Davis as Willow Ufgood, a kind but reluctant dwarf who becomes central to the story.
Thirty years ago this May, director Ron Howard and producer George Lucas combined to bring Willow to the big screen. The story of a dwarf who finds himself protecting a baby from an evil queen provided a classic showcase of the visual effects might of Industrial Light & Magic (ILM). Featured in the 1988 film were numerous animatronic characters, matte paintings, miniatures, miniaturization effects via oversize sets and bluescreen compositing, stop-motion animation and even rear projection. But the VFX-Oscar nominated Willow, which was supervised by Dennis Muren, VES and Michael J. McAlister, also represented a shift towards the digital realm of visual effects. This was thanks largely to ILM’s innovative 2D transformation system called MORF, used for a critical magical scene in the film that transformed a series of animals – from one to the other – and finally into an old sorceress, all in single, unbroken shots, which would ultimately lead to many more morphing effects in subsequent films. The film’s fantastical imagery involved a significant design effort, much of it still done traditionally with pen and paper. On its 30th anniversary, VFX Voice spoke to ILM’s Visual Effects Art Director on Willow, Dave Carson – a veteran of The Empire Strikes Back, Return of the Jedi, Titanic and later Casper, Forrest Gump and Jurassic Park – about his perspective on those design stages, the shift to digital, and the mammoth effects effort put into the film. VFX Voice: What are your memories of coming onto the film, initially? Dave Carson: At that time, I was head of the art department at ILM. My earliest recollection is sitting in the art department meetings with George and Ron, as they talked about the early drafts of the script, to give us an idea of the kind of imagery they would like to see us do. The script was still pretty fluid. They had a strong outline, but they were still open to changes as we contributed art and as they continued to have their story meetings. George had also brought in Moebius [pen name of French artist Jean Giraud] to do some concept art based on this early outline. I remember those pieces showing up, and they were really fun to look at. He would do these beautiful ink drawings and then he would Xerox them and color the Xeroxes. He never colored the originals. I had also just brought in a couple of new artists. One of them was a storyboard artist, Dave Lowery. There was another young man, Richard Vander Wende, who came on as a concept artist. He did these beautiful Ralph McQuarrie-esque kind of paintings. VFX Voice: What was involved in your role as visual effects art director? Carson: Apart from the initial design work, I would go to
78 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 78-79
dailies every day and Dennis would sometimes ask my opinion on how things looked, how shots looked. For some reason, early on, he decided that I should be the expert on whether the blue glow of the wand in the film was consistent from shot to shot. I don’t know how I got saddled with that chore, but that was one of the things that I was always asked about almost every daily: if the degree of blue glow on the wand was appropriate from shot to shot. VFX Voice: One of the things about Willow is that the visual effects were largely still done in the practical and optical days, but then it also saw the advent of morphing. What was the feeling there at ILM about how the effects techniques would be combined? Carson: George definitely was pushing us to do the most that we were capable of, using the technology at the time. Phil Tippett [VES] was still using traditional stop motion for the Eborsisk monster, and optical was being pushed to its limits. And the morphing thing was a huge challenge as well. The script was pushing us to do things that we maybe hadn’t done before. But, on the other hand, we had Dennis Muren [VES] at the helm, and Dennis is just the best when it comes to figuring out how to do effects. VFX Voice: What do you remember about how the morphing shots came about? Carson: I remember we had a meeting early on. It was myself, Dennis and George H. Joblove and Doug S. Kay, who were from the brand new computer graphics department. They were concerned about how they were going to do the scenes where the animals change from one to another. They had assumed that what they would do is build a computer model of each of the animals and then have those models, or the polygons that made up those models, shift from one animal to the next, which was a kind of a known technology. But what concerned them was that they weren’t sure they could make computer models of animals that looked sufficiently real. They couldn’t do hair at the time, either. The challenge wasn’t getting them to change from one to another, the challenge was getting them to look real enough. We were kicking things around and I said, “Well, instead of doing it as a model, what if you just did it as images?” I thought this would work because the computer knows where all the pixels are. They know the pixels that are making up each image. So the idea was, what if you changed from one image to the next, rather than the computer graphic representation of the animals? Doug and George were intrigued by that concept and they said, “Yeah, maybe that would work.” And they went back and they enlisted the help of Doug Smythe, who was in the computer graphics department. I believe they found somebody who had written a SIGGRAPH paper that had suggested kind
“The script was pushing us to do things that we maybe hadn’t done before. But, on the other hand, we had Dennis Muren, VES at the helm, and Dennis is just the best when it comes to figuring out how to do effects.” —Dave Carson, ILM’s Visual Effects Art Director on Willow
TOP: Director Ron Howard and producer George Lucas among actors and crew from Willow. BOTTOM: Animation Supervisor Phil Tippett, VES manipulates a stop-motion model of the Eborsisk dragon.
SPRING 2018 VFXVOICE.COM • 79
3/5/18 4:24 PM
VFX VAULT
Making MORF
TOP: These frames show the morph between animals to final actress, a revolutionary effect that ILM achieved with its MORF system. BOTTOM: ILM camera operator Marty Rosenberg and camera assistant Patrick McArdle set up a tiger element on a bluescreen stage for part of the transformation sequence.
of a similar possible approach. They came back with this program that was our MORF system, which basically takes a grid and lays it over imagery and then just shifts the pixels around, so that all of the transformation really happens in a 2D space with no computer models. That was a big breakthrough, technologically. VFX Voice: In the film there are some sorcery and lightning scenes that utilized a lot of effects animation. How was that achieved back then? Carson: That was all hand-animated with articulated mattes on cells. I remember doing some paintings to work out the look. I took a cell background and showed the animation team how some effect might look. Wes Takahashi supervised that work. The guys in the animation department had become pretty proficient at doing all this effects stuff. Of course, now it’s all done with particle systems, but at the time it was very tedious work.
80 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 80-81
Willow’s most well-known visual effect – a magical transition sequence in which several animals such as a goat, ostrich, peacock, tortoise and a tiger blend between each other and then into actress Patricia Hayes – was also a landmark one in computer graphics and digital visual effects history. It introduced the film-going world to the concept of ‘morphing’, or then, ‘morfing’, since ILM’s toolset was called the MORF system. MORF enabled smooth transitions (‘metamorphosis’) between key features of the different animals and actress in a single shot. Its developer at ILM, Doug Smythe, drew on research done at NYIT by Tom Brigham to design the software. The two would ultimately go on to be awarded a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for MORF. Today, such a sequence would likely involve the animation of completely computer- generated animals. But at the time, crafting creatures and also dealing with fur and human hair was near impossible to achieve photorealistically. So a decision was made to mostly film articulated puppets and the actor separately on bluescreen and handle the transitions as 2D blends. MORF worked by overlaying two separate grids on two different images – say a tortoise (the source) and a tiger (the destination). “You would go to various key frames throughout your sequence and then drag the grid points around,” explains Smythe, noting that matching key features was important. “Then we had an alternate view where we kept the same image on the top, but then the bottom image became a timeline-type thing where you could pick any of the control dots at the corners of the mesh and see the timeline of how that would move and adjust it with a few key frame points to adjust the timing curve of how that particular point would change from the source color to the destination color.” MORF allowed digital artists to also have certain parts of the image change earlier than another part, enabling certain body parts to shrink or grow or even disappear during the morph. It originally ran on Sun workstations connected to Pixar Image Computers. The tool became an important part of ILM’s visual effects arsenal, later finding use for the gruesome death scene of a Nazi sympathizer in Indiana Jones and the Last Crusade, several transformations in Terminator 2: Judgment Day, and transitions in Star Trek VI: The Undiscovered Country.
TOP: An ILM crew member adjusts an animatronic creature to be filmed. Various elements were captured this way to be combined in the MORF system. MIDDLE: ILM camera operator Marty Rosenberg and camera assistant Patrick McArdle set up a tiger element on a bluescreen stage for part of the transformation sequence. BOTTOM: George Joblove (left) and Jonathan Luskin from ILM with elements that made up the 2D images that would be ‘morphed’ between different animals.
SPRING 2018 VFXVOICE.COM • 81
3/5/18 4:24 PM
VFX VAULT
Making MORF
TOP: These frames show the morph between animals to final actress, a revolutionary effect that ILM achieved with its MORF system. BOTTOM: ILM camera operator Marty Rosenberg and camera assistant Patrick McArdle set up a tiger element on a bluescreen stage for part of the transformation sequence.
of a similar possible approach. They came back with this program that was our MORF system, which basically takes a grid and lays it over imagery and then just shifts the pixels around, so that all of the transformation really happens in a 2D space with no computer models. That was a big breakthrough, technologically. VFX Voice: In the film there are some sorcery and lightning scenes that utilized a lot of effects animation. How was that achieved back then? Carson: That was all hand-animated with articulated mattes on cells. I remember doing some paintings to work out the look. I took a cell background and showed the animation team how some effect might look. Wes Takahashi supervised that work. The guys in the animation department had become pretty proficient at doing all this effects stuff. Of course, now it’s all done with particle systems, but at the time it was very tedious work.
80 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 80-81
Willow’s most well-known visual effect – a magical transition sequence in which several animals such as a goat, ostrich, peacock, tortoise and a tiger blend between each other and then into actress Patricia Hayes – was also a landmark one in computer graphics and digital visual effects history. It introduced the film-going world to the concept of ‘morphing’, or then, ‘morfing’, since ILM’s toolset was called the MORF system. MORF enabled smooth transitions (‘metamorphosis’) between key features of the different animals and actress in a single shot. Its developer at ILM, Doug Smythe, drew on research done at NYIT by Tom Brigham to design the software. The two would ultimately go on to be awarded a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for MORF. Today, such a sequence would likely involve the animation of completely computer- generated animals. But at the time, crafting creatures and also dealing with fur and human hair was near impossible to achieve photorealistically. So a decision was made to mostly film articulated puppets and the actor separately on bluescreen and handle the transitions as 2D blends. MORF worked by overlaying two separate grids on two different images – say a tortoise (the source) and a tiger (the destination). “You would go to various key frames throughout your sequence and then drag the grid points around,” explains Smythe, noting that matching key features was important. “Then we had an alternate view where we kept the same image on the top, but then the bottom image became a timeline-type thing where you could pick any of the control dots at the corners of the mesh and see the timeline of how that would move and adjust it with a few key frame points to adjust the timing curve of how that particular point would change from the source color to the destination color.” MORF allowed digital artists to also have certain parts of the image change earlier than another part, enabling certain body parts to shrink or grow or even disappear during the morph. It originally ran on Sun workstations connected to Pixar Image Computers. The tool became an important part of ILM’s visual effects arsenal, later finding use for the gruesome death scene of a Nazi sympathizer in Indiana Jones and the Last Crusade, several transformations in Terminator 2: Judgment Day, and transitions in Star Trek VI: The Undiscovered Country.
TOP: An ILM crew member adjusts an animatronic creature to be filmed. Various elements were captured this way to be combined in the MORF system. MIDDLE: ILM camera operator Marty Rosenberg and camera assistant Patrick McArdle set up a tiger element on a bluescreen stage for part of the transformation sequence. BOTTOM: George Joblove (left) and Jonathan Luskin from ILM with elements that made up the 2D images that would be ‘morphed’ between different animals.
SPRING 2018 VFXVOICE.COM • 81
3/5/18 4:24 PM
VFX VAULT
“The dragon, Eborsisk, was named after [well-known TV/film critics] Roger Ebert and Gene Siskel; Ron [Howard] wanted to know if it could have two heads, and Phil [Tippett] said, ‘Yeah. Why not?’” —Dave Carson, ILM’s Visual Effects Art Director on Willow VFX Voice: There’s some fun miniaturization effects for the small Brownies characters. How was that done? Carson: The biggest challenge was that the Brownies had to walk. So they had to work out how those Brownies, being their height, keep up with people who are much taller, and what their walking speed was. It was all shot bluescreen. It was a lot of work. I just remember them shooting day after day on the main stage at ILM. Mike McAlister worked out most of the scale and the scale-size props and the pieces that he would need for bluescreen. VFX Voice: Are there any specific things you’d like to share about the visual effects production process on the film? Carson: I remember the gradual simplification of the film. The original script was more challenging. For instance, there
82 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 83
TOP: Willow (Warwick Davis) conjures up the necessary spell with his wand to start the magical transformations. OPPOSITE TOP: The tiger element as it appears in the film before finally morphing into the sorceress. OPPOSITE BOTTOM: The morphing ends on actress Patricia Hayes playing the sorceress.
was a scene where everybody was on their way to the castle, or wherever they were going. They found a door in a mountain, and they went through this series of caves and had some escapades, and eventually encountered this dragon and had a big fight. It was a great sequence, but as time went by they lost the cave, they moved the dragon to the moat of the castle. It was all simplified to a point that it lost a lot of its drama. I know Phil Tippett, VES was frustrated that his dragon, in the end, had to live in just a little moat. And the dragon had two heads because, well, I remember Ron saying, “Would it be weird if the dragon had two heads?” And the reason, of course, was that they had decided to name the bad guys after film critics. General Kael was the main bad guy and was named after Pauline Kael. a critic for The New Yorker. And the dragon, Eborsisk, was named after Roger Ebert and Gene Siskel [wellknown TV/film critics]. Anyway, Ron wanted to know if it could have two heads, and Phil said, ‘Yeah. Why not?’”
SPRING 2018 VFXVOICE.COM • 83
3/5/18 4:24 PM
VFX VAULT
“The dragon, Eborsisk, was named after [well-known TV/film critics] Roger Ebert and Gene Siskel; Ron [Howard] wanted to know if it could have two heads, and Phil [Tippett] said, ‘Yeah. Why not?’” —Dave Carson, ILM’s Visual Effects Art Director on Willow VFX Voice: There’s some fun miniaturization effects for the small Brownies characters. How was that done? Carson: The biggest challenge was that the Brownies had to walk. So they had to work out how those Brownies, being their height, keep up with people who are much taller, and what their walking speed was. It was all shot bluescreen. It was a lot of work. I just remember them shooting day after day on the main stage at ILM. Mike McAlister worked out most of the scale and the scale-size props and the pieces that he would need for bluescreen. VFX Voice: Are there any specific things you’d like to share about the visual effects production process on the film? Carson: I remember the gradual simplification of the film. The original script was more challenging. For instance, there
82 • VFXVOICE.COM SPRING 2018
PG 78-83 WILLOW.indd 83
TOP: Willow (Warwick Davis) conjures up the necessary spell with his wand to start the magical transformations. OPPOSITE TOP: The tiger element as it appears in the film before finally morphing into the sorceress. OPPOSITE BOTTOM: The morphing ends on actress Patricia Hayes playing the sorceress.
was a scene where everybody was on their way to the castle, or wherever they were going. They found a door in a mountain, and they went through this series of caves and had some escapades, and eventually encountered this dragon and had a big fight. It was a great sequence, but as time went by they lost the cave, they moved the dragon to the moat of the castle. It was all simplified to a point that it lost a lot of its drama. I know Phil Tippett, VES was frustrated that his dragon, in the end, had to live in just a little moat. And the dragon had two heads because, well, I remember Ron saying, “Would it be weird if the dragon had two heads?” And the reason, of course, was that they had decided to name the bad guys after film critics. General Kael was the main bad guy and was named after Pauline Kael. a critic for The New Yorker. And the dragon, Eborsisk, was named after Roger Ebert and Gene Siskel [wellknown TV/film critics]. Anyway, Ron wanted to know if it could have two heads, and Phil said, ‘Yeah. Why not?’”
SPRING 2018 VFXVOICE.COM • 83
3/5/18 4:24 PM
GAMES
The video game industry grows larger and more varied by the year, with new devices and innovative experiences bringing more and more people into gaming. Interactive entertainment is enjoying an exciting period in its continuous evolution, with console, computer and mobile games all thriving and virtual reality starting to catch on with a wider audience. Many of the year’s biggest game releases won’t be announced until the Electronic Entertainment Expo (E3) in June, but the first half of the year has some enormous games queued up – and trends that started bubbling up in 2017 are expected to be even more significant in the coming months. Here’s a look at what to expect from the games industry early his year, along with insight from veteran studio executives. NINTENDO SWITCH SOARS
THE SWITCH, PLAYSTATION VR, 4K RESOLUTION FUEL GROWTH By ANDREW HAYWARD
TOP: Dave Lang, Iron Galaxy BOTTOM LEFT: Adam Orth, First Contact Entertainment (Photo: Kim Fox) BOTTOM RIGHT: Allen Murray, Private Division
Nintendo had a huge hit with 2006’s Wii console, but then a massive misfire with 2012’s Wii U. Given that, expectations around 2017’s Switch console seemed tempered, but the convertible console – which can be played as a handheld or docked to run on your TV – caught fire. As of the end of 2017, the Switch had sold more than 4.8 million units in the U.S. alone, making it the fastest-selling game console of all time in the country (beating the original Wii). The worldwide tally has passed 10 million units, and critically acclaimed games like Super Mario Odyssey and The Legend of Zelda: Breath of the Wild sold millions of copies apiece. Nintendo has not yet revealed much of its 2018 slate, but the company will surely try to match up to the successes of Mario and Zelda. Meanwhile, external third-party publishers jumped ship from the Wii U early on, but now they’re scrambling to get their games on the Switch. Indie developers and smaller publishers have been rapidly porting their games to the console, and we should see an outpouring of support from larger publishers as the year continues. “I think a lot of Switch titles will get announced at E3 this year from publishers who don’t traditionally work with Nintendo,” says Dave Lang, Founder and Chief Product Officer of Iron Galaxy Studios. His team worked with publisher Bethesda Softworks to bring role-playing smash The Elder Scrolls V: Skyrim to Switch late last year. ENORMOUS SEQUELS ARRIVE
Rockstar Games’ last big title, 2013’s Grand Theft Auto V, has shipped more than 85 million copies to date – and there’s a huge amount of anticipation for the publisher’s next game, this spring’s Red Dead Redemption 2. The studio’s PlayStation 4 and Xbox One sequel to the 2010 original will provide a large, open world to explore as players control a Wild West outlaw. “Red Dead Redemption is probably my favorite game from last generation, so I can’t wait for the new one,” affirms Lang. Other anticipated sequels due out in the first half of the year include Ubisoft’s Far Cry 5 (March 27; PC, PS4, XB1), another open-world action experience, and Sony’s rebooted God of War
84 • VFXVOICE.COM SPRING 2018
PG 84-87 GAMES.indd 84-85
(spring; PS4). This new entry embraces Norse mythology as iconic warrior Kratos – the titular God himself – shepherds his son through a world of monsters. It has a revamped look and vibe, along with refreshed gameplay, and the shift should give the top-selling franchise new life. Shadow of the Colossus (February 6; PS4) is another major release for the first half of the year, but it’s not a sequel – it’s a reimagining of the beloved 2005 epic adventure, outfitting the familiar quest with a dazzling new art style and other modern enhancements. “The original is my favorite game of all time, right up there with Civilization,” says Allen Murray, Executive Producer and VP of Production at 2K Games’ new Private Division label. “It’s one of the few games that I replay and go back to every year. So I am excited to see how Bluepoint has recreated this world I know so well from the PS2 era.” EVEN MORE BATTLE ROYALE
The biggest gaming phenomenon of 2017 was arguably PlayerUnknown’s Battlegrounds, a scrappy PC shooter that released in an unfinished beta version in March and went on to sell 24 million copies by the end of the year. PUBG, as fans know it, drops 100 online players onto an island and tasks them with trying to survive for as long as they can by outsmarting their foes. Inspired by the classic Battle Royale media franchise, it has become an unexpected sensation. PUBG launched on Xbox One at the end of 2017 and will make its way to other platforms (including mobile) in 2018, but competitors are already taking a bite out of developer Bluehole’s hit. In September 2017, Epic Games launched Fortnite: Battle Royale, a free-to-play spinoff of its own recent shooter, and notched more than 30 million players by the end of the year. Another shooter, Paladins, recently announced an upcoming PUBG-like mode, and there are other variations on the theme on mobile devices, as well. According to Adam Orth, Creative Strategist at First Contact Entertainment and former Microsoft Creative Director, we should see a lot more of that in 2018. “We will see a massive influx of games trying to cash in on the zeitgeist,” he explains. “PUBG is the new Minecraft, and several high-profile developers and publishers have either already released their competitive product – Epic Games’ Fortnite, for example – or are preparing to release their own variation on the theme.” NEW THINGS ARE COMING
Big sequels and trend-chasing knockoffs might sound a bit tired, but thankfully, the first half of 2018 also promises some exciting new game properties as well. Sony’s Detroit: Become Human (spring) depicts a futuristic world in which androids serve society, but are treated like second-class citizens – until they show human emotions. The PlayStation 4 game looks like a lavish choose-your-ownadventure experience, where you’ll pick your path through tense cinematics and affect the outcome of each scenario.
TOP TO BOTTOM: Nintendo Switch with left and right Joy-Con controllers (Photo: Hosokawa Shingo) Super Mario Odyssey for Switch (Nintendo) Red Dead Redemption 2 (Rockstar Games) Far Cry 5 (Ubisoft)
SPRING 2018 VFXVOICE.COM • 85
3/5/18 4:24 PM
GAMES
The video game industry grows larger and more varied by the year, with new devices and innovative experiences bringing more and more people into gaming. Interactive entertainment is enjoying an exciting period in its continuous evolution, with console, computer and mobile games all thriving and virtual reality starting to catch on with a wider audience. Many of the year’s biggest game releases won’t be announced until the Electronic Entertainment Expo (E3) in June, but the first half of the year has some enormous games queued up – and trends that started bubbling up in 2017 are expected to be even more significant in the coming months. Here’s a look at what to expect from the games industry early his year, along with insight from veteran studio executives. NINTENDO SWITCH SOARS
THE SWITCH, PLAYSTATION VR, 4K RESOLUTION FUEL GROWTH By ANDREW HAYWARD
TOP: Dave Lang, Iron Galaxy BOTTOM LEFT: Adam Orth, First Contact Entertainment (Photo: Kim Fox) BOTTOM RIGHT: Allen Murray, Private Division
Nintendo had a huge hit with 2006’s Wii console, but then a massive misfire with 2012’s Wii U. Given that, expectations around 2017’s Switch console seemed tempered, but the convertible console – which can be played as a handheld or docked to run on your TV – caught fire. As of the end of 2017, the Switch had sold more than 4.8 million units in the U.S. alone, making it the fastest-selling game console of all time in the country (beating the original Wii). The worldwide tally has passed 10 million units, and critically acclaimed games like Super Mario Odyssey and The Legend of Zelda: Breath of the Wild sold millions of copies apiece. Nintendo has not yet revealed much of its 2018 slate, but the company will surely try to match up to the successes of Mario and Zelda. Meanwhile, external third-party publishers jumped ship from the Wii U early on, but now they’re scrambling to get their games on the Switch. Indie developers and smaller publishers have been rapidly porting their games to the console, and we should see an outpouring of support from larger publishers as the year continues. “I think a lot of Switch titles will get announced at E3 this year from publishers who don’t traditionally work with Nintendo,” says Dave Lang, Founder and Chief Product Officer of Iron Galaxy Studios. His team worked with publisher Bethesda Softworks to bring role-playing smash The Elder Scrolls V: Skyrim to Switch late last year. ENORMOUS SEQUELS ARRIVE
Rockstar Games’ last big title, 2013’s Grand Theft Auto V, has shipped more than 85 million copies to date – and there’s a huge amount of anticipation for the publisher’s next game, this spring’s Red Dead Redemption 2. The studio’s PlayStation 4 and Xbox One sequel to the 2010 original will provide a large, open world to explore as players control a Wild West outlaw. “Red Dead Redemption is probably my favorite game from last generation, so I can’t wait for the new one,” affirms Lang. Other anticipated sequels due out in the first half of the year include Ubisoft’s Far Cry 5 (March 27; PC, PS4, XB1), another open-world action experience, and Sony’s rebooted God of War
84 • VFXVOICE.COM SPRING 2018
PG 84-87 GAMES.indd 84-85
(spring; PS4). This new entry embraces Norse mythology as iconic warrior Kratos – the titular God himself – shepherds his son through a world of monsters. It has a revamped look and vibe, along with refreshed gameplay, and the shift should give the top-selling franchise new life. Shadow of the Colossus (February 6; PS4) is another major release for the first half of the year, but it’s not a sequel – it’s a reimagining of the beloved 2005 epic adventure, outfitting the familiar quest with a dazzling new art style and other modern enhancements. “The original is my favorite game of all time, right up there with Civilization,” says Allen Murray, Executive Producer and VP of Production at 2K Games’ new Private Division label. “It’s one of the few games that I replay and go back to every year. So I am excited to see how Bluepoint has recreated this world I know so well from the PS2 era.” EVEN MORE BATTLE ROYALE
The biggest gaming phenomenon of 2017 was arguably PlayerUnknown’s Battlegrounds, a scrappy PC shooter that released in an unfinished beta version in March and went on to sell 24 million copies by the end of the year. PUBG, as fans know it, drops 100 online players onto an island and tasks them with trying to survive for as long as they can by outsmarting their foes. Inspired by the classic Battle Royale media franchise, it has become an unexpected sensation. PUBG launched on Xbox One at the end of 2017 and will make its way to other platforms (including mobile) in 2018, but competitors are already taking a bite out of developer Bluehole’s hit. In September 2017, Epic Games launched Fortnite: Battle Royale, a free-to-play spinoff of its own recent shooter, and notched more than 30 million players by the end of the year. Another shooter, Paladins, recently announced an upcoming PUBG-like mode, and there are other variations on the theme on mobile devices, as well. According to Adam Orth, Creative Strategist at First Contact Entertainment and former Microsoft Creative Director, we should see a lot more of that in 2018. “We will see a massive influx of games trying to cash in on the zeitgeist,” he explains. “PUBG is the new Minecraft, and several high-profile developers and publishers have either already released their competitive product – Epic Games’ Fortnite, for example – or are preparing to release their own variation on the theme.” NEW THINGS ARE COMING
Big sequels and trend-chasing knockoffs might sound a bit tired, but thankfully, the first half of 2018 also promises some exciting new game properties as well. Sony’s Detroit: Become Human (spring) depicts a futuristic world in which androids serve society, but are treated like second-class citizens – until they show human emotions. The PlayStation 4 game looks like a lavish choose-your-ownadventure experience, where you’ll pick your path through tense cinematics and affect the outcome of each scenario.
TOP TO BOTTOM: Nintendo Switch with left and right Joy-Con controllers (Photo: Hosokawa Shingo) Super Mario Odyssey for Switch (Nintendo) Red Dead Redemption 2 (Rockstar Games) Far Cry 5 (Ubisoft)
SPRING 2018 VFXVOICE.COM • 85
3/5/18 4:24 PM
GAMES
Sea of Thieves (March 20) is a raucous new experience from Microsoft and developer Rare, letting online players team up as pirates to command ships, battle against other squads, and create their own shenanigans along the way. Murray pegs the Xbox One and PC game as an example of “games as a service” experiences that keep players coming back for more over time. One of the spring’s most unique experiences is A Way Out (March 23), Electronic Arts’ prison-break game for PS4, Xbox One and PC. The cinematic adventure can only be played with a partner, whether it’s on your couch or over the internet, and you’ll have to work together to bust out of captivity and evade capture back out in the real world. Orth says that all three games “look like really innovative new titles.” VR IS FINDING AN AUDIENCE
TOP TO BOTTOM: God of War (Sony Interactive Entertainment) Shadow of the Colossus (Bluepoint) PlayerUnknown’s Battlegrounds (Bluehole/PUBG) Xbox One X console and controller
86 • VFXVOICE.COM SPRING 2018
PG 84-87 GAMES.indd 87
Modern virtual reality has seen a slow burn of adoption over the last two years. The high-end Oculus Rift and HTC Vive headsets have mostly reached early-adopter enthusiasts to date, while smartphone-powered experiences like the Samsung Gear VR and Google Daydream are fun but typically have simpler apps and games. But it’s the PlayStation VR, a headset that falls between those devices, that’s starting to make waves. Launched in late 2016, the PlayStation 4-powered PlayStation VR headset has now sold more than two million units to date. It’s cheaper than PC headsets, more powerful than what a smartphone can handle, and has huge games that you won’t find on other VR devices. Sony’s Gran Turismo Sport and The Last Guardian, Capcom’s Resident Evil 7, and Bethesda’s Doom VFR and The Elder Scrolls V: Skyrim VR have driven recent excitement around the headset. Orth, who is working on the PlayStation VR-exclusive shooter Firewall: Zero Hour, believes that the headset’s sales show “no sign of slowing down,” and that those aforementioned franchise hits are “giving credibility to the tech in the form of bankable hits that will sell headsets.” The Facebook-owned Oculus will also make a big new VR play in the first half of 2018 with the Oculus Go, a self-
“With PS4 Pro and Xbox One X both out now, games will end up supporting 4K and/or HDR that wouldn’t have before, and PC gamers will probably benefit the most from this for sure. I think near-universal adoption of non-trivial implementations of 4K/HDR/etc. will be common by year’s end.” —Dave Lang, Founder/Chief Product Officer, Iron Galaxy Studios contained VR headset that starts at $200 and doesn’t require a smartphone, console or PC to use. Oculus also has a cordless version of its Rift expected out later in the year. “With hardware, video and computing processing all lowering in cost, VR will finally be an option for the masses as well as the early adopters of the technology,” explains Orth. 4K ADOPTION EXPANDS
Microsoft’s new Xbox One X is the most powerful console on the market today, driving native and upscaled 4K and HDR experiences, with Sony’s PlayStation 4 Pro doing much the same even with less horsepower in tow. With 4K televisions dropping rapidly in price, industry executives expect wider support for 4K resolution in games in 2018 – on both consoles and high-end PCs. “I think with PS4 Pro and Xbox One X both out now, games will end up supporting 4K and/or HDR that wouldn’t have before, and PC gamers will probably benefit the most from this for sure,” says Lang. “I think near-universal adoption of nontrivial implementations of 4K/HDR/etc. will be common by year’s end.” Murray affirms that 4K resolution support will become more widespread. “The fidelity arms race never really goes away,” he says, “and we’ll see that move to VR and augmented reality where resolution is currently behind but quickly catching up. Indies will also find it necessary to increase resolution to stay competitive, even as they move away from high-fidelity photorealism towards more stylized art direction.” Furthermore, Murray suggests that more affordable motioncapture technology will open the floodgates for smaller studios to improve their games’ presentation and keep up with the industry’s giants. “The cost of motion-capture technology is decreasing, making it achievable for smaller studios – so we will see more and more use and innovation in mocap outside of the large AAA games,” he suggests.
TOP TO BOTTOM: Detroit: Become Human (Sony Interactive Entertainment) Sea of Thieves (Microsoft) A Way Out (Electronic Arts) PlayStation VR headset
SPRING 2018 VFXVOICE.COM • 87
3/5/18 4:24 PM
GAMES
Sea of Thieves (March 20) is a raucous new experience from Microsoft and developer Rare, letting online players team up as pirates to command ships, battle against other squads, and create their own shenanigans along the way. Murray pegs the Xbox One and PC game as an example of “games as a service” experiences that keep players coming back for more over time. One of the spring’s most unique experiences is A Way Out (March 23), Electronic Arts’ prison-break game for PS4, Xbox One and PC. The cinematic adventure can only be played with a partner, whether it’s on your couch or over the internet, and you’ll have to work together to bust out of captivity and evade capture back out in the real world. Orth says that all three games “look like really innovative new titles.” VR IS FINDING AN AUDIENCE
TOP TO BOTTOM: God of War (Sony Interactive Entertainment) Shadow of the Colossus (Bluepoint) PlayerUnknown’s Battlegrounds (Bluehole/PUBG) Xbox One X console and controller
86 • VFXVOICE.COM SPRING 2018
PG 84-87 GAMES.indd 87
Modern virtual reality has seen a slow burn of adoption over the last two years. The high-end Oculus Rift and HTC Vive headsets have mostly reached early-adopter enthusiasts to date, while smartphone-powered experiences like the Samsung Gear VR and Google Daydream are fun but typically have simpler apps and games. But it’s the PlayStation VR, a headset that falls between those devices, that’s starting to make waves. Launched in late 2016, the PlayStation 4-powered PlayStation VR headset has now sold more than two million units to date. It’s cheaper than PC headsets, more powerful than what a smartphone can handle, and has huge games that you won’t find on other VR devices. Sony’s Gran Turismo Sport and The Last Guardian, Capcom’s Resident Evil 7, and Bethesda’s Doom VFR and The Elder Scrolls V: Skyrim VR have driven recent excitement around the headset. Orth, who is working on the PlayStation VR-exclusive shooter Firewall: Zero Hour, believes that the headset’s sales show “no sign of slowing down,” and that those aforementioned franchise hits are “giving credibility to the tech in the form of bankable hits that will sell headsets.” The Facebook-owned Oculus will also make a big new VR play in the first half of 2018 with the Oculus Go, a self-
“With PS4 Pro and Xbox One X both out now, games will end up supporting 4K and/or HDR that wouldn’t have before, and PC gamers will probably benefit the most from this for sure. I think near-universal adoption of non-trivial implementations of 4K/HDR/etc. will be common by year’s end.” —Dave Lang, Founder/Chief Product Officer, Iron Galaxy Studios contained VR headset that starts at $200 and doesn’t require a smartphone, console or PC to use. Oculus also has a cordless version of its Rift expected out later in the year. “With hardware, video and computing processing all lowering in cost, VR will finally be an option for the masses as well as the early adopters of the technology,” explains Orth. 4K ADOPTION EXPANDS
Microsoft’s new Xbox One X is the most powerful console on the market today, driving native and upscaled 4K and HDR experiences, with Sony’s PlayStation 4 Pro doing much the same even with less horsepower in tow. With 4K televisions dropping rapidly in price, industry executives expect wider support for 4K resolution in games in 2018 – on both consoles and high-end PCs. “I think with PS4 Pro and Xbox One X both out now, games will end up supporting 4K and/or HDR that wouldn’t have before, and PC gamers will probably benefit the most from this for sure,” says Lang. “I think near-universal adoption of nontrivial implementations of 4K/HDR/etc. will be common by year’s end.” Murray affirms that 4K resolution support will become more widespread. “The fidelity arms race never really goes away,” he says, “and we’ll see that move to VR and augmented reality where resolution is currently behind but quickly catching up. Indies will also find it necessary to increase resolution to stay competitive, even as they move away from high-fidelity photorealism towards more stylized art direction.” Furthermore, Murray suggests that more affordable motioncapture technology will open the floodgates for smaller studios to improve their games’ presentation and keep up with the industry’s giants. “The cost of motion-capture technology is decreasing, making it achievable for smaller studios – so we will see more and more use and innovation in mocap outside of the large AAA games,” he suggests.
TOP TO BOTTOM: Detroit: Become Human (Sony Interactive Entertainment) Sea of Thieves (Microsoft) A Way Out (Electronic Arts) PlayStation VR headset
SPRING 2018 VFXVOICE.COM • 87
3/5/18 4:24 PM
[ VES SECTION SPOTLIGHT: NEW ZEALAND ]
Building on Strong Bonds of Camaraderie and Community Much of the Visual Effects Society’s growing international presence is due to its network of Sections whose members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region while advancing the Society and industry worldwide. Founded in 2011, the New Zealand Section is thriving with more than 160 members, its vibrancy due to the area’s close-knit visual effects community and impassioned leadership at the helm. The Wellington-based New Zealand Section resides in a country of stunning natural beauty and a dynamic Māori culture, and defines itself as a family town. The Board of Managers believes that its success is largely due to its differences from many of its metropolitan counterparts. “We’re not a traditional Section – we’re a family of members,” says Emma Clifton Perry, former Section Chair. “We have a lot of VFX practitioners who migrate here for the work, as well as other longstanding community members who have been here since before The Lord of the Rings. We always try to have events and children’s films screenings that are accessible to families who see the VES as the all-encompassing hub of their professional lives and social networking. How many Sections have a kid-friendly holiday party? We do and we love that. Nurturing these bonds among our fellow professionals is an important factor in our planning so that we continue to build camaraderie and rich community.” In a country of less than four million people and several thousand visual effects practitioners, the New Zealand Section counts almost 10% of the workforce among its members. Especially in Wellington, the VES is the primary VFX industry organization. When the Section was first established, the majority of its members were from the Weta Group of Companies, a national point of pride and major employer in the visual effects community, but that has shifted in recent years as other smaller facilities and VR companies have emerged in the VFX community and joined the Society. The Section’s membership is primarily comprised of VFX practitioners working in feature film and animation, some with a background in commercials and television, and many whose career trajectories began in the gaming community. The Board of Managers has made it a priority to encourage more educators and
multidisciplinary experts to engage as members and leaders. “We are fortunate to have the active involvement of diverse professionals who see things from another perspective and add to our visual effects proficiency,” states Clifton Perry. “Lance Lones (Chief Scientist at L2VR Ltd.) and Douglas Easterly (Associate Professor, Head of the School of Design at Victoria University of Wellington) are prime examples of leaders who have greatly enriched our programming by contributing their time, talent and resources.” Easterly helped create a series at Victoria University, entitled “An Audience With,” which showcased innovative thinkers – open not only to VES members, but also to the local student population. This is emblematic of the Section’s commitment to developing the next generation of VFX artists. “We see it as our responsibility and our opportunity to share our passion and insights with rising professionals to sustain and grow our industry,” explains Clifton Perry. “A lot of our members are industry leaders, and students having access to their expertise and mentorship is priceless.” At its core, the New Zealand Section has created a diverse roster of programs and special events designed to serve its members by fostering professional development, education, networking – and recognition of the vast spectrum of visual effects talent in the region while adding its unique flair in sync with the culture of the community. The Section is grateful to receive significant support from Weta Digital and Weta Group as a sponsor, including lending their facilities for film screenings and events. Several years ago the Section launched Teq Talks, a tequilainfused version of innovation jams where technology, entertainment and design converge at a local Mexican restaurant. These seven-minute talks have featured an eclectic array of creative thinkers, including a professional ballerina from the New Zealand Ballet, a mo-cap performer, creators of iPad apps and new startups, VFX artists and experts in astronomy and astrophotography. The events were open to the public as a means to share the wealth of knowledge and cement relationships in the community. The popular series has intrigued other Sections, and New Zealand is eager to share its lessons learned throughout the global VES community. There is a Sister City relationship between Wellington and San Francisco, both innovative and socially progressive multicultural cities with great love for their natural environment and built heritage. “Just as the cities are exploring a closer relationship, we are working to foster that kind of partnership as well, particularly
“We’re not a traditional Section – we’re a family of members. We have a lot of VFX practitioners who migrate here for the work, as well as other longstanding community members who have been here since before The Lord of the Rings.” —Emma Clifton Perry, former Section Chair
“We are fortunate to have the active involvement of diverse professionals who see things from another perspective and add to our visual effects proficiency … and who have greatly enriched our programming by contributing their time, talent and resources.” —Emma Clifton Perry, former Section Chair
By NAOMI GOLDMAN
TOP: The Boat Sheds at Wellington’s iconic Oriental Parade. (Photo by James Ogle, New Zealand Section Board member) BOTTOM: New Zealand Section Board members Jason Galeon, Georg Duemlein and local VES members at Teq Talks.
88 • VFXVOICE.COM SPRING 2018
PG 88-91 NEW ZEALAND.indd 88-89
TOP LEFT: Director Christian Nicholson and New Zealand Section Board member Jason Galeon at the VES preview screening of This Giant Papier Maché Boulder Is Actually Really Heavy. TOP RIGHT: New Zealand Section Board member Fabiano Petroni and NASA astronaut Dr. Yvonne Cagle at the VES NZ NASA presentation with Q&A. MIDDLE: New Zealand Section Board member Jason Galeon and local VES members at the 2017 mixer. BOTTOM: Former Chair, current New Zealand Section Board member and member of the Board of Directors Emma Clifton Perry and local VES members at the 2017 mixer.
SPRING 2018
VFXVOICE.COM • 89
3/5/18 4:25 PM
[ VES SECTION SPOTLIGHT: NEW ZEALAND ]
Building on Strong Bonds of Camaraderie and Community Much of the Visual Effects Society’s growing international presence is due to its network of Sections whose members lend their expertise and entrepreneurship to benefit visual effects practitioners in their region while advancing the Society and industry worldwide. Founded in 2011, the New Zealand Section is thriving with more than 160 members, its vibrancy due to the area’s close-knit visual effects community and impassioned leadership at the helm. The Wellington-based New Zealand Section resides in a country of stunning natural beauty and a dynamic Māori culture, and defines itself as a family town. The Board of Managers believes that its success is largely due to its differences from many of its metropolitan counterparts. “We’re not a traditional Section – we’re a family of members,” says Emma Clifton Perry, former Section Chair. “We have a lot of VFX practitioners who migrate here for the work, as well as other longstanding community members who have been here since before The Lord of the Rings. We always try to have events and children’s films screenings that are accessible to families who see the VES as the all-encompassing hub of their professional lives and social networking. How many Sections have a kid-friendly holiday party? We do and we love that. Nurturing these bonds among our fellow professionals is an important factor in our planning so that we continue to build camaraderie and rich community.” In a country of less than four million people and several thousand visual effects practitioners, the New Zealand Section counts almost 10% of the workforce among its members. Especially in Wellington, the VES is the primary VFX industry organization. When the Section was first established, the majority of its members were from the Weta Group of Companies, a national point of pride and major employer in the visual effects community, but that has shifted in recent years as other smaller facilities and VR companies have emerged in the VFX community and joined the Society. The Section’s membership is primarily comprised of VFX practitioners working in feature film and animation, some with a background in commercials and television, and many whose career trajectories began in the gaming community. The Board of Managers has made it a priority to encourage more educators and
multidisciplinary experts to engage as members and leaders. “We are fortunate to have the active involvement of diverse professionals who see things from another perspective and add to our visual effects proficiency,” states Clifton Perry. “Lance Lones (Chief Scientist at L2VR Ltd.) and Douglas Easterly (Associate Professor, Head of the School of Design at Victoria University of Wellington) are prime examples of leaders who have greatly enriched our programming by contributing their time, talent and resources.” Easterly helped create a series at Victoria University, entitled “An Audience With,” which showcased innovative thinkers – open not only to VES members, but also to the local student population. This is emblematic of the Section’s commitment to developing the next generation of VFX artists. “We see it as our responsibility and our opportunity to share our passion and insights with rising professionals to sustain and grow our industry,” explains Clifton Perry. “A lot of our members are industry leaders, and students having access to their expertise and mentorship is priceless.” At its core, the New Zealand Section has created a diverse roster of programs and special events designed to serve its members by fostering professional development, education, networking – and recognition of the vast spectrum of visual effects talent in the region while adding its unique flair in sync with the culture of the community. The Section is grateful to receive significant support from Weta Digital and Weta Group as a sponsor, including lending their facilities for film screenings and events. Several years ago the Section launched Teq Talks, a tequilainfused version of innovation jams where technology, entertainment and design converge at a local Mexican restaurant. These seven-minute talks have featured an eclectic array of creative thinkers, including a professional ballerina from the New Zealand Ballet, a mo-cap performer, creators of iPad apps and new startups, VFX artists and experts in astronomy and astrophotography. The events were open to the public as a means to share the wealth of knowledge and cement relationships in the community. The popular series has intrigued other Sections, and New Zealand is eager to share its lessons learned throughout the global VES community. There is a Sister City relationship between Wellington and San Francisco, both innovative and socially progressive multicultural cities with great love for their natural environment and built heritage. “Just as the cities are exploring a closer relationship, we are working to foster that kind of partnership as well, particularly
“We’re not a traditional Section – we’re a family of members. We have a lot of VFX practitioners who migrate here for the work, as well as other longstanding community members who have been here since before The Lord of the Rings.” —Emma Clifton Perry, former Section Chair
“We are fortunate to have the active involvement of diverse professionals who see things from another perspective and add to our visual effects proficiency … and who have greatly enriched our programming by contributing their time, talent and resources.” —Emma Clifton Perry, former Section Chair
By NAOMI GOLDMAN
TOP: The Boat Sheds at Wellington’s iconic Oriental Parade. (Photo by James Ogle, New Zealand Section Board member) BOTTOM: New Zealand Section Board members Jason Galeon, Georg Duemlein and local VES members at Teq Talks.
88 • VFXVOICE.COM SPRING 2018
PG 88-91 NEW ZEALAND.indd 88-89
TOP LEFT: Director Christian Nicholson and New Zealand Section Board member Jason Galeon at the VES preview screening of This Giant Papier Maché Boulder Is Actually Really Heavy. TOP RIGHT: New Zealand Section Board member Fabiano Petroni and NASA astronaut Dr. Yvonne Cagle at the VES NZ NASA presentation with Q&A. MIDDLE: New Zealand Section Board member Jason Galeon and local VES members at the 2017 mixer. BOTTOM: Former Chair, current New Zealand Section Board member and member of the Board of Directors Emma Clifton Perry and local VES members at the 2017 mixer.
SPRING 2018
VFXVOICE.COM • 89
3/5/18 4:25 PM
[ VES SECTION SPOTLIGHT: NEW ZEALAND ]
“We see it as our responsibility and our opportunity to share our passion and insights with rising professionals to sustain and grow our industry. A lot of our members are industry leaders, and students having access to their expertise and mentorship is priceless.” —Emma Clifton Perry, former Section Chair
“We always try to have events and children’s films screenings that are accessible to families who see the VES as the all-encompassing hub of their professional lives and social networking. How many Sections have a kid-friendly holiday party? We do and we love that. Nurturing these bonds among our fellow professionals is an important factor in our planning so that we continue to build camaraderie and rich community.” —Emma Clifton Perry, former Section Chair
with the San Francisco Bay Area Section,” says Clifton Perry. “We are already close to the Australia Section and are branching out to share what we have learned with our colleagues worldwide.” The New Zealand Section exemplifies a welcoming atmosphere and benefits from directors, producers and industry practitioners who come through the area to shoot or scout locations, and brings them in for intimate Q&As whenever possible. Speakers have included Luc Besson (Lucy, Valerian and the City of a Thousand Planets) and Dean DeBlois (How to Train Your Dragon franchise), who shared personal stories on the evolution of their careers, their approach to filmmaking and passion projects. Rob Cavaleri, CG Supervisor at Blue Sky Studios, talked about the artistic and technical challenges they faced on The Peanuts Movie in bringing the classic pen lines of Charles Schulz to the big screen. And Director of Weta Digital and VFX master Joe Letteri, VES, has given talks about using technology to create unforgettable worlds and CG characters that speak volumes about our humanity. A standout for the group was Letteri sharing recollections about the “early days” on Jurassic Park. And last year, the group enjoyed an out-of-this-world experience with two NASA staffers who journeyed to New Zealand for the Space and Science Festival in Wellington. The Section held a meet & greet with Astronaut Dr. Yvonne Cagle and Mars Curiosity research scientist Dr. Jen Blank, who shared their experiences training for space missions, their views on technological and scientific advancements, and their approaches to out-of-the-box creative thinking and problem solving. Moving forward, the New Zealand Section is focused on broadening its outreach – both geographically to areas outside Wellington as an ‘entire country Section’ and professionally to VFX talent at emerging companies. It also would like to bring back the popular Teq Talk series and build on its roster of educational
programs for its members and burgeoning VFX artists. “I’m enormously proud to be a part of this energized organization,” comments Clifton Perry. “From our Section leadership under our first Chair, Matt Aitken, to our hard-working Board of Managers, to a thriving membership, we are very lucky to be a part of this community. Joe Weidenbach and I have the privilege of serving on the overarching VES Board of Directors representing our membership and giving voice to ensure equal opportunities for members worldwide. We have can-do, proactive people with real passion for the VES – it shows and has enabled our Section to shine.” TOP LEFT: Former Chair and current New Zealand Section Board member Emma Clifton Perry and former Board member Charles Armstrong greeting guests at the Winter Mixer at the Roxy Cinemas. MIDDLE: VES members at the Winter Mixer at the Roxy Cinemas BOTTOM: Oriental Parade, Wellington. (Photo by James Ogle, New Zealand Section Board member)
TOP: New Zealand Section Board member Jason Galeon and guest speaker at Teq Talks at Boca Loca. MIDDLE: VES members at Teq Talks at Boca Loca. BOTTOM: New Zealand Section Board member Georg Duemlein, right, and attendee at Teq Talks at Boca Loca.
90 • VFXVOICE.COM SPRING 2018
PG 88-91 NEW ZEALAND.indd 91
SPRING 2018
VFXVOICE.COM • 91
3/5/18 4:25 PM
[ VES SECTION SPOTLIGHT: NEW ZEALAND ]
“We see it as our responsibility and our opportunity to share our passion and insights with rising professionals to sustain and grow our industry. A lot of our members are industry leaders, and students having access to their expertise and mentorship is priceless.” —Emma Clifton Perry, former Section Chair
“We always try to have events and children’s films screenings that are accessible to families who see the VES as the all-encompassing hub of their professional lives and social networking. How many Sections have a kid-friendly holiday party? We do and we love that. Nurturing these bonds among our fellow professionals is an important factor in our planning so that we continue to build camaraderie and rich community.” —Emma Clifton Perry, former Section Chair
with the San Francisco Bay Area Section,” says Clifton Perry. “We are already close to the Australia Section and are branching out to share what we have learned with our colleagues worldwide.” The New Zealand Section exemplifies a welcoming atmosphere and benefits from directors, producers and industry practitioners who come through the area to shoot or scout locations, and brings them in for intimate Q&As whenever possible. Speakers have included Luc Besson (Lucy, Valerian and the City of a Thousand Planets) and Dean DeBlois (How to Train Your Dragon franchise), who shared personal stories on the evolution of their careers, their approach to filmmaking and passion projects. Rob Cavaleri, CG Supervisor at Blue Sky Studios, talked about the artistic and technical challenges they faced on The Peanuts Movie in bringing the classic pen lines of Charles Schulz to the big screen. And Director of Weta Digital and VFX master Joe Letteri, VES, has given talks about using technology to create unforgettable worlds and CG characters that speak volumes about our humanity. A standout for the group was Letteri sharing recollections about the “early days” on Jurassic Park. And last year, the group enjoyed an out-of-this-world experience with two NASA staffers who journeyed to New Zealand for the Space and Science Festival in Wellington. The Section held a meet & greet with Astronaut Dr. Yvonne Cagle and Mars Curiosity research scientist Dr. Jen Blank, who shared their experiences training for space missions, their views on technological and scientific advancements, and their approaches to out-of-the-box creative thinking and problem solving. Moving forward, the New Zealand Section is focused on broadening its outreach – both geographically to areas outside Wellington as an ‘entire country Section’ and professionally to VFX talent at emerging companies. It also would like to bring back the popular Teq Talk series and build on its roster of educational
programs for its members and burgeoning VFX artists. “I’m enormously proud to be a part of this energized organization,” comments Clifton Perry. “From our Section leadership under our first Chair, Matt Aitken, to our hard-working Board of Managers, to a thriving membership, we are very lucky to be a part of this community. Joe Weidenbach and I have the privilege of serving on the overarching VES Board of Directors representing our membership and giving voice to ensure equal opportunities for members worldwide. We have can-do, proactive people with real passion for the VES – it shows and has enabled our Section to shine.” TOP LEFT: Former Chair and current New Zealand Section Board member Emma Clifton Perry and former Board member Charles Armstrong greeting guests at the Winter Mixer at the Roxy Cinemas. MIDDLE: VES members at the Winter Mixer at the Roxy Cinemas BOTTOM: Oriental Parade, Wellington. (Photo by James Ogle, New Zealand Section Board member)
TOP: New Zealand Section Board member Jason Galeon and guest speaker at Teq Talks at Boca Loca. MIDDLE: VES members at Teq Talks at Boca Loca. BOTTOM: New Zealand Section Board member Georg Duemlein, right, and attendee at Teq Talks at Boca Loca.
90 • VFXVOICE.COM SPRING 2018
PG 88-91 NEW ZEALAND.indd 91
SPRING 2018
VFXVOICE.COM • 91
3/5/18 4:25 PM
[ VES NEWS ]
VES New York Celebrates Acclaimed VFX Supervisor By NAOMI GOLDMAN On March 2, the VES New York Section presented the 2018 VES Empire Award to acclaimed Visual Effects Supervisor Lesley Robson-Foster at its fourth annual VES New York Awards Celebration. Revered by broadcasters and directors and admired by the VFX industry, Robson-Foster has pushed the envelope in some of the most prestigious television series in recent decades. She has shared her talents in New York and Los Angeles on television series and feature films, including Six Feet Under, Sex and the City, The Wire, Ugly Betty, HBO mini-series Mildred Pierce, The Great Gatsby, Boardwalk Empire, The Knick, Mosaic and Logan Lucky. Currently, she is working with Amy Sherman Palladino on Amazon Prime Video’s The Marvelous Mrs. Maisel and on the upcoming movie based on the Pulitzer Prize-winning book The Goldfinch. The Section established the Empire Award in 2015 to recognize a New York-based visual effects professional who has made enormous contributions to the field of visual arts and filmed entertainment, and whose work embodies the spirit of New York. The regional celebration of New York’s VFX community and VES Award winners around the globe was an instant success and has since become a must-attend annual affair.
TOP: Left to right: Randy Balsmeyer, Visual Effects Supervisor and Title Designer; Eric Roth, VES Executive Director; Lesley Robson-Foster, VFX Supervisor and 2018 New York Empire Award honoree; Karl Coyner, Co-Chair, VES New York Section, Producer at Shade VFX; Leslie Chung, Co-Chair, VES New York Section, Lead Compositor at Crafty Apes; and Andrew Bly, Event Chair, Board member of VES Board of Directors, Co-founder of The Molecule. BOTTOM: Randy Balsmeyer, Visual Effects Supervisor and Title Designer (and inaugural recipient of the Empire Award) presents the 2018 VES New York Empire Award to acclaimed VFX Supervisor, Lesley Robson-Foster.
92 • VFXVOICE.COM SPRING 2018
PG 92-93 VES NEWS.indd All Pages
VES BOARD OF DIRECTORS OFFICERS 2018; MIKE CHAMBERS RE-ELECTED BOARD CHAIR The VES Board of Directors officers for 2018, who comprise the VES Board Executive Committee, were elected at the January 2018 Board meeting. The officers include Mike Chambers, who was re-elected as Board Chair. “We are fortunate to have such esteemed leadership represented on the Executive Committee,” said Eric Roth, VES Executive Director. “Collectively, these talented professionals bring passion, diverse experience and enormous commitment to our organization.” The 2018 Officers of the VES Board of Directors are: Chair: Mike Chambers 1st Vice Chair: Jeffrey A. Okun, VES 2nd Vice Chair: Kim Lavery, VES Secretary: Rita Cahill Treasurer: Dan Schrecker Board Chair Mike Chambers has been a member of the VES for 20 years, serving multiple terms on the Board of Directors. This is his fourth term as Chair, and he previously served on the Executive Committee as both Vice Chair and Secretary of the organization. He is also a member of the Producers Guild of America and The Academy of Motion Picture Arts & Sciences. A freelance visual effects producer and independent VFX consultant specializing in large-scale feature film productions, Chambers is currently working on an unnamed project for Warner
Bros. and last worked on Dunkirk, his third collaboration with esteemed director Christopher Nolan. He has contributed to the visual effects efforts on many Academy and BAFTA awardwinning films. Chambers has won VES Awards for Best Visual Effects, for The Day After Tomorrow and Inception, and Best Supporting Visual Effects for Dunkirk. He was also nominated for his work on I Am Legend.
TOP: Nearly 200 gathered to enjoy the 4th Annual VES New York Awards Celebration on March 2, 2018 at the Green Building in Brooklyn. (Photos courtesy of Julius Constantine Motal)
VISUAL EFFECTS SOCIETY WELCOMES TWO NEW SECTIONS The VES is proud to welcome its two newest Sections in India and France. These Sections join the VES’ dynamic worldwide community, which has Sections in Australia, San Francisco’s Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington state. Many among the VES leadership note the creation of the Sections as one of, if not the biggest achievement of the organization to date. The aim is always for members to feel that no matter their location, they are part of a unified community, a VFX cloud clubhouse. “We are thrilled to welcome our 12th and 13th Sections in India and Paris, France,” said Mike Chambers, VES Board Chair. “Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership, and that commitment to diversity will continue to be a driving force of the organization.” Both Sections have a unique and timely opportunity to build a rich network of VFX artists and innovators, and have just elected their inaugural Boards of Managers.
SPRING 2018 VFXVOICE.COM • 93
3/5/18 4:31 PM
[ VES NEWS ]
VES New York Celebrates Acclaimed VFX Supervisor By NAOMI GOLDMAN On March 2, the VES New York Section presented the 2018 VES Empire Award to acclaimed Visual Effects Supervisor Lesley Robson-Foster at its fourth annual VES New York Awards Celebration. Revered by broadcasters and directors and admired by the VFX industry, Robson-Foster has pushed the envelope in some of the most prestigious television series in recent decades. She has shared her talents in New York and Los Angeles on television series and feature films, including Six Feet Under, Sex and the City, The Wire, Ugly Betty, HBO mini-series Mildred Pierce, The Great Gatsby, Boardwalk Empire, The Knick, Mosaic and Logan Lucky. Currently, she is working with Amy Sherman Palladino on Amazon Prime Video’s The Marvelous Mrs. Maisel and on the upcoming movie based on the Pulitzer Prize-winning book The Goldfinch. The Section established the Empire Award in 2015 to recognize a New York-based visual effects professional who has made enormous contributions to the field of visual arts and filmed entertainment, and whose work embodies the spirit of New York. The regional celebration of New York’s VFX community and VES Award winners around the globe was an instant success and has since become a must-attend annual affair.
TOP: Left to right: Randy Balsmeyer, Visual Effects Supervisor and Title Designer; Eric Roth, VES Executive Director; Lesley Robson-Foster, VFX Supervisor and 2018 New York Empire Award honoree; Karl Coyner, Co-Chair, VES New York Section, Producer at Shade VFX; Leslie Chung, Co-Chair, VES New York Section, Lead Compositor at Crafty Apes; and Andrew Bly, Event Chair, Board member of VES Board of Directors, Co-founder of The Molecule. BOTTOM: Randy Balsmeyer, Visual Effects Supervisor and Title Designer (and inaugural recipient of the Empire Award) presents the 2018 VES New York Empire Award to acclaimed VFX Supervisor, Lesley Robson-Foster.
92 • VFXVOICE.COM SPRING 2018
PG 92-93 VES NEWS.indd All Pages
VES BOARD OF DIRECTORS OFFICERS 2018; MIKE CHAMBERS RE-ELECTED BOARD CHAIR The VES Board of Directors officers for 2018, who comprise the VES Board Executive Committee, were elected at the January 2018 Board meeting. The officers include Mike Chambers, who was re-elected as Board Chair. “We are fortunate to have such esteemed leadership represented on the Executive Committee,” said Eric Roth, VES Executive Director. “Collectively, these talented professionals bring passion, diverse experience and enormous commitment to our organization.” The 2018 Officers of the VES Board of Directors are: Chair: Mike Chambers 1st Vice Chair: Jeffrey A. Okun, VES 2nd Vice Chair: Kim Lavery, VES Secretary: Rita Cahill Treasurer: Dan Schrecker Board Chair Mike Chambers has been a member of the VES for 20 years, serving multiple terms on the Board of Directors. This is his fourth term as Chair, and he previously served on the Executive Committee as both Vice Chair and Secretary of the organization. He is also a member of the Producers Guild of America and The Academy of Motion Picture Arts & Sciences. A freelance visual effects producer and independent VFX consultant specializing in large-scale feature film productions, Chambers is currently working on an unnamed project for Warner
Bros. and last worked on Dunkirk, his third collaboration with esteemed director Christopher Nolan. He has contributed to the visual effects efforts on many Academy and BAFTA awardwinning films. Chambers has won VES Awards for Best Visual Effects, for The Day After Tomorrow and Inception, and Best Supporting Visual Effects for Dunkirk. He was also nominated for his work on I Am Legend.
TOP: Nearly 200 gathered to enjoy the 4th Annual VES New York Awards Celebration on March 2, 2018 at the Green Building in Brooklyn. (Photos courtesy of Julius Constantine Motal)
VISUAL EFFECTS SOCIETY WELCOMES TWO NEW SECTIONS The VES is proud to welcome its two newest Sections in India and France. These Sections join the VES’ dynamic worldwide community, which has Sections in Australia, San Francisco’s Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington state. Many among the VES leadership note the creation of the Sections as one of, if not the biggest achievement of the organization to date. The aim is always for members to feel that no matter their location, they are part of a unified community, a VFX cloud clubhouse. “We are thrilled to welcome our 12th and 13th Sections in India and Paris, France,” said Mike Chambers, VES Board Chair. “Our determination to outreach to all corners of the globe and to all of the disciplines across the VFX spectrum has yielded us a very rich, talented membership, and that commitment to diversity will continue to be a driving force of the organization.” Both Sections have a unique and timely opportunity to build a rich network of VFX artists and innovators, and have just elected their inaugural Boards of Managers.
SPRING 2018 VFXVOICE.COM • 93
3/5/18 4:31 PM
[ V-ART ]
Alex Nice V-Art showcases the talents of worldwide VFX professionals as they create original illustrations, creatures, models and worlds. If you would like to contribute to V-Art, send images and captions to publisher@ vfxvoice.com. Alex Nice (www.alexnice.com) is a concept illustrator, art director and matte painter who has worked on films such as The Hunger Games: Mockingjay – Part 2, Hugo, Sin City: A Dame to Kill For and The Jungle Book. More recently, he’s lent his service to art direction for AR and VR projects, and a number of upcoming films. Here, Nice shares several personal artwork projects that focus on visual storytelling.
TOP RIGHT: I’ve spent a good portion of my career as a matte painter for film and I’ve done plenty of city paintings, but most of them have been aerials for some reason. For this one, I wanted to do a futuristic cityscape viewed closer to ground level. I learned a lot doing this piece.
TOP LEFT: I was really inspired by the look of the environments in the latest Mad Max film, so I did this quick color study concept piece. Here I have marauders attempting to take massive rusted desert stronghold.
MIDDLE LEFT: This second concept piece was done for a real-time VR experience I am currently working on in my spare time just for fun. I thought it would be cool to interactively mine an asteroid out in the dark isolated Kuiper belt outside of our solar system. I’m currently in the middle of developing this story further.
MIDDLE: I made these character designs using photos I took of human organs at the “Body Worlds” human anatomy exhibit. For these guys, I photo-bashed images of real human organs such as livers, tongues, lungs, etc., to come up with some interesting character designs. It was a fun challenge transforming my photos of gross parts of human anatomy into this ragtag group of misfits that I call the “The Uglys”.
MIDDLE RIGHT: For this piece, I mixed 2D techniques and 3D array tools to create the dark structures in the distance. I wanted to challenge myself and see if I could create an ominous scene without going dark. BOTTOM: “Oh Noooooooo!!”... a quick ridiculous illustration I did earlier this year of the moon colliding with our beloved Earth. I find the idea of an astronaut watching a disaster like this from space to be kind of so sad, it’s funny.
94 • VFXVOICE.COM SPRING 2018
PG 94-95 V-ART.indd All Pages
TOP RIGHT: VES member, Alex Nice
BOTTOM: “Detection of mini black holes at the LHC could indicate parallel universes in extra dimensions.” .... You idiots! What have you done?!
SPRING 2018 VFXVOICE.COM • 95
3/5/18 4:31 PM
[ V-ART ]
Alex Nice V-Art showcases the talents of worldwide VFX professionals as they create original illustrations, creatures, models and worlds. If you would like to contribute to V-Art, send images and captions to publisher@ vfxvoice.com. Alex Nice (www.alexnice.com) is a concept illustrator, art director and matte painter who has worked on films such as The Hunger Games: Mockingjay – Part 2, Hugo, Sin City: A Dame to Kill For and The Jungle Book. More recently, he’s lent his service to art direction for AR and VR projects, and a number of upcoming films. Here, Nice shares several personal artwork projects that focus on visual storytelling.
TOP RIGHT: I’ve spent a good portion of my career as a matte painter for film and I’ve done plenty of city paintings, but most of them have been aerials for some reason. For this one, I wanted to do a futuristic cityscape viewed closer to ground level. I learned a lot doing this piece.
TOP LEFT: I was really inspired by the look of the environments in the latest Mad Max film, so I did this quick color study concept piece. Here I have marauders attempting to take massive rusted desert stronghold.
MIDDLE LEFT: This second concept piece was done for a real-time VR experience I am currently working on in my spare time just for fun. I thought it would be cool to interactively mine an asteroid out in the dark isolated Kuiper belt outside of our solar system. I’m currently in the middle of developing this story further.
MIDDLE: I made these character designs using photos I took of human organs at the “Body Worlds” human anatomy exhibit. For these guys, I photo-bashed images of real human organs such as livers, tongues, lungs, etc., to come up with some interesting character designs. It was a fun challenge transforming my photos of gross parts of human anatomy into this ragtag group of misfits that I call the “The Uglys”.
MIDDLE RIGHT: For this piece, I mixed 2D techniques and 3D array tools to create the dark structures in the distance. I wanted to challenge myself and see if I could create an ominous scene without going dark. BOTTOM: “Oh Noooooooo!!”... a quick ridiculous illustration I did earlier this year of the moon colliding with our beloved Earth. I find the idea of an astronaut watching a disaster like this from space to be kind of so sad, it’s funny.
94 • VFXVOICE.COM SPRING 2018
PG 94-95 V-ART.indd All Pages
TOP RIGHT: VES member, Alex Nice
BOTTOM: “Detection of mini black holes at the LHC could indicate parallel universes in extra dimensions.” .... You idiots! What have you done?!
SPRING 2018 VFXVOICE.COM • 95
3/5/18 4:31 PM
[ FINAL FRAME ]
2001: A Space Odyssey – The Jupiter Machine
It’s now 50 years since the release of Stanley Kubrick’s landmark sci-fi classic 2001: A Space Odyssey, which represented several landmark achievements in visual effects. One of those achievements includes the methods that Special Photographic Effects Supervisor Douglas Trumbull, VES – one of four credited on the 1968 film – devised to realize shots of the planet Jupiter. For that, Trumbull conceived of ‘The Jupiter Machine,’ a take on the slit-scan technique also used in the film for the famous stargate sequence. The Jupiter incarnation involved a device made up of a revolving disc ball with a thin, white slit onto which two projectors – also revolving around the ball – projected flat Jupiter artwork representing the north and south hemispheres of the planet. The various mechanics and gears of the Machine and lengthy exposure times produced a final realistic result of a revolving gaseous planet.
TOP: A final result, including Jupiter’s distinctive red spot. BOTTOM: The machine in operation. Importantly, the exposures were acquired in a completely darkened room in which only the projected slits of the painted artwork were visible to the camera.
Photos courtesy of Douglas Trumbull, VES
96 • VFXVOICE.COM SPRING 2018
PG 96 FINAL FRAME.indd 96
3/5/18 4:32 PM
CVR3 FOTOKEM AD.indd 3
3/5/18 4:05 PM
CVR4 BOT AD.indd 4
3/8/18 10:29 AM