VFX Voice Spring 2021

Page 1

VFXVOICE.COM SPRING 2021

GALAXY’S EDGE VR

ON-SET SUPERVISORS • WANDAVISION • NO TIME TO DIE FOR ALL MANKIND • PROFILES: GUILLAUME ROCHERON & JENNIFER BELL

CVR1 VFXV issue 17 SPRING 2021.indd 1

2/25/21 2:03 PM


CVR2 NOITOM AD.indd 3

2/25/21 2:33 PM


PG 1 AUTODESK AD.indd 1

2/25/21 2:35 PM


[ EXECUTIVE NOTE ]

Welcome to the Spring issue of VFX Voice! This issue marks four years since we launched VFX Voice, and we cannot thank you enough for your enthusiastic support as a part of our global community. We’re proud to keep shining a light on outstanding visual effects artistry and innovation worldwide and lift up the creative talent who never cease to inspire us all. In this issue, our cover story goes inside the making of Star Wars: Tales from the Galaxy’s Edge VR, an exciting new way to embrace and experience the Star Wars saga. We sit down with Oscar-winning Visual Effects Supervisor Guillaume Rocheron and Universal’s head VFX executive Jennifer Bell. We delve into hot industry trends in AI, Machine & Deep Learning and the latest in virtual production workflows, and share firsthand experiences from on-set supervisors, as well as new creative tools driving artists’ passion projects. We ride the TV/streaming wave, shining a light on Marvel miniseries WandaVision, Apple+ space tale For All Mankind, HBO’s horror-drama Lovecraft Country and National Geographic’s astronaut chronicle The Right Stuff. Read on for big-screen profiles on the latest James Bond adventure No Time to Die, the video game adaptation of Monster Hunter and the cross-cultural influences of Disney’s animated epic Raya. And we continue to bring you tech & tools you can use, including expert tips on creating a waterfall simulation, and guidance from the award-winning VES Handbook of Visual Effects. It’s all here. VFX Voice is proud to be the definitive authority on all things VFX. And we continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

Lisa Cooke, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SPRING 2021

PG 2 EXECUTIVE NOTE.indd 2

2/26/21 4:30 PM


PG 3 VES SPEAKER SERIES HOUSE AD.indd 3

2/25/21 2:36 PM


[ CONTENTS ] FEATURES 8 VFX TRENDS: ON-SET SUPERVISORS Four visual effects supervisors share their on-set experiences.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 92 THE VES HANDBOOK

14 VFX TRENDS: AI, MACHINE & DEEP LEARNING Streamlining the creative technical and creative processes. 20 TV/STREAMING: FOR ALL MANKIND Season 2 of the Apple+ alt-space-race saga rewires history. 26 TV/STREAMING: WANDAVISION Marvel Universe characters step into a classic TV sitcom.

94 VES NEWS 96 FINAL FRAME: JAMES BOND

ON THE COVER: C-3PO in Star Wars: Tales from the Galaxy’s Edge VR experience. (Image courtesy of Lucasfilm and ILMxLAB)

32 PROFILE: GUILLUAME ROCHERON Oscar-winning supervisor’s lifelong pursuit of the perfect image. 38 FILM: NO TIME TO DIE Effects carry 25th Bond thriller in Daniel Craig’s fiery exit. 44 PROFILE: JENNIFER BELL Universal’s VFX leader has guided studio’s all-time classics. 50 VFX TRENDS: VIRTUAL PRODUCTION Studios implement new virtual production workflows. 58 ANIMATION: RAYA Southeast Asian culture inspires Disney fantasy adventure. 64 VFX TRENDS: THE CREATIVE IMPERATIVE Content demand, more tools spur artists’ personal projects. 70 COVER: GALAXY’S EDGE VR Star Wars VR opts for freedom to explore beyond script. 74 TECH & TOOLS: SIMULATING A WATERFALL How an artist can craft a waterfall in both Houdini and Bifrost. 78 FILM: MONSTER HUNTER Video game adaption creates hostile world of massive creatures. 84 TV/STREAMING: THE RIGHT STUFF Meeting the challenge of blending history and modern filmmaking. 88 TV/STREAMING: LOVECRAFT COUNTRY Horrors of racism unleash supernatural beasts and historical evils.

4 • VFXVOICE.COM SPRING 2021

PG 4 TOC.indd 4

2/25/21 2:05 PM


PG 5 RODEO AD.indd 5

2/25/21 2:37 PM


SPRING 2021 • VOL. 5, NO. 2

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING publisher@vfxvoice.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan Igor Zanic ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Emma Clifton Perry, 1st Vice Chair David Tanaka, 2nd Vice Chair Jeffrey A. Okun, VES, Treasurer Gavin Graham, Secretary DIRECTORS Jan Adamczyk, Neishaw Ali, Laurie Blavin Kathryn Brillhart, Nicolas Casanova Bob Coleman, Dayne Cowan, Kim Davidson Camille Eden, Michael Fink, VES Dennis Hoffman, Thomas Knop, Kim Lavery, VES Brooke Lyndon-Stanford, Josselin Mahot Tim McGovern, Karen Murphy Janet Muswell Hamilton, VES, Maggie Oh Susan O’Neal, Jim Rygiel, Lisa Sepp-Wilson Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Colin Campbell, Himanshu Gandhi, Bryan Grill Arnon Manor, David Valentin, Philipp Wolf Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Follow us on social media Tom Atkin, Founder Allen Battino, VES Logo Design VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2021 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SPRING 2021

PG 6 MASTHEAD.indd 6

2/25/21 2:06 PM


PG 7 FTRACK AD.indd 7

2/25/21 2:37 PM


VFX TRENDS

RISE OF THE ON-SET VFX SUPERVISOR By IAN FAILES

TOP: Adam Rowland holds a chrome ball and color chart for reference during filming of a VFX shot for An American Pickle. (Image of courtesy Adam Rowland) OPPOSITE TOP: Scenes featuring Pogo in The Umbrella Academy Season 2, a CG character crafted by Weta Digital, were filmed with a stand-in and the necessary chrome ball and color chart reference, here held by Visual Effects Supervisor Jesse Kawzenuk. (Image courtesy of Jesse Kawzenuk) OPPOSITE BOTTOM: Shooting aerial and ground-level views of Mother flying, via wirework, were a key component of Raised by Wolves’ on-set visual effects challenges, overseen by Sean Stranks. (Image copyright © 2020 HBO)

With the amount of visual effects shots and the complexity inherent in visual effects production, there seems to be an increase in the number of ‘specialized’ visual effects supervisors who specifically work on set. Typically operating hand-in-hand with a VFX data wrangler and production visual effects supervisor – and sometimes taking on either or both of those mantles themselves – these on-set VFX supervisors must always be on top of busy and ever-changing shoots, carefully balancing the wrangling of crucial VFX data with allowing for a smooth overall production. Here, four visual effects supervisors with significant recent on-set experience share their stories from shoots on projects including Dickinson, The Umbrella Academy, An American Pickle and Raised by Wolves. REFLECTING ON THE ROLE OF AN ON-SET SUPE

The Molecule’s Charlotta Forssman has worked as a visual effects supervisor on many television series, most recently Seasons 1 and 2 of Dickinson. Collaboration is one of the facets the visual effects supervisor says is most important in her role on set.

8 • VFXVOICE.COM SPRING 2021

PG 8-13 ON-SET SUPERVISORS.indd 8

2/25/21 2:06 PM


“What I really love about my job is the collaboration I have with all of these different department heads. On Dickinson, it was really fun to divide and conquer with the production designer in terms of what they’re going to build as a world and where we were going to take over.” Another key part of Forssman’s on-set VFX supervision role is the technical aspects related to data capture, including photographic reference, LiDar, lenses, takes and a wealth of other things. “You need to become best friends with the camera crew so that they give you all the measurements you need,” suggests Forssman, “and of course grab what you can yourself. From the set, I want the post producers, the editors and the studio heads – whoever it may be – to know that they have me on the ground as their eyes.” The VFX duties on Dickinson include realizing the period aspect

“Our Pogo days [on The Umbrella Academy] were always big! Brandon Haber, our top-tier data wrangler, and I were constantly tweaking our practice to be more efficient. Re-configuring the database, setting up the witness cameras to sync with the main unit camera, building a rig to hold our Pogo model, with the ball and color chart – dividing and conquering to get what we needed.” —Jesse Kawzenuk, Associate Visual Effects , The Umbrella Academy

SPRING 2021 VFXVOICE.COM • 9

PG 8-13 ON-SET SUPERVISORS.indd 9

2/25/21 2:06 PM


VFX TRENDS

of the show – it is set in the 19th Century, so it requires eraappropriate buildings and environments to be built as practical and CG sets. For aspects like this, Forssman advises that on-set VFX supervisors should consider utilizing tools that are readily available to them. “There’s an app called OldNYC, which is great if you’re shooting period stuff in New York City. It’s a map app, where, say if you’re standing on a particular street corner in New York, then you can bring up what old buildings used to be at that corner. I’ve impressed a few people on scouts with that app.” CHIMPANZEES, SNOW AND TALKING FISH HEADS: UMBRELLA ACADEMY

Having worked as a data wrangler with The Umbrella Academy Production Visual Effects Supervisor Everett Burrell on Season 1 of the show, Jesse Kawzenuk was elevated to Associate Visual Effects Supervisor on Season 2. The work on Season 2 was varied and complex. “We would get hard notes for every VFX shot which would include camera measurements – height, tilt, focus and all the camera settings,” shares Kawzenuk. “For bigger CG type shots, we bust out our chrome ball/gray ball, also known as the ‘CBGB.’ This along with an HDRI gave us our lighting reference, which helps our artists months later bring Pogo or AJ Carmichael to life. Everett was ultimately the mastermind behind everything, and we were

10 • VFXVOICE.COM SPRING 2021

PG 8-13 ON-SET SUPERVISORS.indd 10

2/25/21 2:06 PM


there to make sure he and the folks in post got everything they needed.” In terms of the most complex on-set shots for Season 2, Kawzenuk identifies scenes involving the intelligent chimpanzee Pogo. “Our Pogo days were always big! Brandon Haber, our top-tier data wrangler, and I were constantly tweaking our practice to be more efficient. Re-configuring the database, setting up the witness cameras to sync with the main unit camera, building a rig to hold our Pogo model, with the ball and color chart – dividing and conquering to get what we needed.” Another complex aspect of the shoot related to unforeseen snow that fell at the location for the season’s final episode, but then melted just as quickly. “A decision to use the practical snow quickly changed to adding VFX snow,” details Kawzenuk. “Luckily the special effects team was incredible and we were able to pull together whatever SPFX snow and real snow we had to contain the action within the frame for the day. Snow was melting before our eyes, we were using wheelbarrows to shovel snow into the frame. Everett went to work and the artists that spent countless hours on these shots made it seamless – it turned out great!” SEEING DOUBLE FOR AN AMERICAN PICKLE

Adam Rowland was the Overall Visual Effects Supervisor on An American Pickle, a role that involved a significant presence on set

“You need to become best friends with the camera crew so that they give you all the measurements you need, and of course grab what you can yourself. From the set, I want the post producers, the editors and the studio heads – whoever it may be – to know that they have me on the ground as their eyes.” —Charlotta Forssman, Visual Effects Supervisor, Dickinson OPPOSITE TOP LEFT: The Molecule Visual Effects Supervisor Charlotta Forssman on the set of Dickinson. (Image courtesy of Charlotta Forssman) OPPOSITE TOP RIGHT: Forssman on set for the shooting of a VFX shot in Ballers. (Image courtesy of Charlotta Forssman) OPPOSITE BOTTOM: Jesse Kawzenuk worked closely with Visual Effects Supervisor Everett Burrell on The Umbrella Academy, who is seen here holding a stand-in for AJ Carmichael. (Photo: Christos Kalohoridis. Copyright © 2020 Netflix) TOP: Classic on-set VFX duties on The Umbrella Academy included managing bluescreen shoots and other effects plates. (Photo: Christos Kalohoridis. Copyright © 2020 Netflix)

SPRING 2021 VFXVOICE.COM • 11

PG 8-13 ON-SET SUPERVISORS.indd 11

2/25/21 2:06 PM


VFX TRENDS

to make the twinning shots of the film (where actor Seth Rogen played two characters) possible. One of the toughest aspects of the on-set work for An American Pickle proved to be element shoots for face replacement shots, such as one that sees the two characters played by Rogen escape on an electric scooter. Here, Rowland decided that the best way to acquire an accurate element was to combine the opposing movements of both the scooter and camera and build it into a single moving camera, also allowing Rogen to be kept stationary and for full control of the lighting. “We spent the morning setting up the camera and dolly, a greenscreen behind an array of 3D tracking markers, and changeable light-cover,” explains Rowland. “We also set multiple eyelines for Seth to hit in a specific timing so that his head would pivot and turn in an identical manner to the double whose head we were replacing. “We worked with a stand-in,” Rowland continues, “finessing the eyelines, rehearsing the camera move and making incremental alterations throughout the morning. By the time Seth stepped on set in the afternoon, we were ready to take him through what was required of his performance, and managed to get the whole thing shot in a few takes. It’s probably worth mentioning that this shot never made it into the final film! It was temped up for a test screening, and looked very promising, but ultimately the whole scene was omitted when the ending of the film was changed.” DRONES, FLYING ANDROIDS AND CREATURE FX ON RAISED BY WOLVES

TOP: Forssman takes reference photography during the making of Ballers. (Image courtesy Charlotta Forssman) BOTTOM: Greenscreen plate and final shot by Nviz for the Seth Rogen ‘twinning’ in An American Pickle. (Image courtesy of Adam Rowland)

For Raised by Wolves, on-set Visual Effects Supervisor Sean Stranks, who collaborated with Production VFX Supervisor Raymond McIntyre Jr., spent considerable time on location in South Africa. He says aerial shooting became a vital part of the on-set work, since the main character, Mother, often headed skyward. “There was a day where we needed to shoot some drone plates for a shot of Mother flying, and we wanted to do some big orbits around to find the best way to shoot her. So I put a big white tracking ball on the top of my drone about the size of a grapefruit. Then the drone guys would follow me as if my drone was Mother flying through the air. As long as they could see my tracking drone, they had something to shoot towards – that was an absolutely fun day.” Raised by Wolves also features a number of creature shots; these were predominantly realized with a stand-in suit performer to aid in actor integration and eyelines. Stranks had to ensure what was acquired on set would work for the final CG creature shots. “We had a great performer who was very nimble; he was able to move around but he wasn’t as large as the creatures actually were. But what it did was get us some great camera reference. We shot him doing all this stuff, and then we shot clean plates within that same thing. Every shot is different when it comes to that kind of VFX. If you can have somebody come in and do those actions with the actors, it’s fantastic because it’s always tough to just pull it out of thin air otherwise.”

12 • VFXVOICE.COM SPRING 2021

PG 8-13 ON-SET SUPERVISORS.indd 12

2/25/21 2:06 PM


On-set Essentials for On-set Supervisors Every visual effects supervisor tends to have their own preference about what should go in their on-set kit bag. Amongst the common bag inclusions are, of course, DSLRs, iPhones and iPads for photography, data collection and even rudimentary LiDar; an accompanying plethora of apps; Ricoh Theta and Insta360 cameras for fast HDRIs; and a range of gray balls/light probes and color charts. Here’s what else our four VFX supervisors suggest are must-haves when working on set. Charlotta Forssman: “Having a GoPro has saved me so much sometimes when it comes to reflection passes. I had a recent show that I went on and we were shooting these driving plates, and when we shot the greenscreen we saw the rear-view mirror. It would have been a whole thing to turn the cameras around and shoot it. So I was able to just stick a GoPro on the side of the mirror and shoot really nice reflection plates. “And then there’s clothing: don’t get cold. I remember one time I was on set very early on, and it was one of the coldest days in New York City, and I was freezing, and the DP was like, ‘Visual effects supervisors are always a little underdressed,’ and I will never forget that. And I am always prepared for snow – extra socks and all that stuff so that I’m never the VFX supervisor who is underdressed.” Jesse Kawzenuk: “The most valuable item in my kit is my belt, a simple first AC pouch that holds my iPad, tape, pens, small slate for photography, disto – everything. Some wranglers try to fit it all in their pockets or go minimalist and it just doesn’t work. Back when I was a VFX PA, a supervisor asked me for a piece of tape and I responded, ‘One second, it’s back on the cart,’ and by the time I had returned with the tape, he had already found it elsewhere. He looked at me and said, ‘Don’t let me catch you without tape ever again.’ “To this day, if VFX is on the board, the belt goes on. Sets can often resemble a battle-ground; it moves fast and no one has time to

wait. You want to be prepared for anything at any time. VFX requires you to be thorough on the day because that particular set/prop/ actor may not be there tomorrow.” Adam Rowland: “Ordinarily, throughout the shoot we compile a FileMaker database of all the VFX shots and camera information. I have also used an app called ZoeLog. It’s very simple and is basically an online system for recording camera sheets. At any point anyone with access can view the camera database on their phone or iPad and see all of the information for a particular slate. It can also all be downloaded as a PDF or an XLSX and transferred straight into FileMaker. “It removes the need to be constantly liaising with the camera team for data, and frees you and your wrangler up for more useful work. In these times when we’re attempting to reduce contact between crew, it’s a really useful tool. However, it does rely on the camera team agreeing to use it, and in my experience, they can be pretty old-school and a little reluctant to relinquish their notepads! (Meanwhile, I don’t think it is possible to overstate the importance of thermal underwear and a decent pair of boots.)” Sean Stranks: “I’m really into drones, and so what I did a lot of the time with some of the larger sets on Raised by Wolves is I would just shoot from my drone photogrammetry of a much wider area in case we couldn’t do a proper LiDar. I would orbit the drone around and capture as much photography as possible. The camera on my drone is very good so the images are quite high-res. You can then literally PhotoScan the stuff together and just build a scene and go, ‘Okay, I understand where everything is now.’ “We had a lot of small sets that would come up and disappear in a couple of days. Sometimes they would move things into place, shoot it, and then it would be torn down in a couple of days because they need the area for something else. So we would always try to PhotoScan those as well as we could.”

A filming location for An American Pickle. Many on-set VFX supervisors will advise of the importance of staying warm during long shoot days. (Image courtesy of Adam Rowland)

SPRING 2021 VFXVOICE.COM • 13

PG 8-13 ON-SET SUPERVISORS.indd 13

2/25/21 2:06 PM


VFX TRENDS

AI, MACHINE AND DEEP LEARNING: FILLING TODAY’S NEED FOR SPEED AND ITERATION By TREVOR HOGG

TOP: A hybrid animation approach developed by Framestore and Weightshift combines ragdoll simulation, keyframe animation and machine learning, thereby allowing precise control and enabling animators to focus on storytelling. (Image courtesy of Framestore) OPPOSITE TOP: The development partnership between Framestore and Weightshift helped to craft the photoreal creatures, such as the mouse in His Dark Materials. (Image courtesy of HBO and Framestore) OPPOSITE BOTTOM: One of the miniature sets created for the LAIKA Studios production of Missing Link. (Photo courtesy of LAIKA Studios and Intel)

In his 1950 paper Computing Machinery and Intelligence, English mathematician Alan Turing devised a series of questions to test the ability of machines to imitate human intelligence without being detected. The Turing Test was the inspiration for the opening sequence of the science fiction classic Blade Runner when a suspected replicant kills his human interrogator. The academic theory and plot conceit are becoming a reality as massive data collection is taking place by Google, Apple, Facebook and Amazon to further their development of artificial intelligence, which is seen as the next technological revolution that will impact everything from daily life, medicine, manufacturing and even the creation of imagery. With each innovation, there is excitement about making what was once impossible achievable while also raising concerns about the societal impact. The visual effects industry views artificial intelligence, machine learning and deep learning as the means to streamline the creative and technical process to allow for quicker iterations for clients. Hired during the COVID-19 pandemic by Weta Digital to be their CTO, Joe Marks was previously the Vice President and Research Fellow at Disney Research and Executive Director of the Center for Machine Learning at Carnegie Mellon University. “AI is a toolkit that has been evolving since the late 1950s. A lot of the early work was on heuristic search, searching amongst different permutations and combinations. Initially, it was thought that

14 • VFXVOICE.COM SPRING 2021

PG 14-19 AI.indd 14

2/25/21 2:07 PM


“Visual effects in general is in a unique position as we generate thousands of images a day that are already perfectly labeled because you usually know what you’re rendering. But for the longest time we didn’t have the computing power and the knowledge of machine learning, so that data was wasted. We can help the art direction with machine learning as we have algorithms and ways of supporting the style of certain people and creating versions.” —Johannes Saam, Senior Creative Technologist, Framestore playing chess would be hard and human-perception-related issues, like speech and language, were going to be easy. It turns out that was exactly wrong! “For classical AI,” continues Marks, “it is sitting down with an expert and saying, ‘Explain to me your expertise and I’ll try to code that into the machine by hand.’ With machine learning it is, ‘Give

me data samples and I’m going to learn patterns from it.’” Marks has a humanistic attitude towards the technology. “I find that AI and computers in general are the most amazing tools that we’ve built, and they amplify humans. That’s what gets me excited about generating new AI-enabled tools that enhance artists, storytellers and musicians.”

SPRING 2021 VFXVOICE.COM • 15

PG 14-19 AI.indd 15

2/25/21 2:07 PM


VFX TRENDS

TOP: In order to make the casting of actors easier, Framestore has developed Face Designer, which can combine the facial features of different individuals in a more realistic manner. (Image courtesy of Framestore) MIDDLE: Labeling data is standard practice for the visual effects industry, which is critical in improving the effectiveness of machine learning, such as in the case of creating digital characters. (Image courtesy of Framestore) BOTTOM: There is still a long way to go before digital characters can replace or even resurrect their human counterparts, but it does raise ethical issues that need to be addressed. (Image courtesy of Framestore) OPPOSITE TOP: Masquerade 2.0 is the latest version of the proprietary facial capture system developed by Digital Domain. (Image courtesy of Digital Domain) OPPOSITE BOTTOM: Digital Domain is pushing the boundaries of photorealistic CG characters with DigiDoug. (Image courtesy of Digital Domain)

Framestore partnered with software developers Weightshift for a research project that is meant to significantly reduce the animation process by combining ragdoll simulation, keyframe animation and machine learning to allow animators to focus on the artistry rather than dealing with repetitive and time-consuming tasks. Benefitting from these techniques were the photoreal creatures of Lady and the Tramp and His Dark Materials. “The current implementation of machine learning is more around, ‘How do we do things faster, better and more efficiently?’” notes Theo Jones, Visual Effects Supervisor at Framestore. “The visual effects industry, like other industries, is getting quite squeezed for budgets and schedules. In order to operate a high-end, large-scale Hollywood production, you need a large, robust pipeline and infrastructure to push that through, so there is a machine aspect to it. One of the things that brought me into the industry and has kept me here for so long and satisfied is the talent of the people I work with. We’re a long way off of machines replacing that talent. There are certainly more low-level tasks, the grunt work of getting through the shots and work that machine learning is having quite an impact on.” Frequent tasks such as greenscreen keys and rotoscoping can be mastered by machine learning, believes Johannes Saam, Senior Creative Technologist at Framestore. “Visual effects in general is in a unique position as we generate thousands of images a day that are already perfectly labeled because you usually know what you’re rendering. But for the longest time we didn’t have the computing power and the knowledge of machine learning, so that data was wasted. We can help the art direction with machine learning as we have algorithms and ways of supporting the style of certain people and creating versions. “A tool that we have right now allows us to do AI casting,” continues Saam. “What that means is that we can take pictures of different famous or nonfamous people, feed them into a system, and mix and match them in a way that is smarter than putting them in Photoshop and blending them together. You can say, ‘Get me Trevor’s eyes and Johannes’ nose,’ and see what comes out of it. It’s almost like doing a police sketch. We can use that to hone in the casting. Even though those pictures aren’t yet animatable and fully rigged, and can’t replace extras yet, we can at least find our extras in a smarter way that is visual.” LAIKA Studios entered into a partnership with Intel to harvest data collected from pre-production tasks that would train character-specific neuro networks to scan frames and detect key points on a puppet’s face. “A lot of what we do in the Roto Paint Department is cleaning up the rigs and seams of the puppets,” states James Pina, Roto Paint Lead, Technical at LAIKA Studios. “What this allows us to do is to take away lot of the repetitive part of the process and be more creative, and it gives us more time to do better work.” Originally, it was thought that machine learning could provide general solutions for general problems. “We started feeding a bunch of images of puppets, but it wasn’t that useful as it wasn’t task-specific to what we were trying to do, which was to remove

16 • VFXVOICE.COM SPRING 2021

PG 14-19 AI.indd 16

2/25/21 2:07 PM


“One of the biggest breakthroughs for us, in terms of speed, has been building gigantic, pre-trained models that can be guided by deep learning networks. We train these on hundreds of thousands of images, so it can automatically refine data you want to improve.” —Darren Hendler, Director, Digital Human Group, Digital Domain the seams on the puppets’ faces,” remarks Jeff Stringer, Director of Production Technology at LAIKA Studios. “It isn’t so much about quantity, but the quality of the data.” There is scope and limitation to machine learning, says Stringer. “One of the things that we want to talk to Intel next is ways to use sound files to drive a rig to get to the initial facial animation. It is important to give the machines the right tasks, like removing things from the frame. When you start to add things, you can get into some territory where you’re getting results that an artist wouldn’t do.” Machine learning can still assist the creative process. “I would use AI to generate a set of random events and let the artists pick out what they want,” notes Narayan Sundararajan, Senior Principal Engineer & Director, Applied Machine Learning at Intel. “It is like augmenting the creative process versus replacing it, which is what you don’t want to do as that’s when you start creating the same thing over and over again.” AI is seen not only as a technical tool but as an avenue for personalized storytelling, which is illustrated by Agence, an interactive experience created by Canadian Pietro Gagliano, Founder

SPRING 2021 VFXVOICE.COM • 17

PG 14-19 AI.indd 17

2/25/21 2:07 PM


VFX TRENDS

TOP: The specific task that LAIKA Studios wanted to achieve with machine learning was the removal of seams on the faces of puppets. (Photo courtesy of LAIKA Studios and Intel) MIDDLE: Unity was chosen by Pietro Gagliano to make Agence because of the ML-Agents toolkit that allows the attachment of reinforcement ideas to the game engine and Cinemachine. (Photo courtesy of National Film Board of Canada and Pietro Gagliano) BOTTOM: The nature of Agence is to create as many dynamic systems as possible to get surprises. (Photo courtesy of National Film Board of Canada and Pietro Gagliano)

and Creative Director at Transitional Forms, and the National Film Board of Canada, where the user has the ability to observe and intervene with artificially intelligent creatures. “At the time we started this project,” Gagliano explains, “Unity had two factors that we were looking for. One was the ML-Agents toolkit so we could attach our reinforcement ideas to the game engine, and the other was Cinemachine, where we could build dynamic camera systems to judge what types of cuts could happen.” The narrative was not meant to be restricted, Gagliano says. “The nature of Agence is to create as many dynamic systems as possible to get surprises. I believe there is a way to teach AI about the rhythm of storytelling and there is a way to treat storytelling as a technology that works for us humans. One of the things that we encountered quickly is that we lacked the data to support that initiative. We started into this idea of reinforcement learning because it’s synthetic data, so we were able to create simulations that created data again and again. That’s how we’re training the neuro network to run these little creatures rather than the storytelling.” Autodesk is designing software that utilizes artificial intelligence to break down images into objects. “The AI innovations we’ve shipped in Flame since 2019 include a depth extraction tool to generate Z-depth for a scene automatically, and a face-normal map generator to recognize human faces and generate normal maps for them,” explains Will Harris, Flame Product Manager at Autodesk. “From there we built tools that produce alpha mattes to output various things, the first of which was a sky tool and later a face matte keyer for human heads. These alpha matte tools are specialized object-recognition keyers. With the face tool, we can produce high-contrast mattes to track anything from bags under eyes to lips or cheeks across a sequence to generate a matte. Most recently we added a salient keyer which is like a Swiss army knife for extracting mattes. This feature can be customized for any object, cars, traffic lights, and can even recognize detailed features of a hand. These tools can help artists get 80% of the way to fixing a shot, saving hours of tedious extraction and shot prep work along the way.” AI and machine learning require intensive computation. “While local compute resources are critical, for some processes, having cloud compute resources brought to bear on specific machinelearning training challenges is necessary due to processing requirements of generating these algorithms,” states Ben Fischler, Industry Strategy Manager at Autodesk. “AI and machine learning are already impacting rendering in areas like de-noising and image cleanup. We can expect more developments along these lines, particularly 2D/3D compositing processes for things like de-aging or frame interpolation for adding frames. There are techniques where you might need to add noise to frames – doing that during a render is demanding on turnaround times. However, if you could use AI to add noise to frames after the fact, like post noise, grain and texture, which are subtle effects that make images look photoreal, you could save a lot of time. There is also the possibility of using AI to render 12 frames per second and for figuring out and filling in the in-between frames; that frame interpolation concept has been demonstrated by NVIDIA.”

18 • VFXVOICE.COM SPRING 2021

PG 14-19 AI.indd 18

2/25/21 2:07 PM


“For classical AI, it is sitting down with an expert and saying, ‘Explain to me your expertise and I’ll try to code that into the machine by hand.’ With machine learning it is, ‘Give me data samples and I’m going to learn patterns from it.’ I find that AI and computers in general are the most amazing tools that we’ve built and they amplify humans. That’s what gets me excited about generating new AI-enabled tools that enhance artists, storytellers and musicians.” —Joe Marks, CTO, Weta Digital “One of the biggest breakthroughs for us, in terms of speed, has been building gigantic, pre-trained models that can be guided by deep learning networks,” states Darren Hendler, Director, Digital Human Group for Digital Domain. “We train these on hundreds of thousands of images, so it can automatically refine data you want to improve. In practice, this might look like a job where you had to roto someone’s face. If your gigantic module has seen thousands of different people and objects and you give it a face to roto, it’s going to be able to learn to do exactly what you want much faster because of all the training that’s come before. This compounds over time, shrinking the time it takes to do a task until one day maybe it takes a couple of seconds.” The next wave of machine learning will alter things for the better, Hendler believes. “What this will likely center on is augmenting or modifying a performance or something that already exists. Machine learning is exceptionally good at that. This could be turning a plate performance into a character, identifying objects in a scene, improving qualities of renders. However, it won’t be used for new creative performances that have no ties to the real world. Actors still need to drive those. Something else worth noting is that machine learning won’t be taking over the whole process, only parts of it.” AI has not reached the point of being widely used. “Right now, AI and machine learning are mostly utilized for simpler projects and stunts, like face-swapping for a YouTube video,” remarks Nic Hatch, CEO at Ncam. “They are rarely the main drivers of traditional visual effects, although we see some exceptions from larger studios who have been designing custom tools. Overall, it’s easier to think of this as the beginning of a journey. Everything is still mainly research at the moment, but I anticipate we will start to see more AI-driven off-the-shelf products in the next three to five years, opening it up to studios of all sizes.” As with the emergence and adoption of new technology, there are fears of the workforce being reduced. “I don’t see it eliminating jobs,” Hatch says. “The world has a huge appetite for media right now, so there are more artists and more work than ever before. I have seen no evidence to suggest this will lessen. As the use of AI expands, it may simply change the way certain jobs are done. Currently, artists can spend a great deal of time on simpler, more mundane tasks. That type of work can potentially be done by computers so humans can spend more time on what they’re good at – ideation and creativity. AI has the potential to change the industry in many ways, but perhaps one of the most important will be allowing creatives to focus on the parts of the job they love and thereby reigniting a passion for visual effects.”

TOP TO BOTTOM: A demonstration of the AI-powered facial segmentation capability of Flame, which in this case allows for the lighting to be altered. (Photos courtesy of Autodesk)

SPRING 2021 VFXVOICE.COM • 19

PG 14-19 AI.indd 19

2/25/21 2:07 PM


TV/STREAMING

LIGHTING NIGHT ON THE DARK SIDE OF THE MOON FOR ALT-HISTORY SERIES FOR ALL MANKIND By KEVIN H. MARTIN

Images courtesy of Apple. TOP: U.S. moonbase Jamestown was developed beyond its original first season depiction. The series postulates great technological advances beyond what actually took place in the post-Apollo years. OPPOSITE TOP: The stark lighting and sparse palette for moon scenes mandated exacting standards for quality control with respect to color. Redd worked with a variety of vendors to ensure the look was locked-in, ensuring a smooth DI without surprises or a need to tweak the image further. OPPOSITE BOTTOM: Lunar landing pad suggests a rough-and-ready equivalent to similar but much more advanced vistas in Kubrick’s 2001: A Space Odyssey, with the area located at a remove from the base proper.

With its premise of how history might have changed if Russia had beaten the U.S. in the space race to the Moon, For All Mankind, created by Star Trek scribe and Battlestar Galactica showrunner Ronald D. Moore, posits America taking the Avis rental car position of being number two: we try harder. An expansive and expensive commitment to manned spaceflight, far beyond the real-world Apollo program, develops, with more than a few hints that the Cold War will heat up the vacuum of space. Season 2 of the Apple+ series offers a deeper dive into this alt-history, but still relies on behind-the-scenes talent carried over from the first season, including series creator Ronald D. Moore, cinematographers Stephen McNutt and Ross Berryman, production designer Dan Bishop and Visual Effects Supervisor Jay Redd. With a career that started in the 1990s with Rhythm & Hues and continued with 18 more years of feature film work at Sony Pictures Imageworks, adjusting to TV production was something of a shock. “Co-supervising a single movie with Ken Ralston at Imageworks would be something on the order of a two-year process,” he states. “Here, it’s a nine-month cycle for 10 episodes per season, with each year requiring 1,600 to 1,900 shots. “Since we’ve taken a leap of about a decade from Season 1, Moonbase Jamestown reflects considerable development from how it first looked,” Redd acknowledges. “We made a big deal in Season 1 about discovering ice on the Moon, so that suggests the potential for self-sufficiency rather than bringing everything up from Earth. And with the potential to create oxygen and hydrogen, that tied into a mention of using the Moon as a fueling station for trips to Mars, which kind of points up why, as a total space nerd from childhood, I really love working on this show. Plus I feel super-fortunate to work with like-minded people, ranging from VFX artists to the technical advisors, who help us with our desire

20 • VFXVOICE.COM SPRING 2021

PG 20-25 FOR ALL MANKIND.indd 20

2/25/21 2:08 PM


“We shot down at the old Boeing warehouses next to Long Beach Airport, using this huge 100K light as a sun source. It’s similar to the First Man approach, but while they shot outdoors in a quarry, we wanted the control of a studio, which permits us to place cables wherever we want for the stuntwork as astronauts bound along in the low-g environment.” —Jay Redd, Visual Effects Supervisor to keep one foot – minimum! – in accurate tech.” But Redd readily admits that balancing science and tech with an overriding need to deliver the requisite level of drama remains a prime concern. “That’s often an issue with TV and film sci-fi, so the struggle will always be a part of things going forward. And the

thing is, envisioning this future isn’t just a straight-line process of saying, ‘after we get to the Moon, next up we have a moonbase.’ It’s also a matter of exploring how things may take a detour for one reason or another, which could either be the cause for more drama, or driven by dramatic dictates. The technological developments

SPRING 2021 VFXVOICE.COM • 21

PG 20-25 FOR ALL MANKIND.indd 21

2/25/21 2:08 PM


TV/STREAMING

TOP: Live action for the exterior moon sequences, which were much more expansive than Season 1’s work, was accomplished at a former Boeing warehouse in Long Beach, California. MIDDLE: Visual Effects Supervisor Jay Redd drew on the experience of various vendors in realizing the many challenges on For All Mankind. A new look first seen in Season 2 is lunar night, which was achieved by Method Montreal, which extended the no-light illumination achieved by the cinematographer during live action. BOTTOM: A 100K light illuminated the moon set, conveying an appropriate single-source look for the surface. Most of these scenes required VFX set extensions to depict the lunar vastness. Set extensions and paintwork were used to transform stock NASA imagery for Earth scenes as well.

in this reality show a progression far beyond what happened in our world by that time. Should we incorporate, say, cellphones in this alternate version of 1983? The technology is advancing faster because more money was poured into space exploration than happened in reality, which means we get to show the various efforts that result in new rockets, space vehicles and communication methods. And it’s not just focusing on U.S. and the military, there’s NASA and also the Russian space effort.” With the increased focus on lunar habitation came the need for production to build more sections of what Buzz Aldrin called ‘magnificent desolation’ in order to accommodate an increase in surface excursions. “We shot down at the old Boeing warehouses next to Long Beach Airport,” says Redd, “using this huge 100K light as a sun source. It’s similar to the First Man approach, but while they shot outdoors in a quarry, we wanted the control of a studio, which permits us to place cables wherever we want for the stuntwork as astronauts bound along in the low-g environment.” Special Effects Supervisor Mark Byers was no stranger to such gags, with wire-rigs for low gravity a part of his work on the feature The Space Between Us and Netflix’s Space Force series. “Many years back, we’d have to use piano wire and super-thin cables because, if they showed up on film, painting them out was super-expensive,” he recalls. “Nowadays, the more easily they can see the rope or cable, the easier it is for VFX to remove it.” For HBO’s From the Earth to the Moon, balloons were attached to the performers to aid their low-gravity movements on the Moon’s surface, but the limitations inherent with that process ruled out its use here. “The idea is a good one, but it just requires so much helium,” explains Byers. “Plus, the balloons take up so much space above the heads of each person that you can’t position them close together. So we worked with stunts to give that 1/6th gravity look, employing counterweights that let them keep up on their toes and take large strides. Setting up the track and cable rigging from above just requires knowing where they are going in frame. The huge vertical space was perfect for staging and shooting this kind of thing. There were other issues as well, like how the kick of a gun would impact you in low gravity, and whether you’d even see a muzzle flash. Once these aspects were addressed, we’d prepare enough of the gags so production could shoot multiple takes. This was mainly on indoor scenes, with VFX helping a lot when the action takes place outside of pressurized cabins.” Production designer Dan Bishop collaborated with VFX when it came to blending between practical builds and post work. “The art department construction issues often revolved around – you guessed it! – a question of both budget and time,” admits Redd. “We’d use The Third Floor heavily for previs to figure out how handoffs could happen between his art department and VFX. The same was true for scenes where VFX had to bridge between shots of stunt performers and digi-doubles. This kind of open collaboration really paid off during those times when Ron Moore would come to us with a specific line of description from the script and ask what we could do to make that moment happen. If a director had been assigned at that point, we’d get together with him or her to do some quick boards, which was very useful when

22 • VFXVOICE.COM SPRING 2021

PG 20-25 FOR ALL MANKIND.indd 22

2/25/21 2:08 PM


“The art department construction issues often revolved around – you guessed it! – a question of both budget and time. We’d use The Third Floor heavily for previs to figure out how handoffs could happen between [Dan Bishop’s] art department and VFX. The same was true for scenes where VFX had to bridge between shots of stunt performers and digi-doubles. This kind of open collaboration really paid off during those times when [series creator] Ron Moore would come to us with a specific line of description from the script and ask what we could do to make that moment happen.” —Jay Redd, Visual Effects Supervisor the characters enter and exit space vehicles, especially if they are turning. That way we can make sure the lighting matches before anything gets set up on stage.” The issue of reflections on space helmet visors was also something of a push-me/pull-you. “A lot of the helmet visors – and there are both the kind with a gold shield that comes down and a transparent one – wound up being CG replacements,” reports Redd. “The amazing paint and roto work often added 3D extensions into visors on occasions when the practical reflection was used. Sometimes the paintout is just limited to the camera crane, since we’re all either dressed in black or behind a curtain. Almost all of our partner studios offer paintwork. We work with Method Studios, Union Visual Effects, Barnstorm VFX, Refuge, Crafty Apes and other small boutique houses. Since houses use different render engines, I have to be concerned about sharing assets, since the same rocket rendered in V-Ray at one facility may look different when rendered elsewhere in Arnold. So we try to minimize such sharing, and instead assign like sequences and types of work together.”

TOP: Spacesuited performers were fitted with overhead wire rigs that allowed them to bound along in a manner consistent with the 1/6th gravity environment. MIDDLE AND BOTTOM: Special Effects Supervisor Mark Byers oversaw live-action dogfight work involving use of a motion base to effect drastic aerial maneuvering. When the pilot ejects, a parallelogram rig was augmented by air movers to suggest his trajectory through the sky.

SPRING 2021 VFXVOICE.COM • 23

PG 20-25 FOR ALL MANKIND.indd 23

2/25/21 2:08 PM


TV/STREAMING

“Philosophically, I have very strong views on colorspace. It’s why I handhold our images all the way to the DI. I may get some flack over this, but I just don’t believe that colorists should be reworking our CG. Coloring should be thought of as an accent, like salt and pepper on top of a nice meal. … I have a good relationship with our colorist, so we get things nailed and locked before it can get into somebody else’s hands. The showrunners absolutely want me there to carry it across, maintaining image integrity while making it pretty.” —Jay Redd, Visual Effects Supervisor

TOP: Accurate lighting reference for scenes set on the moon’s surface was integral when set extensions were called for, and also provided key data for reflection elements used for astronaut visor replacements. MIDDLE: While inside pressurized environments like Moonbase Jamestown, traditional practical approaches to bullet hits could be used, but out on the surface, care had to be taken to maintain a scientifically accurate depiction of guns being fired in a vacuum, requiring coordination between SFX and VFX. BOTTOM: A 100K light was used to simulate the harsh single-source light of the sun falling on the moon’s surface. While the lunar stage set expanded on what was built for Season 1, digital extensions still often were required.

The series still divides its time between the heavens above and the Earth below. “We provided various gags for Earth scenes,” says Byers, “which included flying a T-38 on a motion base for its dogfight maneuvers. There were some pretty drastic and severe moves to program, followed by an ejector-seat sequence. We put the performer up on a parallelogram as he ‘flew’ and used air movers for wind and turbulence. It was largely the kind of work we’ve done before, but for this series, production had their act together. I’ve been on shows where they ask for too much, because they aren’t considering – or just don’t know – the expense and effort to deliver a particular effect. We can wind up building elaborate rigs that they don’t end up having time to use. But on shows where they’ve done their homework and already answered their own questions, I can just go. Such a well-organized production allows me more time to get the right tools in place, which is the key to being efficient and staying on schedule. And since they’ve only got so much time to shoot, in some cases that made the difference, in terms of getting something that we might not otherwise have had time to get on film.” VFX work on Earth scenes involved new live-action shooting as well as manipulation of stock footage. “We do a lot of paintouts for period [work] on our location shooting in Culver City, sometimes to make places look like Houston or Cape Canaveral,” Redd reveals. “There’s a lot of manipulation of archival imagery too, like swapping out different NASA logos on the Vehicle Assembly Building footage and altering building numbers for Johnson Space Center stock.” Redd wasn’t a stranger to dealing with rocket blast-offs from Florida, having recreated the launch of Apollo XI for Men in Black 3. One new situation this season involved scenes set during the two-week-long night on the Moon’s surface. “We wanted to explore lunar night last year, and this season actually got to deal with that right at the start,” Redd states. “That gave me a great opportunity to work with DP Stephen McNutt in determining just how little light would be needed to convey a surface only illuminated by

24 • VFXVOICE.COM SPRING 2021

PG 20-25 FOR ALL MANKIND.indd 24

2/25/21 2:08 PM


starlight. Essentially it was a no-source look, but with enough light that you could see things.” The element that wasn’t viable for suspending the astronaut performers turned out to be very suitable for helping to light them in these scenes. “The solution came from giant helium-filled softboxes, which gave us a beautiful even light across the whole set,” enthuses Redd. “It gave us a tiny but very necessary diffused shadow, which let you discern shapes that wouldn’t register with flat lighting. From there, we struck a balance between what he needed to do on the day and how I would take that the rest of the way in post. I studied how much we could adjust the exposure upwards and downwards while we worked on the low end. If it was too bright, viewers would start wondering where this light is coming from, and if it is too dark, you’ve got nothing except the lights from spacesuits and lunar rover vehicles. I worked with Method Montreal to make sure when they extended the environment, it matched to what Stephen had done on set. Then, during final grading, we used Windows and did some additional tweaking to make sure everything read.” Redd, in declaring his very strong views on colorspace as being akin to a philosophy, states, “Philosophically, I have very strong views on colorspace,” he admits. “It’s why I handhold our images all the way to the DI. I may get some flack over this, but I just don’t believe that colorists should be reworking our CG. Coloring should be thought of as an accent, like salt and pepper on top of a nice meal. If a colorist asks me for mattes, I will say no. I spend a ton of times with the artists and supervisors at all our partner studios to dial in a sequence’s look, and the compositor may have had 150 layers working in concert. You don’t ask a single violinist out of the whole orchestra to re-record his part and then dub that in. Instead, I may go back to production and ask if they want me to work with the vendor and tweak the work there. I’d rather spend the time in a DI suite doing things the right way. I have a good relationship with our colorist, so we get things nailed and locked before it can get into somebody else’s hands. The showrunners absolutely want me there to carry it across, maintaining image integrity while making it pretty.” Production was halted last March owing to COVID “that left our two final shows – really big episodes – unshot. We were among the first productions to go back in at the Sony lot, but that was only after months of Zoom meetings to figure out how best to resume under COVID conditions, shooting for five very intense weeks in August and September. We were doing previs, techvis, plus a lot of pan-and-tile cameras, because with the new restrictions we couldn’t have many people on set. So often we were limited to just shooting the main cast in mission control on one pass. Then they’d leave and we’d bring in our extras. But when you look at it, there’s no sign of an effect, no giveaway that we did anything tricky. How did we bring it off? Incredible work from our ADs managing all this. And being told, ‘Three takes to get that, not eight.’ You just had to be focused, and I actually kind of liked how there was less chatter and milling about. And it was easy to get used to shorter 10-hour days. Now I don’t ever want to go back to working 14-hour days ever again!”

TOP: Live action was shot using a variety of methodologies that included handheld work to embellish on the ‘you are there’ feeling of verisimilitude. MIDDLE: Moon scene clean-up routinely required paintouts of the camera crane used to shoot live action. BOTTOM: Action scenes for this alt-history series include elements that could have come from science-fiction paperback covers, such as the armed astronauts poised on the outside of a lunar lander as it maneuvers above a lunar facility. Accurate lighting reference for scenes set on the moon’s surface was integral when set extensions were called for, and also provided key data for reflection elements used for astronaut visor replacements.

SPRING 2021 VFXVOICE.COM • 25

PG 20-25 FOR ALL MANKIND.indd 25

2/25/21 2:08 PM


TV/STREAMING

BLENDING TIMELESS SITCOM REALITY AND THE MARVEL UNIVERSE IN INNOVATIVE WANDAVISION By CHRIS McGOWAN

Images courtesy and copyright © 2021 Marvel Studios. TOP: Marvel Cinematic Universe characters from Avengers: Age of Ultron, Wanda and Vision seek to fit in as a happy New Jersey suburban couple in this love letter to classic television sitcoms that is WandaVision. The era-specific sitcom atmospheres were painstakingly re-created with vintage lenses, lighting, color palettes and effects. OPPOSITE TOP: Wanda Maximoff aka The Scarlet Witch (Elizabeth Olsen) and android Vision (Paul Bettany) celebrate marital bliss in Episode 1. The first two episodes were in black-and-white, and Episode 1 was filmed with a live studio audience. OPPOSITE BOTTOM: Watching TV with their new-born twins in Episode 3.

WandaVision is arguably the most unusual Marvel Studios production to date. Not only does the series place Marvel Cinematic Universe (MCU) characters in the unfamiliar territory of TV sitcom land, it presents episodes with the precise look of particular decades in television history, ranging from the 1950s to the 2010s. The six-hour, nine-episode series references the sitcoms The Dick Van Dyke Show, Bewitched, The Brady Bunch, The Partridge Family, Roseanne, Malcolm in the Middle and Modern Family with camera lenses, aspect ratios, lighting, production design, color palettes and period FX. Modern VFX are present to the greatest degree in episodes that have more MCU-style scenes, some with full CG environments. “It is true that we have a higher [VFX] shot count than Avengers: Endgame,” comments Visual Effects Supervisor Tara DeMarco. “However, I would say the complexity of the work in Endgame is off the charts! Our show is significantly longer and the visual effects are integral to the story in later episodes.” To kick off the strangeness, the opening two episodes are in black-and-white (Episode 1 was also filmed in front of a studio audience) and conjure up the atmospheres of The Dick Van Dyke Show and Bewitched, except that the lead characters happen to be Wanda Maximoff – the reality-warping Scarlet Witch (Elizabeth Olsen) – and Vision (Paul Bettany), her hyper-intelligent android companion last seen perishing twice in the MCU. The sitcom banter and gags are familiar, but something is not quite right in suburbia in this series written by Jac Schaeffer – are we inside Wanda’s mind or lost in some sort of parallel universe? The show’s innovative blending of sitcom reality with the Marvel universe attracted director Matt Shakman, who has deeply immersed himself in most TV genres. Formerly a child star in the sitcom Just the Ten of Us, Shakman, also the Artistic Director of

26 • VFXVOICE.COM SPRING 2021

PG 26-31 WANDAVISION.indd 26

2/25/21 2:08 PM


the Geffen Playhouse theater in Los Angeles, has directed episodes across all genres, including Game of Thrones, It’s Always Sunny in Philadelphia, Succession and The Boys. “I am a huge fan of the MCU. I admire the storytelling and the risk-taking. From Iron Man to Guardians of the Galaxy to Thor: Ragnarok, Marvel Studios is constantly surprising,” says

Shakman. “When I heard the pitch about WandaVision, it blew my mind – the beauty of it, the ambition and the innovation. As a director, I work in comedy and drama and large-scale spectacle. With WandaVision, I get to do all those things at the same time.” Shakman’s duties ranged “from doing an episode in front of a live audience, which drew on my theater experience and my sitcom

SPRING 2021 VFXVOICE.COM • 27

PG 26-31 WANDAVISION.indd 27

2/25/21 2:09 PM


TV/STREAMING

“It is true that we have a higher [VFX] shot count than Avengers: Endgame. However, I would say the complexity of the work in Endgame is off the charts! Our show is significantly longer and the visual effects are integral to the story in later episodes.” —Tara DeMarco, Visual Effects Supervisor

TOP: The series enters a new decade as shown by the changing colors mid-show, from the 1960s to the ’70s. MIDDLE: Wanda uses her magic powers to “escort” her guest out of the house. BOTTOM: Wanda and Vision celebrate their joyous relationship on a backyard swing set in the ’70s.

past, to orchestrating large MCU set pieces, which drew on skills I developed on shows like Game of Thrones and The Boys.” For Shakman, doing the homework for WandaVision included watching “episode after episode of the best TV comedies ever made.” Director of photography Jess Hall adds, “Matt Shakman introduced me to specific shows he was interested in and then we refined these further. I created collections of still images from the original shows for each period that represented my intentions. Then Matt, [production designer] Mark Worthington and I looked at these regularly and talked about framing, composition, production design, color, etc. I also studied literature about the making of the more iconic series and looked at behind-the-scenes photographs to see what techniques they were using, particularly in relation to lighting.” Hall used 47 different lenses across the production. “From day one, I began excavating the vaults at Panavision Woodland Hills and testing vintage lenses combined with period lighting techniques and instruments,” he adds. The production design also shifted through time. “Mark created the same house in every different era,” Shakman remarks, “as if it were built in the ’50s, and slowly renovated over time. We were fascinated by the idea of iterating everything through the ages – the house, the family car, the TV, the fireplace, the couch, the stairs. Even the products in the refrigerator are the same; they just change through the eras.” “A large part of each era’s look is in camera,” DeMarco explains. “We had highly stylized sets with era-specific palettes. We took all the goodness achieved in camera and then pushed them further in the DI. For example, we loved how the digital transfer from film to video in the ’80s caused a red channel bloom in the picture, so we added the same style bloom to our shots in post.” Shakman adds that the filmmakers paid close attention to “making sure we got our period colors correct, and were also carefully charting a progression from episode to episode. We also used color as signifiers – red being an important one for both Wanda and Vision; its use was quite specific throughout the show.” “Each episode has its own hero palette and DI finishing style with variable grain, blooming and softness,” says DeMarco. “We really made a push to be faithful to the era-correct VFX of each sitcom, which affected the styles of VFX for the very different looking episodes. Our MCU episodes have a more modern Marvel look to them. We purposefully gave them a cleaner, sharper look to contrast from the sitcoms.”

28 • VFXVOICE.COM SPRING 2021

PG 26-31 WANDAVISION.indd 28

2/25/21 2:09 PM


For its Special Effects Supervisor, WandaVision turned to Dan Sudick, who was Special Effects Supervisor on Avengers: Endgame and Avengers: Infinity War, has garnered 10 Academy Award nominations to date and shared a VES award for Outstanding Visual Effects in a Photoreal Feature for Avengers: Infinity War. Recalls Shakman, “When I first met with Marvel’s special effects maestro, Dan Sudick, I was a little bit sheepish. He’s the best in the business at blowing things up and building giant spaceships, and here I was about to ask him to do a bunch of old-school wire and rod gags. But instead of him rolling his eyes, he was thrilled. Dan came up under the guys who did Bewitched and I Dream of Jeannie. He knew this stuff like the back of his hand and really loved revisiting the old techniques.” And, Shakman notes, the charm of the practical effects “helped to differentiate the sitcom magic from MCU moments that sneak in.” Effects for the early episodes were mostly achieved in camera. “The floating objects [such as in Episodes 1 and 2] were achieved practically on set using puppeteers and sometimes complex wire rigs that competed for space in my overhead lighting rigs,” Hall says. “We undercranked the camera and adjusted the shutter to achieve in-camera motion blur for Vision’s accelerated actions. Whenever possible we shot locked-off plates for elements on the actual set and avoided greenscreen.” “VFX and SFX are BFF on a Marvel show,” says DeMarco. “We worked together in the planning stages and discussed each gag at length. Practical effects would test the wire rigs and we would

TOP: Vision and Wanda’s relationship blooms in color. Great care was paid to each era’s color palette. BOTTOM: Vision admires the huge pile of hamburgers he has just grilled.

SPRING 2021 VFXVOICE.COM • 29

PG 26-31 WANDAVISION.indd 29

2/25/21 2:09 PM


TV/STREAMING

“When I heard the pitch about WandaVision, it blew my mind – the beauty of it, the ambition and the innovation. As a director, I work in comedy and drama and large-scale spectacle. With WandaVision, I get to do all those things at the same time. … [My duties range] from doing an episode in front of a live audience, which drew on my theater experience and my sitcom past, to orchestrating large MCU set pieces, which drew on skills I developed on shows like Game of Thrones and The Boys.” —Matt Shakman, Director

TOP: Vision, in his red android getup, and Wanda, with witchy fingers, are startled while in their dining room in Episode 3. MIDDLE: Vision rises up to confront Wanda about manipulating his memories in Episode 5. BOTTOM: In a black-and-white episode, Vision, in human form, and Wanda put on a magic show that is full of surprises. Visual effects and special effects worked hand-in-hand in the series, such as in the magic show.

decide together the best methodology for combining multiple passes of wire-rigged plates, or a CG takeover of a mostly practical puppeteered effect. We have a healthy combination of 2D and CG in the early episodes. Wanda’s kitchen was filled with both objects on wires and CG items.” Continues DeMarco, “This show had everything – 2D platesbased VFX, full CG environments, digi-double fight scenes, complex FX and a lead actor with a CG face. All the good stuff.” Regarding Paul Bettany, she adds, “Paul wore a bald cap with special paint and special tracking markers for alignment. The paint gave us a sheen that we use in the final Vision look. The tracking markers helped us place the panels on his skin and the metal that wraps around the back of his head. “I love working on a blend of practical effects and CG VFX,” DeMarco acknowledges. “WandaVision has been very hard work but super fun,” says DeMarco. The black-and-white episodes were shot in color, but finished in B&W. “The major challenge was with translating Vision into B&W. His red makeup films quite dark on the sensor and he didn’t look like the Vision we all know and love,” says DeMarco. “We did a few days of camera tests with different shades of blue makeup and several DI tests to set a look for Vision that felt right for the 1950s/1960s. Actresses of the era would use blue lipstick instead of red, so we took a cue from them.” Vision regains his normal hue in Episode 3, which has a heavy 1970s Brady Bunch influence. “As far as the black-and-white finishing was concerned, I went to great lengths to create an authentic ’50s look,” says Hall. “The challenge was that this analog look had to ultimately exist within a 4K HDR platform, so with Josh Pines at Technicolor, I built a black-and-white LUT [Lookup Table] with specific features to control the HDR factor and the expanded dynamic range. To create the desired tonal range, the colors in the set and wardrobe design were critical. I tested these extensively and selectively used a color matrix in the final DI that mimicked the variable sensitivity of black-and-white emulsion to color.” “One of the key techniques I relied on to control the visual integrity in color of each episode,” Hall explains, “was to build color

30 • VFXVOICE.COM SPRING 2021

PG 26-31 WANDAVISION.indd 30

2/25/21 2:09 PM


palettes for each period based on data I extracted from reference stills. I used the specific RGB values of colors that appeared regularly in the period reference to create 20 or 30 colors that formed a color palette for my lighting, but also for the art department and costume designer. I knew restricting the color palette was the best way to take control of the image and to give it period integrity, as it was a process I developed on Ghost in the Shell.” Hall continues, “There was certainly a final layer of refinement to the image in the Digital Intermediate where we used the tools of Resolve in addition to color work to intensify certain characteristics like softness, highlight bloom or grain structure. However, this was always about enhancing what we already had, and we never had to compete against what had been photographed because – [with] color science, lensing, lighting through design – the look for each era had been conceived and executed predominantly in camera. “The 1970s was an interesting decade, because I was targeting an early color film look which is hard to achieve digitally,” adds Hall. “I worked meticulously with Mark Worthington and [costume designer] Mayes Rubeo to find the right balance of color in the frame and to accent this using complimentary colors. The Episode 3 LUT stretched the color separation even further, and then in the DI we used Resolve to push certain colors a little further or shift their hue to a more pastel and secondary realm.” Hall worked with Technicolor’s Pines on the color science to develop LUTs for each period. “We had 23 different LUTs in total,” Hall says. “During my first camera test, I became aware of the expanded color gamut available when mastering in 4K HDR – we were the first Marvel project to establish this as part of our principal photography workflow.” “We allowed ourselves to use the full range of modern VFX tools for the MCU episodes,” says DeMarco. “We were very restrained in VFX for some of the sitcom episodes, so the modern VFX are part of the cue to the viewer that we’re not in sitcom world anymore.” For the transition to the MCU reality, “quite a lot of that is done with sound, edit pacing and subtle changes in color in the DI,” notes DeMarco. “We consciously move in and out of aspect ratios in the episodes that mix MCU and sitcom. The sitcoms are always 4:3 or 16:9. When we move to a cinematic 2.39 picture, it’s that cue to the viewer. We planned extensively for the moments where we knew we would break from traditional sitcom filming style. All of our MCU scenes were blocked with action in mind and were filmed on their own set of lenses. Much of our ‘MCU’ filming was on location and outside. Moving away from the sets gave our shots a sense of scale along with the action beats.” Shakman concludes, “We were inspired by many different classic shows for each era, but ultimately we wanted to create WandaVision – something our own, something original, that felt thoroughly of the time. Tone is a mysterious thing – it can be really difficult to nail. It requires lots of experimentation and playfulness. Keeping things grounded is key. And having a strong story to pull you through. For all our stylistic experimentation, this is a love story between two amazing and very different characters. That’s what holds it all together.”

TOP: Old-school practical effects were called upon to make pots and cookery levitate in Wanda’s kitchen. MIDDLE: Vision assumes his human suburban husband form in a black-and-white scene. BOTTOM: Wanda dons her throwback Marvel superhero costume for Halloween in Episode 5.

SPRING 2021 VFXVOICE.COM • 31

PG 26-31 WANDAVISION.indd 31

2/25/21 2:09 PM


PROFILE

GUILLAUME ROCHERON: OSCAR-WINNER PURSUES HIS PASSION PERFECTING IMAGERY FOR STORYTELLING By TREVOR HOGG

TOP: Rocheron stands with the Academy Award that was given to him in 2013 for his contributions on Life of Pi. (Image courtesy of Guillaume Rocheron and MPC)

Currently based in Los Angeles and enjoying being a father to his year-and-a-half- old daughter, Guillaume Rocheron was born and raised in Paris, and has gone from making personal CG short films to being rewarded with Oscars for Life of Pi and 1917. “I wasn’t particularly good at drawing but always had a passion for photography, and learned early on how to use and understand a camera,” shares Rocheron. “I was developing a video game that I had written and worked with a few artists to help me conceptualize it. One of them introduced me to computer graphics. I was amazed by the possibilities that were way beyond what I could draw and photograph.” A fascination with creating moving images that were lit like photography led Rocheron to produce a series of short films on his personal computer at home. “I was using 3D Studio 4 on DOS. My computer wasn’t powerful enough to install Windows! A lot of time was spent flipping through the 500-page software manual as well as conducting trial and error experiments. One short film revolved around a group of ants trying to climb onto a sofa to watch their favorite TV show.” The short won a prize in the independent category at Imagina and caught the attention of an acclaimed Paris-based visual effects company. “I was 17 years old at the time and BUF got in touch with me. I remember going on the BUF website and discovering that they did the gunshot sequence in Fight Club. That was when I realized what I was doing could be applied to movies.” While attending École Georges Méliès, Rocheron educated himself about classical art. He was also doing freelance work for BUF, starting off as an uncredited modeler on Panic Room and becoming a digital artist on The Matrix – Reloaded and Batman Begins. “One of the biggest things that I learned from film photography and my early days at BUF is how similar photography and computer graphics are in concept. Both give you controls like shutter speed and aperture on a camera, or lines of code in a shader, and it is down to you to understand how they apply to making a picture. BUF used its own software at the time and there was no UI. Compositing and shading were scripted, which forced you to understand how an image or a shot was constructed without the real-time feedback you now have in modern software. I learned so much from the way BUF’s founder, Pierre Buffin, designed their approach to visual effects and from some great people there such as Stephane Ceretti and Dominique Vidal.” At 25, Rocheron left Paris to work in London at MPC. “It was a thrill to work abroad, but I didn’t speak great English and the working practices were very different in big studios. Nicolas Aithadi was a CG supervisor at the time and he took me under his wing. At BUF I was a generalist whereas at MPC people were specialized, which was a big adjustment for me. I had always worked executing complete shots rather than having a specialization. MPC supported me greatly by giving me the opportunity to work across many different disciplines over the years, from effects TD to lighting TD, 3D matte painter, CG supervisor, and finally VFX supervisor. I learned how to handle the bigger projects, the longer process, bigger complexity and higher volume. It was a less intuitive way of working, and much more calculated and planned.”

32 • VFXVOICE.COM SPRING 2021

PG 32-37 ROCHERON.indd 32

2/25/21 2:09 PM


A facility visual effects supervisor focuses on sequences, while the production counterpart deals with the overall aesthetic of the movie. “It is helpful to come from a facility because you know how a facility works,” notes Rocheron, who served as a production visual effects supervisor on Godzilla: King of the Monsters and 1917, and co-supervised Ghost in the Shell with John Dykstra (Star

Wars). “You know the technical language, workflow, and how the shot is being created quite precisely. On the production side, you have to learn how to collaborate with the production designer, DP and director, and translate their ideas into technical and artistic direction for the VFX teams. The challenges are widely different on every film, which is frankly why I love it. You try to find a common TOP AND BOTTOM: Rocheron gets a second Academy Award in 2020 for his contributions in creating the illusion that 1917 was captured as a continuous single shot along with Special Effects Supervisor Dominic Tuohy, left, and MPC Visual Effects Supervisor Greg Butler. (Image courtesy of Guillaume Rocheron and MPC) The visual effects team of Rocheron, Bill Westenhofer, Donald Elliott and Erik De Boer celebrate winning an Oscar for their work on Life of Pi. (Image courtesy of Guillaume Rocheron and MPC)

SPRING 2021 VFXVOICE.COM • 33

PG 32-37 ROCHERON.indd 33

2/25/21 2:09 PM


PROFILE

TOP: Rocheron and crew with the 80-camera video rig created for the solograms produced for Ghost in the Shell. (Photo courtesy of Guillaume Rocheron) BOTTOM: One of the biggest technical challenges that Rocheron had to overcome was the ability to art-direct the ocean in Life of Pi. (Image courtesy of Universal Pictures and MPC)

language to help the director bring his vision to the screen. It’s the people you work with who make this job the most enjoyable.” Movies that have had a lasting impact on Rocheron are Jurassic Park, Terminator 2: Judgment Day and Star Wars. “The first time that I saw the dinosaurs in Jurassic Park and the trench run in Star Wars, it completely blew my mind. I didn’t know enough about how films were made to understand how it was all possible, but I knew there was something that I found to be incredibly thrilling about the prospect of making them.” For the past decade, blockbusters have dominated the production slates of studios. “Now, between movies and longform content, there is so much demand for volume that we are seeing a lot of innovation happening to facilitate collaboration, rapid prototyping and iteration with filmmakers,” observes Rocheron. “Visual effects are now an integral part of the filmmaking process, but there are still challenges to make it more approachable for directors and other heads of departments. It can sometimes be a slow and counter-intuitive process that doesn’t always quite keep up with how organic the editing process can be. Real-time tools certainly help because you can put things in front of the director and studio at greater speed and more interactively, and hopefully help solidify scenes early to give enough time for the final effects to be finished at the highest possible quality. A lot of the directors I have been talking to recently are getting increasingly interested in how to best work with visual effects in order to explore new storytelling avenues.” A constant goal for filmmakers and the visual effects industry is photorealism. “Photorealism is whatever the reality of the movie is,” believes Rocheron. “Pre-production with a DP, production designer and director is where you define your language. A movie is

34 • VFXVOICE.COM SPRING 2021

PG 32-37 ROCHERON.indd 34

2/25/21 2:09 PM


an illusion. Sets get made. Actors get lit. Everything that you think is real is fabricated. I spend a lot of time in pre-production trying to understand that language with the other heads of department. If you take Ghost in the Shell, the cinematic language is driven by the drawn manga aesthetic. Translating it into live-action required a level of interpretation in order to create a fantastical world. Whereas 1917 had to be photoreal and invisible because of how grounded in reality the movie had to feel like. I find that working on different types of films is incredibly interesting because your

TOP AND BOTTOM: Rocheron was a Co-Visual Effects Supervisor with John Dykstra on the live-action adaptation of Ghost in the Shell. (Images courtesy of Paramount Pictures and MPC)

SPRING 2021 VFXVOICE.COM • 35

PG 32-37 ROCHERON.indd 35

2/25/21 2:09 PM


PROFILE

TOP: Rocheron provided additional visual effects supervisor support for the Heart of Darkness-inspired Ad Astra. (Image courtesy of 20th Century Fox and MPC) MIDDLE: The EnviroCam was created to assist with making the transition between digital and live-action characters and environments for the takeoff and landing sequences in Man of Steel. (Image courtesy of 20th Century Fox and MPC) BOTTOM: Rocheron has learned a lot from collaborating with Visual Effects Supervisor John ‘D.J.’ Des Jardin on films such as Sucker Punch. (Image courtesy of Warner Bros Pictures and MPC)

definition of photorealistic effects becomes an artistic choice, based on the rules of each film.” Rocheron is attracted to projects that offer opportunities to collaborate and contribute to the storytelling. “You don’t do movies for the technique but because you have a specific experience and mindset that will help you work with a group of people. At the very beginning of a project, I love not knowing yet how to achieve some of the work and working with my team to figure it out. It took two years of R&D to simulate the ocean in Life of Pi because we had to come up with a process that would allow Ang Lee to design the oceans like an animated character while ultimately being able to convert it into photoreal moving water. The breakthrough was that we would animate the ocean surface and then drive a thin layer of voxels on top of the surface. Ang would talk about the mood of the waves and how aggressive they should be. It was not just about doing what it said on the page. It was also about crafting his artistic vision.” Rocheron has also worked closely with Visual Effects Supervisor John ‘D.J.’ Des Jardin, a frequent collaborator of filmmaker Zack Snyder, who has left a lasting impression on him. “The first movie that [Des Jardin and I] did together was Sucker Punch followed by Man of Steel and Batman v Superman: Dawn of Justice. D.J. was the one who taught me that you have to be bold and not let the technical process put fear in the middle of what you’re trying to achieve creatively. On Man of Steel, we had to find a way to transition from live-action sets and characters to fully digital shots and back again. In the ‘Smallville Battle’ scene, some shots had to transition multiple times over to allow for uninterrupted action. “To capture environments in their shot-specific lighting, we created a process we dubbed ‘EnviroCam,’” continues Rocheron. “We used a high-speed, programmable and precise motorized camera head called the Roundshot, from a Swiss company called Seitz, that allowed us to capture our environments at very high resolution and very quickly. The Roundshot domes were then used along with some custom tools in Nuke to be aligned to the scene

36 • VFXVOICE.COM SPRING 2021

PG 32-37 ROCHERON.indd 36

2/25/21 2:09 PM


geometry and be color calibrated to the film plates in order to create seamless 2.5D transitions. It is cool to now see Roundshots used on a lot of visual effects sets as it proved to be an incredibly versatile and reliable piece of equipment.” Presenting a unique challenge both technically and artistically was the one continuous shot premise of a World War I drama helmed by Sam Mendes. “For 1917, the question was, ‘How do you manage a shot that is two hours long, make it workable in terms of iterations and deal with the ripple effects of each change?’” explains Rocheron. “For example, a tree or a crater added at a certain point could show up again one minute later and still had to work compositionally. Not having cuts meant we couldn’t rely on our usual bag of tricks. We had a specialized ‘Stitch Team’ that became the masters at seamlessly connecting takes together, mixing CG elements, 2.5D projections and incredibly elaborate comp tricks, etc. The editing, effects and camera work were all designed to not draw attention, so we did everything we could to make the effects as unnoticeable as possible and to not distract from the immersion. “Sam would often say, ‘I don’t want this to feel like a visual effects shot,’” emphasizes Rocheron. “It was so delicate as a movie. There are visual effects in 90% of it, in 4K IMAX. Instead of reviewing a shot that is four seconds long, which is the average duration of a visual effects shot, we had to deal with shots that were three or four minutes long. Life of Pi was technically a complicated movie because the R&D was massive, whereas 1917 was about finding innovative methodologies to handle the specificity of a one-shot movie.” No two projects are the same for Rocheron. “I love how filmmaking is about teamwork. Each director brings their unique vision and perspective that constantly generates new challenges for me and my team. Growing up in France, I never imagined that I would be working on Hollywood films, let alone winning Academy Awards. I consider myself so lucky that I get to make images for a living!”

TOP: The most difficult aspect of 1917, according to Rocheron, was not being able to break down the shots into the usual 10-second segments, and the ripple impact that change had on every subsequent shot. (Image courtesy of Universal Pictures and MPC) MIDDLE: An iconic franchise that Rocheron took part in relaunching was Godzilla. (Image courtesy of Warner Bros Pictures and MPC) BOTTOM: Rocheron went from being a facility Visual Effects Supervisor for MPC on Godzilla to a Production Visual Effects Supervisor for Godzilla: King of the Monsters. (Image courtesy of Warner Bros Pictures and MPC)

SPRING 2021 VFXVOICE.COM • 37

PG 32-37 ROCHERON.indd 37

2/25/21 2:09 PM


FILM

VFX IS THE GLUE THAT SEALS THE ‘BOND’ IN NO TIME TO DIE By TREVOR HOGG

Images courtesy of EON Productions, United Artists Releasing and Metro Goldwyn Mayer Studios. TOP: Framestore digitally enhanced a scene when Safin (Rami Malek) fires into a frozen lake with an automatic rifle at a figure swimming underneath. OPPOSITE TOP: Director Cary Joji Fukunaga, Daniel Craig and Lashana Lynch (Nomi) on the 007 Stage at Pinewood Studios.

Navigating through a series of personal and professional obstacles to prevent a worldwide crisis is a staple of the James Bond franchise, but this time around life was imitating art when making No Time to Die. Originally, Danny Boyle (Slumdog Millionaire) was supposed to direct the 25th installment, and while in pre-production he was replaced with Cary Joji Fukunaga (Jane Eyre) and an entirely new script, the 007 Stage at Pinewood Studios in England accidentally got blown up by a “controlled explosion,” Bond star Daniel Craig suffered an on-set injury, and the April 2020 release was delayed to October 2021 because of the impact of the pandemic. The fifth and final appearance of Craig as the lethal, martini-loving MI6 operative has him brought out of retirement by the CIA to rescue a scientist from a megalomaniac with a technology that threatens the existence of humanity. Fukunaga is no stranger to adversity, having battled malaria and losing his camera operator to injury during the first day of shooting Beasts of No Nation. “The walls of the stage came down, which was a bummer because we needed that stage space and Pinewood was completely booked. Quite often, when we finished shooting a set – that was it. There was no going back two months later for a pickup shot. That created a major logistical complex puzzle for the rest of our shoot. Daniel getting injured meant that we had to shuffle

38 • VFXVOICE.COM SPRING 2021

PG 38-42 NO TIME TO DIE.indd 38

2/25/21 2:10 PM


around again when we were already in this complex Twister game.” There was no additional time added to the pre-production schedule despite the directorial changeover. “We started all over from page one,” Fukunaga elaborates. “What that meant was writing took place all the way through production. I did something similar like that on Maniacs. We knew our ins and outs, but didn’t always have the inside of it completely fleshed out. The communication involved with having five units going at the same time and keeping everyone in continuity was a major task.” Previs was critical in filling in the gaps of the narrative. “There were several sequences that would get rewritten in the editing,” explains co-editor Tom Cross (Whiplash). “Sometimes they would be shooting some parts at a certain time and weeks or months later shoot other parts of it. In a way, that gave us time to work with previs and postvis and try to mock up a template for how these scenes could be completed.” Digital augmentation was an important element in achieving a visual cohesiveness. “You know that a lot of film was going to be shot from all of these different locations at various times and visual effects was one of the glues that held it all together,” Cross says. A sound mix was created for the previs by supervising sound editor Oliver Tarney (The Martian). “There were certain things, like when the guy walks up to the car and shoots at Bond, that were

to me as much an audio experience as they were a visual experience,” states Fukunaga. “I wanted the audience to be inside that car, to hear what it is like to have bullets hitting you from all sides.” “Cary is from the school of Christopher Nolan where he went into Bond wanting to get as much practical as possible,” remarks co-editor Elliot Graham (Steve Jobs). “Because of the schedule and Daniel’s injury, we had to rely more on visual effects.” A total of 1,483 visual effects shots are in the final cut. “It took so many man hours alone just to review shots that it would have been too much to do by myself,” admits Fukunaga. “It required an entire team whose curatorial eye had to be sharp to determine whether a shot was worthy to be brought to the review stage. Some shots were going to be entirely visual effects because those things you can’t do in real life. I found it actually quite liberating because my first films didn’t have much money for visual effects, so I was hesitant to rely on them for anything. I wanted the suspension of disbelief to be bulletproof, which meant I tried to do as much in camera as possible. As budgets have increased and allowed for some of the best craftspeople working on the visual effects, I could relax control and trust that we were going to get spectacular shots.” Unpredictable weather had an impact on the production, particularly on a scene that takes place on the Scottish Highlands. “It was supposed to be a blue-sky chase, which it was for much of the

SPRING 2021 VFXVOICE.COM • 39

PG 38-42 NO TIME TO DIE.indd 39

2/25/21 2:10 PM


FILM

TOP: The greenscreen was subsequently turned into a high-rise building by Framestore. MIDDLE: Fires and damage were digitally added by ILM for the action sequence that takes place in Matera, Italy. BOTTOM: Framestore was responsible for the winter environment and frozen lake where Safin hunts his human adversaries.

main unit shoot,” recalls Graham. “However, while we were there with the second unit it rained the entire time and the roads were covered in mud. These were car-flipping stunts, so you have to be careful because people’s lives are much more important. They did the stunt once. We waited four hours, gave up, did it in the rain, and hoped that we could paint out the rain the best we could. “We really needed a third time,” Graham continues, “because we didn’t get the shot where Bond’s car knocks into this other car. It was just raining too damn much, and by the time it cleared up a few days later we had to move on because of the schedule. That left us in the position of me asking for a plate of Bond’s car pretending to knock into another car. Visual effects could add the second car in later. Those are the practical realities of filmmaking.” A veteran of the James Bond franchise is Special Effects Supervisor Chris Corbould (Inception). “The great thing was having the classic DB5 back and not just in a cameo role but in full combat mode,” Corbould says. “Combining that with the wonderful town and tower in Matera, Italy was masterful. My department was responsible for putting together all of the gadgets on the cars, making the pod cars where you have a stunt driver on the roof, as well as all of the explosions and bullet effects.” The roads were like polished stone, Corbould adds. “We had to put on special tires so that we didn’t skid around too much. Only one road goes through Matera and thankfully we were able to shut it down on many occasions.” LED stages were utilized for close-up shots of actors during the car chases. “We built LED stages very much like we did on First Man,” explains Cinematographer Linus Sandgren (La La Land). “This time 2.8mm pitch LED panels connected making a 270-degree, 20-foot-radius cylinder surrounding a car, and a ceiling with LEDs as well. At one point, we used a squarish cube set up around some vehicles. For these scenes we shot plates with an array of nine cameras on a sporty vehicle and brought this footage to the stage. This way we could both shoot and light the scene with footage from the actual environment and just an added 5K Tungsten Par on a hydroscope crane working as a sun.” A special gimbal was produced for the interior of a sinking boat with passengers trapped inside. “It was like a rotisserie that could revolve around and then drop down on one end and completely submerge into a 20-foot-deep tank of water with one of the main actors inside,” remarks Corbould. “It was 50-feet long and could roll around 360 degrees. When we started, the engine was on the bottom of the boat, then as it started sinking the engine revolves around, and add to that copious amounts of compressed air as if air pockets are escaping. It makes for an exciting sequence.” Then there was a matter of a seaplane. “A seaplane is driving along the ground trying to get enough speed to take off while being chased by gunboats,” describes Corbould. “For that we made a plane with an engine driving its wheels with no propeller on so we could drive up and down to our heart’s content with a driver underneath.” A high-flying Antonov was, in reality, grounded. “We had to put a mechanism that made it look like the glider was being pulled out from the back of the plane to allow it to drop through the air,”

40 • VFXVOICE.COM SPRING 2021

PG 38-42 NO TIME TO DIE.indd 40

2/25/21 2:10 PM


Corbould reveals. “It was easier to do on the ground and safer, especially when you’ve got your main actors in there. In these days of digital effects, you can easily put bluescreen or greenscreen for the sky outside of the back of a transport plane.” Overseeing the visual effects were Supervisor Charlie Noble (Jason Bourne), Co-Supervisor Jonathan Fawkner (Guardians of the Galaxy), and producer Mara Bryan (Dark City), with the work divided among Framestore, ILM, DNEG and Cinesite. Additionally, in-house team The Post Office did a few re-speeds that required cleanup. “There were four prongs of attack to the action sequences, storyboarded by Steve Forrest-Smith and his team,” explains Noble. “2D animatics were then produced for certain sections using recced location stills/video and animated line drawings by Adrian Spanna at Monkeyshine. Subsections of these were then previs’d by Pawl Fulker at Proof using initial location LiDAR scans and previs or art department models. The stunt department also shot rehearsals of specific action beats. This gave Cary the ability to increase the level of detail as required for shot planning/techvis purposes to disseminate his vision.” Principal photography on the ice lake, when the disfigured antagonist Safin (Rami Malek) fires an automatic rifle at someone swimming below him, was complicated by the natural elements. “The opening scene in Norway on the ice was shot under varying lighting and weather conditions,” states Noble. “We needed to add snow to the trees and make the ice itself clearer with small dustings of drifted snow. One shot from the sequence was selected as the hero look for the sequence. Framestore built and lit a CG environment based on that. This then gave us a guide to aim at for each angle, retaining as much of the original photography as possible while laying in our snowy trees, atmosphere and ice surface.” Encapsulating the variety of visual effects created for No Time to Die is the Norway safehouse escape that goes across moors, through a river and into the woods. “Largely in-camera with a lot of cleanup of tire tracks, stunt ramps and some repositioning of chase vehicles for continuity, which involved removing in- camera action, replacing terrain and adding CG vehicles, bikes, and any dirt or water sprays as required, but leaving the majority of the frame untouched.” Most of the sets and locations required some degree of digital augmentation. “We took a dedicated LiDAR scanning and plate/ photo texturing team with us wherever we went and sent them to places where we would need to capture environments to integrate into principle photography,” states Noble. “For example, for the Cuba street scene we had a huge set built on the Pinewood backlot, which required buildings to be topped up and streets extended. Art Director Andrew Bennett (Skyfall) provided us with an excellent layout for what should go where, and we sent our team out to Havana and Santiago de Cuba to scan and texture the desired building styles for street extensions. Building top-up plans were provided by Andrew and realized using set textures and extrapolated set surveys. At the end of the scene, Bond and Valdo Obruchev (David Dencik) escape using Nomi’s (Lashana Lynch) seaplane, moored in the port of Santiago. The foreground was shot at a small wharf in Port Antonio, Jamaica,

TOP TWO: A greyscale model of the gilder containing Bond and Nomi along with the final version created by DNEG. BOTTOM TWO: ILM handled a signature action sequence involving Bond spinning a DB5 around while firing built-in machine guns.

SPRING 2021 VFXVOICE.COM • 41

PG 38-42 NO TIME TO DIE.indd 41

2/25/21 2:10 PM


FILM

TOP TWO: A set extension of the submarine base was created by DNEG to replace the bluescreen. BOTTOM TWO: Greyscale modeling and final shot of a boat sinking into a burning and watery grave courtesy of DNEG.

the mid-ground port cranes were constructed from plates and surveys of Kingston port, and the background skyline came from our Cuba stills.” Both greenscreen and bluescreen were deployed depending on the scenario. “Exterior work is always a bit trickier with the wind to contend with,” notes Noble. “We often used telehandlers carrying 20-foot x 20-foot screens outside, either lined up to make a wall or to top up existing fixed screens [on the backlot for example]. It was really handy to be able to dress them in shot-byshot as opposed to building acres of green for every eventuality. They came into their own on the Cuba streets set – we had six on the perimeter of the set, and they maneuvered themselves into place to give us as much coverage as possible on top of the 30-foot-high fixed screens that we had surrounding the build, which had some complex elements to contend with: overhead power cables, telegraph poles, wet downs, vines, and explosions towards the end of the scene.” A broad range of simulations needed to be integrated into the live-action photography. “Huge dust/concrete explosions to match real SFX ones, fireballs roiling down corridors, fire, smoke, mist, foliage, masonry hits, crowd, water, ocean, bubbles underwater, shattering glass, clothing and hair,” remarks Noble. “For the most part we were matching to in-camera special effects, surrounding shots, and other takes where we were needing to slip timings or separate reference elements. “The trawler sinking required some complex simulations that all had to interact with one another,” Noble comments. “While patches of oil burn on the water surface, pockets of air are released from the submerging hull along with foam, bubbles, floating detritus, and sea spray from all the activity. Other simulations needed were volumetric passes for light scattering under the water, in particular for the boat’s lights, as well as interactive simulations on the trawler’s nets as it submerges and all the props such as crates, buoys, life raft, and more oil spewing from the damaged engine room.” Digital doubles were used sparingly because of the success of the stunt department led by Lee Morrison (The Rhythm Section). “Digital doubles were mainly utilized to massage stunt performer body shapes and posture to match surrounding shots on the cast,” reveals Noble. “Face replacements were also required for the principal cast at various points for either scheduling or safety reasons. Some limbs were replaced where actors were required to wear stunt padding over bare skin. A good example would be the fight scene in the Cuba bar where Paloma (Ana de Armas) takes out a few goons with high kicks and flying kicks. Digital limbs and stilettos were used to replace padding and trainers.” The usual adversaries made their appearance in post-production. “The challenges of volume and time were greatly eased by bringing in Jonathan Fawkner as Co-Supervisor and we divided the vendors between us,” says Noble. “My profound thanks to Mara Bryan, Jonathan Fawkner and Framestore, Mark Bakowski at ILM, Joel Green at DNEG and Salvador Zalvidea at Cinesite for their lovely work.”

42 • VFXVOICE.COM SPRING 2021

PG 38-42 NO TIME TO DIE.indd 42

2/25/21 2:10 PM


PG 43 DNEG AD.indd 43

2/25/21 2:38 PM


PROFILE

JENNIFER BELL: LEADING A DOMINANT ERA OF VISUAL EFFECTS AT UNIVERSAL By TREVOR HOGG

Images courtesy of Universal Studios except where noted. TOP: Jennifer Bell, Executive Vice President, Visual Effects, Universal Pictures. OPPOSITE TOP: With long-time collaborator Chris Cram on the set of Jarhead in El Centro, California in 2002. (Photo courtesy of Jennifer Bell) OPPOSITE BOTTOM: Filming VFX plates of pristine dunes for The Mummy in the Merzouga Desert, Morocco in 1998. (Photo courtesy of Jennifer Bell)

One cannot help to be impressed by the résumé of Jennifer Bell, Executive Vice President, Visual Effects for Universal Pictures, which includes collaborations with Sam Mendes (Jarhead), Michael Mann (Miami Vice), Alfonso Cuarón (Children of Men), Guillermo del Toro (Hellboy II: The Golden Army), Baltasar Kormákur (2 Guns), Ridley Scott (Robin Hood), Justin Lin (Fast & Furious) and Paul Greengrass (The Bourne Ultimatum). Born in Fresno, California, Bell was raised in Charlotte, North Carolina, where her father worked as a radio disc jockey and program director and her mother as a teacher. “It was great place to grow up. We were a liberal family in conservative surroundings. Charlotte was a beautiful town where you could ride your bike everywhere and explore the woods. There was a lot of freedom. I was a year-round swimmer as a kid, so I spent most of my time in the pool.” The radio was on constantly in the Bell household. “I still to this day play the radio all day! My dad played the piano and trumpet. There was a lot of music and entertainment in our house. We grew up around the piano singing songs before dinner.” The patriarch of the family had ambitions of becoming an actor. “I have to give a lot of credit to my father for my love for filmmaking,” notes Bell. “He talked about films constantly, recreated scenes, acted in the community theater, and loved acting and going to see movies.” However, the younger of his two daughters did not share the same enthusiasm for cinema. “I was the exact opposite. I could not sit still as a kid. I was quite energetic, so to get me to sit through a movie was difficult. When I was a young teenager my father insisted that we go see Raiders of the Lost Ark. I didn’t want to go as it didn’t appeal to me. I was a bit precocious and said, ‘I’m going to sit by myself and watch this.’ After the first scene, I had literally moved right next to my dad and held him for the rest of the film. It blew me away. That was the first time a film had a real impact on me.” At 19 years old, Bell decided not to pursue a post-secondary education and instead took a job on a major Hollywood production. “I remember watching Space Balls and that opening scene where the spacecraft keeps going through frame, cracking up and thinking, ‘Oh, my god! How did they do that?’ At the time, I was fresh out of high school and not sure what to do. I heard from a friend that The Abyss was being shot locally, so I called about a position and got an interview. The job was to get up early in the morning, pick up dailies from the Charlotte airport, drive them to Gaffney, South Carolina, and spend all day helping on set wherever I was needed. At wrap, I would take footage back to the airport and send it back to L.A. It was a very long day. It wasn’t until later that I realized what a big deal The Abyss was.” The Abyss resulted in a career epiphany for Bell. “A friend of mine in the art department suggested that I should be a set painter, so I applied and started working. We painted the set of Deep Core on rafter boats inside an abandoned nuclear reactor as they were filling it with water. I thought that this was the craziest job ever! Eventually, I ended up being fired as a painter because I wasn’t very good at it. The same friend tipped me off that visual effects was looking for help. I went to the visual effects office and said, ‘I hear you’re looking for a PA and I know everyone on the crew,

44 • VFXVOICE.COM SPRING 2021

PG 44-49 JENNIFER BELL.indd 44

2/25/21 2:10 PM


“At that time [in the ’90s], digital technology was really heating up. No one truly knew what its impact would be. We were learning to evolve our traditional processes and incorporate this new digital technology. I thought, ‘If nobody else knows either, then we’ll all learn together.’ Over the next few years, I hopped around a lot of different visual effects vendors, taking the best of what they had to offer and crafting my own style.” —Jennifer Bell, Executive Vice President, Visual Effects, Universal Pictures would you consider hiring me?’ They hired me on the spot. I didn’t know anything about visual effects, but the team on that film was brilliant. They took me under their wing and patiently explained everything. We had our own model shop and all the different vendors, ILM and Dream Quest Images, were coming out to shoot plates. Every day, I was learning things that I’d never been exposed

to before. At that moment, I realized I couldn’t just let it go. After production finished on location, the visual effects team told me that if I got to L.A., I would have a job.” Throwing her meager possessions on a production truck, Bell headed to L.A. and served as a visual effects coordinator on Ghost. “I enjoyed the production side of things because I loved being on

SPRING 2021 VFXVOICE.COM • 45

PG 44-49 JENNIFER BELL.indd 45

2/25/21 2:11 PM


PROFILE

TOP: On the set of Deep Core for The Abyss in Gaffney, South Carolina in 1988. (Photo courtesy of Jennifer Bell) MIDDLE: It was by doing various production jobs on The Abyss that Bell decided to pursue a career in the film industry. (Image courtesy of Twentieth Century Fox) BOTTOM: The first movie that Bell worked on after moving to California was Ghost as a visual effects coordinator. (Image courtesy of Paramount Pictures)

set, dealing with editorial and being part of the long process that visual effects is. However, I realized I didn’t have a lot of experience in the field; I had only done two movies. I needed to learn more about the craft, so I began working at visual effects companies. I started off at a miniature company working with the Skotak brothers. At night, I would go on set and do any odd job. I got to be up close and personal with the creative process. At that time, digital technology was really heating up. No one truly knew what its impact would be. We were learning to evolve our traditional processes and incorporate this new digital technology. I thought, ‘If nobody else knows either, then we’ll all learn together.’ Over the next few years, I hopped around a lot of different visual effects vendors, taking the best of what they had to offer and crafting my own style.” After working as a visual effects producer for Pacific Data Images and Sony Pictures Imageworks, as well as for The Mummy and Van Helsing, Bell became a visual effects executive at Universal Pictures. “Personally, I was more satisfied managing the whole process as opposed to just a portion. I enjoyed the interaction with directors, producers, production teams and editorial, trying to figure out how to best shape the plan in order to help realize the director’s vision. I thrive having a lot of plates in the air, juggling all of the different tasks going on, with all of the various vendors and working towards a common goal.” One of the early challenges is being able to match a filmmaker with a visual effects producer and supervisor. “I often say that those of us in visual effects are the marathon runners of the production. We’re typically on right after the line producer has been hired, we’re present throughout the shoot, and then we’re integral in delivery of the film. That is a long relationship with a filmmaker, and it is important that there is trust in that

46 • VFXVOICE.COM SPRING 2021

PG 44-49 JENNIFER BELL.indd 46

2/25/21 2:11 PM


“I work with a tight group of talented, creative and smart people. For me, that is the best part of my job. I’ve worked on seven Fast & Furious films and make dinosaurs for a living – what can be better than that?” —Jennifer Bell, Executive Vice President, Visual Effects, Universal Pictures relationship. If that trust starts to erode, when the work comes through they’re not going to believe that you have their best interests at heart.” A director’s needs can vary as to what they want in a VFX team. “Upon meeting a director on a new project, I sit with them and talk about what their creative intentions and necessities are so I can pair them with someone who can support them,” remarks Bell. “Then I introduce them to the folks who I think fit the bill. I am part of the process to observe and determine who the best creative mark is. As far as producers and supervisors, it is a similar process. You want to make sure that those two parties are going to be able to work together as a team. I am someone who gets invested in projects, and I want to make sure that all the relationships are working well.” Upon getting the script, the visual effects work is broken down. “First, establish the production plan,” explains Bell. “Geographic location is important. Where are we filming? Atlanta? The U.K.? That’s when I start gearing up towards finding the crew and trying to take advantage of tax credit opportunities. Once we establish the list of vendors that fit within that profile, we look at what

TOP: Filmmaker Colin Trevorrow returns to direct Jurassic World: Dominion with stars Bryce Dallas Howard and Chris Pratt. MIDDLE: The visual effects department at Universal still quote from Scott Pilgrim vs. the World even a decade after its release. BOTTOM: Bell is proud of her collaboration with filmmaker Justin Lin on the Fast & Furious franchise.

SPRING 2021 VFXVOICE.COM • 47

PG 44-49 JENNIFER BELL.indd 47

2/25/21 2:11 PM


PROFILE

TOP: Pushing the boundaries of visual effects spectacle is the Fast & Furious spinoff Hobbs & Shaw. MIDDLE: Bell was impressed by the preparation of filmmaker Edgar Wright when making Scott Pilgrim vs. the World. BOTTOM: A massive undertaking was creating mobile cities for Mortal Engines, directed by Peter Jackson.

teams are available and have a proven track record with a positive outcome. I want to put someone in front of a director who I can say, ‘They’ve done this before. Be confident that this will go well.’” “These days, all movies have visual effects,” notes Bell. “When I started at Universal Pictures there were always a handful that didn’t require the department, and now they all do. Visual effects is constantly evolving and the tools are improving. The latest hot topic is virtual production. Any tool that can help a filmmaker better tell their story is worth exploring. If virtual production can do that, then we are all in. It allows for huge creative flexibility, but definitely requires a different way of thinking and setting up for. What virtual production does do is force everyone in the development and production period to commit earlier to certain creative decisions. I don’t think that virtual production takes away from visual effects, it’s just another tool in the toolbox.” With the COVID-19 pandemic, Universal’s team worked remotely without any major visual effects production delays. “We were able to quickly pivot to a work-at-home setup with our vendors and teams,” remarks Bell. “Six visual effects films were delivered during the pandemic, most of which, did not have a huge disruption in our delivery schedules. There was one case where we opted for a thinner, longer pipeline but that actually benefitted the film. “With Jurassic World: Dominion, we were the first major motion picture to resume filming during the pandemic with very few hitches. Even while the production was trying to figure out the best practices, our filmmaker, Colin Trevorrow, jumped into post and said, ‘We shot for 10 days and got some plates. Let’s

48 • VFXVOICE.COM SPRING 2021

PG 44-49 JENNIFER BELL.indd 48

2/25/21 2:11 PM


get these turned over and get going.’ Overall, we have had to be nimble and keep up the momentum to stay ahead of everything. There are a lot of projects in development right now. The silver lining of the pandemic for me is that I’ve been able to work more closely with a variety of filmmakers. My team and I are introducing them to different vendors, helping problem-solve budget issues, and getting artwork and/or previs done while we determine which films are going to be out of the gate next.” Bell recalls some memorable films such as Children of Men. “The long shot filmed from the interior car’s point of view took months to figure out, but we had a great team at DNEG that executed it flawlessly. We also had Framestore create the CG birth of a baby, which had never been done to date. There were a lot of new advancements in that film that were exciting to be a part of.” Another film is Scott Pilgrim vs. the World, which has gained cult status since it was released in 2010. “That was one of my favorite films to work on. I had such fun with [director/ writer] Edgar Wright. He was so prepared and knew exactly what he wanted. To this day, in the office we still quote lines from that film.” “I’ve worked at Universal Pictures for 23 years and have had so many amazing opportunities and memories,” reflects Bell. “I work with a tight group of talented, creative and smart people. For me, that is the best part of my job. I’ve worked on seven Fast & Furious films and make dinosaurs for a living – what can be better than that? When I talk to young people, I tell them, ‘If you don’t know what to do and you’re looking for work, you should explore visual effects. It continues to be a growing industry with opportunities that could change your life.’”

TOP: Johanna Leonberger (Helena Zengel) and Captain Jefferson Kyle Kidd (Tom Hanks) in News of the World, co-written and directed by Paul Greengrass. MIDDLE: A standout moment in Children of Men for Bell is the long tracking shot. BOTTOM: Bell served as Visual Effects Producer on The Mummy and sequel The Mummy Returns.

SPRING 2021 VFXVOICE.COM • 49

PG 44-49 JENNIFER BELL.indd 49

2/25/21 2:11 PM


VFX TRENDS

HOW VFX STUDIOS ARE ADOPTING VIRTUAL PRODUCTION INTO THEIR WORKFLOWS By IAN FAILES

TOP: Pixomondo’s LED volume stage, here pictured nearing completion, is now deep in the throes of production. The stage is 75 feet in diameter, 90 feet deep and 24 feet in height. (Image courtesy of Pixomondo)

In the past few years, a number of technological developments – improvements in real-time rendering and game engines and high-fidelity LED wall tech, in particular, and the need for social distancing during the COVID-19 pandemic – have seemed to kickstart visual effects studios into a ‘virtual production’ whirlwind. Of course, many VFX studios have already been dabbling in – or indeed innovating in – the virtual production field, whether that be in previsualization, the use of virtual and simul-cams, real-time rendering, motion capture or some kind of live in-camera effects filmmaking. The success of Industrial Light & Magic’s virtual production work on The Mandalorian is perhaps well known. But how have other studios been concentrating their virtual production efforts? In this report, several outfits discuss how they have been implementing new virtual production workflows. A HISTORY OF VIRTUAL PRODUCTION AT WETA DIGITAL

Many will remember the iconic footage of director Peter Jackson utilizing a simul-cam and VR goggles setup for the cave troll scene in The Fellowship of the Ring. Simul-cams, real-time rendering and

50 • VFXVOICE.COM SPRING 2021

PG 50-57 VIRTUAL PRODUCTION.indd 50

2/25/21 2:11 PM


other virtual production tech, especially performance capture, have been a trademark of Weta Digital’s filmmaking and VFX work ever since – consider films such as Avatar, The BFG and Alita: Battle Angel. The studio has recently embraced Unreal Engine and LED wall workflows in an even more significant way by experimenting with the game engine’s production-ready tools and shooting with movable LED walls and panels (a workflow they tested with a fun setup involving Jackson’s WWII bomber replica). Weta Digital Visual Effects Supervisor Erik Winquist points out that testing these new workflows has helped determine what might and might not work in a normal production environment. “I think getting too hung up on any one particular technique or approach to doing this is pointless,” Winquist notes. “You’re not

“From here the next question will be, can we consistently deliver final pixel renders, renders that come direct from the game engine, at the scale of a production like The Lion King.” —Rob Tovell, Global Head of Pipeline, MPC Film TOP: Weta Digital’s LED wall test-shoot using a bomber cockpit replica. (Image copyright © 2020 Weta Digital Ltd.) BOTTOM: These on-set and capture frames and final image showcase MPC Film’s virtual production approach on The One and Only Ivan. (Image copyright © 2020 Walt Disney Pictures)

SPRING 2021 VFXVOICE.COM • 51

PG 50-57 VIRTUAL PRODUCTION.indd 51

2/25/21 2:11 PM


VFX TRENDS

going to say, ‘Well, we’re going to shoot our whole movie in front of an LED wall.’ That’d be silly. With any of this stuff, you’re going to use the strength of the tool – you say, ‘What LED stage down the street is going to best serve this particular scene? Great, we’ll shoot there on Tuesday, and then when we need to do this other thing, we’re going to go on the backlot.’ Whatever serves the storytelling the best.” Winquist sees so many areas where the studio can continue utilizing virtual production methods. One is in virtual scouting that has become so crucial during the pandemic when locations cannot be easily visited. “You can get LiDAR scans with high-fidelity three-dimensional meshes, now you can do your previs with that and start using it as a basis for your actual shot production once you’ve actually got plates shot. The whole thing just feeds down the pipe.” HEADING INTO VISUALIZATION AT DIGITAL DOMAIN

Digital Domain’s Head of Visualization, Scott Meadows, advises that the studio has been delivering on several aspects of virtual production, from LED screens to virtual reality. The studio also houses a ‘Digital Humans Group,’ which leverages AI and machine learning to deliver digital humans for feature, episodic and realtime projects. One of Digital Domain’s latest endeavors is to create ‘bite-sized

52 • VFXVOICE.COM SPRING 2021

PG 50-57 VIRTUAL PRODUCTION.indd 52

2/25/21 2:12 PM


tools,’ as Meadows describes, for directors and production designers who want to handle their scouting work on their own and explore locations remotely. “Users can connect over a VPN to ensure security, then use the controller to fly around the location, set bookmarks, determine focal length and more, all using the controller.” Digital Domain’s presence in virtual production recently came to the forefront while working on Morbius. Here, traditional previs had been done for a particular scene, but when a story point needed to change it was re-filmed on a stage using a game engine approach. “We took existing assets and merged the world that was already created to the new world being filmed on the stage,” outlines Meadows. “The scene featured some incredible action with unusual movements from the actors, so we couldn’t use traditional mocap. Instead, we utilized key framed animation in real-time, making adjustments as we went. Thanks in part to some incredible work from stunt people, we got things looking great, including using the game engine to help add some really dramatic lighting. “When it came time to work on the scheduled re-shoots, COVID protocols were in full effect,” continues Meadows. “We had a few people head to the set, maintaining social distance and limiting numbers, while the director and editor both worked remotely the entire time using iPads on stands and screen sharing. The director

“Right now, real-time environments lend themselves to fantasy and science fiction, I think that this is going to change very quickly as the real-time renderers, Unreal in particular, achieve ever greater levels of sophistication.” —Paul Franklin, Co-founder and Creative Director, DNEG

OPPOSITE TOP: The LED panels utilized in Gravity represented an early insight into the benefits of virtual production for Framestore. (Image courtesy of Framestore) OPPOSITE BOTTOM: The camera rig with actor and LED wall for DNEG’s virtual production test. (Image courtesy of DNEG) TOP: A final render of ‘DigiDoug’ (Doug Roble) by Digital Domain. (Image courtesy of Digital Domain)

SPRING 2021 VFXVOICE.COM • 53

PG 50-57 VIRTUAL PRODUCTION.indd 53

2/25/21 2:12 PM


VFX TRENDS

gave camera commands from Sweden, while the editor in L.A. offered tips. That was only one part of what we did on that film, but it was really incredible to see it all come together.” DNEG DIVES INTO LED WALLS

TOP: Mavericks VFX founder Brendan Taylor and a crew member use a virtual camera to set up a scene in Unreal Engine. (Photo: Caley Taylor. Image courtesyof Mavericks VFX) BOTTOM THREE: From virtual camera input to final shot. A scene comes together for The Lion King with final visual effects by MPC Film. (Image copyright © 2019 Walt Disney Pictures)

The possibilities of in-camera effects using virtual production is an area of principal interest for DNEG, which had already made strides here in films such as Interstellar and First Man. “In the last year, we have developed a great working relationship with Epic Games, who have provided incredible technical support, helping us to really get the most out of Unreal Engine,” discusses DNEG Co-founder and Creative Director Paul Franklin. “We have also set up a partnership with Dimension Studio in the U.K. Dimension are experts in virtual production and immersive content, and their knowledge is the perfect complement to our background in high-end film and TV VFX.” For several recent projects, such as Death on the Nile, DNEG has implemented its proprietary Virtual Camera and Scouting system. The studio has also leaped into tests and short film projects utilizing LED volumes. Franklin suggests that there are many virtual production options now, and therefore many choices available for filmmakers. “Right now, real-time environments lend themselves to fantasy and science fiction, but I think that this is going to change very quickly as the real-time renderers, Unreal in particular, achieve ever greater levels of sophistication.” Franklin also observes that, in some ways, virtual production methodologies can be seen as re-inventions of techniques that have been with us for decades – “rear projection, translight backings, even glass paintings, but I think it’s also important to realize that the sheer speed of virtual production approaches offers a new way of looking at things, where parts of the process that had been split off into different areas of the VFX pipeline now happen all at the same time under the control of a much more tightly-knit group of artists. This in itself opens up whole new opportunities for filmmakers to experiment and create.”

54 • VFXVOICE.COM SPRING 2021

PG 50-57 VIRTUAL PRODUCTION.indd 54

2/25/21 2:12 PM


MPC FILM: LION KING AND BEYOND

With films like The Jungle Book, The Lion King and The One and Only Ivan, MPC Film has been honing its virtual production credentials for some time. Virtual Production Producer Nancy Xu says the studio’s seen an increase in interest in final-pixel-onstage simul-cam, VR scout/shoots with participants all around the world, LED walls and fully-CG virtual production shows. “At MPC, we’ve made headway on evolving virtual production as a way to work remotely together,” states Xu. “We’ve shrunk the footprint of our large stages to fit into small living rooms. We are also exploring different ways to incorporate the power of the game engine into our VFX pipeline.” MPC Global Head of Pipeline Rob Tovell adds that Pixar’s open-source USD is one technology helping the studio push ahead in virtual production and real-time tech. “We can save USD data directly from the game engine and send those USD files straight to post where our layout team can use that as reference or build directly on that data to start working on the final shot. By using USD and having tools to help quickly turn over our VP work to post, and make our toolset scalable, allows us to operate efficiently on both small episodic shows and larger film ones.” As for the future of virtual production, Tovell’s view is that “we are going to start seeing more and more filmmakers asking for the VFX work that is normally done after the shoot, to be available before the virtual production shoot, therefore allowing them to use that work while they are filming. Likewise, they are going to want to be able to direct the action in their shoots a lot more; they’re not going to want to leave the virtual production space to adjust an animation, they’ll want to do that right there while they’re shooting. From here the next question will be, can we consistently deliver final pixel renders, renders that come direct from the game engine, at the scale of a production like The Lion King.”

TOP: Pixomondo’s LED volume stage nearing completion in Toronto. (Image courtesy of Pixomondo) BOTTOM THREE: A breakdown of the virtual production approach to filming used on The BFG. Weta Digital has been pioneering these techniques for many years. (Images copyright © 2016 Storyteller Distribution Co., LLC)

SPRING 2021 VFXVOICE.COM • 55

PG 50-57 VIRTUAL PRODUCTION.indd 55

2/25/21 2:12 PM


VFX TRENDS

“Whether we’re using Unreal Engine to help our environments team or to quickly reanimate a camera, there are endless uses and opportunities in an industry where we’re used to opening up new frontiers. It’s incredibly exciting to see the horizon opening up like this.” —Tim Webber, Visual Effects Supervisor and Chief Creative Officer, Framestore

A FUTURE OF VIRTUAL PRODUCTION AT FRAMESTORE

Framestore is another studio delivering multiple shows right now using virtual production techniques. “We are currently using virtual scouting and virtual cameras to interactively stage previs on multiple shows via fARsight, Framestore’s proprietary virtual production tool,” details Alex Webster, Framestore’s Managing Director of Pre-Production Services. “It is primarily used for virtual scouting, but can also be used for scene/animation review, camera blocking and as a virtual camera. fARsight uses a VR headset or an iPad which connects to a PC running Unreal. It can support multiple users in one session, which is particularly useful with many of our clients and creative stakeholders working remotely and often on different continents.” In terms of LED volumes and virtual production, Framestore was one of the early adopters of this approach for the film Gravity. “We previs’d the whole movie, we pre-lit the movie, we used realtime cameras and we used LED panels,” says Framestore Visual Effects Supervisor and Chief Creative Officer Tim Webber. “Now we’re at a stage where we’ve been using our fARsight toolset for some three years, and we’ve got some tremendous talent – people like [Framestore Pre-Production Supervisor] Kaya Jabar – who are experts in their field and have worked across a wide range of projects. Right now, we’re finding ourselves going beyond just scouting, camera placement and LED panels to the idea of fully prototyping and ultimately finishing a whole movie using this toolset.” Webster and Webber see virtual production changing just about all areas of visual effects and filmmaking, all the way from preproduction to physical production. In addition, Webber identifies, real-time tools and virtual production techniques are likely to have major impacts on traditional VFX techniques. “Whether we’re using Unreal Engine to help our environments team or to quickly reanimate a camera,” says Webber, “there are endless uses and opportunities in an industry where we’re used to opening up new frontiers. It’s incredibly exciting to see the horizon opening up like this.” PIXOMONDO BUILDS A STAGE

TOP: Framestore had utilized a mixture of virtual production techniques to help visualize the bear fight in His Dark Materials. (Image copyright © 2019 BBC/HBO) MIDDLE: DNEG’s LED wall test-shoot conducted in conjunction with Dimension in 2020. (Image courtesy of DNEG) BOTTOM: This Digital Domain image shows the capture from a helmet-mounted camera on Doug Roble, and the CG face render. (Image courtesy of Digital Domain)

Like several other studios, Pixomondo is adopting virtual production workflows on numerous productions, taking things one step further by building its own LED stage and developing an Unreal Engine throughput in Toronto (the first of several planned around the world to match Pixomondo’s several locations). “The Toronto stage is a very large stage that is 75 feet in diameter, 90 feet deep and 24 feet in height,” explains Mahmoud Rahnama, Pixomondo’s Head of Studio for Toronto and Montreal, who is also heading the company’s virtual production division. “It has a full edge-to-edge LED ceiling and it sucks up a lot of power! The LEDs are the next generation in-camera tiles from ROE Visual that are customized to our specs. We’re using processors from Brompton Technology and OptiTrack cameras for motion capture.” For Rahnama, being able to offer virtual production services to

56 • VFXVOICE.COM SPRING 2021

PG 50-57 VIRTUAL PRODUCTION.indd 56

2/25/21 2:12 PM


clients means that Pixomondo has found itself more involved in initial decision-making on projects, at least more than is traditionally part of a VFX studio’s role. “It’s been a fascinating and exhilarating experience for us,” he says. “The linear process that is commonly used in traditional VFX does not always lend itself to virtual production, so Pixomondo had to pivot and re-configure the workflow for virtual production. “We are now helping with virtual scouting, virtual art department, stunt-vis, tech-vis and live previs,” adds Rahnama. “We help DPs, directors and showrunners visualize anything and everything so they can make key decisions early on in the process, like troubleshoot scenes before they are filmed, or experiment with different shots and techniques to see what would work best on the day of shooting. This helps maintain the filmmaker’s vision throughout the entire process and helps keep everyone on the same page from start to finish.” HOW MAVERICKS VFX’S QUICK PIVOT LED TO SOMETHING BIGGER

Mavericks Founder and Visual Effects Supervisor Brendan Taylor was a close observer of the advent of new virtual production techniques, but initially saw it as something for bigger studios to specialize in. However, he then began noticing how smaller companies and even individuals had adopted simul-cam and realtime rendering techniques to produce high-quality results. So Taylor recruited Unreal Engine specialist Paul Wierzbicki to build a set of real-time virtual production tools for previs and remote collaboration, made up of elements including a hand-held simul-cam rig, Vive hand controllers, camera operator wheels and VR goggles. When the COVID-19 pandemic presented an opportunity to turn around a solution for remote visualization on the fourth season of The Handmaid’s Tale, these tools were employed directly. “We were able to build an environment all in Unreal Engine, then do a Zoom call with [producer/show star] Elisabeth Moss while she was in hotel quarantine, and the key crew, and block out the scene,” outlines Taylor. “We could walk around the set looking for camera angles and work out things like where to put bluescreen. Even once COVID is gone they’re still going to be doing it because it’s better than just looking at stills. You can actually walk yourself around.” The same possibilities are true for production design, too, says Taylor. “Usually they spend so much time designing something and then they hand it over to VFX, and we might have a different take. But here, what we’ve been doing is independent Unreal Engine sessions to work out what needs to be designed and what doesn’t. It’s been a heartwarming collaborative process, because everybody has been able to have these conversations.” Taylor has turned this virtual production experience into a standalone outfit called MVP, and is buoyed about the future. “I feel the most exciting thing with real-time is it is actually so much closer to our normal process of making movies. It’s how we make movies. Like, ‘Less smoke over there, take down the smoke.’ We can do that now. I can’t wait to do it more.”

Inside the Polymotion Stage, a volumetric capture stage that Dimension Studio has partnered on. (Image courtesy of Dimension Studio)

Dimension: Collaboration is Key One thing that marks the big move to virtual production for VFX studios is collaboration, in terms of software and hardware, as well as collaborations with specialist studios in the virtual production space. For example, DNEG has teamed up with Dimension on a number of projects. The studio offers volumetric production as well as immersive, digital humans, real-time and virtual production solutions from a range of locations. “We have our VAD (virtual art department) as well as our on-set virtual production teams running our simulcam, virtual scouting, tracking, machine vision and AI solutions,” outlines Jim Geduldick, Senior Vice President, Virtual Production Supervisor & Head of Dimension North America at Dimension Studio. The idea behind Dimension is to help other filmmakers and studios jump into a highly technical area by using established expertise. And importantly, says Geduldick, that’s not always to just suggest following a virtual production workflow. “One of the things that we do is say, ‘Is virtual production and an LED volume right for your project?’ Or, ‘Should you go down the route of traditional VFX?’ Right now, there’s so many people working who come from the traditional VFX and production world, so there’s still a bit of education required. You always want to give people options. And now there are so many.”

“At MPC, we’ve made headway on evolving virtual production as a way to work remotely together. We’ve shrunk the footprint of our large stages to fit into small living rooms. We are also exploring different ways to incorporate the power of the game engine into our VFX pipeline.” —Nancy Xu, Virtual Production Producer, MPC Film

SPRING 2021 VFXVOICE.COM • 57

PG 50-57 VIRTUAL PRODUCTION.indd 57

2/25/21 2:12 PM


ANIMATION

SOUTHEAST ASIAN CULTURAL TOUCHSTONES ILLUMINATE RAYA AND THE LAST DRAGON By TREVOR HOGG

All images courtesy of Disney. TOP: A time-of-day color script exploring various scenes by production designer Paul Felix. OPPOSITE TOP: Blue is prominent color for Sisu (Awkwafina) which reflects her spiritual connection with water. OPPOSITE BOTTOM: Raya (Kelly Marie Tran) with her faithful steed Tuk Tuk.

As our world lay under siege by the pandemic, Raya and the Last Dragon revolves around a warrior seeking a demigod in order to defeat evil creatures threatening her divided land of Kumandra. The fantasy adventure by Walt Disney Animation Studios was not left unscathed by the lockdown as 400 artists had to work remotely as shot production was to commence, and the theatrical release date shifted from 2020 to 2021, with it simultaneously debuting on Disney+ in March. Within two weeks of the work-at-home mandate, the production was back on track thanks to Herculean efforts of the technology department. “The hardest things to figure out at first were sync playback and editorial Avid reviews because of the lag time,” recalls Producer Osnat Shurer (Moana). “You’re dependent on everyone’s Internet speeds. There’s family. Two kids are going to school online. A spouse might be teaching classes on the other side of the room. But people figured out ways around it. One of the keys was the support of the studio as well working with each person individually. ‘What works for you?’ Flexibility was important and finding ways to still feel connected.” “Making these films is not easy because we create all of these layers that are important to the moment, appealing, and full of heart and humor,” notes Shurer. “We have to live up to the Disney legacy. It’s an action-adventure film, which isn’t what we always do in animation. We have a world within the film that is made up of five distinct lands; each one of those have their own personality, geography and topography. It is inspired by Southeast Asian design and thematic principles. Every character represents one of these lands, even down to the patterns in their clothing. Our dragon is based on the Nāga, which is a deity connected to water and to bringing life. It’s not fire-breathing like Western dragons. The Heart Clan are more connected to the dragon, so Raya has a

58 • VFXVOICE.COM SPRING 2021

PG 58-63 RAYA.indd 58

2/25/21 2:13 PM


“The design of Raya came together relatively quickly because she has such a clear personality. Raya is a martial artist and martial artists stand and move a certain way. We wanted her to have stature, mobility, and be believable as a strong fighter as well as someone who can lead a crack team. She’s gone out to save the world. Raya is conflicted and has trouble with trust, but works with her flaws in an exciting way.” —Osnat Shurer, Producer raindrop pattern on her outfit.” While the premise of a young girl and dragon remained constant, the setting evolved over the course of the development of the project. “The Southeast Asian focus came in just over two years ago,” remarks Production Designer Paul Felix (The Emperor’s New Groove). “Joe Rohde from Imagineering came in to talk about how

you build a culture from scratch and what are the visual hallmarks of a culture that you’re likely to come across. We tried to lay down ground rules for the civilizations that we were trying to create, that kept being modified because of changes in the story. “Luckily,” continues Felix, “we came in under the wire [with the lockdown] and did take a research trip to Bali, Cambodia and

SPRING 2021 VFXVOICE.COM • 59

PG 58-63 RAYA.indd 59

2/25/21 2:13 PM


ANIMATION

TOP: It was important that Sisu did not come across like a dog, so reptiles and creatures that swim in water were referenced when deciding upon how the limbs would move. MIDDLE: A late addition to the story was the character of baby Tuk Tuk. BOTTOM: A volumetric exploration of the Land of Talon.

Laos early last year. I was going for more of a sensory feel of what it’s like to be under those climatic conditions and having the sun be almost overhead all of the time. What struck me [the most] was the knowledge that Southeast Asians have about the stories of their own cultures – and that influenced how we imagined how the cities were built and how people lived with each other.” Combined with how the communities would actually live in the climatic conditions being depicted was the mythology created for the movie. “Once the clans [of Fang, Heart, Spine, Talon and Tail] were one people, but are now divided up into different lands,” remarks Felix. “We had conceived some fairly straightforward through lines. For example, all of the clans have their own particular approach to costume but have similarities in the way the wrap was tied over. A clan like Fang has a rigid hierarchical, militaristic approach to life and that’s how their buildings are constructed. It lent itself more to a geometric way of living. Heart has reverence for the connection between people and dragons, so we went for more of a flowing, organic feel with their architecture.” Charts were produced that illustrated the various design underpinnings for each land. “For Heart, we leaned towards cool blues and greens, and tried to keep that part of the color wheel associated with the dragons that you see in the film,” states Felix. “Talon was conceived as the trading crossroads of the whole land. Every culture meets the other culture there, so we had the idea it would be more of a cacophony of colors coming together with a strong purple, pink underlay. Everyone is wearing a poster version [of their clothing] to draw attention to themselves and to sell their wares. Tail consists of people who wanted to get away from everyone else, so we gave them earth-tone colors to make sure that they disappeared into the landscape.” A serpentine-shaped river is the source of life and transportation in Kumandra. “Because the water has been drying up, Tail is the driest and most ravaged land,” remarks Head of Environments Larry Wu (Big Hero 6). “Talon is like a floating city in Cambodia that is on stilts on the water. That was a fun set to work on because it’s full of little huts and has a big market. Spine is up in the mountains, so they’re really cold and it’s snowing. They make all of their structures from bamboo [that is as big as redwood trees]. Heart has done well because their city is built on an island, so they are naturally protected from the Druun. Fang is on a peninsula, but they cut a canal to make themselves an island. Fang has worked hard to thrive and have this sense that they deserve what they have.” In the prologue, the dragons sacrifice themselves to save humanity from the Druun and then the story flash-forwards 500 years later when Raya is a teenager being trained by her chieftain father to be the guardian of the Dragon Gem. “The first act sets up her journey as an 18-year-old and what needs to happen for the world,” states Osnat. “The thing with fantasy films is that you want to establish the world so the audience can move along with the story, but not to the point where it feels like a separate movie.” Young and adult Raya kept being refined so as to feel like one character. “The design of Raya came together relatively quickly because she has such a clear personality,” remarks Shurer. “Raya is a martial artist and martial artists stand and move a certain way. We

60 • VFXVOICE.COM SPRING 2021

PG 58-63 RAYA.indd 60

2/25/21 2:13 PM


wanted her to have stature, mobility, and be believable as a strong fighter as well as someone who can lead a crack team. She’s gone out to save the world. Raya is conflicted and has trouble with trust, but works with her flaws in an exciting way.” Simulating authentic Southeast Asian clothing was a complex undertaking. “We designed the garments the way they were actually constructed,” explains Head of Characters and Technical Animation Carlos Cabral (Tangled). “They didn’t have stitching. It’s all folded garments, tucked in and wrapped – that was a huge challenge. Raya is a total badass. She’s out there in the world on her noble steed Tuk Tuk running around different environments. Raya had to have that tough Asian look. She’s dusty, wearing well-worn clothing, and her hair is clumpy. Raya has been in the outdoors for a long time, and there were a lot of challenges in maintaining continuity whether she’s wet or dusty. Raya has this large cape to protect herself from the environment and a giant sword, so there were a lot of pieces to design and to maintain believability throughout the whole movie.” Pivotal to the plot is the demigod Sisu who has the ability to shape-shift. “Sisu [as a dragon] has this amazing mane of hair and we wanted to preserve the quality and spirit of it in the human form,” remarks Cabral. “We also had to make sure that the facial traits and the personality translates from one to the other.” Visual research involved studying various animals and reptiles. “For Sisu, we wanted to make sure that she wasn’t coming across too much like a dog,” reveals Co-Head of Animation Amy Smeed (Frozen). “We would take a piece from a reptile or something that swims in the water versus something like a lion, and look at the way the shoulders move. It was a challenge to get Sisu walking on all four legs, and there are times when she will stand up on two legs and talk.”

TOP: Boun (Izaac Wang) is a street-savvy 10-year-old entrepreneur recruited by Raya on her journey to find the last dragon. MIDDLE: Being a martial artist, the clothing for Raya had to allow her a wide range of mobility. BOTTOM: A formidable ally of Raya’s is the giant known as Tong (Benedict Wong).

SPRING 2021 VFXVOICE.COM • 61

PG 58-63 RAYA.indd 61

2/25/21 2:13 PM


ANIMATION

“If you look closely at the shot when Raya is in the temple, the water is flowing up the stairs not down. How do I simulate that? It is not opposite gravity, otherwise the water would flow straight up. You have to figure out how to get it to conform to the stairs plus flow in the wrong direction. There was a bit of development when it came to that.” —Kyle Odermatt, Visual Effects Supervisor

TOP: Raya has been on a six-year odyssey which is reflected in her being covered with dust, having clumpy hair and wearing well-worn clothing. MIDDLE: While exploring the floating market of Talon, the shape-shifting Sisu encounters Dang Hu (Lucille Soong). BOTTOM: A major part of emulate the climate of Southeast Asia was making sure that there were plenty of atmospherics in each shot such as when Raya prepares to battle Namaari (Gemma Chan) in the Land of Spine.

Tuk Tuk is the faithful companion and steed of Raya. “Tuk Tuk is so cute!” laughs Smeed. “There was a lot of reference early on of pugs, penguins and armadillos.” Tuk Tuk has the ability to transform himself into a giant wheel with a saddle for Raya. “In pre-production,” says Smeed, “one of our animators built a small Tuk Tuk in a garage workshop in order to figure out how the saddle could stay on top. A lot of it has to do with the counterweights at the bottom. Everything has to fit perfectly to be able to roll. Once it opens, the shell has to rotate in a way that you see his body underneath. There was a lot of back-and-forth between our Animation Supervisor Brian Menz and the rigger on how that would work. That was definitely a challenge to figure out because there are many times where Tuk Tuk is walking or running, will leap into the air, turn into a ball and start rolling.” Extension experimentation was required to create the effectsdriven and antagonistic creatures known as Druun. “It went through a lot of iterations based on the needs of the story,” states Wu. “Where we ended up was this ethereal, otherworldly thing that invades Kumandra. The Druun are composed of a lot of smoke and mist. They appear as dark particles and have an internal glow, so we got them to read well.” The Druun are organic and have lots of different shape changes within them. Explains Smeed, “What we created for the layout and effects department were Druun cycles that had basic shape and timing changes. Layout could use that when they were doing all of the cinematography. There were times where we would touch the Druun, if it was reacting to a character or the timing needed to be specific for a moment. Other times the Druun would live more in the background and we could plug in cycles that went straight to the effects team.” A personal favorite for Cabral is a rival to Raya. “Namaari is a unique and strong character, not that I would necessarily want to hang out with her because she’s intense, but in terms of how the design got resolved and the way her performance is in the movie, it was amazing to see her antagonistic relationship with Raya come to life.” Smeed agrees. “Namaari comes from a place that is stoic and structured, so it was finding what makes her different from Raya so we can play up those characteristics between the two. A choreographer helped us pick out specific martial arts moves from Southeast Asia. One of our writers, Qui Nguyen, has a martial arts background so we also had conversations with him.” “On this film, the question was, ‘How do we push the art and the

62 • VFXVOICE.COM SPRING 2021

PG 58-63 RAYA.indd 62

2/25/21 2:13 PM


efficiency of it?’” states Visual Effects Supervisor Kyle Odermatt (Winnie the Pooh). “Our effects department made a whole library of volumetric effects that would be common in the region that inspires the film – a lot of broad atmospherics such as mist because it’s humid there. Normally, our shots tend to be devoid of things like smoke and mist unless the effects department specifically gets involved. But in this case the lighting department was able to add all of that, and it has changed dramatically many of the sequences in the film.” Even though ambient water was mastered in Moana, it takes on magical properties in Raya and the Last Dragon. “If you look closely at the shot when Raya is in the temple, the water is flowing up the stairs, not down,” adds Odermatt. “How do I simulate that? It is not opposite gravity, otherwise the water would flow straight up. You have to figure out how to get it to conform to the stairs plus flow in the wrong direction. There was a bit of development when it came to that.” “The main part of my job is to understand the director’s vision and then figure out how our teams can get that up onscreen,” notes Odermatt. “At the same time, I have a component of my role where we need to be able to complete it in a timely fashion and within budgetary constraints. I am encouraging and pushing them back at the same time depending on which day it is or meeting I’m in. I said we could not show fully the onscreen transformation between human and dragon. I set those ground rules in the beginning because it’s too much. There were numerous transformations in Moana, and our artists were so clever that they figured out ways to do it without a complex method. Here we’ve covered up transitions with speed and, in one case, with an outfit.” “For both our cloth and hair we have custom simulation engines that we have written over the years that address the needs of the complex situations that we find ourselves in,” explains Odermatt. “It’s a process of doing both the groom and simulation at the same time, because you can’t construct a groom in a way that is not nonphysical and have the simulation that you add look natural. Once we have that simulation rig for that hair groom together, it goes to our technical animation group who runs it for any given shot context, like when it’s blowing in the wind. Technical animation run and then guide that simulation. Long hair will always fall in front of the shoulders and frequently in front of the face. There might be heavy acting in those scenes, so they have to find a way to pin the hair behind the shoulders and yet feel like it isn’t disobeying physics.” “The scope and scale of this film that we wanted to create and put up onscreen was our ultimate challenge to which the teams rose incredibly,” states Odermatt. “When you see Raya and the Last Dragon, you feel that it is a world of great diversity both in terms of characters and land. We accomplished it all within the timeframe that we had to work with and under the incredible constraint of working from home. We have an experienced crew that knows what it’s doing; if we get clear direction from our creatives, their ability to execute is second to none. They were lightly noted. Everyone got to contribute a lot of themselves in the production, which they’re happy about.”

TOP: Situated on the outskirts of the fantasy world of Kumandra is the Land of Tail with the voice of the chief being provided by Patti Harrison. MIDDLE: Sandra Oh voices Virana, the chief of the Fang Lands and the mother of the Namaari who serves as the nemesis of Raya. BOTTOM: A driving force of the narrative is the close relationship between the chief of Heart, Benja (Daniel Dae Kim) and his daughter Raya, as well as his desire to bring unity to Kumandra.

SPRING 2021 VFXVOICE.COM • 63

PG 58-63 RAYA.indd 63

2/25/21 2:13 PM


VFX TRENDS

THE CREATIVE IMPERATIVE: TAKING CONTROL OF YOUR OWN PROJECT By IAN FAILES

Finding time as a visual effects or animation artist to work on your personal or passion project can be hard. But perhaps now, more than ever before, the democratization of digital tools and the demand for new content have allowed artists to go ahead and make the things they want to make. Here, a range of artists with experience in the VFX and animation industries working on client projects and who have now branched out into making their own content share their stories – and their advice for others who may be looking to do something similar. FROM VFX PRODUCTION TO WAR

TOP: The crew from A Battle in Waterloo, seen here during location photography, were predominantly female. (Image courtesy of Emma Moffat) OPPOSITE TOP: Jessie Buckley stars in A Battle in Waterloo, which saw director Emma Moffat capitalize on her VFX production skills in helming the film. (Image courtesy of Emma Moffat) OPPOSITE BOTTOM: Colin Levy’s Skywatch had the surprise inclusion of actor Jude Law. The short is now being developed as a series. (Image courtesy of Colin Levy)

Writer/director Emma Moffat got her start in the industry as a set PA on Inception before working at DNEG in visual effects coordination and production roles. In 2019, her short film A Battle in Waterloo received funding from the Bumble Female Film Force competition aimed at female filmmakers. The short, about a soldier’s wife searching for her missing husband, was filmed in Epping Forest, East London, and opened up a new world for Moffat. “The script was ambitious, being a period short with action sequences and a sizeable cast, special effects and costumes,” Moffat says. “I’d had the script for a few years and had struggled to find funding, but was determined to make the short. Thankfully, the

64 • VFXVOICE.COM SPRING 2021

PG 64-69 CREATIVE.indd 64

2/25/21 2:17 PM


“Try to use basically any available way to foresee the way your project is supposed to look like... You need to understand the complete cost of your project creation including all risks and force majeure situations. Your late nights and weekends next to the computer need to be planned and accounted for as well, even if it’s your personal project and/or you’re doing it for free.” —Denys Shchukin, Writer/Director Female Film Force competition selected my script and gave us the necessary funding to pull it off.” Moffat, who is now developing other projects while also continuing to work in visual effects production, notes that her background in VFX comes in very helpful for writing and directing. “Knowing what you can achieve with the help of VFX, and just as importantly knowing when not to use it if you’re on a tight

budget/timeline is invaluable. Whether or not you’re utilizing VFX, the knowledge of how to creatively solve issues and having an understanding of timelines and expenses is vital, particularly when working on shorts. “In VFX production,” adds Moffat, “we’re in the room with directors during reviews, so it’s a hugely privileged position to learn from the best. My advice to others would be to listen to the notes

SPRING 2021 VFXVOICE.COM • 65

PG 64-69 CREATIVE.indd 65

2/25/21 2:17 PM


VFX TRENDS

“I was very used to dropping everything for my clients and making their projects work despite time zones or deadline conflicts, and so it wasn’t a great shift to then prioritize my own project.” —Allison Brownmoore, Artist/Writer/Animator being given. They often give a valuable insight into the narrative and style the director is developing at large. It goes without saying that VFX artists are uniquely positioned to write and direct projects that showcase VFX if they’re able to work on the project themselves – it really elevates shorts to a high finish.” A PATH AFTER PIXAR

Los Angeles-based writer/director and ex-Pixar layout artist Colin Levy always had a goal to make his own content. It’s something he has realized via a number of projects, including several Blender Animation Studio films and his 2019 short Skywatch, a project that had the surprise inclusion of actor Jude Law and is now being developed into a sci-fi series for NBC’s Peacock. “Since I was a kid, I’ve been obsessing over my own little creative experiments, which have grown into bigger and better projects,” outlines Levy. “Once you’ve got a taste of how it feels to be the author, to run the ship, to release a piece of your soul into the world, it’s hard to turn that off. I had an absolute blast working at Pixar, but I realized early on that I had to continue pursuing my personal work on the side. That’s where I was going to find greater purpose in my life.” The live-action Skywatch featured plenty of CG and animation – much of it achieved in the open source tool Blender – so Levy’s past work certainly came into play during production. “My layout experience allowed me to explore and plan and pre-edit the film so that I could feel confident about what we needed to achieve on the day. With so much of the film relying on missing elements that would later be created in post, it gave the whole crew and the actors a better sense of context from moment to moment.” Seeing a project through is one of Levy’s strengths he says he has gained from working on many personal endeavors. Skywatch took seven years to complete, and it was during this time that Levy and his co-writer Mike Sundy produced a 120-page feature screenplay based on the short. “This basically became the core of the 10-minute pitch that led to our TV deal,” Levy explains. “Literally years of preparation boiled down to that moment. But fortunately, at the end of the day, the promise of the larger story combined with the strength of the short proved to be enough. Can’t tell you what a relief that was!” TOP: A VFX breakdown of the drone from Levy’s Skywatch. The use of Blender became a key tool in the indie production. (Images courtesy of Colin Levy) BOTTOM: Allison Brownmoore’s illustrations for The Amazing Adventures of Awesome. (Image courtesy of Allison Brownmoore)

TAKING ON VFX-HEAVY DIRECTING

Director Denys Shchukin has worked in visual effects in several roles at a number of studios, including Framestore and Image Engine. The Vancouver-based artist continues to work as a supervisor but recently had a foray into directing for Ukranian singer

66 • VFXVOICE.COM SPRING 2021

PG 64-69 CREATIVE.indd 66

2/25/21 2:17 PM


Ivan Dorn’s music video for the song “Wasted.” The fully-CG video saw the singer motion-captured and then turned into a digital avatar with several ‘powers.’ Shchukin pitched the project to Dorn based on what he says were “inner ambitions as a director, producer and scriptwriter. I was very lucky to find a client who liked my idea and vision, and also a singer/musician who let me take complete ownership of the project.” Artists were assembled across the world, with Shchukin orchestrating his team to produce some very CG/animation/ FX-heavy shots. Supervisor experience proved key for turning the project around, attests Shchukin. “Working on A-rate, heavy VFX blockbusters, you very quickly learn how to plan the whole show, bid hours, calculate budgets and expenses, plan department schedules, reviews, comments, etc. Especially if your department is closer to the end of the pipe workflow where you most likely need to plan everything from the Shotgun field creation and first polygons to final color grading and delivery to client in all possible codecs and formats.” In putting together the “Wasted” music video, Shchukin says his two main take-aways were the importance of visual planning – “Try to use basically any available way to foresee the way your project is supposed to look like” – and understanding the budget. “You need to understand the complete cost of your project creation including all risks and force majeure situations. Your late nights and weekends next to the computer need to be planned and accounted for as well, even if it’s your personal project and/or you’re doing it for free.” ANIMATED ADVENTURES

Allison Brownmoore is a BAFTA-nominated Design Director and Founder of London design boutique firms Blue Spill and Past

“We are in a time now where filmmaking technology is in the grasp of your fingertips. ... It’s now really not about whether you can get the best tech to tell your story, it’s now about having the right story to tell and your vision to execute it.” —Hasraf ‘HaZ’ Dulull, Director/Producer TOP: A still from Denys Shchukin’s “Wasted” music video for Ivan Dorn. (Image courtesy of Denys Shchukin) BOTTOM: Animation-in-progress screenshot from ‘Wasted.’ Ivan Dorn was motion captured and photogrammetry scanned, with animators taking the performance further. (Image courtesy of Denys Shchukin)

SPRING 2021 VFXVOICE.COM • 67

PG 64-69 CREATIVE.indd 67

2/25/21 2:17 PM


VFX TRENDS

“It goes without saying that VFX artists are uniquely positioned to write and direct projects that showcase VFX if they’re able to work on the project themselves – it really elevates shorts to a high finish.” —Emma Moffat, Writer/Director Curfew. She began work in the industry in commercials, promos and idents before designing for film. With her hand-illustrated animated short about an autistic child, The Amazing Adventures of Awesome, Brownmoore overcame the challenge of setting aside time to jump into content creation. “The last short I’d made was back at university 20 years ago,” details Brownmoore. “Over the years I’d had a few ideas and a few false starts, but the changing point with The Amazing Adventures of Awesome was finding a story that was so important I knew I had to tell it. Finding the time then actually became quite easy, because the project became as imperative as my day job. I was very used to dropping everything for my clients and making their projects work despite time zones or deadline conflicts, and so it wasn’t a great shift to then prioritize my own project.” Collaborating with others, Brownmoore’s script – based on her own experience with her autistic son – and animatic became final animated scenes in After Effects and Flame. The self-funded short presented Brownmoore with several challenges, not only for the visuals. She had to consider, too, sound and music, and eventually a way for getting people to see it. “I’ve seen great films in the past which have ended up on shelves because everyone runs out of steam after the project, or gets into a festival or two and then abandons the hard slog,” Brownmoore shares. “And it is a lot of work, festival submissions themselves, and then after acceptance doing the press and getting your submission and admin across, then updating the website, etc. However, it’s this side, the unspoken side, which is so incredibly important, because without it, no one is seeing the film! Thankfully, there’s great agencies like Festival Formula who can help take some of the burden of this. Definitely worth putting some money aside for the back end of the process.” TURNING TO CROWDFUNDING IN ORDER TO CREATE

TOP: Director Emma Moffat on the set of her A Battle in Waterloo. (Image courtesy of Emma Moffat) MIDDLE: Hasraf Dulull operates a virtual camera to stage a scene. (Image courtesy of Hasraf Dulull) BOTTOM: Dulull has adopted a virtual production approach for his upcoming Mutant Year Zero: Road to Eden, relying principally on Unreal Engine. (Image courtesy of Hasraf Dulull)

Josh Johnson and Tim Maupin are visual effects practitioners who wanted to make a film together and felt this could be achieved via crowdfunding. Platforms such as Kickstarter and Indiegogo are, of course, one of the many ways artists can support the creation of their own content. Their successful campaign on Kickstarter resulted in the short film Breathe about the impact of climate change. “We put quite a bit of effort into the campaign, from extensive visual mood boards and images, to visiting locations to get a head start on how we wanted the film to look and feel,” outlines Johnson, a visual effects supervisor who has contributed to films such as A Ghost Story, Native Son and Omniboat: A Fast Boat Fantasia.

68 • VFXVOICE.COM SPRING 2021

PG 64-69 CREATIVE.indd 68

2/25/21 2:17 PM


“We shot footage that would help to illustrate our concept,” adds Maupin, who mostly works as a visual effects generalist, “and had some very helpful friends from a small production company who shot our interviews and some test drone shots – they even got their drone stuck in a tree for us!” During production, the team overcame the challenges of shooting, especially during a major storm, and dividing up more than 60 VFX shots for the short, ranging from driving comps to environment changes and a complex CG tree. But Johnson and Maupin persevered. And despite being experienced in VFX, they deliberately chose to keep the project manageable from a production point of view. “The characters, story, emotion and ideas are always going to be the most important,” observes Johnson. “Just because you know VFX, I would still try to use them in moderation and remember they are best used to support your story.” FULLY EMBRACING VIRTUAL PRODUCTION

For several years now, Visual Effects Supervisor turned director/producer Hasraf ‘HaZ’ Dulull, based in the U.K., has been able to lean on the skills and knowledge he gained in VFX production to help create a number of indie projects, ranging from his shorts, Project Kronos and SYNC, which ultimately led to Hollywood representation for Dulull, to directing and producing his features The Beyond and 2036 Origin Unknown, as well as the Disney+ mini-series Fast Layne. One of Dulull’s latest efforts – a full-length CG-animated feature film called Mutant Year Zero: Road to Eden, based on the Funcom video game – has seen Dulull take on board virtual production and game engine techniques. The director views this tech as especially advantageous during the COVID-19 pandemic when access to film sets has been limited, and for creating a new way of working. “What was originally supposed to be some concept renderings and pre-production material ended up being full-blown rendered sequences, because there was no rough/block out version of a shot like you would get with conventional CG animation,” explains Dulull. “We had assets from the game developers which we then up-res’ed to 8K textures with Substance Painter and then lots of cool shader work in Unreal Engine, and I was animating the cameras and lighting the scenes with ray tracing all in real-time, so when we render out the EXR frames from Unreal Engine, what we are rendering out is final pixels, i.e. no compositing, and then putting that into Resolve for editing and color grading, and that’s it. “We are in a time now where filmmaking technology is in the grasp of your fingertips,” continues Dulull. “Meaning, we don’t have to spend thousands of dollars on a single software license, you can now rent the software on subscription, and also tech like Unreal Engine is free! On top of that, the knowledge is freely available with YouTube tutorials, etc. It’s now really not about whether you can get the best tech to tell your story, it’s now about having the right story to tell and your vision to execute it.”

TOP: A final still from the animated The Amazing Adventures of Awesome. (Image courtesy of Allison Brownmoore) MIDDLE: The Breathe crew on location. The film was a chance for the directors to branch out from VFX work. (Image courtesy of Josh Johnson) BOTTOM: One of the VFX elements required for Breathe was the ‘tree monument,’ achieved via CG foliage. (Image courtesy of Josh Johnson)

SPRING 2021 VFXVOICE.COM • 69

PG 64-69 CREATIVE.indd 69

2/25/21 2:17 PM


COVER

TALES FROM THE GALAXY’S EDGE VENTURES TO BATUU AND THE BLACK SPIRE By CHRIS McGOWAN

Images courtesy of Lucasfilm and ILMxLAB. TOP: In-game screenshot looking out at Black Spire Outpost from Seezelslak’s cantina in Star Wars: Tales from the Galaxy’s Edge. Walt Disney Imagineering provided scans of the buildings in Black Spire Outpost. ILM and Advanced Development Group artists, who worked directly on the Millennium Falcon: Smugglers Run attraction, joined forces with the VFX effort to faithfully reproduce Black Spire Outpost. OPPOSITE TOP: The player holds a blaster in Star Wars: Tales from the Galaxy’s Edge, which is both an interactive narrative experience and a VR video game. Epic Games’ Unreal Engine was used in its production.

For the follow-up to its acclaimed Vader Immortal: A Star Wars VR Series, ILMxLAB decided to go less linear and more interactive. Star Wars: Tales from the Galaxy’s Edge “has a much greater sense of freedom,” says director Jose Perez III, while “Vader Immortal was a rather scripted and linear experience. We really wanted to allow players more space to explore and soak up the world. This meant we had to build environments and objectives in a way that would allow players to opt into their desired activity. “If they want to run off into the wilds and fight pirates, they can do that,” he explains. “If they would rather just hang out in the bar and listen to the six-eyed bartender Seezelslak crack jokes, that’s also an option.” Lead Animator Kishore Vijay adds that the Tales experience “has more branching narratives and side missions. It’s also more modular so we can build upon the breadth of the world and characters.” Star Wars: Tales from the Galaxy’s Edge was produced by ILMxLAB in collaboration with Oculus Studios and launched last November on the Oculus Quest VR platform. The ILMxLAB team also included Steven Henricks (Visual Director), Jennifer Cha (Lead Animator), Alyssa Finley (Producer), Mark S. Miller (Executive Producer), and Ian Bowie and John Nguyen (Lead Experience Designers). It was written by Ross Beeley, with additional writing by Bryan Bishop. The experience melds an interactive narrative with multiple styles of gameplay and takes place on

70 • VFXVOICE.COM SPRING 2021

PG 70-73 VR/AR.indd 70

2/25/21 2:18 PM


“We strive for cinematic-quality visuals but are limited by having to render everything in real-time. Every inch of the experience, both on our CPU and GPU, had to be finely tuned and optimized to get the most out of our experience. … So we couldn’t guarantee the player would be locked into a position or see the things we want them to see. This freedom motivates us to make the entire world rich.” —Steven Henricks, Visual Director planet Batuu in the time between Star Wars: The Last Jedi and Star Wars: The Rise of Skywalker. The primary narrative features new and iconic Star Wars characters and is connected to the Black Spire Outpost, a prominent locale in the new Star Wars: Galaxy’s Edge lands of Disneyland and Disney World. As the player, you are a droid repair technician whose ship is boarded by Guavian Death Gang pirates and their leader Tara Rashin (Debra Wilson). To save your skin, you must jettison a mysterious cargo and flee in an escape pod to Batuu. Once there, at the Black Spire Outpost, you interact with an Azumel barkeep named Seezelslak (voiced by SNL alum Bobby Moynihan), whose cantina is the main hub of the action, while you seek to recover the precious cargo – which happens to be old Star Wars stalwarts C-3PO and R2-D2. You can hang out with Seezelslak and toss Repulsor Darts, visit the Droid Depot owner Mudo, explore Batuu, or blast pirates and face off with Tara. Also, via Seezelslak’s storytelling, you can immerse yourself

in Temple of Darkness, a standalone story (called a “tale”) set hundreds of years ago. Ady Sun’Zee is a Jedi Padawan studying at a remote research facility on Batuu. As Ady, with lightsaber in hand, you must confront an evil relic with the help of Jedi Master Yoda (voiced by Frank Oz). It all takes place in the time of the High Republic, which is a new era for Star Wars storytelling. More narrative-driven tales for Tales are expected in successive DLC packs. It was a challenge making Tales and moving the story forward while “giving the player more agency,” notes Cha. Early in the production, Perez made a conscious decision that they would not force the player to be still during any of the experience, including times when cinematic narrative scenes were playing out. “So,” outlines Henricks, “the challenge became how to draw the player toward these moments naturally. We paid close attention to the environment design and worked to make these moments the focal points of the scene. The detail and complexity of set dressing

SPRING 2021 VFXVOICE.COM • 71

PG 70-73 VR/AR.indd 71

2/25/21 2:18 PM


COVER

“We don’t really have cuts. The characters cannot really go off frame easily, so everything has to be animated to work from different viewpoints and distances. There are also technical considerations for interactivity and blending for branching cinematics and AI animation that the animators have to be cognizant of.” —Kishore Vijay, Lead Animator

TOP: In Mubo’s workshop in his Droid Depot, one can repair and program droids used throughout Black Spire Outpost. BOTTOM: Bobby Moynihan’s facial performance was captured at the same time as his voice recording for the six-eyed bartender Seezelslak. Facial cues for Seezelslak’s performance were taken from the animatronic puppet used for the “Six Eyes’”character from Solo: A Star Wars Story.

around or directly behind areas of interest was often increased to provide that extra point of interest for the player. Cinematic lighting and targeted sound design by Skywalker Sound also played a key role in making our cinematic moments when the player wouldn’t want to turn away.” Vijay adds, “Performance, audio, environment design and lighting all play a role.” Maintaining high-quality animation was another challenge. The volume of work increased for several reasons, including branching animations and AI trees, as well as longer shots and the resulting heavy animation files, according to Vijay. “We don’t really have cuts,” he says. “The characters cannot really go off frame easily, so everything has to be animated to work from different viewpoints and distances. There are also technical considerations for interactivity and blending for branching cinematics and AI animation that the animators have to be cognizant of.” “We strive for cinematic-quality visuals but are limited by having to render everything in real-time,” adds Henricks. “Every inch of the experience, both on our CPU and GPU, had to be finely tuned and optimized to get the most out of our experience.” Unlike Vader Immortal or a traditional film, Tales allows players to freely explore their environments, “so we couldn’t guarantee the player would be locked into a position or see the things we want them to see. This freedom motivates us to make the entire world rich.” Tales sought to connect a consistent visual style that would be “a virtual extension of what you see in the Galaxy’s Edge areas of Disneyland and Disney World,” explains Henricks. “Walt Disney Imagineering (WDI) was able to share their design bible and countless reference images of the park with us. This allowed me to understand the original intent of the park and the stories that are embedded within it. Star Wars: Tales from the Galaxy’s Edge was always a way for ILMxLAB to explore what was beyond Black Spire Outpost. And, with the guidance provided by WDI and the Lucasfilm Story Group, we were able to bring some newly imagined parts of the planet of Batuu to life while staying true to [the] Galaxy’s Edge conceptual vision.” One aspect of sharing that vision was bringing over artists and assets from the Smugglers Run ride in the Galaxy’s Edge lands. During the production of Tales, “several ILM and ADG artists who worked directly on the Millennium Falcon: Smugglers Run attraction joined the team to help us faithfully reproduce Black Spire Outpost and some of our exterior vistas,” says Henricks. Assets from the attraction were optimized and transferred to real-time friendly assets for Tales. He adds, “Bringing the people and art over from Millennium Falcon: Smugglers Run really helped us nail the feeling of Black Spire Outpost and the artistic influence of that art can be seen.” “We really wanted to provide people with a glimpse into the vastness and variety of the planet Batuu. Whether you are seeing it from space in Mubo’s cargo ship, through Seezelslak’s cantina window, or experiencing the wilds, it all started with inspiring concept art. ILMxLAB utilizes artists from the amazing ILM and Lucasfilm art departments to conceptually develop the experiences we produce,” notes Henricks. Adds Cha, “We were lucky

72 • VFXVOICE.COM SPRING 2021

PG 70-73 VR/AR.indd 72

2/25/21 2:18 PM


to have animators on our team who have worked on many of the Star Wars films, so we were drawing from a wealth of experience to create authenticity in our characters.” “The animators crafted some really good animation building upon a foundation of fantastic performances from the actors and mocap artists,” adds Vijay. “Michael Beaulieu, our Animation Supervisor, had plenty of experience working on Star Wars features and helped us keep consistency and maintain quality for our marquee characters in the game. And several of our animators from ILM had already worked on the movies before.” The motion capture was extensive, including even mocap of quadruped creatures. “This was necessary for us to be able to complete the volume of work on schedule. Even so, the mocap had a good deal of hand key work layered on depending on the character. Non-humanoid droids, notably R2-D2, were all hand keyed,” says Vijay. Facial performances were captured to give the animators a solid base to work from, notes Vijay. “We captured Bobby Moynihan’s facial performance at the same time as his voice recording for the six-eyed bartender Seezelslak, which reflects in the final animation. We also took cues for Seezelslak’s performance, especially his face, from the animatronic puppet used for the ‘Six Eyes’ character from Solo: A Star Wars Story,” comments Vijay. “Yoda was animated with an effort to stay true to his appearances in the features. We looked at both the puppet and CG versions for reference.” Scanned resource images and content are becoming an immensely powerful tool in the creation of realistic worlds and characters, notes Henricks, and on Tales a good number of scanned resources were used as the base for natural formations. “Scanned textures for rocks, cliffs, terrain and foliage all were all utilized as reference and combined with original art to build the majority of the ‘wilds.’ We also used scans of character costumes used in the films. Seezelslak began as a scan of the articulated costume used for the character Argus Panox in Solo. This scan gave us great 3D and texture reference used to model, sculpt and texture the final Seezelslak asset.” Tales utilized Epic Games’ Unreal Engine, as had Vader Immortal. “A large number of solid, optimized systems were engineered during the production of Vader Immortal, and we were able to utilize that tech as we upgraded our engine version, including the customizations made [by] our Advanced Development Group,” comments Henricks. For Tales, “more systems were engineered and optimized, including our brand-new custom AI system. Unreal’s power and flexibility allowed our tech art team to expand upon the tools that already existed within the engine. These upgraded tools allowed us to vastly expand the size of the environments in Tales.” Concludes Perez, “It was important to us that we help put a little bit of joy into the world before the end of 2020. I’m proud of the fact that this team was able to build this entire project in less than a year, with the majority of production happening remote during a pandemic. That’s a Herculean effort, and it’s an honor to be a part of a team that can pull something like that off.”

“ILMxLAB utilizes artists from the amazing ILM and Lucasfilm art departments to conceptually develop the experiences we produce. We were lucky to have animators on our team who have worked on many of the Star Wars films, so we were drawing from a wealth of experience to create authenticity in our characters.” —Jennifer Cha, Animation Lead

TOP: A handy “multi-tool” helps the player solve mechanical puzzles and repair combat droids during game play. BOTTOM: Debra Wilson portrays the space pirate Tara Rushin. In a Zoom call, director Jose Perez III guided Wilson as she recorded her vocal performance, while Tara’s physical performance was captured by Associate Experience Designer Karessa Bowens. All the pieces were ultimately put together and mapped to the digital model of Tara.

SPRING 2021 VFXVOICE.COM • 73

PG 70-73 VR/AR.indd 73

2/25/21 2:18 PM


TECH & TOOLS

HOW TO SIMULATE A WATERFALL IN HOUDINI AND MAYA BIFROST By IAN FAILES and IGOR ZANIC

TOP LEFT: FX technical director and visual effects trainer Igor Zanic, co-founder of Rebelway. BOTTOM LEFT: Rebelway Creative Director Urban Bradesko works on a scene in Houdini.

In visual effects, artists are often given a problem to solve. They will need to reach the desired outcome of the client, but it can often be up to the artist or studio what tool they may use to solve the problem. What if you were asked to build a flowing waterfall for a scene – where would you start? What tool or tools would you use to simulate the flowing water of a waterfall? Arguably the two most used and accessible tools for such a task in VFX right now are SideFX’s Houdini and Autodesk’s Maya Bifrost. Houdini is well known for its procedural approach to animation and effects and offers a number of solvers, such as pyro and particles, for handling fluid simulations. Bifrost in Maya is now a fully-blown visual programming language that also allows for the generation of procedural effects, although when first implemented into Maya its strength was in fluid simulations. Here, FX technical director and visual effects trainer Igor Zanic, a co-founder with Saber Jlassi of online VFX training website Rebelway.net, which specializes in FX and procedural effects-related courses, outlines in simple terms how an artist might begin the process of crafting a waterfall in both Houdini and Bifrost. Zanic is an experienced Houdini user, but he was also one of the first freelance artists to try out the precursor to Bifrost, a fluid simulation tool called Naiad. He has contributed to a number of projects that were heavy in water simulations and other FX sims, including Shark Night 3D, Kon-Tiki and The Man in the High Castle. Zanic has also worked previously for both SideFX and Autodesk as an artist and technical director. In the steps below, Zanic shows how similar-looking waterfalls, where water is flowing – essentially colliding – over rocks, can be simulated in both Houdini and Bifrost with somewhat relatively

74 • VFXVOICE.COM SPRING 2021

PG 74-77 FLUID SIMS.indd 74

2/25/21 2:19 PM


While these steps might not instruct you in exactly which buttons to push in each tool or which nodes to create and use, they will hopefully give you a guide to getting started in each piece of software. similar procedures. Some of the differences relate to generating the fluid simulation and the water simulation itself, foam generation and the caching and meshing process. While these steps might not instruct you in exactly which buttons to push in each tool or which nodes to create and use, they will hopefully give you a guide to getting started in each piece of software. MAKING A WATERFALL IN HOUDINI

1. Preparing assets for simulation When we import our geometry into Houdini, the first thing we want to be sure of is that it has the correct scale. Sometimes an object coming from other software will appear in a different scale. The next step is to optimize and prepare the geometry for simulation. This includes doing things like ensuring that the geometry is closed so there are no object edges. Also, if the object is high-res, we create proxy geometry. After optimization, we can set up our collision geometry for simulation, creating proxy collision geometry and volume. 2. Prepare source for simulation Before we set up our waterfall fluid simulation, we need to decide where we want to put our emitter for the fluid simulation. This emitter can be a single object or a few smaller objects. Then we will need to add initial velocity to push the fluid in one direction, and also to add a bit of noise on top to break the uniform source. 3. Set up and cache fluid simulation After we set up our collision geometry and source, we can set up our fluid simulation. The next step is to tweak parameters such as resolution, fluid simulation boundary and a few other attributes to arrive at a suitable simulation look. The last step here is to cache our fluid simulation. 4. Set up and cache water simulation Using our cached fluid simulation, we create a water source, isolating just the areas where we want our whitewater simulation to emit. The next step is to cache our whitewater source to be faster for our whitewater simulation to read. We then set up the whitewater simulation and cache that final simulation. 5. Mesh fluid simulation Using our cached fluid simulation, we combine particles and volume to create our final mesh. After the mesh is done, we transfer attributes from the fluid simulation particles to the mesh. The final step is to cache our fluid mesh, and this allows us to continue creating a waterfall scene in Houdini, or elsewhere.

SPRING 2021 VFXVOICE.COM • 75

PG 74-77 FLUID SIMS.indd 75

2/25/21 2:19 PM


TECH & TOOLS

MAKING A WATERFALL IN BIFROST

1. Prepare scene and assets for simulation While these steps for Maya Bifrost are similar to those outlined for Houdini, there are a couple of differences, so please note them. When we import our geometry into Maya, the first thing we want to be sure of is its correct scale, as in Houdini. The next step, again, is to optimize and prepare the geometry for simulation. Remember, this involves ensuring the geometry is closed so that there are no open edges. Also remember that if the object is high-res, it’s important to create proxy geometry. 2. Create Bifrost Liquid and add source Before we set up our waterfall fluid simulation in Bifrost, we need to decide where we want to put our emitter, just like with Houdini. It can be a single object or a few smaller objects. We select our emitter and add Bifrost Liquid from the shelf, and then we also create a source tab with all the relevant settings. After that, we add the initial velocity to push fluid in one direction, and then we set up our liquid properties such as resolution. 3. Add collision geometry and cache fluid simulation After we set up our initial Bifrost Liquid container, the next step is to add collision geometry. Sometimes it is better to split every piece of geometry into separate collision geometry, or you can group objects for better control. We also now create a Bifrost kill-plane to limit our waterfall so that it’s not falling into infinity. The last step here is to cache our fluid simulation. 4. Set up and cache foam simulation Using our cached fluid sim, we create a foam source, isolating only areas where we want the foam simulation to emit. The next step here is to tweak our desired foam parameters, and then we cache our foam simulation. 5. Mesh fluid simulation Using our cached fluid sim, we create our final mesh. We tweak the settings for our mesh to get the look we want. The final step is to cache our fluid mesh, which enables us to continue to develop the scene in Maya or elsewhere.

76 • VFXVOICE.COM SPRING 2021

PG 74-77 FLUID SIMS.indd 76

2/25/21 2:19 PM


PG 77 THE VES HANDBOOK AD.indd 77

2/25/21 2:38 PM


FILM

BRINGING OUT THE HEAVY WEAPONS TO BATTLE THE BEHEMOTHS IN MONSTER HUNTER By TREVOR HOGG

Images courtesy of Constantin Film and Screen Gems TOP: Milla Jovovich and Tony Jaa had to perform with the signatured oversized weapons from the video game. OPPOSITE TOP: The sailing vessels from Black Sails were redressed as the sand ships in the parking lot of Cape Town Film Studios. OPPOSITE BOTTOM: One of the biggest practical sets built by Production Designer Edward Thomas was the ships graveyard.

Even though video game adaptations have yet to dominate the box office like their comic book counterparts, one cannot fault British filmmaker Paul W.S. Anderson, who is responsible for the Resident Evil franchise that encompasses six films and has grossed $1.2 billion worldwide. The partnership between Japanese video game developer and publisher Capcom and Anderson continues with Monster Hunter, which takes place in an alternative world filled not with post-apocalyptic zombies but hostile creatures the size of skyscrapers. “I don’t know if there’s ever a formula for making movies, and if I knew it I certainly wouldn’t give it away!” laughs Anderson who before Resident Evil helmed Mortal Kombat. “You’re walking a fine line between pleasing hardcore fans who know everything about the intellectual property and reaching out to a broader audience who maybe don’t even play video games.” Anderson has maintained his attitude towards visual effects despite having a growing need for them. “I was fortunate early on in my career to work with a great visual effects supervisor, Richard Yuricich, who had come up working on movies like 2001: A Space Odyssey and Blade Runner. He had this mantra that his favorite kind of visual effect had no visual effects in it. Richard always pushed us to do as much as possible real. My approach on this movie was if the creatures have to be CG, let’s shoot on real landscapes rather than in studio backlots against a greenscreen. Every time a creature’s foot goes down on the ground it displaces and showers our actors with real sand, and the lens flare from the sun will be real as well as the wind. It gives the animators an awful lot to match into as well as helps to tie the creatures into the reality of the existing location.” Capcom was heavily involved in the design process and supplied assets from Monster Hunter: World. “One of the reasons why I

78 • VFXVOICE.COM SPRING 2021

PG 78-83 MONSTER HUNTER.indd 78

2/25/21 2:21 PM


wanted to adapt Monster Hunter was that I fell in love with the look of the creatures,” states Anderson. “I thought they were unique and original. I have worked with creatures in the past, but this was a whole new array of them. I could basically follow the designs that already existed and that gave us a great jumping-off point, but also there is a lot of fan satisfaction to see the monsters come to life

exactly as you know them.” The cinematic versions of the monsters are not exact replicas. “With the monsters you’re building them at a different level of detail than a video game engine could ever handle,” says Anderson. “Dennis Berardi [The Shape of Water], our Visual Effects Supervisor and co-producer, and his team sat down and analyzed

SPRING 2021 VFXVOICE.COM • 79

PG 78-83 MONSTER HUNTER.indd 79

2/25/21 2:21 PM


FILM

“My approach on this movie was if the creatures have to be CG, let’s shoot on real landscapes rather than in studio backlots against a greenscreen. Every time a creature’s foot goes down on the ground it displaces and showers our actors with real sand, and the lens flare from the sun will be real as well as the wind. It gives the animators an awful lot to match into as well as helps to tie the creatures into the reality of the existing location.” —Paul W.S. Anderson, Director

TOP TO BOTTOM: Jovovich portrays Lt. Artemis, who gets mysteriously transported to an alternative world filled with enormous deadly creatures. After playing the video game as research, Jovovich decided that her weapon of choice would be the dual swords. Jaa is the closest person to being an actual superhuman that filmmaker Paul W.S. Anderson has ever met. The Apceros were introduced in the original Monster Hunter video game.

the way that the creatures moved in the game and compared that to creatures of a similar bulk in our world [such as elephants and rhinoceroses] and how they would move with gravity operating on them. A footfall of a creature weighing a certain amount must displace a certain amount of sand or whatever material it’s running on. Something of a certain size normally moves at certain speed.” “One of the reasons I’ve stuck with Paul for a lot of years is because his process is fun,” states Berardi. “It is a real partnership. He is open to ideas right down the line, not just with me but with the artists.” Every key sequence was storyboarded and translated into 2D animatics to get a sense of timing. “In some cases the animatic was enough, but when technical shooting was involved we did full 3D previs,” adds Bernardi. “We were always trying to put in scale references for these creatures along with respecting what a cinematographer would do to photograph them.” Anderson favors backlighting to get dramatic silhouettes. “We’re in the alien world here, so we were able to take some liberties with light sources,” explains Berardi. “We put in a light source up there as a moon and had another light source with another moon. We had these backlit clouds, which gave us a nice way to punch out the creatures [in the nighttime]; otherwise, you wouldn’t see anything.” Production Designer Edward Thomas previously worked on Resident Evil: The Final Chapter and the YouTube series Origin with Anderson. “Paul and I have a mutual trust and a shared vision. There is no question that Paul makes me a better designer as he causes me to think and approach things differently.” It was important to be able leverage the history of the franchise. “Monster Hunter was a lovely challenge because so much of the information is there in the games for us,” says Thomas. “What I did on Resident Evil: The Final Chapter and Monster Hunter was to find the best possible gamers wherever they are and ask them to take me around the game. I’ll say to them, ‘Show me this or take me to a cave.’ I’ll ask questions and get feedback from them on what is important. Then I try to put as much of that into the film as possible because we’re making the movie for the fans as well as people who will become fans. Just doing that makes the game companies happy. When we made the weapons, we’d run

80 • VFXVOICE.COM SPRING 2021

PG 78-83 MONSTER HUNTER.indd 80

2/25/21 2:21 PM


everything by the Capcom team to make sure that the materials and paint finishing work for them. It is an enjoyable process because they are the creators and you want to do it justice.” Sixty-five minutes of screen time consist of 1,300 visual effects shots created by MR. X facilities in Toronto, Montreal and Bangalore, as well as at South African-based BlackGinger. “We had one situation where Kaname Fujioka [the director of the Monster Hunter games] and the team at Capcom were like, ‘Diablos looks amazing, but her toenails are too sharp,’” recalls MR. X Visual Effects Supervisor Trey Harrell. “Diablos is an herbivore, so the feature should be more like a rhino or elephant with rounded tusks with no pointy sharp bits. The most interesting thing to me that I found over the course of this whole journey was there is a certain amount of hubris involved when you start on a property and go, ‘Now we’re making a movie version of this.’ But a lot of times you do that before you understand the design in the first place. Everything was there for a purpose.” “Because I play the game and am quite familiar with the creatures, I was trying to include some of the big stars,” reveals Anderson. “Within the fan favorites, I was taking creatures that would offer me different aspects of combat. If I used nothing but flying creatures, the combat could become repetitive. Instead, we start with the Diablos, which burrows underground and suddenly bursts up; it has a feeling of a shark but of the land. It could be under the sand. You could hide it quite a lot and pop [it] up surprisingly. Those are all good aspects for a fight scene. The next creature that we used is the Nerscylla, which lives in underground lairs. Darkness brings a whole new mise en scène and look to the combat. We save the big flying creatures for the climax of the movie.” There are no lingering shots to assist in conveying the weight of the creatures as editor Doobie White (Polar) opted for a quick cutting style. “Early on we knew that this was going to happen, so we developed a lot of tests of different actions,” remarks MR. X Animation Supervisor Tom Nagy. “It’s not often that we’ve worked on this scale of creature, so it took a while to get used to it. What it came down to was capturing just the right detail or pose that is appropriate for that shot.”

TOP: South African-based BlackGinger handled the oasis sequence that features a herd of herbivores known as the Apceros. MIDDLE: A practical gimbal was created for the head of the Black Diablos so that Jaa could able to climb on top of the creature, just like characters in the video game. BOTTOM: Jaa, Jovovich and Anderson discuss a shot during principal photography.

SPRING 2021 VFXVOICE.COM • 81

PG 78-83 MONSTER HUNTER.indd 81

2/25/21 2:21 PM


FILM

“The challenge was that you’re putting these creatures against a real-life person, like Milla Jovovich, and have to be at the same level of acting or emoting. Even though the emotional range of Rathalos is more limited, we still didn’t want him to be angry all the time or be in that emotive ‘kill, kill, kill’ mode. There was always some other motivation, such as being wounded. He is drawing on past experiences to drive his current action.” —Tom Nagy, Animation Supervisor, MR. X

TOP TO BOTTOM: Rathalos, which is a combination of falcon and raptor, appears in the climax of the movie. The Nerscylla have crystalline spikes that are caused by dripping venom. Anderson wanted a variety of creature battles and chose Rathalos to provide the aerial threat. Rathalos unleashes a series of firebombs.

Outside of the Meowscular Chef, a talking Palico cat, none of the creatures have lines of dialogue. “The challenge,” outlines Nagy, “ was that you’re putting these creatures against a real-life person, like Milla Jovovich, and have to be at the same level of acting or emoting. Even though the emotional range of Rathalos is more limited, we still didn’t want him to be angry all the time or be in that emotive ‘kill, kill, kill’ mode. There was always some other motivation, such as being wounded. He is drawing on past experiences to drive his current action.” Principal photography took place in remote settings in South Africa, Namibia, and at Cape Town Film Studios. “We did go to some extreme lengths to capture the feel of the exotic worlds in the video game,” remarks Anderson. “Most of the places that we shot were a good 200 miles away from the nearest habitation of any kind – 350 people living in tents with us putting in water and electricity when we could. We were without cellphones and Internet. It was a great way to build a real camaraderie between the cast and crew. You had to take everything with you and be self-contained. It’s more like guerrilla-style filmmaking, which I found to be immersive and exciting.” The fantasy adventure was shot with large-format cameras. “We used the ARRI ALEXA LF,” says Anderson, “which is like two chips put together, so you’re getting twice the information than you would with a normal camera. It’s fantastic in capturing big landscapes that could contain the creatures.” Another technical tool came in handy when giving visual cues to the actors. “What we did was to carry a drone with us all of the time,” adds Anderson. “We got a fast one so we could chase the actors at an appropriate height for the creature, so everyone has something to look at, and then those shots could function either as a point of view or over the shoulder shot of the creature.” Anderson cites virtual reality as an important part of the design process. “What I like to do with all of the sets is to get them into VR as quickly as possible,” he says, “because that’s a surefire way of focusing the attention of the director, cinematographer and actors. It’s difficult sometimes for performers to turn up on set and be told that there’s a 400-foot monster about to chase them. But as soon as

82 • VFXVOICE.COM SPRING 2021

PG 78-83 MONSTER HUNTER.indd 82

2/25/21 2:21 PM


you put on a headset in VR, that 400-foot monster is right in front of you. The Nerscylla lair is one of my favorite sets in the movie – it’s a dark, dank place made from Nerscylla bones and the bones of eaten prey. We created that in VR and invited Milla Jovovich to have a look around. She put the mask on, and I swear she literally screamed, took the mask off and left the room. Milla was absolutely terrified by it. The VR helped Milla because when stepping onto the small part of the built set she knew what was going to be its final scale in the movie.” Massive sand simulations made Microsoft Azure cloud rendering essential in being able to complete shots. “Our cumulative effects file server at the end of the show was about 670 terabytes, and then all of the other storage for the show peaked at 1.3 petabytes,” reveals MR. X Digital Effects Supervisor Ayo Burgess. “At the peak [of post-production] we were using about 80% of our Isilon storage for Monster Hunter. It was a crazy ride of constantly balancing things out.” An enormous storm consisting of sand and lightning was created for the beginning of movie. “Our Montreal team would simulate smaller chunks, lay them out, and replicate them to increase the apparent detail,” says Burgess. “Simulating kilometers of storm was not feasible. The efficiency of replicating the smaller caches helped us iterate, to tell a story and make a movie. Once we knew what we needed to do, we could go back in, up-resolution things and refine them without shooting in the dark.” A hallmark for Monster Hunter is the oversized weapons. “That works within the world of the game because you’re playing a character who doesn’t have to obey the laws of gravity,” notes Anderson. “But when you build these oversized weapons for real and are asking human beings to wield them and do complicated fight choreography, it’s not so easy. That’s why I’m glad we had people of the caliber of Tony Jaa, who had to carry a lot of the big weapons and use them. Of the individuals I’ve met, he is the closest to being superhuman. Tony managed to master these giant weapons, which was not easy. When you’re asking the actors not to just wield the weapons but to do it in sand as well, that adds a whole new layer of complexity. Sometimes we made it easier by doing CG replacements of blades so that they would only be holding the hilt. But in a lot of the movie, they’re actually using the full-size weapon.” “The biggest challenge was representing the creatures so they respected the video game design while also being cinematic,” notes Berardi. “The rest of it was environment work, set extensions and effects simulations, which are things that we’ve done before. But we haven’t run this many creatures into one movie with fans who are religious about them. They’re basically in full daylight except for the Nerscyllas, which are in the underground.” Anderson is proud of the seamless blending of practical and digital elements. “When people [actually] go to inhospitable places in movies, they’re usually not visual effects films. Visual effects films tend to create the exotic places in the computer, so there’s always a synthetic feel to them. This combination of real tough location photography with cutting-edge visual effects is quite fresh and new, and is something I’m excited for people to see.”

TOP TO BOTTOM: An effects pass of the Black Diablos, which includes heavy sand simulations as the creature travels by burrowing below the surface. (Photo courtesy of Constantin Film, Screen Gems and MR. X) The design of the Black Diablos had to be adjusted to remove any suggestion of sharp edges to reflect the creature being an herbivore. (Photo courtesy of Constantin Film, Screen Gems and MR. X) The final lighting pass of the Nerscylla, which emphasizes the textural quality of the creature that lives in an underground lair. (Photo courtesy of Constantin Film, Screen Gems and MR. X) Animation pass of the Nerscylla, which is a combination of a spider and praying mantis. (Photo courtesy of Constantin Film, Screen Gems and MR. X)

SPRING 2021 VFXVOICE.COM • 83

PG 78-83 MONSTER HUNTER.indd 83

2/25/21 2:21 PM


TV/STREAMING

THE RIGHT STUFF: BALANCING MODERNIZATION AND PERIOD AUTHENTICITY TO ACHIEVE SERIES BLAST-OFF By CHRIS McGOWAN

TOP: The Mercury Seven astronauts hold a press conference for their mission to achieve the first American manned spaceflights, portrayed in The Right Stuff series, with visual effects by Zoic Studios and DNEG. (Photo: Gene Page. Image courtesy of National Geographic, Disney+ and DNEG) OPPOSITE TOP: Zoic recreated this massive Mercury Atlas rocket for an unmanned test-flight. Both CG and matte paintings were used for the rocket, test-flight and rocket failure. (Images courtesy of National Geographic, Disney+ and Zoic Studios)

Visual Effects Supervisor Mark Stetson, VES, had two major motivations for wanting to work on The Right Stuff, an original National Geographic series for Disney+. One was that at a young age he followed the exploits of the Mercury Seven, America’s first astronauts, who are portrayed in the show. “With my friends, I looked to the evening skies to try to spot Sputnik or Echo 1 or other satellites flying overhead. My father was an engineer working on tracking radars used by NASA for the space program,” recalls Stetson. Then, in 1961, “Alan Shepard flew into space for the first time. The test pilots, astronauts and rocket engineers were all national heroes, the stuff of dreams.” In addition, Stetson worked for Special Visual Effects Supervisor Gary Gutierrez on Philip Kaufman’s theatrical version of The Right Stuff, which was released in 1983 and, like the series, was based on the 1979 Tom Wolfe book. The Philip Kaufman movie didn’t fare well at the box office but earned critical acclaim and eight Oscar nominations, winning four Academy Awards. Stetson had known about The Right Stuff series project for more than a year and been wishing he could work on it. “But I was always committed to something else and just one step away from being able to get involved. So, in June 2020, when Executive Producer/director Chris Long and Supervising Producer Joshua Levey invited me to join the team, I was super excited. It was a very happy surprise.” Stetson joined the project as a Visual Effects Supervisor while

84 • VFXVOICE.COM SPRING 2021

PG 84-87 THE RIGHT STUFF.indd 84

2/25/21 2:23 PM


the series was in post-production. Zoic Studios was the primary VFX house, with DNEG joining in May. Matthew Bramante was the Overall Visual Effects Supervisor for the entire season, while Dan Charbit was the Visual Effects Supervisor for DNEG, which “focused mostly on Alan Shepard’s first space flight, from launch to splashdown,” according to Stetson. One of the challenges for the series was that the collective memory of those events is “grounded so much in the historical footage,” according to Stetson. “The stylistic choice to give the series a fresh look with dynamic camera moves sometimes led to gut feelings that the shots didn’t look authentic. We found that the difference between giving the series a look achieved with modern cameras and production equipment, compared to the less dramatic historical reference footage, sometimes caused negative reactions to the VFX shots that were hard to explain and, once identified, hard to resolve. “I had to find the balance,” Stetson continues, “and sell the differences between physical reality, period authenticity and a modern photographic and editorial style aimed at a modern audience. We did everything we could to achieve technical excellence and convey the collective gestalt of the period and the events.” During production, Bramante visited many of the real locations and studied first-hand many historical artifacts of the period,

“The Redstone rocket standing at Cape Canaveral as a museum piece, whose surface is now pocked with 60 years of weathering and paint layering, needed to be translated into CG as a clean, new, freshly-painted missile in 1961, revealing the simplicity of its geometry and construction. The visual effects teams strove to show that what are aeronautical antiques today were the technical state of the art back then. And that state of the art was simple, elegant, and utterly dangerous.” —Mark Stetson, VES, Visual Effects Supervisor

SPRING 2021 VFXVOICE.COM • 85

PG 84-87 THE RIGHT STUFF.indd 85

2/25/21 2:23 PM


TV/STREAMING

TOP TO BOTTOM: Test pilot Cal Cunningham (Tyler Jacob Moore) moves from bluescreen into the cockpit of a test plane and winds up with an F-104 composited behind him. (Images courtesy of National Geographic, Disney+ and Zoic Studios) OPPOSITE TOP TO BOTTOM: The Mercury Redstone rocket that would carry the first American into sub-orbital space is integrated into the CGI launch pad structure, which is then brought to full CGI life with detailing and lighting. (Images courtesy of National Geographic, Disney+ and Zoic Studios)

amassing a library of reference photography and scans. “NASA was also very forthcoming with access to their archives,” Stetson observes. “But all this reference required interpretation. For example, the Redstone rocket standing at Cape Canaveral as a museum piece, whose surface is now pocked with 60 years of weathering and paint layering, needed to be translated into CG as a clean, new, freshly-painted missile in 1961, revealing the simplicity of its geometry and construction. The visual effects teams strove to show that what are aeronautical antiques today were the technical state of the art back then. And that state of the art was simple, elegant and utterly dangerous.” One challenge for the filmmakers had to do with the size of the Mercury Redstone rockets used for the Mercury Seven’s suborbital flights. “To an audience brought up with huge historical rocket launches like the Saturn V and the Space Shuttle of later years, the exhaust plume of a Redstone rocket was pretty puny,” says Stetson. “We had to make sure we captured every bit of dynamic detail we found in the film footage of the real event. We wanted to raise the drama, but not at the expense of realism and historical accuracy.” He adds, “In showing the simplicity of the rockets, and the relatively diminutive scale of the early missiles in particular, the teams were careful to interpret historical footage and the existing artifacts.” Regarding other rocket details, he notes, “Perhaps the trickiest thing to get right was the cold frost on the sides of the rocket fuselages filled with super-cooled liquid oxygen, and the clouds of mist flashing off of them as they stood on the launch pads. The historical footage showed many different looks, depending mostly on lighting, ambient temperature and humidity. So finding the look that was right for the series required a lot of exploration. It took a number of tries to arrive at a sim that convincingly portrayed the large scale of the steaming rockets in the detail of the turbulence and wind speed. Beyond getting the dynamic simulations right, the most unexpected challenge came from the difference between modern camera moves chosen for the series and the more static footage from the more cautiously-placed cameras capturing the real events 60 years ago.” All the rocket exhaust and ground interactions were achieved in CG, primarily as sims, according to Stetson. For the Episode 103 Atlas Rocket test flight and explosion, “both CG and matte paintings were used to create the rocket, gantry, launch pad and background. Close attention was paid to tracking camera footage of rocket explosions of the period and those looks were incorporated into the shots. Notably, the CG rocket flying prior to the explosion is almost identical to historical footage of the event – if the archival footage had been cleaned up and up-res’d to 4K, they would probably look the same,” says Stetson. “In Episode 108, the environment surrounding the launch pad, the retracting gantry, the rocket and capsule and of course the launch effects were all VFX. All the views of Earth from space were painstakingly re-created as matte paintings, as the angles were too specific to permit use of NASA photography. Even in tight close-ups, the Mercury Freedom 7 capsule exterior was CG. The surface of the South Atlantic and all the splashdown elements were

86 • VFXVOICE.COM SPRING 2021

PG 84-87 THE RIGHT STUFF.indd 86

2/25/21 2:23 PM


all CG,” Stetson explains. For that episode, he comments, “I found working with Dan Charbit and his team at DNEG, plus NASA spaceflight consultant Robert Yowell and all the filmmakers, to be so inspiring and uplifting. It was very satisfying to be able to help map out the shots of the flight from personal memory, shared research and collaboration with the teams. The search for accuracy in the storytelling and the depiction of events in the shots, combined with awesome craft in the element creation, lighting and compositing of the shots made for a breathtakingly beautiful finish to the season.” Visual effects also were crucial for the period look throughout the series. Stetson notes, “A big part of the look of the series is the color palette used for the Florida locations, with those amazing pastels and the costumes and hair design. In terms of maintaining an authentic period look for the series, VFX helped with the usual paintouts to remove modern people, cars, signage, buildings, road markings, etc. And then in the larger matte-painted views around the launch pad, period reference was closely followed. Beyond the period look of the series, we worked in VFX to convey the themes of the period as well. There was an emphasis on capturing the freshness and technological hubris of the period, contrasting with the skin-of-their-teeth escapes from disaster along the way.” A number of places outside of Florida in the series were CG. For example, the production never filmed at Edwards Air Force Base or any locations in California – in Episode 101, the Mojave Desert background environment and all the aircraft for the F-104 sequence were entirely created in VFX, according to Stetson. So was the shot in that episode of the Manhattan skyline and the Brooklyn Bridge in the montage of people gathered across the country to watch the launch. In addition, he explains, “The winter scenes of Christmas in Virginia were shot in Florida and transformed with VFX.” With so much historical footage and the 1983 feature film for reference, “the treasure trove of material sometimes led to conflicting ideas of how the aerial sequences should look,” comments Stetson. “These had to be resolved to give the series a consistent look as established by the production design and cinematography.” As part of that process, Stetson had long discussions with aerial consultants Carl Pascarell and Budd Davisson “about atmosphere, the historical environments, weather, preferred shooting conditions as they pertained to CG lighting, and flight dynamics of the aircraft. My biggest challenge was to get all these valid viewpoints to work together to support the filmmakers’ vision and the style of the production design and cinematography,” he says. “The passion of all the filmmakers for the material and the story was wonderful to embrace. The VFX teams were committed to making all the shots as perfect as they could,” comments Stetson. “The war stories of the aerial consultants were awe-inspiring. And just being able to spend serious time reviewing all the reference footage, stills and histories from the period was so enjoyable. That the filmmakers sought to rekindle the heroism and hope of the time for a new, generational audience makes me very happy. Revisiting The Right Stuff was a dream fulfilled.”

SPRING 2021 VFXVOICE.COM • 87

PG 84-87 THE RIGHT STUFF.indd 87

2/25/21 2:23 PM


TV/STREAMING

CREATING THE BEASTS, EVILS AND HORRORS THAT HAUNT LOVECRAFT COUNTRY By CHRIS McGOWAN

Images courtesy of HBO. TOP: Jurnee Smollett, Jonathan Majors and Courtney Vance embark on an epic road trip full of supernatural and racial menace. OPPOSITE TOP: Majors and Smollett calm the savage shoggoth.

Lovecraft Country is a sprawling fantasy-horror series from HBO in which the protagonists must confront supernatural beasts and evil magic, as well as the racist horrors of 1950s America during the Jim Crow era, when state and local laws still enforced racial segregation in the southern U.S. Inspired by the eerie horror stories of early 20th century writer H.P. Lovecraft, the series was developed by Misha Green and based on Matt Ruff’s novel of the same name. J.J. Abrams and Jordan Peele served as executive producers. The main story centers on a road trip undertaken by Atticus Freeman (Jonathan Majors) in search of his missing father. He is joined by his friend Leti (Jurnee Smollett) and his uncle George (Courtney B. Vance) as they travel from Chicago to fictional Devon County, Massachusetts. On their journeys, they encounter earthly racists and Lovecraftian monsters (such as shoggoths), an occult society, a secret text, spirits, shape-shifting, time travel, the Tulsa race massacre, lynching victim Emmett Till’s memorial, dancer Josephine Baker in Paris, and more. For Visual Effects Supervisor Kevin Blank, what drew him to Green’s project “was her unique take on genre and horror from an African-American perspective. I knew this project was going to turn some heads and push some buttons, including my own. I didn’t know what it would become, but I had this deep sense that it would be meaningful. I thought somehow this show was going to push me out of my comfort zone.”

88 • VFXVOICE.COM SPRING 2021

PG 88-91 LOVECRAFT COUNTRY.indd 88

2/25/21 2:24 PM


“I knew this project was going to turn some heads and push some buttons, including my own. I didn’t know what it would become, but I had this deep sense that it would be meaningful. I thought somehow this show was going to push me out of my comfort zone.” —Kevin Blank, Visual Effects Supervisor To address the wide panorama of drama and horror, the visual effects had to be as ambitious and eclectic as the narrative. Blank estimates there were around 3,500 VFX shots in the series. Rodeo FX did the largest volume of work on the series, with some 1,200 shots, according to Blank. Crafty Apes “did a little bit of everything,” he says, and contributed over 1,000 shots. And Framestore worked on the shoggoth, the biggest creature asset. ILP, RISE FX, SPINVFX, BlackPool Studios and BOT VFX also worked on the series. “The majority of our digital creations were only used for a few scenes, sometimes only for a single shot. From a logistical point of view this was a daunting challenge,” comments Rodeo FX Visual Effects Supervisor François Dumoulin. “But from a creative perspective, we had so many things to invent. It was such a gift.” Dumoulin especially enjoyed the work because he had been an H.P. Lovecraft fan since the age of 10, when his grandfather introduced him to the author. Chicago was the site of many VFX-constructed scenes. “When

the series got greenlighted by HBO and we started prepping for an eight-month shoot, it was decided to shoot in Atlanta and partially recreate the Chicago exterior on the backlot of our shooting stages,” recalls Dumoulin. The ‘Safe Negro Travel’ store and its adjacent garage, the only set piece to be featured in every episode, relies heavily on CGI. “The production built a partial replica of the real location, three corners of that crossroads, first floors only, all surrounded with bluescreens. Under the supervision of our environment supervisor, Alan Lam, Rodeo took care of extending the set, adding buildings, trees, parked cars, traffic and pedestrians. “Toward the end of the shoot,” he continues, “when we were doing Episode 8 with Misha in the director’s chair, I went to the place where the pilot was shot in Southwest Chicago and did an extensive survey of the location, thousands of pictures, as well as terabytes of LiDAR scanning data provided by the wonderful people of Captured Dimensions.” “The backlot set was scanned and became our starting point. We first merged the scan of the backlot with the scan of the real

SPRING 2021 VFXVOICE.COM • 89

PG 88-91 LOVECRAFT COUNTRY.indd 89

2/25/21 2:24 PM


TV/STREAMING

“We painstakingly created the asset [of the monster Cthulhu], complete with its hundred eyes on the forehead, its eight tentacles that would act like legs on the ground to support its weight, and its wings large enough to carry that weight in the air. The final animation caches went through Houdini to achieve the moment where it is being sliced in two by Jackie Robinson, before reforming again out of a giant pile of green goo. Fun stuff!” —François Dumoulin, Visual Effects Supervisor, Rodeo FX location, adjusting proportions where needed. And from there we started to extrapolate our fictional 1950s Chicago, making our way from that crossroads down the streets, up to the distant skyline of the downtown. The entire city is a CG creation, nothing is matte painted, so we had full control over the camera and lighting,” says Dumoulin. Tulsa was another city that had to be recreated to portray the 1921 Tulsa Race Massacre. “The approach was similar to our CG reinvention of Chicago,” Dumoulin notes. Adds Blank, “We first collected all the documentation we could find and then we augmented that reality through fiction and design decisions. There’s nothing left of the Black Wall Street neighborhood where the massacre took place, so the production had found this street in Macon, Georgia, which acted as the main street of Tulsa. There were very large bluescreens put up at either end to enable VFX to create set extensions and multiply blocks of Tulsa destruction for several blocks.” Explains Dumoulin, “In the final shots those environments are 90% computer-generated. There’s a lot of digital manipulation to the environment, particularly when we go into the last part of the episode, when the riot begins. Most of the riot happens in the same street we have seen in broad daylight, but everything is on fire and some buildings are collapsing.” More architectural work was required in the creation of Ardham Lodge, where many key scenes take place. “There was a real house in LaGrange, Georgia, where we staged some of the Ardham interiors and all of the exteriors. We dramatically altered the exterior look. It was based on a sketch by our Production Designer, Kalina Ivanov. [Art Director/Concept Artist] Deak Ferrand at Rodeo expanded on her sketches, and Rodeo took their designs and scans of the real location and melded them together into what you saw in the show,” says Blank. Dumoulin adds, “The final version is about three times bigger than the real building, and the huge glass cupola is a complete invention.” Roaming the woods around the lodge at night are monstrous shoggoth beasts, designed and built by the Framestore office in London. “They built the evil white shoggoth and modified that

90 • VFXVOICE.COM SPRING 2021

PG 88-91 LOVECRAFT COUNTRY.indd 90

2/25/21 2:24 PM


asset to become the good black shoggoth,” explains Blank. “It was a reimagining of the iconic Lovecraft shoggoth, [which had] multiple eyes but was not necessarily perceived as sleek or fast. The shoggoths were going to be the Ardham white people’s attack dogs.” “The Cthulhu was another monster inspired by Lovecraft,” Dumoulin remarks. “Cthulhu is clearly the most iconic creature of the mythology. [Wanting to give it a twist] I asked our concept artist, Yvonne Mejia, to make him purely an animal, a simpler yet terrifying version of the squid head with giant wings.” Once Green approved the design, says Dumoulin, “We painstakingly created the asset, complete with its hundred eyes on the forehead, its eight tentacles that would act like legs on the ground to support its weight, and its wings large enough to carry that weight in the air. The final animation caches went through Houdini to achieve the moment where it is being sliced in two by Jackie Robinson, before reforming again out of a giant pile of green goo. Fun stuff!” The most disturbing monster in Lovecraft Country might well be the Kumiho, which is based on a shape-shifting, nine-tailed fox from Korean, Chinese and Japanese folklore, and given a grotesque new version. When Ji-Ah (Jamie Chung), a nursing student, is possessed by the Kumiho spirit, tentacles exit her body and attack others – such as during sex. Blank recalls, “I think this gave me a few nightmares! The concept art was done by Deak Ferrand. It was very graphic and didn’t leave much open to the imagination.” “We created full digi-doubles of both characters,” elaborates Dumoulin. “It really was one of the oddest things to work on, particularly because of the pandemic situation. For some of our artists working from home it was like, ‘François, I’m sorry, but I just can’t work on that scene. I’ve got kids at home.’” One of Blank and Dumoulin’s favorite scenes was the reanimation scene in the Boston museum, in which a desiccated corpse reanimates into an Arawak spirit named Yahima. “It all happens within a single, 360-degree orbiting shot,” Dumoulin notes. “The camera circles around the character – there’s no place to hide any tricks. The actress playing Yahima performed a choreography that would represent her transformation while our heroes reacted to it. The final animation was brilliantly crafted by [Animation Lead] Toby Winder under [Animation Supervisor] Bernd Angerer. Layers of muscle, skin and hair simulations were used for maximum realism, and we also covered the character in dust and cobwebs to add more interactions with the environment.” Looking back at Lovecraft Country, comments Dumoulian, “I still can’t believe we made Misha’s vision come true. It’s a show that is bold, original, fearless and political.” Blank concludes, “It was a never-ending creative blast, and I got to work with Misha and many talented, wonderful people. The creative process reinvented itself many times over, episode to episode. [I was inspired by] the cultural significance I felt the show stood for. Although our show and the book used H.P. Lovecraft for inspiration, our show is actually about racism in Jim Crow America and using Lovecraft’s writings and imagery to manifest that evil on screen. I knew this was going to be talked about and I wanted to be part of that.”

OPPOSITE TOP TO BOTTOM: Front of the Ardham Lodge mansion, which was based on a real house in Georgia, but dramatically altered with CGI. Ardham Lodge, the site of many key scenes, burned to the ground. A sweeping view of 1950s Chicago – a CG creation that started with a survey that included thousands of photos and terabytes of scanning data. TOP TO BOTTOM: Some Chicago city blocks, such as those close to the shop where the Safe Negro Travel Guide was sold, relied on bluescreen and heavy CGI. Fire was a major element in Lovecraft Country, almost a character on its own, according to Rodeo FX Visual Effects Supervisor François Dumoulin.

SPRING 2021 VFXVOICE.COM • 91

PG 88-91 LOVECRAFT COUNTRY.indd 91

2/25/21 2:24 PM


[ THE VES HANDBOOK ]

Designing Visual Effects Shots By SCOTT SQUIRES, VES Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Visual Effects – 3rd Edition Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES

“The best solution to creating compelling images is to design shots that enhance the story and then have the director work with the production designer, director of photography and VFX Supervisor to come up with the best possible visuals for the film.”

One of the key elements to a successful visual effects shot is the design. With today’s technology most of the technical issues can be resolved (even if problematic), but there is little point in doing an elaborate and costly visual effects shot if the design is unsuccessful. Some suggested guidelines follow, but, as with any art form, there are no absolute rules. Visual effects shots require the same eye for composition as standard live-action shots, but there are a number of issues that directly relate to visual effects shots. The costs and planning required for visual effects may lead the director to avoid dealing with the visual effects and the VFX Supervisor. This approach tends to lead to a more painful process during production and post-production. It also increases the costs and, more importantly, decreases the quality of the final shots if the visual effects team is second-guessing the director. The VFX Supervisor should be looked at as a creative collaborator and be relied on along with the director of photography and production designer to help design the film. The VFX Supervisor is there to serve the director and the film to create the appropriate effects in the most efficient way possible. The main objective of any shot is to help communicate the story the filmmaker is trying to tell. It is easy to lose sight of this and have a visual effects shot become just eye candy with no intrinsic storytelling value. Visual effects are tools for the filmmaker that open up an almost unlimited range of stories that can be told on film. Locations, time periods, sets, props, and even the characters can all be changed or rendered from scratch. It is very easy to abuse such powerful tools and get caught up in the technique and pure visuals created. When anything becomes possible, including during post-production, there may not be as much care and design as there should be in pre-production. When an audience complains about too much computer-generated imagery (CGI), this usually means there are too many unnecessary shots or that the shots have been pushed beyond the level expected for the type of movie. Guidelines for Directors 1. Work with the VFX Supervisor and their team. Do not make the mistake of thinking that they just bring technical knowledge to your project. They bring a creative eye and experience that can help design the

best visual effects for your project. 2. Assume everything is real and exists on the set. How would this scene be shot if everything were really on the set or location? This mindset avoids treating visual effects shots differently. 3. Design for the finished shots. The VFX Supervisor will have to work out the techniques required and determine the different pieces to be shot. 4. Do the first pass of the storyboards without limitations. What visuals are needed? These may need to be pared back as the storyboards are reviewed, but sometimes shots are neutered to try to make a budget based on incorrect assumptions about the costs for the visual effects or what the technical requirements may be. 5. Design the shots necessary to tell the story well. If a particular story point can be made in two simple shots, then it may not be necessary to turn it into a 30-shot extravaganza. This is something to consider in the large scope of the film. Use the visual effects shot budget where it counts. Objective of the Shot If the primary design goal of the shot is to make it cool and awesome rather than to advance the story, then it will likely fail. The reality these days is that it is very difficult, if not impossible, to wow the audience. There was a time at the start of digital visual effects when it was possible to show something totally new and different by technique alone. With the sheer volume of moving images these days, however, it is hard to amaze the audience even with something original. The best solution to creating compelling images is to design shots that enhance the story and then have the director work with the production designer, director of photography and VFX Supervisor to come up with the best possible visuals for the film. Even with the best intentions, the original concept of a shot may veer off-course. The original storyboard is sketched up with the director and a storyboard artist. This is then reinterpreted by a previs artist, and then reinterpreted by a DP. Finally, in post, the animator, technical director, lighting artist, and compositor working on the shot may enhance it. In many cases this collaboration improves the shot, but there are times when the changes are at cross-purposes with the original intent of the shot.

92 • VFXVOICE.COM SPRING 2021

PG 92 HANDBOOK.indd 92

2/25/21 2:46 PM


PG 93 VFXV BEST WEBSITE AD.indd 93

2/25/21 2:39 PM


[ VES NEWS ]

VES Board of Directors Officers 2021; Lisa Cooke to Lead the Global Organization By NAOMI GOLDMAN

The 2021 VES Board of Directors officers, who comprise the VES Board Executive Committee, were elected at the January 2021 Board meeting. The officers include Lisa Cooke, who was elected Board Chair, and is the first woman to hold this role in the Society’s history. The five officers embody the global makeup up the Society, coming from Sections in the United States, New Zealand and London.

Emma Clifton Perry is a Senior Compositor with more than 15 years of experience working across feature films, longform/TV series, commercials and advertising at VFX facilities worldwide. Based in Wellington, New Zealand, Perry offers remote compositing and VFX consulting services. She has served several terms as Chair, Co-Chair and Secretary/Treasurer on the VES New Zealand Board of Managers and was previously 2nd Vice Chair of the Board of Directors.

The 2021 Officers of the VES Board of Directors are: Chair: Lisa Cooke 1st Vice Chair: Emma Clifton Perry 2nd Vice Chair: David H. Tanaka Treasurer: Jeffrey A. Okun, VES Secretary: Gavin Graham “The Society is proud to have exceptional leadership represented on our Executive Committee,” said Eric Roth, VES Executive Director. “This is a group of impassioned and venerated professionals with a vision for the Society’s future, especially amidst this time of dynamic change. We appreciate their commitment to serve our organization and our members worldwide.” “It is my great honor and privilege to Chair this society of exceptional artists and innovators,” said Lisa Cooke, VES Chair. “I’m proud to be the first woman in this role, and I’m committed to lifting up our future leaders so that I’m not the last. I look forward to working with the Executive Committee and Board of Directors to support our members during these challenging times and advance the work to propel the Society and the industry forward.” Cooke has spent more than 20 years as an animation/VFX producer, creative consultant, screenwriter and actor. Firmly believing that it is vital for science to effectively communicate its truths to a broader audience, Cooke started Green Ray Media, and for the last 10-plus years has been producing animation and VFX to create scientific, medical and environmental media for a broad audience. She served six years on the Bay Area Board of Managers and currently serves as Chair of the VES Archive Committee, and was previously 1st Vice Chair of the Board of Directors.

David H. Tanaka has worked in visual effects and animation for more than 25 years, including 15 years at Industrial Light & Magic in VFX Editorial and a decade as a Special Projects Editor at Pixar Animation Studios. He is also an Adjunct Professor for the Academy of Art University, San Francisco, specializing in Editing, Producing and Cinema History. Tanaka served three terms on the VES Bay Area Board of Managers, holding both Chair and Co-Chair positions, and is a core member of the VES Archive Committee. Jeffrey A. Okun, VES is an award-winning visual effects artist known for creating ‘organic’ and invisible effects for film and television, as well as spectacular ‘tent-pole’ visual effects. He is a VES Fellow and recipient of the VES Founders Award, created and co-edited the VES Handbook of Visual Effects, an acclaimed reference book covering all aspects of creating visual effects, techniques and practices, and produces the global annual VES Awards. Okun served as VES Board Chair for seven years. Gavin Graham is the General Manager of DNEG Montréal, and has spent most of his 20-plus-year career with DNEG. Originally an FX artist with a background in Computer Science, he has also created tools and worked in development in the area of simulation and rendering. Graham joined the VES Board of Directors in 2019, and previously served as Secretary during his six-year tenure on the London Section Board.

94 • VFXVOICE.COM SPRING 2021

PG 94-95 VES NEWS.indd 94

2/26/21 4:29 PM


VES Virtual Production Resource Center Virtual production is game-changing technology helping to solve new creative challenges and advance the field of filmed entertainment. In January, the VES launched the Virtual Production Resource Center, a new online hub for free educational resources and captivating insights, available to VES members and the industry at large. http://bit.ly/VirtualProductionResources • Free educational tutorials • Vibrant roundtable discussions with industry pros • O n-demand access to conferences and inspiring speakers • M edia stories and publications charting trends, case studies and cutting-edge technology and

techniques – including exclusive features from our award-winning magazine VFX Voice • Work from home and virtual production guidance and best practices

TOP: Dimension Studio collaborated with DNEG in the U.K. on LED wall test-shoots. (Image courtesy of Dimension Studio)

The Center is evolving along with this dynamic field. Come back often to explore new content. To help us plan future programming and resources, we encourage all virtual production professionals to take this short survey: http://bit.ly/VirtualProductionSurvey

SPRING 2021 VFXVOICE.COM • 95

PG 94-95 VES NEWS.indd 95

2/26/21 4:29 PM


[ FINAL FRAME ]

When James Bond Reeled in the Rarest of Treasures

Was a James Bond movie ever in the running for an Oscar VFX trophy? The answer is yes. Not only was the movie Thunderball nominated in that category in 1966, it won, the only Bond movie ever to do so. John Stears was the recipient of the prize. The late Stears, who won two VFX Academy Awards, was known as the “Dean of Special Effects.” He not only created Bond’s famous Aston Martin DB5 sports car, but also Luke Skywalker’s Landspeeder, the Lightsabers of the Jedi Knights, and many other memorable screen gadgets. Stears actually created special effects for the first eight Bond movies and also went on to share an Academy Award in 1977 for Visual Effects for Star Wars: Episode IV – A New Hope. Among his onscreen VFX résumé highlights: blowing up Dr. No’s headquarters in Dr. No; an avalanche in On Her Majesty’s Secret Service; and flying cars for The with the Golden Gun. One interesting bit of VFX trivia from Thunderball: As overseen by John Stears in the Bahamas, the special effects explosion of the Disco Volante yacht was so powerful it shattered and blew out windows 20 to 30 miles away on Nassau’s Bay Street, where the movie’s Junkanoo Mardi-Gras sequence was filmed. Reportedly, Stears had not known how potent and strong a mix the experimental rocket fuel was in order to create the explosion.

Image courtesy of United Artists

96 • VFXVOICE.COM SPRING 2021

PG 96 FINAL FRAME.indd 96

2/25/21 2:32 PM


CVR3 FOTOKEM AD.indd 3

2/25/21 2:34 PM


CVR4 VES AWARDS HOUSE AD.indd 4

2/25/21 2:35 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.