VFX Voice Fall 2021

Page 1

VFXVOICE.COM FALL 2021

VISIONS OF DUNE

SPECIAL FOCUS: VIRTUAL PRODUCTION • REMINISCENCE • FREE GUY DOOM PATROL • PROFILES: DOUG CHIANG, HAYLEY WILLIAMS & LENNY LIPTON

CVR1 VFXV issue 19 FALL 2021.indd 1

8/29/21 2:31 PM


CVR2 RAYNAULT AD.indd 2

8/29/21 2:56 PM


PG 1 AUTODESK ADVERTORIAL.indd 1

8/29/21 2:58 PM


[ EXECUTIVE NOTE ]

Welcome to the Fall 2021 issue of VFX Voice! Thank you for your continued and enthusiastic support of VFX Voice as a part of our global community. We’re proud to keep shining a light on outstanding visual effects artistry and innovation worldwide and lift up the creative talent who never cease to inspire us all. In this issue, our cover story goes inside epic science fiction film Dune. We delve into the making of Free Guy and Reminiscence for the big screen. We sit down with film history author Lenny Lipton, Production Designer Doug Chiang and SFX Supervisor Hayley Williams. We’ve assembled an insightful array of features on Virtual Production, looking at LED walls and stages with Zoic Studios, real-time solutions with Aaron Sims Creative, video game innovators, and an industry roundtable on the future of this game-changing technology. We go inside the intersection of animation and live action, look at trends in immersive theater and take a look at superhero adventure Doom Patrol, and more. VFX Voice is proud to be the definitive authority on all things VFX. And we continue to bring you exclusive stories between issues that are only available online at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety.

Lisa Cooke, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM FALL 2021

PG 2 EXECUTIVE NOTE.indd 2

8/29/21 2:31 PM


PG 3 EPIC AD.indd 3

8/29/21 2:59 PM


[ CONTENTS ] FEATURES 8 VIRTUAL PRODUCTION: ZOIC STUDIOS Zoic’s Julien Brami describes their foray into LED walls.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 90 VES SECTION SPOTLIGHT: LONDON

14 VIRTUAL PRODUCTION: AARON SIMS CREATIVE The benefits of jumping head-first into real-time solutions. 20 COVER: DUNE With 2,000 VFX shots, mostly by DNEG, realism rules Dune. 26 PROFILE: DOUG CHIANG The remarkable career of the artist guiding Star Wars projects. 32 ANIMATION: INDUSTRY ROUNDTABLE Inside the blurred lines between animation and live action.

92 THE VES HANDBOOK 94 VES NEWS 96 FINAL FRAME – DUNE

ON THE COVER: Ornithopters, modeled after flying insects, in flight in Warner Bros.’ Dune. (Image courtesy of Warner Bros. Pictures and Legendary Pictures)

38 PROFILE: LENNY LIPTON Visionary author of The Cinema in Flux illuminates film history. 44 FILM: FREE GUY The joy of creating a video-game world where anything goes. 48 VIRTUAL PRODUCTION: INDUSTRY ROUNDTABLE Expert perspectives on the near-future of virtual production. 56 VIRTUAL PRODUCTION: LED WALLS & STAGES Where to find LED walls and stages in the U.S. and Europe. 62 PROFILE: HAYLEY WILLIAMS Dad, mentoring, teamwork shape SFX supervisor’s career path. 68 VIRTUAL PRODUCTION: VIDEO GAMES Game innovators on today’s virtual production connections. 74 FILM: REMINISCENCE Practical effects and tons of water simulate submerged city. 80 VR/AR/VR TRENDS: LIVE VR THEATER Adding immersive storytelling to live theatrical experiences. 86 TV/STREAMING: DOOM PATROL Effects grounded in reality anchor the outrageous and surreal.

Correction: Pixomondo Visual Effects Supervisor Michael Shelton’s name was misspelled in the iPhone article in the Summer issue.

4 • VFXVOICE.COM FALL 2021

PG 4 TOC.indd 4

8/29/21 2:31 PM


PG 5 SCANLINE AD.indd 5

8/29/21 3:00 PM


FALL 2021 • VOL. 5, NO. 4

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen Arlene-VFX@outlook.com Bernice Howes BerniceHowes@me.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Emma Clifton Perry, 1st Vice Chair David Tanaka, 2nd Vice Chair Jeffrey A. Okun, VES, Treasurer Gavin Graham, Secretary DIRECTORS Jan Adamczyk, Neishaw Ali, Laurie Blavin Kathryn Brillhart, Nicolas Casanova Bob Coleman, Dayne Cowan, Kim Davidson Camille Eden, Michael Fink, VES Dennis Hoffman, Thomas Knop, Kim Lavery, VES Brooke Lyndon-Stanford, Josselin Mahot Tim McGovern, Karen Murphy Janet Muswell Hamilton, VES, Maggie Oh Susan O’Neal, Jim Rygiel, Lisa Sepp-Wilson Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Colin Campbell, Himanshu Gandhi, Bryan Grill Arnon Manor, David Valentin, Philipp Wolf Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 visualeffectssociety.com VES STAFF Nancy Ward, Program & Development Director Mark Farago, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Debbie McBeth, Global Coordinator Jennifer Cabrera, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media

VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2021 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM FALL 2021

PG 6 MASTHEAD.indd 6

8/29/21 2:32 PM


PG 7 RODEO VFX AD.indd 7

8/29/21 3:01 PM


VIRTUAL PRODUCTION

ZOIC STUDIOS AND THE LESSONS LEARNED FROM ITS LED WALL By IAN FAILES

Images courtesy of Zoic Studios. TOP: An overhead lighting rig is installed on the LED wall stage. OPPOSITE TOP: Zoic Studios Creative Director and Visual Effects Supervisor Julien Brami reviews a real-time scene on a laptop. OPPOSITE BOTTOM LEFT: Zoic Studios tests the LED wall that was set up at its Culver City, California location. OPPOSITE BOTTOM RIGHT: A hand-held test-shoot against an LED wall environment.

If there’s one thing changing the way that physical production and visual effects interact right now, it’s the rise of LED walls, also known as LED volumes. LED walls form part of the shift to virtual production and realtime content creation, and have gained prominence by enabling in-camera VFX shots that don’t require additional post-production, offering an alternative to bluescreen or greenscreen shooting, and helping to immerse actors and crews into what will eventually be the final shots. Several production houses and visual effects studios have built dedicated LED walls and associated workflows. Some rely on bespoke LED wall setups, depending on the needs of the particular project. One company, Zoic Studios, even embarked on a project to install a LED wall inside existing studio space and use it to further the crew’s understanding of virtual production. How did Zoic do that exactly? Here, Zoic Creative Director and Visual Effects Supervisor Julien Brami breaks down the process they followed and what they took away from this new virtual production experience. A HISTORY OF VIRTUAL PRODUCTION AT ZOIC

Zoic’s foray into LED walls is actually part of a long history of virtual production solutions at the visual effects company, which has offices in Los Angeles, New York and Vancouver. In particular, some years ago the studio developed a system called ZEUS (Zoic Environmental Unification System) that could be used to track live-action camera movements on, for example, a greenscreen stage and produce real-time composites with previs-like assets. The idea was to provide instant feedback on set via a Simul-cam setup, either a dedicated monitor or smart tablet. When Brami, who has been at Zoic since 2015, began noticing

8 • VFXVOICE.COM FALL 2021

PG 8-12 ZOIC.indd 8

8/29/21 2:32 PM


“FTG [Fuse Technical Group] lent us an LED wall for about three months, and it was really exciting because they said, ‘We can give you the panels and we can do the installation, but we can’t really give you anyone who knows how to do it. You have to learn yourself.’ So it was just a few of us here at Zoic doing it. We each tried to figure everything out.” —Julien Brami, Creative Director and Visual Effects Supervisor, Zoic Studios a major shift in the use of game engines for virtual production shoots on more recent shows – The Mandalorian, for example, has of course brought these new methods into the mainstream – he decided to try out the latest techniques himself at home using Epic Games’ Unreal Engine. “I’ve always been extremely attracted by new workflows and new technology,” shares Brami. “I mostly work in the advertising world and usually the deadlines are short and budgets are even shorter, so I’m always looking for new solutions. I’d been looking to employ real-time rendering in a scene as a way to show clients results straight away, for example.

“So,” he adds, “I first started doing all this virtual production stuff in my own room at home using just my LED TV and a couple of screens. What’s amazing with the technology is that the software is free. I just had a Vive controller and was tracking shots with my Sony DSLR. As soon as I saw something working, I called [Zoic Founder and Executive Creative Director] Chris Jones and said, ‘I think we need to do this at scale.’” ‘WE NEED TO LEARN IT’

With an Epic MegaGrant behind them, Zoic partnered with Fuse Technical Group (FTG) and Universal Studios to create a LED

FALL 2021 VFXVOICE.COM • 9

PG 8-12 ZOIC.indd 9

8/29/21 2:32 PM


VIRTUAL PRODUCTION

“The panels create light, but not strong enough to get shadows. So, you can have an overall global lighting and the value’s just right on the character, but there is no shadow for the character. That’s why we wanted to also have practical lighting. We used the DMX plugin from Unreal, which would send information for correct intensity and color temperature to every single practical light. So, when you move in real-time, the light actually changes, and we can create the real cast shadows on the characters.” —Julien Brami, Creative Director and Visual Effects Supervisor, Zoic Studios

TOP TO BOTTOM: The LED wall, provided to Zoic on loan from Fuse Technical Group, gave the visual effects studio a chance to test several shooting scenarios. The LED wall, a ROE Black Pearl 2, was 28 tiles wide and 8 tiles high. An on-set tracking system utilizing Vive controllers allowed for camera movement and real-time re-adjustment of the imagery on the LED wall.

wall at their studio space in Culver City. It was a big leap for Zoic, which predominantly delivers episodic, film and commercial VFX work. Subsequently, the studio launched a ‘Real Time Group’ to explore more virtual production offerings and use Unreal Engine for visualization, virtual art department deliveries, animation and virtual production itself. “What we said was, ‘We have the time, we have the technology, we need to learn it,’” relates Brami, in terms of the initial push into LED wall filmmaking. “FTG lent us an LED wall for about three months, and it was really exciting because they said, ‘We can give you the panels and we can do the installation, but we can’t really give you anyone who knows how to do it. You have to learn yourself.’ So it was just a few of us here at Zoic doing it. We each tried to figure everything out.” The LED wall setup at Zoic was used for a range of demos and tests, as well as for a ‘Gundam Battle: Gunpla Warfare’ spot featuring YouTuber Preston Blaine Arsement. The spot sees Preston interact with an animated Gundam character in a futuristic warehouse. The character and warehouse environment were projected on the LED wall. “That was the perfect scenario for this,” says Brami. The wall was made up of around 220 panels. Zoic employed a Vive tracker setup for camera tracking and as a way to implement followfocus. The content was pre-made for showing on the LED wall and optimized to run in Unreal Engine in real-time. “We had the LED processor and then two stations that were running on the set,” details Brami. “We also had a signal box that allowed us to have genlock, because that’s what matters the most. Everything has to work in tandem to ensure the refresh rate is right and the video cards can handle the playback.” There were many challenges in making its LED wall setup work, notes Brami, such as dealing with lag on the screen, owing to the real-time tracking of the physical camera. Another challenge involved lighting the subject in front of the LED panels. “The panels create light, but not strong enough to get shadows. So, you can have an overall global lighting and the value’s just right on the character, but there is no shadow for the character. That’s why we wanted to also have practical lighting. We used the DMX plugin from

10 • VFXVOICE.COM FALL 2021

PG 8-12 ZOIC.indd 10

8/29/21 2:32 PM


Thoughts on an LED Wall Future Arising from his LED wall and virtual production experiences so far, Zoic Studios Creative Director and Visual Effects Supervisor Julien Brami believes there are many advances still to come in this space. One development Brami predicts is that there will be more widespread operation of LED walls remotely, partly something that became apparent during the pandemic. “All of this technology just works on networks. My vision is that one day I can have an actor somewhere in Texas, an actor somewhere in Germany, I can have the director anywhere else, but they all look at the same scene. As long as it can be all synchronized, we’ll be able to do it. And then you won’t need to all travel to the same location if you can’t do that or if it’s too expensive.” Another advancement that Brami sees as coming shortly is further development in on-set real-time rendering to blend real and virtual environments. “This is going to be like an XR thing. You shoot with an LED wall, but then you can also add a layer of CG onto the frame as well – that’s still real-time. You can sandwich the live-action through the background that’s Unreal Engine-based on the wall and then add extra imagery over the wall. “Doing this,” Brami says, “means you can actually create a lot of really cool effects, like particles and atmospherics, and make your sets bigger. It does need more firepower on set, but I think this is really what’s going to blend from an actor in front of a screen to something that’s fully completed. You could put in a creature there, you can put anything your space can sandwich. I’m really excited about this.”

Unreal, which would send information for correct intensity and color temperature to every single practical light. So, when you move in real-time, the light actually changes, and we can create the real cast shadows on the characters.” Zoic tested several cameras on the LED wall set, including an ALEXA Mini, Sony VENICE and RED KOMODO. Brami says they noticed each camera sensor produced different responses to the LEDs. “We learned really early on that to make it this beautiful, magical treatment when everything’s still degraded, that the grading of the scene through the monitor was extremely important. Being able to match the blackness level, and also the hue and saturation values of the screen, was extremely important to make it believable.” Ultimately, Brami came away from the experience more enthusiastic about the role of LED walls in production and visual effects than he had ever been previously. That was both because of what he saw could be achieved with in-camera visual effects and interactivity on set, but also what it meant for ‘world-building,’ something VFX studios are regularly called upon to do. “What’s really interesting from my point of view is the power of virtual production to save the backgrounds forever,” declares

TOP TO BOTTOM: LED panels with a 2.8mm pixel pitch made up the wall. Zoic Studios has ultimately established a ‘Real-Time Group’ at the studio to handle virtual production and real-time game-engine solutions to VFX work. Filming the ‘Gundam Battle: Gunpla Warfare’ spot featuring YouTuber Preston Blaine Arsement, left.

FALL 2021 VFXVOICE.COM • 11

PG 8-12 ZOIC.indd 11

8/29/21 2:32 PM


VIRTUAL PRODUCTION

“We know how to create VFX and we know how to create content and assets, we just need to get more involved in preparing this content for real-time on LED walls, since the assets need to run at high frame rates...” —Julien Brami, Creative Director and Visual Effects Supervisor, Zoic Studios Brami. “When you shoot on location, you may have the best location ever, but you don’t know if next year if there’s another season whether this location will be there or will remain untouched. Now, even 10 years from now, you might want a location in Germany, well, we can come back to the same scene built for virtual production. Exactly the same scene, which really is mind blowing.” THE STATE OF PLAY WITH ZOIC AND LED WALLS

TOP TO BOTTOM: Preston Blaine Arsement performs on the LED wall set. An Unreal Engine view of the Gundam character for the ‘Gundam Battle: Gunpla Warfare’ commercial. The video village, or ‘brain bar’ area of the LED wall stage, during filming of the Gundam commercial at Zoic.

It became apparent to Zoic during this test and building phase that a permanent build at their studio space may not be necessary, as Brami explains. “We realized a couple of things. First, this technology was changing so fast. Literally, every three months there was a new development, which meant buying something now and having it delivered in three months means it would be obsolete. “Take the LED panels, for example,” adds Brami. “The panels we had at the time were 2.8mm pixel pitch. The pixel pitch is basically the distance between LEDs on a panel, and these define the resolution you can get from the real estate of the screen and how close you can get to it and all the aberrations you can get. When we started shooting, 2.8 was state-of-the-art. But then we’ve seen pixel pitches of 1.0 appearing already. Everybody has seen the potential of this technology, and the manufacturers want to make it even better.” This meant Zoic decided that, instead of buying an LED wall for permanent use, to utilize the loaned wall as much as possible for the three months to understand how it all worked. “Our main focus then became creating really amazing content for LED walls,” states Brami. “We know how to create VFX and we know how to create content and assets, we just need to get more involved in preparing this content for real-time on LED walls, since the assets need to run at high frame rates, etc.” Already, Zoic has produced such content for a space-related project and also for scenes in the streaming series Sweet Tooth, among other projects. Indeed, Zoic believes it can take the knowledge learned from its LED wall experience and adapt it for individual clients, partnering with panel creators or other virtual production services to produce what the client needs. “So, if they need a really high-end wall, because everything needs to be still in focus, we know where to go and how to create a set for that,” says Brami. “But if they just need, let’s say, an out-of-focus background for a music video, we know where to go for that approach. “I love where it’s all going, and I’m starting to really love this new technology,” continues Brami. “The past three years has really been the birth of it. The next five years are going to be amazing.”

12 • VFXVOICE.COM FALL 2021

PG 8-12 ZOIC.indd 12

8/29/21 2:32 PM


PG 13 YANNIX AD.indd 13

8/29/21 3:01 PM


VIRTUAL PRODUCTION

DEEP-DIVING INTO REAL-TIME CAN OFFER A CONTENT CREATOR INSTANT BENEFITS By IAN FAILES

Images courtesy of Aaron Sims Creative, except where noted. TOP: A concept for the Demogorgon creature in Stranger Things by Aaron Sims Creative (ASC). (Image copyright © 2016 Netflix) BOTTOM: Aaron Sims, founder of Aaron Sims Creative. OPPOSITE TOP: Sims paints on a puppet used during the filming of Gremlins 2: The New Batch.

Aaron Sims Creative (ASC), which crafts both concept designs and final visual effects for film, television and games, bridging both the worlds of practical and digital effects, has long delivered content relying on the long-standing ‘traditional’ workflows inherent in crafting creatures and environments. But more recently, ASC has embraced the latest real-time methods to imagine worlds and create final shots. In fact, a new series of films ASC is developing are using game engines at their core, as well as related virtual production techniques. The first of these films is called DIVE, a live-action project in which ASC has been capitalizing on Epic Games’ Unreal Engine for previs, set scouting and animation workflows. Meanwhile, the studio has also tackled a number of commercial and demo projects with game engines and with LED volumes, part of a major move into adopting real-time workflows. So why has ASC jumped head-first into real-time? “It’s exactly in the wording ‘real-time,’” comments ASC CEO Aaron Sims, who started out in the industry in the field of practical special effects makeup with Rick Baker before segueing into character design at Stan Winston Studio, introducing new digital workflows there. “You see everything instantaneously. That’s the benefit right there. With traditional visual effects, you do some work, you add your textures and materials, you look at it, you render it, and then you go, ‘OK, I’ve got to tweak it.’ And then you go and do that all again.

14 • VFXVOICE.COM FALL 2021

PG 14-18 AARON SIMS.indd 14

8/29/21 2:33 PM


“You build it, and you can go with your team to this virtual world and go scout out your world before you go make it – we never really had that luxury in traditional visual effects. We can build just enough that you can actually start to scout it with a basic camera, almost like rough layouts, and then use Unreal’s Sequencer to base your shots, and work out what else you need to build.” —Aaron Sims “But with real-time, as you’re tweaking you’re seeing everything happen with zero delay. There’s no delay whatsoever. So, the timeframe of being able to develop things and just be creative is instant.” The move to real-time began when ASC started taking on more game-related and VR projects, with Sims noting they have now actually been using Unreal Engine for a number of years. “However,” he says, “it wasn’t until COVID hit that I saw some really incredible things being produced by artists and other visionaries who were coming up with techniques on how to use the engine for things beyond games. That’s what excited me, so I started getting into it and learning Unreal in more detail to help tell some of our stories. Then I realized, ‘OK, wait, this is more powerful than I expected it to actually be.’ On every level, it was like it was doing more than I ever anticipated.” The other real benefit of real-time for Sims has been speed. “The faster I can get from point A to point B, I know it’s true to my original vision. The more you start muddying it up, the more it becomes something else and the less that you can actually see it in real-time. You’re just guessing until the end. So for me it’s been fascinating to see something that’s available now, especially during COVID when I’ve been stuck at home. It’s made it even more reason to dive into it as much as I can to learn as much.” Over the past few years, ASC has regularly produced short film

projects. Some have been for demos and tests and some as part of what Sims calls the studio’s ‘Sketch-to-Screen’ process, designed to showcase the steps in concept design, layout, asset creation, lookdev, previs, animation, compositing and final rendering. Sims has even had a couple of potential feature films come and go. DIVE is the first project for which the studio has completely embraced a game engine pipeline, and is intended as the start of a series of ‘survival’ films that Sims and his writing partner, Tyler Winther (Head of Development at ASC), have envisaged. “The films revolve around different environment-based scenarios,” outlines Sims. “This first one, DIVE, is about a character who’s put in a situation where they have to survive. We’re going to see diving and cave diving in the film, but we wanted to put an everyday person into all these different situations and have the audience go, ‘That could be me.’” The development of the films has been at an internal level so far (ASC has also received an Epic Games MegaGrant to help fund development), but still involves some ambitious goals. For example, DIVE, of course, contains an underwater aspect to the filmmaking. Water simulation can be tricky enough already with existing visual effects tools, let alone inside a game engine for filmquality photorealistic results. “It’s very challenging to do water,” admits Sims, although he adds that the technology has progressed dramatically in game engines.

FALL 2021 VFXVOICE.COM • 15

PG 14-18 AARON SIMS.indd 15

8/29/21 2:33 PM


VIRTUAL PRODUCTION

“Instead of just previs’ing the effects shots, we’re previs’ing the whole thing so we can see how the film plays. It helps the story develop in a way that is harder to just imagine while you’re writing it.” —Aaron Sims

TOP TO BOTTOM: Unreal Engine rendering of the DIVE cave environment. An Unreal Engine screenshot of one of the cave environments for DIVE. (Image courtesy of Epic Games) Underwater divers in the film. ASC has explored live motion capture to help bring the divers to life. (Image courtesy Epic Games) Men In Black character Mikey, in reference maquette form, was one of the characters Aaron Sims worked on at Rick Baker’s Cinnovation for the 1997 film.

“I thought, ‘Let’s do the most difficult one first, which is water.’ We have other settings in the desert and the woods, which we have seen more of with game engines, but water is hard.” To help plan out exactly what DIVE will look like – including what physical sets and locations may be necessary – the ASC team has been building virtual sets first and then carrying out virtual set scouting inside them. “You build it, and you can go with your team to this virtual world and go scout out your world before you go make it – we never really had that luxury in traditional visual effects,” notes Sims. “We can build just enough that you can actually start to scout it with a basic camera, almost like rough layouts, and then use Unreal’s Sequencer to base your shots, and work out what else you need to build.” That virtual set scout resembles previs, to a degree, continues Sims, while allowing the filmmakers to go much further. “Instead of just previs’ing the effects shots, we’re previs’ing the whole thing so we can see how the film plays. It helps the story develop in a way that is harder to just imagine while you’re writing it.” This year, ASC helped Epic Games launch its announcement of early access to Unreal Engine 5. The new version of the game engine incorporates several new technologies, including a ‘virtualized micropolygon geometry’ system called Nanite and a fully dynamic global illumination solution known as Lumen. ASC’s work on a sample project called Valley of the Ancient,

16 • VFXVOICE.COM FALL 2021

PG 14-18 AARON SIMS.indd 16

8/29/21 2:33 PM


“It wasn’t until COVID hit that I saw some really incredible things being produced by artists and other visionaries who were coming up with techniques on how to use the engine for things beyond games. That’s what excited me, so I started getting into it and learning Unreal in more detail to help tell some of our stories. Then I realized, ‘OK, wait, this is more powerful than I expected it to actually be.’ On every level, it was like it was doing more than I ever anticipated.” —Aaron Sims available with this UE5 release, showcased that new tech, along with what could be done with Unreal’s MetaHuman Creator app and what could be achieved in terms of character animation inside Unreal (these include the Animation Motion Warping, Control Rig and Full-Body IK Solver tools, plus the Pose Browser). “I can’t speak highly enough about the tools,” comments Sims. “It’s become definitely a new medium for me. I haven’t gotten this excited since my days of starting in the ‘80s on makeup effects. “Unreal is able to take in a lot more geo than you would expect,” adds Sims. “A lot more than you would be able to in Maya or some of these other programs, and it’s able to actually interact with the geo and even animate with the geo at a higher density that you normally would be able to.” One aspect of the real-time approach Sims is keen to try out with DIVE and the related survival films is in shooting scenes, especially with LED volumes. Typically, volumes have served as essentially in-camera background visual effects environments or set-piece backgrounds, or for interactive lighting. ASC is exploring how this might extend to some kind of creature interaction, since the films will feature a selection of monsters. “It’s still early days,” says Sims, who notes the different approaches they have considered include pre-animating, live animation and live motion capture. “We’re R&D’ing a lot of this process because, in terms of what’s on the LED walls, we’re still working out, how much can you change it? People are of course used to being able to be on set with a puppeteered creature that interacts with an actor and lets you do improv. Right now, we’re trying to create tools to be able to do that.” While he is immersed in the world of game engines right now, Sims is well aware that there are many developments that still need to occur with real-time tools. One area is in the ability to provide ray-traced lighting in real-time on LED volumes. Because of the intensive processing required, fully photoreal backgrounds on LED walls are still often achieved with baked-in lighting. “This means you’re not getting the exact same interaction that you would if it was ray-traced in real-time,” says Sims. “But with things like Lumen coming, it’s likely we’ll get even more interactive

TOP TO BOTTOM: A look at a DIVE spider creature in an Unreal Engine scene. (Image courtesy of Epic Games) Working out how to realize underwater environments for DIVE has involved significant experimentation from ASC. The underwater world of DIVE will be populated with sea creatures – large and small – and divers. Test rendering of one of the strange underwater creatures to be featured in DIVE.

FALL 2021 VFXVOICE.COM • 17

PG 14-18 AARON SIMS.indd 17

8/29/21 2:33 PM


VIRTUAL PRODUCTION

“You’re seeing all the results instantly [in real-time]. So, as you’re tweaking it you’re not accidentally going back and doing it again because you forgot that you did something before because you’re waiting for the rendering process. And that’s just the rendering part. The camera, the lighting, the animation all being real-time is another component that makes it just so much more powerful and exciting as a filmmaker, but also a visual effects artist.” —Aaron Sims

TOP TO BOTTOM: For the early release of Unreal Engine 5, ASC was involved in crafting the Valley of the Ancient demo project featuring a character called the Ancient One. (Image courtesy of Epic Games) One of the other new tools used by ASC for the Valley of the Ancient animation was the Full-body IK (FBIK) Solver in Unreal Engine. (Image courtesy of Epic Games) The Ancient One’s animations were authored by ASC in the Unreal Editor using Control Rig and Sequencer. (Image courtesy of Epic Games)

lighting that doesn’t need to be baked.” Sims also mentions his experience with the effects simulation side of game engines (in Unreal Engine the two main tools here are Niagara for things like particle effects and fluids, and Chaos for physics and destruction). “There’s a lot of great tools in Unreal for creating particle effects and chaos or destruction. But a lot of times, you’re still having to do stuff outside, say in Houdini, and bring it in. I’m hoping that that’s going to change.” For DIVE’s underwater environments, in particular, ASC has been experimenting with all the different kinds of water required (surfaces, waves and underwater views). “We’re also creating our own tools for bubbles and all that stuff, and how they’re interacting, clumping together and forming the roofs of caves – all in-engine, which is exciting. I think it’s just the getting in and out of the water that’s the most challenging thing.” Despite the push for real-time, Sims and ASC of course continue to work with traditional design and VFX tools, and they are also maintaining links to a practical effects past for DIVE. Makeup effects are still a key part of the film. “If there’s a reason to do practical, I say still do practical,” states Sims. “In certain situations, the hybrid approach is usually the most successful because of the fidelity of 4K and everything else that you didn’t have back in the ‘80s. You could get away with grain and all of that stuff back then.” Still, Sims is adamant that real-time has changed the game for content creation. He says that right now there’s such a great deal of momentum behind creating in real-time, from the software developers themselves through to all levels of filmmakers, since the technology is becoming heavily accessible, and something that can be done almost on an individual level. “I mean, you’re seeing all the results instantly. So, as you’re tweaking it, you’re not accidentally going back and doing it again because you forgot that you did something before because you’re waiting for the rendering process.” “And that’s just the rendering part,” notes Sims. “The camera, the lighting, the animation all being real-time is another component that makes it just so much more powerful and exciting as a filmmaker, but also a visual effects artist.”

18 • VFXVOICE.COM FALL 2021

PG 14-18 AARON SIMS.indd 18

8/29/21 2:33 PM


PG 19 DNEG DUNE AD.indd 19

8/29/21 3:04 PM


COVER

DIRECTOR DENIS VILLENEUVE MAINTAINS A GROUNDED APPROACH TO EPIC SPACE SAGA DUNE By KEVIN H. MARTIN

Images courtesy of Warner Bros. Pictures and Legendary Pictures. TOP AND OPPOSITE TOP: The Atreides’ ornithopters incorporate the articulated wings described in Frank Herbert’s novel. Production opted for the flapping wings to flutter like an insect in flight. BOTTOM: The villainous Harkonnens arrive at Arrakeen, their ships offering a distinct contrast to the Atreides’ Ornithopters in both configuration and flight mode.

After failing to secure interest from 20 book publishers in his epic Dune, novelist Frank Herbert man aged to convince the car manual gurus at Chilton Press to get his book into print in 1965. A science fiction novel dealing with concepts ranging from the nature of messiahs to ecological concerns, Dune rapidly burgeoned from underground cult classic to a huge and sustained success worldwide. Even so, the property resisted adaptation to the big screen for some time, despite attempts by Arthur P. Jacobs, Alejandro Jodorowsky and Ridley Scott. Ultimately a theatrical version directed by David Lynch debuted, but proved to be a critical and financial failure, foiling partially-written sequels. Filmmaker Denis Villeneuve was a fan of Herbert’s landmark tome and when the possibility of a new and decidedly more faithful adaptation of Dune arose, he committed to it, bringing along with him a number of his past collaborators, including Production Designer Patrice Vermette, Editor Joe Walker, Special Effects Supervisor Gerd Nefzer and Visual Effects Supervisor Paul Lambert. An Oscar-winner for Villeneuve’s Blade Runner 2049, Lambert recalls that even before pre-production began in earnest, an enormous design phase had been undertaken. “Dune is such a massive project, with so many worlds to visualize,” he acknowledges. “So Denis and Patrice spent the better part of a year coming up with concepts. Usually that process just serves as a springboard for the final designs, but Denis was so happy with the initial artwork that it really served as a template for other departments. The work was just so strong, with such defined ideas – there were some tweaks down the line, but really it was very close. Sets were built with these same colors and textures, and visual effects created and executed these images in a similar faithful way.” While the Lynch film had featured rather grand and ornate production design by Tony Masters, this project reflected the filmmaker’s own tastes. “Having worked with Denis previously, I knew that things had to remain grounded,” confirms Lambert.

20 • VFXVOICE.COM FALL 2021

PG 20-25 DUNE.indd 20

8/29/21 2:34 PM


“[Director] Denis [Villenueve] was so happy with the initial artwork that it really served as a template for other departments. The work was just so strong, with such defined ideas – there were some tweaks down the line, but really it was very close. Sets were built with these same colors and textures, and visual effects created and executed these images in a similar faithful way.” —Paul Lambert, Visual Effects Supervisor “No wild pullbacks and moves that didn’t have some basis in how cameras are actually moved, and we tried to build as much as possible on most of our sets rather than encasing actors within a blue box. Trying to keep things grounded, we didn’t go for a big showy effect. We tried to avoid anything too FX-y, even when the ships fold space. In fact, the shield effect is based on past and future frames in the same shot, plus some color changes, so it isn’t like we’re adding or inventing, just using what was actually there. While there wound up being around 2,000 visual effects shots, realism was the word throughout.” While the majority of work was handled at DNEG Vancouver and DNEG Montreal, an in-house crew was also formed for the production. “I think it is very valuable to have offices close to the director,” says Lambert, “as it facilitates the fleshing out of ideas, which in turn affects and improves the shooting process. The in-house unit wound up doing hundreds of the simpler shots, plus we did a few shots with Rodeo FX up in Montreal. They have Deak Ferrand, a fantastic artist who is one of the fastest conceptualists I’ve ever seen. He’s also a good friend to Denis, having worked on Arrival and Blade Runner 2049 as well.” When production began in Budapest, Lambert assigned a scanning crew from OroBlade to LiDar parts of Wadi Ram, where later location shooting would take place. “That helped with the wides of Arrakeen,” he reports. “Our beautiful wides of the city are actually from helicopter views scanned into the computer. We built that CG world around those real terrain elements.”

For backlot work in Budapest, sand-colored screens were used instead of blue or green, allowing a more naturalistic Arrakisdesert coloration to reflect onto the performers. Natural light was employed whenever possible. “For some massive interiors, we had to come up with creative ways to light the scene,” Lambert reveals. “We connected two separate studios with a tarp across the roof, which had a specific shape cut into it allowing sunlight to come down into part of the set. We could only shoot at a certain time of day with this tarpaulin because the hole permitted sunlight to illuminate the interior just right for a couple hours only, but it really sold the reality of the environment with natural lighting beaming down into our stage sets.” Close collaboration between VFX, camera and art departments led to a unique approach for interiors requiring set extensions. “Because of the scale involved, the question was often, ‘how high do we build this?’ Traditionally, you would have green or blue up above, but that compromises the lighting for the whole view. I suggested that we consider what the actual tones and shapes would be going up in frame when we did the VFX. So during the shoot, we’d place a particular colored form up there. If there were structural elements, like crossbeams, that were supposed to be up there, we’d represent them simply as well. DP Greig Fraser loved this, because he was able to light it like he would a fully-built set. We’d build a very cheap version, through which he could light as he desired.” For scenes involving the many aircraft seen in the film, a mix of

FALL 2021 VFXVOICE.COM • 21

PG 20-25 DUNE.indd 21

8/29/21 2:34 PM


COVER

TOP TO BOTTOM: A 400-ton crane was used to fly the larger 13-ton ornithopters in the desert. For landings, separate concrete pads had to be poured for the 12-meter mockup and its somewhat smaller cousin. CG wings were added for takeoffs and landings.

full-scale vessels and CG craft were created. “Seen from a distance, the ornithopters look like insects,” says Lambert, “but as you get closer in, you can feel the power of these flapping wing aircraft. The Atreides’ dragonfly-like craft is a thing apart from the others. When the Harkonnens strike at Arrakeen, their ships come down looking like they have inflatable wings, and sport a very intricate and detailed look. For some of the shots of full-scale craft shot on location, we might add CG wings, but most of that was actually captured in-camera.” SFX Supervisor Gerd Nefzer had his teams begin prepping long before shooting in order to have time to transport huge containers holding the ornithopters and supporting equipment to Aqaba. “The containers for Jordan arrived slightly late, so that made it very tough to get everything ready for each shooting day,” Nefzer acknowledges. “It was a huge challenge to get the ornithopters ready. Altogether we had two large ornithopters and a smaller one on location in the desert. The big one weighed maybe 13 tons and was 10 or 12 meters long, so making it look as if it could maneuver took a lot of power. We had one scene with the big ornithopter landing in the desert and then taking off. That needed a 400-ton crane, so a special road had to be built to accommodate transport.” Nefzer used various gimbals, plus an overhead flying rig hung from a crane. “Our two computer-controlled motion bases had to function on location,” he says, “because our DP needed real sunlight and could not create that look on stage. That meant pouring a concrete pad for the thopter. Then we had to invent a drive file in order to maneuver the ship to follow the path of the sun in the sky for continuity. The large base was rated for 20 tons and the smaller thopter used a six-ton base for sandstorm scenes. We tented the smaller base with wind machines and tons of dust flying around in there, including soft stones, so that the view through the cockpit would really look like it was flying through a sandstorm. It could rotate 360 degrees so that the actors could be upside-down. We had four six-wheel drive trucks from the military so we could mount and drive our big wind machines around into position. As always, Denis tries to get as much as possible in-camera, but on some very wide shots, there just weren’t enough machines. But the part of the shot that we could handle practically – the area around the actors, because you get such a greater sense of believability with performers having to lean into the dust as they move instead of just pretending against greenscreen – was the basis for how VFX went about completing things, matching to the look and density of our sandstorm. Paul Lambert always asked for as much practical as possible to have something to match – ‘even if it is just a few meters’ he said one time – and then he filled out the rest of the shot beautifully. I think this is the right way to make movies these days, instead of just automatically.” To convey the sense that the mockup was really up in the air, the team used an idea also employed on The Rocketeer, choosing a high-up locale that allowed cameras to shoot down on the airborne object. “We found the tallest hill in Budapest and put a gimbal atop it,” says Lambert. “That gave us a horizon. We surrounded the gimbal with what we affectionately called the dog collar, a 360-degree ramp that went all the way around the gimbal, again

22 • VFXVOICE.COM FALL 2021

PG 20-25 DUNE.indd 22

8/29/21 2:34 PM


sand-screen colored. Rather than extracting the characters and then matting behind them, we were able to just augment the properly-hued backgrounds using some of the tremendous aerial photography done earlier.” The large backlot in Budapest was used for a crucial moment in the attack on Arrakeen when the palm trees burn. “Denis wanted about 25 palm trees in two rows,” Nefzer states. “With the shoot scheduled over two nights, we had to build the trees and their leaves so they could be burned over and over. We started by taking the bark off actual palm tree trunks, which wasn’t easy, since there are 20 layers. We cut a slot inside to get our gas/alcohol piping inside, along with fire protection sheets, and then reassembled it with the barks back on in proper order. We laser-cut 280 leaves from sheet metal, then added our gas pipes beneath. We had to bend all of them by hand so they looked real, then color them before connecting them to the trees. Nearly every leaf had its own control, so Denis could get various levels of fire, all controlled by 20 technicians. A pair of 1,500-gallon propane systems steamed the liquid, which utilized five miles of hoses.” In addition to the spice exported throughout the galaxy that is found only on Arrakis, the planet is best known for its massive native life, the sandworms. “The worms are dinosaur-like creatures that have been cruising through the desert protecting the spice for thousands of years,” says Lambert. “They are not very agile, taking something like a mile to stop and turn, with deeply textured looks. The design showed great articulation, kind of like an accordion. While it isn’t all that animated, the sandworm does have interesting textures, which include solid plates between the fleshy parts.

TOP TO BOTTOM: DNEG Vancouver and DNEG Montreal handled the majority of Dune’s 2,000 shots, though an in-house crew took on additional work, as did Rodeo FX. Visual challenges ranged from epic ground battles to massive space-going vessels.

FALL 2021 VFXVOICE.COM • 23

PG 20-25 DUNE.indd 23

8/29/21 2:34 PM


COVER

TOP: A year of close collaboration between director and production designer resulted in conceptual art so detailed that it could be used as a guide for construction by SFX, VFX, costuming and art departments. The Arrakeen city conveys both an expansive and realistic lived-in look. BOTTOM: Terrain surrounding the city of Arrakeen was based on scanned-in aerial plate photography from the Middle East location shoot. OPPOSITE TOP: Partial sets erected on the Budapest backlot were often augmented with CG set extensions. OPPOSITE MIDDLE AND BOTTOM: The worm ingests both vehicles and unfortunate humans. The sandworm design featured a mix of textures that included armor-like plating and fleshy segments, which facilitated a range of accordion-like articulations.

The worm sifting through sand was difficult to get right. A lot of the time, we were seeing more the effect of the worm’s passage than the worm itself, as there’s not much existing reference for things that big in the desert. I did request that production detonate some explosions out there to see how the sand gets disturbed, but given that we were shooting in the Middle East, that kind of thing could be mistaken for an attack and was frowned upon!” Time-intensive renders for the sims used to produce the organic sand passages were just the nature of the beast for the sandworm scenes. “Even so, there’s often some aspect with simulations that doesn’t look quite right,” allows Lambert. “So occasionally there is paint involved, since otherwise there’d be days and days more waiting to run the simulation again. Fortunately, we started worm development straightaway, so there was time.” To sell the disturbance in live-action scenes, Nefzer built a 12-ft. square plate atop a vibrator. “When it was activated,” Lambert reveals, “you would actually start sinking into the sand. We used that for shots of Paul (Timothée Chalamet) and Gurney (Josh Brolin) when the worm attacks the sand crawler. This largely in-camera solution was what we then copied to extend out beyond the foreground to the rest of the frame, whenever they went wider.” Conspicuous consumption of the local spice has the side effect of turning eyes blue over time. “The eyes were handled by our in-house unit, says Lambert. “The attention of a viewer automatically gravitates to the eyes, so when you have something

24 • VFXVOICE.COM FALL 2021

PG 20-25 DUNE.indd 24

8/29/21 2:34 PM


very pronounced, it can become very distracting. We didn’t want super-glowy eyes, and strove to keep something of the original tones of the actors’ actual eyes. So if you had one actor in shot who had brown eyes and another who had lighter-colored eyes, the blue effect would be customized to look different for each of them. It wasn’t about a covering effect from the sclera to the iris either, there was a lot of fine detail, and it took a while before Denis approved an overall look with which we could proceed. Then, on occasion, the matter of going too subtle became an issue, because the look might be minimized depending on the camera angle and how light struck the eye. Denis did point out a few times when

he couldn’t actually tell that the character was supposed to have the blue eyes, so we adjusted in those instances. Paul and Jessica (Rebecca Ferguson) are in transition for a large section of the movie, so the color had to change as the spice affected them more and more.” Looking back on the project, Lambert still marvels at the opportunity. “When I was first contacted, there was no hesitation on my part,” he laughs. “I mean, we’re talking about Dune here! And in the hands of someone like Denis, I have a very good feeling about how audiences will really feel transported to these worlds.”

FALL 2021 VFXVOICE.COM • 25

PG 20-25 DUNE.indd 25

8/29/21 2:34 PM


PROFILE

THE PROGRESSION OF DESIGN WITH OSCAR-WINNING VISIONARY DOUG CHIANG By TREVOR HOGG

Images courtesy of Doug Chiang. TOP LEFT: Doug Chiang, Vice President and Executive Creative Director, Star Wars, ILM TOP RIGHT: A photo of Doug Chiang taken before his family moved from Taiwan to Michigan. BOTTOM: The Chiang family in Michigan in 1968. From left: James, Doug, Sidney, Lisa and Patricia. OPPOSITE TOP: Along recreating the world-building of Star Wars, Chiang had to factor in health and safety measures when conceptualizing the Star Wars: Galaxy’s Edge theme parks.

Even though the Coronavirus pandemic confined Doug Chiang to home for most of 2020, for the Vice President and Executive Creative Director, Star Wars at Lucasfilm, there was not a lot of family time. “I still couldn’t talk to them much because I was busy!” This is not surprising as he is the visual gatekeeper of the expanding Star Wars franchise, whether it be films and television shows, games, new media or theme parks. The space opera did not exist in 1962 when Chiang was born in Taipei, Taiwan. “We moved to Dearborn, Michigan when I was about five years old. I remember the home we grew up in Taiwan, the kitchen and bedrooms. Before getting into bed, we had to wash our feet because of the dirt floors. We actually had pigs in the kitchen.” Arriving in Michigan during the middle of winter introduced the five-year-old to snow and the realization that he didn’t quite fit in, as Asian families were a rarity in Dearborn and Westland where Chiang attended elementary school and high school. “Our parents always encouraged us [he has an older brother Sidney and younger sister Lisa] to assimilate as quickly as possible, but as a family we were still culturally Chinese. I was your classic Asian nerd. I didn’t talk a lot, was quiet and looked different. I was picked on a lot, but the saving thing for me was that I quickly developed a reputation for being the class artist. I found that as a wonderful escape where I could create my own worlds and friends, so I drew a lot.”

26 • VFXVOICE.COM FALL 2021

PG 26-31 DOUG CHANG.indd 26

8/29/21 2:35 PM


“Star Wars had a huge impact on my generation. That completely defined my career goals in terms of what I wanted to do. I was starting to learn about stop-motion animation. When I saw The Making of Star Wars with Phil Tippett doing the stop-motion for the chess game on the Millennium Falcon, it all connected together. That’s when I went down to the basement of our home and borrowed my dad’s 8mm camera, tripod and lights, and started to make my own films.” —Doug Chiang, Vice President and Executive Creative Director, Star Wars, ILM A passion for filmmaking developed as a 15-year-old upon seeing the film that would launch the franchise that he is now associated with. “Star Wars had a huge impact on my generation. That completely defined my career goals in terms of what I wanted to do. I was starting to learn about stop-motion animation. When I saw The Making of Star Wars with Phil Tippett doing the stop-motion for the chess game on the Millennium Falcon, it all connected together. That’s when I went down to the basement of our home and borrowed my dad’s 8mm camera, tripod and lights, and started to make my own films. All trial and error. You’re discovering things by accident. I love that aspect of it.” A newspaper ad caused the aspiring cinematic talent to enter the Michigan Student Film Festival. “I won first place and that gave me a lot of encouragement. It gave me connections to one of the founders, John Prusak, who would loan me professional tools.

That became the start of my quasi-filmmaking education. When I came out to UCLA for film school it was a lot more of the same. A lot of it was self-driven. After I made one of my experimental animations called Mental Block, I made this film in 10 weeks, where I took over the little dining room of our dorm. I entered it into the Nissan FOCUS Awards and got first place. Winning a car as a sophomore was great! My goal was to direct films, but I realized that everybody in Los Angeles wants to do that, so there’s virtually no chance. I could do storyboards well so that was going to be my foot into the industry. One of my first jobs out of film school was doing industrial storyboards for commercials.” A fateful job interview took place at Digital Productions, which was one of three major computer graphics companies in the 1980s. “I was hired on the spot to design and direct computer graphics commercials. That was my first introduction of combining film design and art

FALL 2021 VFXVOICE.COM • 27

PG 26-31 DOUG CHANG.indd 27

8/29/21 2:35 PM


PROFILE

TOP TO BOTTOM: Chiang made a series of short films with Star Wars being a significant influence. In 1981 Chiang won first place at the Michigan Student Film Festival and went on to film school at UCLA. At Skywalker Ranch conceptualizing Star Wars: Episode I - The Phantom Menace in 1995.

direction with this new medium called computer graphics.” Joining ILM was an illuminating experience for Chiang. “I always thought of the design process as one seamless workflow. ILM was different in that we were doing post-production design, so the art department was specifically for that. The films that I worked on, like Ghost, Terminator 2: Judgment Day and Forrest Gump, were all post-production design. While I was at ILM it started to evolve where we could participate in the pre-production design. It wasn’t until I started working with George Lucas when he hired me to head up the art department for the prequels in 1995 that I realized that was the way George had done it all along back with Joe Johnston and Ralph McQuarrie. “I was one of the first people onboard while George was writing Star Wars: Episode I - The Phantom Menace,” Chiang continues. “I had been learning Joe Johnston and Ralph McQuarrie, and got their style down. When he told me that we were going to set the foundation for all of those designs, and aesthetically it is going to look different, that threw me for a loop because I felt like I was studying for the wrong test. My goal was to give him the spectacle that he wanted without any of the practical limitations. There were enough smart people at ILM like John Knoll to figure all of that out. It was world building and design in their purest form. I remembered it terrified ILM because they hadn’t developed anything of that scale. The Phantom Menace was the biggest film at that time at ILM with miniature sets. There was a huge digital component, but that was mostly for the characters.” In 2000, Chiang established DC Studios and produced several animated shorts based on the illustrated book Robota, co-created with Orson Scott Card, which takes place on the mysterious planet

28 • VFXVOICE.COM FALL 2021

PG 26-31 DOUG CHANG.indd 28

8/29/21 2:35 PM


of Orpheus and was inspired by his robot drawings. He would then co-found Ice Blink Studios in 2004 and carry on his collaboration with another innovative filmmaker, Robert Zemeckis, which resulted in Chiang receiving an Oscar. “Death Becomes Her was fascinating because the story is about the immortality potion, so our main characters can’t die. It was figuring out, ‘How do you achieve that?’ Real prosthetics can only go so far, and we had never achieved realistic skin in computer graphics before. It was a huge risk to try to combine the two. Forrest Gump was all about subtle adjustments to the reality of the world to create powerful dramatic images. Bob’s films can be complete spectacle like The Polar Express, and you have to lean into that sensibility. Bob doesn’t have a specific filmmaking style. He goes with what works best for his storytelling.” Zemeckis developed such a strong level of trust in Chiang that he hired him to establish ImageMovers Digital, a ground-breaking performance-capture animation studio. “It was almost a perfect collision with all of my experiences, because I had that strong computer graphics experience working with Digital Productions, I had the strong foundation with ILM for post-production design, and then I had the pre-production design with George Lucas. ImageMovers Digital became this wonderful test-case where we had an opportunity to build a new company from scratch and create a new artform. We were taking a huge risk because the difficulty of that challenge equated to bigger budgets because of the sheer amount of time and the number of people that it takes. Literally, tools were being written as we were going. Those four years were a highlight of my career. It was a culmination of all that history of learning, having Bob as the visionary to drive all of that,

CLOCKWISE: A big thrill for Chiang was being hired by George Lucas to be the visual effects production designer for the Star Wars prequels. A showcase of the different tools that Chiang utilizes to produce his illustrations. Chiang was able to meet his film idol, Ray Harryhausen. Chiang collaborating with Model Supervisor Steve Gawley in 1997 on Star Wars: Episode I - The Phantom Menace.

FALL 2021 VFXVOICE.COM • 29

PG 26-31 DOUG CHANG.indd 29

8/29/21 2:35 PM


PROFILE

TOP TO BOTTOM: A dramatic action sequence in Star Wars: Episode I - The Phantom Menace is the pod race. On the set of The Mandalorian making sure that everything is visually correct. A detailed cockpit layout for a Naboo ship that has an artisan aesthetic.

and the support of Disney. It was sad for me [when it was closed by Disney in 2010] because we were right at the point of breaking through to having the right tools to make that big transition for success.” Production on The Force Awakens saw Chiang join Lucasfilm to shepherd the expansion of the Star Wars universe. “Working with George Lucas for seven years, we established a logic in terms of how designs evolve in the Star Wars universe. It’s marrying the design history to our actual history. The prequels are in the craftsman era – that’s why the designs from Naboo are elegant and have sleek artforms. The shapes in the original trilogy become more angular and look like they came off of an assembly line. George always considered Star Wars to be like a period film where we do all of this homework and only 5% of it ends up onscreen, but all of that homework informs that 5%.” Believable fantasy designs need to be 80% familiar and 20% alien, Chiang learned. “I learned specific guidelines from George. When you design for the silhouette, draw it as if a kid could draw it. The other one is designing for personality. Certain shapes are very emotive, and if you can design with that in mind, and on top of that put in color and details, it’ll be more successful. The way your brain works is that you’ll see the shape first which will tell you right away, ‘Is it friendly or bad, and what it’s supposed to do.’ In the end, the details don’t inform the design but can make it better.” Conceptualizing and constructing the Star Wars: Galaxy’s Edge theme parks situation in Disneyland and Disney World has been a whole other level of design for Chiang. “Film sets exist for weeks and we tear them down while theme parks exist for years, so the materials have to be real and there is no cheating. On top of that, you layer in the whole health and safety component because you’re going to have people wandering through these environments unsupervised. We’re trying to layer in what makes Star Wars special. For instance, the whole idea of no handrails is

30 • VFXVOICE.COM FALL 2021

PG 26-31 DOUG CHANG.indd 30

8/29/21 2:35 PM


a real thing in Star Wars. You obviously can’t do that. The aesthetic part was just as challenging because we wanted to create a design that fits seamlessly with our films. One of the things that I didn’t realize is the sunlight quality in Florida is different than Anaheim. You have to tune it to take in account the cooler light in Florida. On top of that, there are hurricanes in Florida whereas you have earthquakes in Anaheim. The WDI engineering team are amazing artists in their own right. “There was a period of time where films became CG spectacles because the audience hadn’t seen anything like that before,” observes Chiang. “What I find interesting now is that got boring because it became too much sugar. Now I see the pendulum swinging back to where, what is the best technique for the film? One of the great things about working with Jon Favreau on The Mandalorian is precisely that. I don’t know what the younger generation will think because they have grown up with video games where they’re completely immersed in digital spectacle.” Virtual production works best with filmmakers who know exactly what they want, Chiang explains. “What’s different about what we’re doing now with StageCraft and the volume is we’re bringing a good percentage of post-production work upfront so it involves all of the department heads collaborating to create a seamless process. Virtual production is transforming filmmaking into a fluid process, which is in some ways what computer graphics did when it first became available as a tool. The world building has to be complete before we actually start photographing the film. But it’s not unlike what I was doing with George. When I think back to what George, Robert Zemeckis and Jon Favreau were doing, they’re of the same mind. We were all trying to create a technique to better tell stories and to make the most efficient process to create visual spectacle onscreen.”

CLOCKWISE: The Mandalorian duo of Din Djarin and Grogu aka Baby Yoda through the eyes of Doug Chiang. For Chiang, the key to creating effective concepts, such as for the Monopod, is choosing shapes that can be quickly understood by the audience. A Ravinak that inhabits the frigid planet of Maldo Kreis attacks the Razor Crest. Chiang examines the Pod Hanger miniature constructed for Star Wars: Episode I The Phantom Menace.

FALL 2021 VFXVOICE.COM • 31

PG 26-31 DOUG CHANG.indd 31

8/29/21 2:35 PM


ANIMATION

CREATIVE CO-EXISTENCE AT THE INTERSECTION OF ANIMATION AND LIVE-ACTION By TREVOR HOGG

TOP: Part of the comedic appeal of Monty Python’s Flying Circus were the surreal cut-out animation segments created by Terry Gilliam. (Image courtesy of Python (Monty) Pictures Limited) OPPOSITE TOP: Originally, The Drowned Giant was meant to be a live-action short by Tim Miller, before landing on Netflix. (Image courtesy of Netflix) OPPOSITE BOTTOM LEFT: Robert Legato used live-action camera techniques when making The Jungle Book with Jon Favreau. (Image courtesy of Walt Disney Pictures) OPPOSITE BOTTOM RIGHT: Tim Miller was conscious of poses and silhouettes when composing shots for Terminator: Dark Fate. (Image courtesy of Paramount Pictures)

The line between animation and live-action has blurred so much, with visual effects achieving such a high fidelity of photorealism, that cinema has become a seamless hybrid of the two mediums. This is an achievement that British filmmaker Terry Gilliam (12 Monkeys) finds to be more problematic than creatively liberating. “If you’re watching Johnny Depp on the yardarm of a four-masted pirate ship sword fighting somebody else, there’s no gravity involved! Tom and Jerry and the Roadrunner understood gravity. Modern filmmaking is so artificial now. It’s not real people in real situations.” Gilliam began his career creating animated segments for Monty Python’s Flying Circus. “I was limited by the amount of time and money that I had to do what I did, so it became cut-outs.” Animation at its best is showcased in Disney classics Pinocchio and Snow White and the Seven Dwarfs, he says. “I love the animation done by the Nine Old Men [Walt Disney Productions’ core animators, some of whom later became directors] because it was such hard work and required an incredible understanding of people and animals. “The problem is that live-action and animation are becoming one in the same,” observes Gilliam. “As George Lucas got more successful and had more money, it became more elaborate. Rather than have three spaceships flying around you could have a thousand. It becomes abstract at that point. You don’t have what you had when two people are fighting to the death.” Live-action is where Gilliam will remain. “People keep asking me why don’t I make another animated film, but I don’t want to. I like working with actors who bring their own view of the world to the work. It’s not like working with other animators who you are directing

32 • VFXVOICE.COM FALL 2021

PG 32-36 ANIMATION.indd 32

8/29/21 2:36 PM


“Animation is, ‘Here’s the shot, let me animate within it and make that work.’ It literally only works for that angle whereas mine works for all angles, but I pick the one that looks the coolest.” —Robert Legato, ASC, Visual Effects Supervisor to do this and that the way you want it. I want other people to be part of the process and take me out of my limitations and show me different ways of looking how a scene or a character could be played.” Even after making Deadpool and Terminator: Dark Fate, Tim Miller and his visual effects company, Blur Studio, have continued to produce and create animated shorts, in particular for the Netflix anthology Love, Death + Robots. “When you’re in the animation business you are a student of motion. You’re always trying to carve away at the artificiality of what you’re doing to find the reality that feels natural. I do the same thing in film,” says Miller. “I just use

people instead of digital characters. I’m conscious of poses and silhouettes, the kinds of things that animators would think about. All of the time I would walk on set during Terminator: Dark Fate and say to Mackenzie Davies, ‘I need you to drop the shoulder and chin because it looks tougher. Stand a little contrapposto or three quarter because it makes a better silhouette.’ Coming from animation helps with making the transition to a film set because it’s working with groups of artists and having them not hate you. “I rely heavily on previs and storyboards which allow me to figure out mistakes before they get expensive or impossible to correct,” explains Miller. “Every filmmaker will tell you that there

FALL 2021 VFXVOICE.COM • 33

PG 32-36 ANIMATION.indd 33

8/29/21 2:36 PM


ANIMATION

“The problem is that live-action and animation are becoming one in the same. As George Lucas got more successful and had more money, it became more elaborate. Rather than have three spaceships flying around you could have a thousand. It becomes abstract at that point. You don’t have what you had when two people are fighting to the death.” —Terry Gilliam, Filmmaker

TOP THREE: Progression imagery for the scene when Violet creates a forcefield to shield her family from an explosion in The Incredibles 2. (Images courtesy of Disney/Pixar) BOTTOM: Michael Eames enjoyed collaborating with filmmaker Spike Jonze on Where the Wild Things Are. (Image courtesy of Warner Bros. Pictures)

is that special panic that you have on the day [you shoot] because you know no matter what your budget is you’re never going back to this place where you are. In animation you have this luxury of being able to return to any location to redo any shot you want, no matter where in the process you are. That’s a hard limitation to get used to if you’re making the transition from animation to live-action.” Miller references some things that need to be kept in mind when making the transition from live-action to animation. “You have to be able to recognize enough about your intentions, because it’s a while before you get to see the final shot. On the flipside, you’re not stuck in the continuity that you shot.” All of the high-end animation and visual effects companies have some level of customized commercial software like Houdini, Maya and 3ds Max, he points out. “Then you have Blender, which is a free open source and the whole animation community can contribute to it, and that is a game-changer.” Collaborating with Brad Bird on both animated and live-action projects such as The Incredibles and Mission: Impossible – Ghost Protocol is Rick Sayre, Supervising Technical Director at Pixar Animation Studios. “Brad is a special case because he had some live-action commercial experience, but is also keenly aware of cinematography and production design, all of the practical production principles. The other big thing that Brad had going for him is he’s a writer-director, and that can translate quite nicely into the live-action side.” Something that a live-action filmmaker has to get used to when transitioning to animation is that everything has to be intentionally planned out, says Sayre. “The happy accident or natural interplay on set, you have to create in animation. In traditional animation it is obvious what you’re producing is a frame. But in live-action you have to be conscious of that. The audience only sees what the camera sees.” Previs is an animation process that has been adopted by live-action. “A lot of previs in live-action has become an enriched storyboard where you’re finding angles and figuring out what is the most exciting way to shoot this action,” remarks Sayre. “Almost every giant-budget genre picture now will have scenes that are entirely animated. In that sense you have these so called live-action films that are actually animated movies, like Gravity.” The blending of two mediums dates back to Fleischer Studios having animated characters walk through live-action sets. “Pixilation didn’t used

34 • VFXVOICE.COM FALL 2021

PG 32-36 ANIMATION.indd 34

8/29/21 2:36 PM


to mean giant pixels. It meant that you were moving something a frame at time. That might be an actor on set appearing and disappearing with a cut or doing animation with a miniature. There is such a rich history of filmmaking being a hybrid art.” “The reason I do previs is that I can’t draw,” admits Oscarwinning Visual Effects Supervisor Robert Legato, ASC (Hugo). “On the last film that I did with Michael Bay, I ended up doing car accident simulations. I don’t simulate for what would actually happen, but for what I want it to do. Then I figure out a clever way of shooting it as opposed to animate and plan every moment. After you piece the scene together, the shortest bridge is what you force the animation into being. People who are fluent in animation would rather animate the whole thing. I find it not working for me well because I’m steeped in live-action work. Even if the animation is probably correct, I question it because it wasn’t done with science, gravity, weight and mass, all of the various things that I factor in when I set up a gag or shoot. Animation is, ‘Here’s the shot, let me animate within it and make that work.’ It literally only works for that angle whereas mine works for all angles, but I pick the one that looks the coolest. “What we did which I liked in The Lion King, and what Andy Jones [Animation Supervisor] did, was only make the animals do what they can do within the confines of the scene,” notes Legato. “The animals can jump and leap but couldn’t do it 50 times more than what was actually possible. You couldn’t stretch and squash the animation. You had to work within if you could train an animal to do that, it would do that. If you get them to move their mouth with the same cadence of speech you could almost shoot it live. That was the sensibility behind it. Sometimes we wanted to invent our own version of the movie as opposed to being a direct homage. When we ran into some problems, we went back to the old one, and it turned out that they had the same problem and solved it.” The most widely known animation principles are the 12 devised by Disney’s Nine Old Men during the 1930s and later published in Disney Animation: The Illusion of Life in 1981. Focusing on five in particular that apply to live-action is Ross Burgess, Head of Animation, Episodic at MPC. “Within feature animation, the timing of a character is easier to manage as it’s fully CG and you can re-time your camera to fit the performance. In live-action, often you must counter the animation to the timing of the plate that has been pre-shot. It’s easier to exaggerate the character’s emotions or movements in feature animation. You are not bound to the ‘reality’ of a real-life environment. In visual effects, we use exaggeration slightly differently in the way that we animate our characters or anthropomorphize our animals. It’s all about the subtlety of a character and knowing when you have broken the ‘reality.’ “Drawing has become as important in live-action features as it is in CG animated features,” continues Burgess. “We use custom software that allows the animator to draw over work in dailies to make sure that a character or creature remains ‘on model.’ Animation principles [arcs, squash and stretch] are as important in live-action, and sometimes it’s easier and quicker to draw over the work and show the animator than to trial and error with Maya. A good pose is as important in visual effects. At the end of the day, we are

TOP: Where the Wild Things Are is a hybrid of practical and digital effects when it comes to the facial performances of the animals. (Image courtesy of Warner Bros. Pictures) MIDDLE AND BOTTOM: Filmmaker Brad Bird has the ability to seamlessly shift from animation to live-action, whether it be The Incredibles 2 or Mission: Impossible – Ghost Protocol. (Images courtesy of Disney/Pixar and Paramount Pictures)

FALL 2021 VFXVOICE.COM • 35

PG 32-36 ANIMATION.indd 35

8/29/21 2:36 PM


ANIMATION

“When you’re in the animation business you are a student of motion. You’re always trying to carve away at the artificiality of what you’re doing to find the reality that feels natural. I do the same thing in film -- I just use people instead of digital characters. I’m conscious of poses and silhouettes, the kinds of things that animators would think about. ... Coming from animation helps with making the transition to a film set because it’s working with groups of artists and having them not hate you.” —Tim Miller, Visual Effects Supervisor/ Owner, Blur Studio

TOP AND MIDDLE: From Where the Wild Things Are. Michael Eames views CG animation in live-action as part of the evolution of filmmaking that seeks to tell stories in the best possible way. (Images courtesy of Warner Bros. Pictures) BOTTOM: The Netflix series The Crown relies on invisible effects work produced by visual effects vendors like Framestore shown here. (Image courtesy of Netflix)

all trying to tell the story in the easiest and clearest way possible. Clarity of pose is the most important ingredient when you’re trying to tell a story in under two hours. Appeal is everything, isn’t it? How much you can identify or love a character is steeped in how appealing the character is.” “A lot of the principles of animation and filmmaking boil down to having an aesthetic sensibility and a clear idea on how to best tell a story using sound and vision,” believes Michael Eames, Global Director of Animation at Framestore. “Animators often use exaggerated poses or actions in a performance to help put a particular story point across. Whether the look is stylized and perhaps extreme, or realistic and more subtle, the principle is the same – just more a matter of degree as to how you apply it. Thinking about stylized animation working in the context of live action, we recently did Tom and Jerry, which is a hybrid. We wanted to find a balance between a typically stylized, almost flat-looking classic 2D cartoon in an environment that is real. One technique we used to help marry the two worlds was to apply a directional rim light that connected to the direction of light in the real plate.” For Eames, one of the best collaborative experiences with a filmmaker occurred during the making of Where the Wild Things Are with Spike Jonze. “They started out with a group of actors voicing the parts in the space that represented the environment. A given scene would play out in the film. Spike then selected those tracks and played them on set to the costumed performers, to drive their physical performance,” explains Eames. “The bit that didn’t work was the facial performance because it needed to be visceral and delicate. Animation came in and listened to voice recordings, watched the actors in suit, and added to the facial performance by replacing the entire face in CG. That was a perfect combination of three different things that led to one final result. When people complain about CG animation stealing away their thunder, it’s not that. It’s the evolution of filmmaking. Whatever it takes, as long as we’re serving the story in the best possible way.”

36 • VFXVOICE.COM FALL 2021

PG 32-36 ANIMATION.indd 36

8/29/21 2:36 PM


PG 37 DNEG BOT AD.indd 37

8/29/21 3:05 PM


PROFILE

LENNY LIPTON: PROJECTING HISTORY THROUGH THE MAGIC LANTERN By TREVOR HOGG

Images courtesy of Lenny Lipton, except where noted. TOP LEFT: Lenny Lipton, Author/ Filmmaker/Inventor TOP RIGHT: The cover of The Cinema in Flux: The Evolution of Motion Picture Technology from the Magic Lantern to the Digital Era by Lenny Lipton (Springer). BOTTOM: Louis and Auguste Lumière invented the film camera known as the Cinématographe that also functioned as a photo developer and projector.

A poem left in the typewriter belonging to Cornell University classmate Peter Yarrow, who would in turn compose accompanying music, had a huge impact on Lenny Lipton, as the song royalties for “Puff the Magic Dragon” have provided a lifetime of financial security. “It’s a weird story but true!” notes Lipton, who spent a decade-long odyssey of researching and writing his homage to cinematic innovators, titled The Cinema in Flux: The Evolution of Motion Picture Technology from the Magic Lantern to the Digital Era. Lipton is a kindred spirit, being a pioneer of stereography and founding the StereoGraphics Corporation in 1980, which two decades later was acquired by Real D Cinema. The physics graduate developed the first electronic stereoscopic visualization eyewear known as CrystalEyes, which has been used for molecular modeling, aerial mapping, and to remotely drive the Mars Rovers. “I saw 3D movies and comic books as a kid in the early 1950s which got me interested in the stereographic medium. “When I approached Springer, I thought I had died and gone to heaven,” remarks Lipton. “I wanted a low-cost book that was one volume and had a lot of color. They sent me samples of their books that were beautiful. I found a great editor in Sam Harrison and we worked on the book together. I thought I was done, but you see the book differently when you’re starting to lay it out. No matter how smart you are there are things that you can’t envision. I kept finding things that I could fix and illustrations to be improved.

38 • VFXVOICE.COM FALL 2021

PG 38-42 LENNY LIPTON.indd 38

8/29/21 2:37 PM


“The Cinema in Flux calls upon my years of doing what I did. I became a self-taught filmmaker and an entrepreneur. I raised a lot of money for my company; I ran it, registered numerous patents, and had a lot of experience in product development. It was almost like I was training to write this. I do empathize with the inventors and their struggles. It is a book about inventors. I don’t know how anybody becomes an inventor. You’re probably wired that way at birth.” —Lenny Lipton, Author of The Cinema in Flux During the 18 months of working with Sam and the production department, I made 9,000 changes to the manuscript that were big and little. I’m a much better copy editor than a proofreader!” Appropriately, a Hollywood icon known for developing and producing technology to improve the theatrical experience wrote the foreword. “I’m closer to Douglas Trumbull than most people in the world in terms of our careers. I wished that he lived in Los Angeles, but we do see each other often. We are a lot alike.” There is a personal connection to the subject matter. “That’s the beautiful thing about it. The Cinema in Flux calls upon my years of doing what I did. I became a self-taught filmmaker and an entrepreneur. I raised a lot of money for my company; I ran it, registered numerous patents, and had a lot of experience in product development. It was almost like I was training to write this. I do empathize with the inventors and their struggles. It is a book about inventors. I don’t know how anybody becomes an inventor. You’re probably wired that way at birth.” Part of the motive to write The Cinema in Flux for Lipton was to correct a misconception about the history of cinema. “There has been a tendency in the past for film scholars to think that everything before Thomas Edison’s camera and the Lumières’ Cinématographe is prehistory. But I didn’t view it that way. The real start of what today we consider to be movies occurred on 1659 with [Dutch physicist] Christiaan Huygens’ invention of the magic

TOP: Considered to be the inventor of special effects in movies, Georges Méliès is pictured in his studio located in Montreuil, France circa 1897. BOTTOM TWO: A color frame from A Trip to the Moon by Georges Méliès is compared to a hand-colored Magic Lantern slide.

FALL 2021 VFXVOICE.COM • 39

PG 38-42 LENNY LIPTON.indd 39

8/29/21 2:37 PM


PROFILE

“There has been a tendency in the past for film scholars to think that everything before Thomas Edison’s camera and the Lumières’ Cinématographe is prehistory. But I didn’t view it that way. The real start of what today we consider to be movies occurred on 1659 with [Dutch physicist] Christiaan Huygens’ invention of the magic lantern. Very rapidly people learned how to produce movies and do shows.” —Lenny Lipton, Author of The Cinema in Flux

TOP TO BOTTOM: By using his trolley camera for locomotion studies Étienne-Jules Marey pioneered the technique of performance capture. Even the Lumière brothers were experimenting with 3D by creating viewing eyewear for their 35mm horizontal-traveling stereo format. A Magic Lantern created by Ernest Plank circa 1990.

lantern. Very rapidly people learned how to produce movies and do shows. I would define movies as the projection of moving images on a screen.” The origin of the book occurred during the week of Christmas in 2009. “I got invited to do a talk on stereoscopic cinema at La Cinémathèque française in Paris and was lucky enough to arrive at a moment where they were having an exhibit in their museum about the magic lantern and had demonstrations of the technology. At that time, I didn’t know I was going to write about it. “I had no idea what the outcome would be,” admits Lipton. “The text is about 400,000 words and the book is 800 pages. I didn’t have an outline. I just headed down the road.” The historical narrative is divided into three eras: Glass Cinema, Celluloid Cinema, and Television and the Digital Cinema. “It didn’t occur to me to use that classification system until I had been working on the book for five years. I had a single Word file that was gigantic. It became a good way to write a book like this because I was able to easily search the whole file to avoid redundancies and put things in a proper order. I had one slim notebook with notes in it. I bought about 400 books. I can’t say that I understood the subject well. I needed time to digest it. I had to keep making mistakes and correcting them. I went to the Margaret Herrick Library, which is part of the Academy of

40 • VFXVOICE.COM FALL 2021

PG 38-42 LENNY LIPTON.indd 40

8/29/21 2:37 PM


Motion Picture Arts and Sciences, and was fortunate to have access to the Society of Motion Picture and Television Engineers digital library, in particular the journals that they began publishing in 1916. It’s a primary source, and every source has a bibliography where you get to more sources. The Library of Congress has about 50 to 60 years of motion picture periodicals. I also read about glass and celluloid. “I understand this technology and tried to explain it in my own words to the reader,” notes Lipton. “The vast majority of quotes in the book have a historical importance that gives some insight into the inventor’s process and state of mind. Even when the inventor was lying, I thought that was interesting too.” The project provided an opportunity to recognize underappreciated contributions. “For the most part my account celebrates the best-known inventors who were by prior scholars attributed correctly.” There are certain controversial figures like Thomas Edison. “The problem with Edison is by inventing the phonograph, an electronical distribution system, a very good lightbulb, and the first motion picture camera, he invented modern entertainment, and there is no one who can take it away from him. The guy who is most overlooked is Theodore Case who essentially invented optical sound on film as it was used for almost a century. His technology was licensed to Western Electric which took the lion’s share of the credit.” Some of the significant inventions were not intended to be applied to cinema. “The guys who were working on celluloid for film were thinking about photography and snapshots,” notes Lipton. “The most interesting example of a serendipitous invention that profoundly influenced cinema is Joseph Plateau and the phenakistoscope. It is like a zoetrope. It is a contraption that you can spin and has slits. You look into a slit, then into a mirror and see a moving image. That was invented in about 1832, and it had nothing to do with cinema and the projection of images. But for the next 50 years, inventors applied that discovery of apparent motion and the phenakistoscope technology to the magic lantern. Celluloid cinema, which was with us for most of the recent history of cinema, is a hybrid of the phenakistoscope and magic lantern. You can produce the illusion of moving images by properly projecting or presenting frames that are incrementally different. Another curious aspect was that Joseph Plateau was going blind when he made the discovery.” The Cinema in Flux is an overview of the technological advancements in cinema. “It’s a sprawling subject,” observes Lipton. “Possibly any one of those chapters could have been turned into a book of this length, but I couldn’t do that. What we called cinema from the get-go included the projection of motion with sound and color. By 1790, reasonably bright and decent-sized painted color slides were being projected that had narrators, musicians and sound-effects people. Without getting into esoteric disciplines like costume design, makeup and visual effects, I thought that the broad characteristics of cinema that existed from inception were motion, color and sound. Therefore, I needed to explore the evolution of those technologies through 350 years.” The advances in technology have caused the cinematic language to evolve. “The most dramatic thing that I can think of is the advances in

TOP TO BOTTOM: The ground floor of the West Orange, New Jersey lab belonging to renowned inventor Thomas Edison. An overlooked contributor to the technical advancements of motion pictures is Theodore Case who invented optical sound on film. Inventor Erich Kästner with his prototype of the ARRIFLEX 35 in 1937. Different ways to present movies to audiences, such as utilizing a geodesic dome rather than a traditional flat theatrical screen. Stanley Kubrick and Garrett Brown shooting The Shining with a Steadicam. Lenny Lipton has made his own contributions to stereographic projection with the ZScreen. IMAX co-founder Graeme Ferguson operating a 15 perf 70mm camera.

FALL 2021 VFXVOICE.COM • 41

PG 38-42 LENNY LIPTON.indd 41

8/29/21 2:37 PM


PROFILE

“The vast majority of quotes in the book have a historical importance that gives some insight into the inventor’s process and state of mind. Even when the inventor was lying, I thought that was interesting too. For the most part my account celebrates the best-known inventors who were by prior scholars attributed correctly.” —Lenny Lipton, Author of The Cinema in Flux

TOP TO BOTTOM: A phenakistoscope disc was used to create the illusion of motion from still images. 3ality stereoscopic rig was used to shoot Dawn of the Planet of the Apes. Picture editing literally went from being a cut-and-paste process to footage being digitally shifted around inside of a computer courtesy of software programs such as Avid Media Composer.

digital cinema, which has had a profound effect on motion picture production and exhibition.” The next major stage for cinema may well be the shift from audience members being passive to active participants, much like video games and virtual reality. “In the long run [that] technology will become feasible. In the short run I don’t know.” As to whether inventors and their inventions are a product of the period in which they live, Lipton responds, “There is a current that runs through these inventors and maybe all inventors. There is an ornery tenaciousness to many inventors. Often when people invent something, their vision of how it will be deployed is different from the rest of the world. Edison gets credit for inventing the research lab, which is maybe one of his greatest inventions. The damnedest thing is after the research lab became part of corporate entities, they started calling inventors ‘engineers.’ In some cases, if a guy was highly qualified, he’d be called a scientist. Corporations would never call an employee an inventor, and there is a good reason for that. The term inventor implies ownership. Corporations don’t want inventors to have that. The idea of science as a separate discipline doesn’t occur until the mid-19th century. Most of the people I write about in the early days, which until relatively recently, I describe as autodidact, people who taught themselves, or polymaths or both. Polymaths are masters of many disciplines. The idea of a speciality is a new idea. There aren’t many maverick independent inventors in the world. “I have to tell you that if I wasn’t emotional and didn’t have strong feelings [about the subject matter], I couldn’t have written this book,” reflects Lipton. “I had to focus on the chemistry and the science because if you don’t have any of it then you don’t have movies. A lot of us in the motion picture industry don’t know the history of the technology, so I hope that the book will be a good read for them. It is a small contribution to humanity and human intelligence to be able to provide a book like this because I’m thankful of people who wrote the books, patents and articles; I believe in that tradition. I can only thank God that I had the wherewithal to do it. If we don’t support and help eccentric people who are working on strange projects then we will be less human.”

A restored autochrome portrait of Charlie Chaplin in Hollywood, circa 1917-1918. (Image courtesy of George Eastman House/Charles Zoller/The Image Works)

42 • VFXVOICE.COM FALL 2021

PG 38-42 LENNY LIPTON.indd 42

8/29/21 2:37 PM


PG 43 THE MOLECULE AD.indd 43

8/29/21 3:05 PM


FILM

GAME ON: FREE GUY VFX ANCHORS BOTH ACTUAL AND SIMULACRUM REALITIES By KEVIN H. MARTIN

Images courtesy of 20th Century Studios and Walt Disney Studios Motion Pictures. TOP: Video-game character Guy (Ryan Reynolds) squares off against an array of other gaming denizens. Guy’s growing awareness of his own nature, revealed by programmer Millie (Jodie Comer), fuels his desire to save his Free City pseudo-world from the game’s publisher (Taika Waititi). OPPOSITE TOP: Live action was accomplished in Boston between May and August of 2019. Direct Dimensions LiDar scanned most locations and set during the live-action shoot in Boston, which featured extensive practical work by Special Effects Supervisor Dan Sudick.

The hero who wakes up to realize his existence is limited to a virtual or controlled-from-outside reality has become something of a staple in genre filmmaking, popularized by The Matrix and The Truman Show, to name just two such entries. While Free Guy treads similar ground, its protagonist Guy (Ryan Reynolds) is no world-class hacker or unwitting reality-show star, but just a bank teller, cheerfully plodding through a ritualized daily routine that involves his Free City surroundings – the setting for a video game of the same name – suffering massive devastation and violence. Once awakened to the situation, Guy must step up his own game to stop the program’s publisher from shutting down Free City permanently. Director Shawn Levy has a long history with VFX-heavy projects ranging from the first two Night at the Museum installments to Real Steel and several episodes of the Stranger Things series, along with a remake of Starman in the works, plus he intends to re-team with Reynolds on a time-travel project for Netflix. To look after the VFX end of things, he chose Visual Effects Supervisor Swen Gillberg, whose past experience spans both real-world (Flags of Our Fathers, A Beautiful Mind) and fantasy environs (Jack the Giant Slayer and a trio of recent MCU features). Gillberg joined Free Guy’s production team in December 2018, after the arrival of Production Designer Ethan Tobman. “Ethan had already done some design work on Free City,” Gillberg recalls. “But there was real evolution to things and, design-wise, it was a group effort going well into the next year to develop how things looked in this world. When our DP George Richmond came on, he had a great idea: shoot the real world on 4-perf anamorphic, but handle the gaming section with ARRI 65 and big, sharp spherical

44 • VFXVOICE.COM FALL 2021

PG 44-47 FREE GUY.indd 44

8/29/21 2:38 PM


“There’s ‘Reality,’ which is present-day, supposedly set in Seattle. Then, inside Free City, you see our gaming world – a photoreal emulation of a real-world game – which is where most of the movie and most of the VFX take place. But there are also views of a lower-res Free City, as seen on monitor screens when real-world people, including the villain [Taika Waititi], who works in a gaming company, play Free City.” —Swen Gillberg, Visual Effects Supervisor lenses to give them a very different look.” Live-action was accomplished in Boston between May and August of 2019, with Gillberg’s old friend Ron Underdahl acting as lead for the bevy of data wranglers. “A company that I’ve used many times in the past, Direct Dimensions, LiDared pretty much everything,” he notes. “There were HDRs for every setup and lots and lots of panels to be done, so still photography was a big aspect throughout. Special Effects Supervisor Dan Sudick was handling floor effects, so the in-camera work was very solid.” Gillberg elected to work with a total of 11 vendors: Scanline VFX, Digital Domain and ILM Singapore were primaries, aided by Halon (which shared previs duties with DD), BOT VFX, capital T, Lola Visual Effects, Mammal Studios and Raynault VFX. Unlike Tron and most other films exploring this dual-worlds concept, Free Guy actually required three levels of reality. “There’s ‘Reality,’ which is present-day, supposedly set in Seattle,” says Gillberg. “Then, inside Free City, you see our gaming world – a photoreal emulation of a real-world game – which is where most of the movie and most of the VFX take place. But there are also views of a lower-res Free City, as seen on monitor screens when

real-world people, including the villain [Taika Waititi], who works in a gaming company, play Free City. “There was a ton of work involved in making that video game version of the gaming world,” Gillberg relates, “and it had to really hold up when the camera pushes in past the real-world characters to the screen where we see versions of our Free City characters, before transitioning into Ryan and the other actors. It was a fine line for us to walk. While the game versions of their characters did have to be immediately recognizable and compelling, we didn’t want the video game versions of Ryan and the others to look too real, which meant maintaining some contrast between the video game versions and the characters themselves. So, we started that process by figuring out what the video game itself should look like. Did we want something looking more like Warcraft, or other games that are even more realistic?” Levy’s input was often sought throughout look development. “Shawn wanted the game versions to be recognizable enough that the audience could empathize, even when we’re only seeing them on this gaming screen,” reports Gillberg. “I showed Shawn a range of video games, and based on that we settled on a look that was kind

FALL 2021 VFXVOICE.COM • 45

PG 44-47 FREE GUY.indd 45

8/29/21 2:38 PM


FILM

“There was a ton of work involved in making that video game version of the gaming world, and it had to really hold up when the camera pushes in past the real-world characters to the screen where we see versions of our Free City characters, before transitioning into Ryan and the other actors. It was a fine line for us to walk. While the game versions of their characters did have to be immediately recognizable and compelling, we didn’t want the video game versions of Ryan and the others to look too real, which meant maintaining some contrast between the video game versions and the characters themselves.” —Swen Gillberg, Visual Effects Supervisor

TOP: Jodie Comer (Molotov Girl) and Ryan Reynolds in Free Guy. BOTTOM FOUR: Guy’s unvarying daily routine as a bank teller is disrupted when, after donning a special pair of glasses, the entirety of the 'Bank Robbery' gaming world set in Free City is revealed to him.

of a cinematic version of Grand Theft Auto. All of the movement of the characters would be at a full mocap level rather than canned animation. This look brought with it more definition in the emoting talking faces. At first we were going to limit the talking to a simple hinged jaw, but that reduced emotional engagement. To keep a more compelling look, we enhanced the face rigs on the game. There isn’t much talking going on, so instead of head cams we did witness cams. The facial rigs were essentially keyframed, while the hand and body motion was motion-captured.” In addition to handling all shots of the video game version of Free City, Digital Domain also took on some ‘actual’ Free City scenes. “VFX Supervisor Nikos Kalaitzidis oversaw all that,” notes Gillberg, “which included the opening oner plus a construction site. The real-world game programmers keep reprogramming the Free City environment to obstruct Guy’s passage, so he has to deal with obstacles like stairs moving around.” Another impediment faced by Guy as he attempts to save the day was handled by Scanline VFX Supervisor Bryan Grill. “The villain decides to stop Guy’s orange car by squeezing the street in on both sides to crush the vehicle between buildings,” says Gillberg. Scanline was also responsible for all of the building destruction. “We went through a development stage to determine the right look for destruction in the video game world, which meant figuring out whether these collapsing buildings were just shells, or was there firmature? It took tons of artwork and iterations before finally winding up with something that has all the gravitas and real emotional impact of a genuine building collapse, but still kept a video game look to things. Pieces fall in a very naturalistic way, but then as they fall, they rotate around their own center of gravity and then kind of suck into themselves. That look for the debris, which also included a wire-frame kind of blue glow, kept it well away from 9/11 photorealism, but there was still enough weight to it that audiences would care when seeing this world being destructed.”

46 • VFXVOICE.COM FALL 2021

PG 44-47 FREE GUY.indd 46

8/29/21 2:38 PM


One early design concept that stuck postulated a ‘box’ outside of the gaming world. “This box was all blue skies and white clouds,” Gillberg notes, “so when the buildings blow up, you can see this blue sky behind them – even when the camera is up high looking down at the ground!” “After we finished shooting, there were multiple huge pushes through the end of 2019 to produce really good-looking temps,” Gillberg says. “Shawn really wanted the test screenings to have decent visual effects in them, so we completed the screening VFX at a higher-[than-usual] level, but then would abandon those and go back to another approach for the real finals. DD was originally assigned to doing previs for the construction site sequence, but at that time lacked bandwidth, so they rendered all of their previs through a game engine in Unreal. This let us quickly evaluate the effects and how they impacted the story, so the screening process definitely led to a mix of lost and gained shots between the tests and when we completed our finals.” Since post had not fully ramped up when COVID hit, there was a scarcity of finals as of February 2020. “That meant the bulk of the work had to get done during lockdown,” admits Gillberg. “But there were a couple of weeks when it looked like everybody would be going home, as there was a fair amount of pressure to shut the whole thing down. I lobbied heavily against that, feeling there was a good body of talented individuals already on board and that it would be a shame to lose them, so we didn’t ever completely stop. First off, it was a matter of coming up with different protocols in order to work from home. That involved many VPN tunnels into our server on the FOX lot, to establish encrypted links. We set up a screening room at Shawn’s house, which I could drive remotely. Being able to work straight through the heart of the lockdown pleased me greatly, as it let us all stay focused while still having a lot of fun, trying out lots of new ideas.” While the live-action cinematography had relied upon a single real-world LUT and another pair for the virtual gaming world, the looks were refined significantly during digital grading. “We’d re-balance sequences while retaining the general tone,” says Gillberg. “In our final grade, I worked with George and colorist Skip Kimball, and we let the real world go toward more of a coolblue feel while pushing the gaming world look to become even more colorful.” Being free from the constraints of traditional reality for the Free City scenes proved to be a strange attraction for Gillberg and his collaborators. “We had this premise of being in a video game world where anything goes,” observes Gillberg, “and that was a ton of fun for us to play with. I’ll give you an example: we’d be doing reviews with Shawn and somebody would ask, ‘What does this scene need?’ And somebody else might say, ‘This scene needs a dinosaur!’ And the thing here is, we can add that without worrying about the character not reacting to it, because for this clueless bank teller, it is normal to see that crazy stuff happen constantly in this violent world. We got to put in all kinds of video game vehicles and characters from other real games for people to recognize.” That creative rush did have limits, however. “We pushed until I added a TIE fighter, at which point I got told, ‘Stop!’”

TOP: Taika Waititi (Antoine), Utkarsh Ambudkar (Mouser) and Joe Kerry (Keys) in Free Guy. MIDDLE: Live action in the real world was differentiated visually by DP George Richmond, who shot those scenes in 4-perf anamorphic, while Free City sequences were captured with spherical lenses using the ARRI 65. To create more of a distinction between Free City and the world outside, Gillberg worked with cinematographer and colorist Skip Kimball in the DI to create a more cool-blue monochromatic feel in the latter while pushing the gaming world in a more colorful direction. BOTTOM TWO: Scanline’s work in destroying various buildings had to strike a balance between real-world impact and a video-game feel, with objects falling at a naturalistic speed while tumbling and vanishing in a showier way. The faux city’s design reflected the idea of a blue-sky-and-white-clouds box surrounding all, so when Free City is damaged, the ‘sky’ shows through beneath, even in views of the ground.

FALL 2021 VFXVOICE.COM • 47

PG 44-47 FREE GUY.indd 47

8/29/21 2:38 PM


VIRTUAL PRODUCTION

PERSPECTIVES: CHASING THE EVOLVING INDUSTRY NORM FOR VIRTUAL PRODUCTION By TREVOR HOGG

TOP: A virtual production system designed by Lux Machina Consulting simulates hyperspace during the making of Solo: A Star Wars Story. (Image courtesy of Lux Machina Consulting)

OPPOSITE TOP: Lux Machina Consulting partnered with Possible Productions and Riot Games to orchestrate a real-time XR experience for League of Legends Worlds 2020 esports tournament in Shanghai. (Image courtesy of Lux Machina Consulting)

What are the prospects for virtual production in the post-pandemic world when global travel and working in closer proximity to one another will become acceptable once again? Will it fade away or become an integral part of the filmmaking toolkit? There is no guarantee when it comes to predictions, especially when trying to envision what the technological landscape is going to look like five years from now. The only absolute is that what we are using today will become antiquated. In order to get a sense of what the industry norm will be, a variety of professionals involved with different aspects of virtual production share their unique perspectives. Geoffrey Kater, Owner/Designer/Director, S4 Studios “My estimate is that LED walls have a good run for the next five years, but after that AR is going to take over. AI will paint light on people and things using light field technology and different types of tech. Virtual production is here to stay but it’s going to evolve with AI being a big part of that. We decided to go after car processing work and up the ante by using Unreal Engine, which gives you great control over the lighting, art direction and real-time to create our own system that would have the same amount of adjustability. We have the creative work which is building all of the different cityscapes and cars, software development, and working with a stage

48 • VFXVOICE.COM FALL 2021

PG 48-54 VIRTUAL PRODUCTION.indd 48

8/29/21 2:39 PM


and LED company to build out what the volume would look like. If you’re going to have a city, you have to figure out stuff like traffic and pedestrians walking around. What we ended up generating was a city that never sleeps. Typically, car driving footage is two minutes long. We figured out how the computer could build a road in front of us that we never see being built, but now you can drive infinitively on the day so directors and DPs don’t have to cut.” James Knight, Virtual Production Director, AMD “Our CPUs are what everybody is using to power these LED walls, whether it be Ryzen or Threadripper, because of the setup time. I don’t think that anyone has totally figured out virtual production. It is interesting when you develop hardware because there are unintended use cases. You get college students right out of film school who aren’t jaded like the rest of us and don’t know that you shouldn’t do a certain thing, and in that beautiful messiness they end up discovering something that we didn’t realize was possible. There is still a huge element of discovery and that’s the beautiful nature of filmmaking. The media entertainment business is responsible for almost all innovation in CG that trickles to realtime medical imaging and architecture. Imagine a groundbreaking ceremony where you shovel in the dirt, but the media that has

come can look through a virtual camera and see the buildings there in photorealistic CG in real-time. If you make the CPU beefy and fast enough for the filmmaking business, it will be able to handle anything that fashion and real-time medical imaging will throw at it.” Philip Galler, Co-CEO, Lux Machina Consulting “People will always want to tell big stories and in many cases big stories need big sets that require big spaces to put them in and light. I don’t think that the studio stage is going to go away. But what we’re going to see is that smaller studios will be able to find more affordable approaches to these technologies whether it be pro versions of Vibe tracking solutions that are in the $10,000 range instead of $50,000 to $150,000 for most camera tracking or more affordable and seamless OLED displays that people can buy at the consumer level to bring home. As more of these virtual production studios come online, there will be more competition in the market on price for the services and the price will drop at the low end of the market in terms of feature set and usability. One of the things that people overlook all the time is the human resources that are needed. It’s not just spending the money on the LED. It’s being able to afford the training and putting the right team together.”

FALL 2021 VFXVOICE.COM • 49

PG 48-54 VIRTUAL PRODUCTION.indd 49

8/29/21 2:39 PM


VIRTUAL PRODUCTION

“My estimate is that LED walls have a good run for the next five years, but after that AR is going to take over. AI will paint light on people and things using light field technology and different types of tech. Virtual production is here to stay but it’s going to evolve with AI being a big part of that.” —Geoffrey Kater, Owner/Designer/Director, S4 Studios

TOP: Lux Machina Consulting is key member of the virtual production team behind The Mandalorian. (Image courtesy of Lux Machina Consulting) BOTTOM: The CPUs made by AMD, whether it be Ryzen or Threadripper, have become essential for powering the LED walls needed for virtual production. (Image courtesy of AMD)

Jeroen Hallaert, Vice President, Production Services, PRG “The design of an XR Stage is driven by what the content needs to be and where it’s going to end up. Sports broadcasting has been using camera tracking for 10 years where you saw people standing on the pitch and players popping up behind them and scoreboards. Then you have the live-music industry that was using media servers and software like Notch to make sure that the content became interactive with movement and music onstage. Bringing those together in early 2018 made it possible for us to do the first steps of virtual production. Within the year, things are going to be shot within an LED volume and the camera will be looking at the LED as the final pixel image. “Because PRG was active in corporate events, live music and Broadway, we have a lot of skills available, and it’s more of a matter of retraining people than starting from scratch. We now have people running these stages who were previously directing a Metallica show. We’ve got the video engineer from U2 sitting downstairs in the control room. Our gear is set up in a way that video engineers, whether they have been working in broadcasting or live music, can serve these stages as part of hybrid teams. In virtual production there are jobs and job titles that haven’t existed before, but that’s the glue between all of the different elements.” Adam Valdez, Visual Effects Supervisor, MPC “I think of virtual production as simply bringing CG and physical

50 • VFXVOICE.COM FALL 2021

PG 48-54 VIRTUAL PRODUCTION.indd 50

8/29/21 2:39 PM


production steps together. We used to think of motion capture as one thing, making previz another, and on-set Simulcam yet another thing. Now it’s clearly a continuum, and the 3D content flows between all the steps. On the large-scale shows MPC works on, it usually means visualization for live-action filmmakers. Our mission is to take what we know about final-quality computer graphics, and apply it to early design and visualization steps via real-time technologies. “I have been using virtual production extensively for the last year from my home office. I use our toolset to scout environments in VR, block out where action should happen, and finally shoot all the coverage – all in game engine and using an iPad. With regard to technology, it’s all about greasing the gears. The metaverse trend and standards from the technology companies that now hold the reins is a big piece. On industry, we need live-action filmmakers, showrunners and companies like Technicolor to keep a strong educational effort going so we all understand what’s realistic, possible and affordable. If those two ends keep working toward the middle, the virtual production ecosystem should thrive, and then we all win.” Tim Webber, CCO, Framestore “We view the VP toolkit as an expansion of what we do as a company, not a standalone item. Everything is about finding solutions for clients and creatives, and the pandemic has basically ratcheted up the need for VP and changed the way we work with filmmakers. In just two short years we’ve seen Framestore Pre-production Services [FPS] establish itself across every facet of our creative offer, with clients from commercials through to feature films and into AAA gaming and theme park rides all wanting to discuss how it can benefit their projects. “There’s a lot that we can do in terms of explaining to industry just what the techniques and tools are, and where they can best be applied. Virtual production extends far beyond just LED volume and encompasses virtual scouting and virtual camera. Unreal Engine is used in previs and on-set visualisation, along with ICVFX. The beauty of this is being able to offer these as standalones, in combination or as a full suite of creative services that include final visual effects. It is something that we’re employing for our FUSE R&D project.” Christian Kaestner, Visual Effects Supervisor, Framestore “Each project will be unique in its artistic requirements, which usually inform which virtual production elements are required and the wider technical setup for the project. Virtual production only makes sense if you can effectively offset or reduce some of your location or visual effects costs, and while it gives you more flexibility in some areas, it can limit you in others. Ultimately, only strategic planning and a clear idea of the required assets will help you reduce costs. “With the advancements in real-time technology, machine learning, tracking systems and LED panel technology, soon we will be able to explore more and more creativity without boundaries. Our industry has been pushing these for the last decades in

“Technology will become less of a limitation, but a vehicle to unleash new ways of creative storytelling. These are truly exciting times, and it is fantastic to see our industry evolving from ‘let’s fix it in post’ to ‘let’s go see the VP team’ to discuss a new creative solution. We are on a great path to becoming even stronger creative partners in the filmmaking industry.” —Christian Kaestner, Visual Effects Supervisor, Framestore

TOP: The control room responsible for executing the XR experience at League of Legends Worlds 2020. (Image courtesy of Lux Machina Consulting) BOTTOM: From left: Jon Favreau, Director; Caleb Deschanel, Cinematographer; James Chinlund, Production Designer; Andy Jones, Animation Supervisor; and Robert Legato, Visual Effects Supervisor, have a virtual production session for The Lion King. (Image courtesy of Walt Disney Pictures)

FALL 2021 VFXVOICE.COM • 51

PG 48-54 VIRTUAL PRODUCTION.indd 51

8/29/21 2:39 PM


VIRTUAL PRODUCTION

photorealistic rendering, and soon we will see those results in realtime. Technology will become less of a limitation, but a vehicle to unleash new ways of creative storytelling. These are truly exciting times, and it is fantastic to see our industry evolving from ‘let’s fix it in post’ to ‘let’s go see the VP team’ to discuss a new creative solution. We are on a great path to becoming even stronger creative partners in the filmmaking industry.”

TOP: Framestore is providing previs, virtual production and visual effects for the Netflix series 1899, which is using a brand-new LED volume at Studio Babelsberg in Germany. (Image courtesy of Netflix) BOTTOM: The Light Box used by Framestore during the making of Gravity. (Image courtesy of Framestore)

Robert Legato, ASC, Visual Effects Supervisor “Right now, they ascribe more magic to virtual production than it is. You still need to know what you’re doing. What it really is... is prep. There is now an expression, ‘Fix it in prep.’ Yeah. That’s homework. That’s write a good script. Vet the locations. Pick the right actors. Pick the right costumes. When you show up on the shoot day it’s execution phase because you have vetted all of these things. Working out an action sequence in previs means that I’m vetting it in prep. I’m seeing it work or not work or stretching something I haven’t seen before. I need to lay down the foundation editorially to see if that elicits a response that I want. George Lucas did it when he got dogfight footage from World War II and said, ‘Make it look like that.’ That was previs. Previs is fast iterative work. “Because I had never shot a plane crash before, I had to practice at it so I didn’t embarrass myself on my first Martin Scorsese movie. So, I used the same pan and tilt wheels, animated in MotionBuilder, shot it live because that’s my way of working, and edited it all together. Before I spent money I shot it 25 times, re-jigged the pieces, found an editorial flow that worked, and that is now the script. I went out and executed that script in seven days. That’s the way to work, because when I’m done and reassemble it, as good as the previs was, it’s going to be 50 times better because the foundation and vocabulary are already there.”

52 • VFXVOICE.COM FALL 2021

PG 48-54 VIRTUAL PRODUCTION.indd 52

8/29/21 2:39 PM


“There is now an expression, ‘Fix it in prep.’ Yeah. That’s homework. ... Working out an action sequence in previs means that I’m vetting it in prep. I’m seeing it work or not work or stretching something I haven’t seen before. I need to lay down the foundation editorially to see if that elicits a response that I want. George Lucas did it when he got dogfight footage from World War II and said, ‘Make it look like that.’ That was previs. Previs is fast iterative work.” —Robert Legato, ASC, Visual Effects Supervisor Johnson Thomasson, Real-Time Developer, The Third Floor “Visualization can have multiple functions in a virtual production workflow. It can be used to plan the work that will be attempted, and it can be used as part of execution when actual filmed elements are captured. The Third Floor has been leveraging visualization in multiple modes including previs, virtual scouting, motion control techvis, Vcam and on-set visualization on quite a number of virtual productions to date. With LED wall production, the trend is toward more visual effects content being finaled during production. That means a huge shift in decision-making on design, blocking and camera coverage. “Traditionally in visual effects, the turnaround between notes and the next iteration is measured in days if not weeks. Real-time is immediate. That opens the door to exploration and creativity. It empowers the key creatives to riff and for good ideas to snowball. In terms of visualization, the increase in rendering fidelity allows those ideas to be captured in an accurate visual form. This often results in a cost savings because down the line there’s less room for misinterpretation. Virtual scouting in VR also has significant cost implications because it can be a replacement for costly trips for large crews to real-world locations, or in terms of set builds. The keys can walk through the virtual set prior to construction, which again, gives a chance to make changes that would be much more costly later.” Nic Hatch, Co-Founder & CEO, Ncam “There are two sides to Ncam: real-time visual realization and data collection and reuse. The real-time is not our technology. It’s akin to existing platforms such as Unreal Engine. It allows our end users and the visual effects vendors to create better looking images in real-time, which it has to be if you want to finish in camera. The data side is hugely important, and I feel that’s going to be a game-changer. At the moment, data collection on set is minimal. To some extent machine learning will help. It’s not going to be one technology on its own. It’s going to be everything together. Ncam’s technology is based on computer vision with a bit of deep learning. “If you look at the quality of real-time gaming over past five to 10 years, it has had leaps and bounds in terms of high fidelity and

TOP TO BOTTOM: Framestore was responsible for producing the cartoon characters that inhabit the real locations captured in Tom and Jerry. (Image courtesy of Warner Bros. Pictures) NVIZ provided visualization, virtual production and visual effects for the Netflix series The Irregulars. (Image courtesy of NVIZ) NVIZ enabled the production for The Irregulars to have live remote sessions to create the previs. (Image courtesy of NVIZ)

FALL 2021 VFXVOICE.COM • 53

PG 48-54 VIRTUAL PRODUCTION.indd 53

8/29/21 2:39 PM


VIRTUAL PRODUCTION

realism – that’s only going to get better. The more that we can do real-time, the more that we can do visual effects through the lens. There will be all kinds of technology coming out over the next few years that will help us to visualize things better. Reading and calculating light in real-time and the depth analysis that we require, to deep compositing in real-time, all of this will be coming – this is just the start. I’ve never seen the industry embrace it as much as they have done now. Ultimately, there was always change coming.” Hugh Macdonald, Chief Technology Innovation Officer, NVIZ “Up until a couple of years ago all of our previs was done in Maya. We wanted to push to doing it in Unreal Engine for various reasons, including being able to do virtual camera sessions as part of previs, but also for on-set Simulcam and more virtual production setups. It involved bringing a lot of new technologies into this world, and Epic Games has been fantastic. An interesting question to come out of this is, ‘Should previs be the department creating the assets for on set?’ A lot of productions these days have virtual art departments. If I could snap my fingers and determine how it would be, I’d have previs focus on designing the shot and not having to generate that full quality, and get the visual effects companies involved from pre-production building these assets, and allow them to make the final quality which is what they’re good at. Twenty or 30 years ago, visual effects was a bit of a luxury for a few big films, and now it is in every film. We’ll see the same with virtual production.”

TOP TO BOTTOM: The Third Floor contributed a range of previs and virtual production for the Netflix movie Jingle Jangle: A Christmas Story. (Image courtesy of The Third Floor) The production team responsible for the second season of The Mandalorian had the ability to preview CG assets in context with real locations and sets on mobile devices by using The Third Floor’s AR app Cyclops. (Image courtesy of The Third Floor) From top: The Mandalorian (Pedro Pascal), Bo-Katan Kryze (Katee Sackhoff), Axe Woves (Simon Kassianides), Koska Reeves (Mercedes Varnado) and The Child sit atop the Razor Crest set piece inside Industrial Light & Magic’s StageCraft virtual production volume. (Image courtesy of 2020 Lucasfilm Ltd.)

Kris Wright, CEO, NVIZ “What is exciting about virtual production is being able to turn up on the day [you shoot] because camera-ready has made visual effects part of the filmmaking process. Often Janek Lender [Head of Visualization] and Hugh Macdonald are not just working in a visual effects capacity, but with stunts, production designers and DPs. It is great to see how this tool is able to work across many disciplines and start to work in a hub. Virtual production is making it a lot more collaborative. On Solo: A Star Wars Story we wrote a rendering tool which enabled everything that they shot with a Simulcam to go in as a low-footprint postvis, and that was staying in their cut in certain cases for quite a long time. This has become a downflow from virtual production into post and editorial. “What is exciting is this idea that you can have these tools that are helping to accelerate the process whether it would be final pixel, which is the Holy Grail, or now you can start to build quick edits from dailies that were captured through a Simulcam process. If we can keep in step with how filmmaking works and not make it feel like we’re reinventing the wheel, or keep things low footprint but accessible, that’s where it’s successful. What has been the big advancement is the real-time engine, but in a lot of ways we’re still using the same methodologies for filmmaking.”

54 • VFXVOICE.COM FALL 2021

PG 48-54 VIRTUAL PRODUCTION.indd 54

8/29/21 2:39 PM


PG 55 MPC Episodic AD.indd 55

8/29/21 3:06 PM


VIRTUAL PRODUCTION

LED WALLS AND STAGES ON THE RISE IN THE U.S. AND EUROPE By CHRIS McGOWAN

With the growth of virtual production, there has been an increased demand for LED walls and stages. Production houses and VFX studios in the U.S. and Europe have built dedicated LED walls, stages and workflows, with more on the way. Some offer bespoke LED wall setups. Here is a sampling that highlights studios with LED walls and stages in a fixed location and those firms that set up bespoke/pop-up LED Volumes. This is not intended to be a complete listing.

FIXED-LOCATION LED WALLS

TOP: The Child and the Mandalorian (Pedro Pascal) in the cockpit of the Razor Crest inside Industrial Light & Magic’s StageCraft virtual production Volume during shooting of Season 2 of The Mandalorian. (Image courtesy of Lucasfilm Ltd.) OPPOSITE TOP: An image from a promotional video for Resort World in Las Vegas, on the NantStudios LED wall. (Image courtesy of Resort World and NantStudios)

Studio Lab Fixed Location: Derry, New Hampshire. Size of LED stage area: Approximately 2,000 sq. ft. Size of LED wall: 14 ft. high x 52 ft. long. Software driving the stage: Unreal Engine. First operational date: April 2020. Bespoke/pop-up walls availability: “Yes. In addition to consulting, installing and integrating Volumes, we are often asked to bring LED Volumes directly to a client’s location,” according to Benjamin Davis, Studio Lab Director. Comments: “While we are software agnostic, we primarily use Unreal in conjunction with the Disguise platform.”

56 • VFXVOICE.COM FALL 2021

PG 56-61 LED STAGES SURVEY.indd 56

8/29/21 2:41 PM


NantStudios Fixed Location: El Segundo, California. Size of LED stage area: 24,000 sq. ft. Size of LED wall: 145 ft. long x 20 ft. high (LED volume is 52 ft. wide by 65 ft. long in a 270-degree oval). Software driving the stage: Unreal Engine. First operational date: February 2021. Bespoke/pop-up walls availability: “We are building Volumes as needed!” says Keaton Heinrichs, Director of Business Development/Operations. Comments: Plans to launch two more fixed-location LED Volumes soon: one in Q4 2021 and one in Q2 2022, according to Gary Marshall, Director of Virtual Production. The NantStudios campus in El Segundo also has an Epic “Innovation Lab” LED stage for demos, education and R&D. TRICK 3D Fixed Location: Atlanta, Georgia. Size of LED stage area: (35 ft. x 45 ft.) 1,575 sq. ft. Size of LED wall: 13.12 ft. high x 42.71 ft. long. (The wall is on a curve, so the size measurement is on a straight line.) Software driving the stage: Unreal Engine. First operational date: April 7, 2021. Bespoke/pop-up walls availability: “Yes, our location is fixed but we can also move the setup to pop-up as needed,” according to Stacy Shade, TRICK 3D Head of Studio and Executive Producer.

Comments: “The wall is 26 LED panels across the curve,” adds Shade. TRICK 3D and Music Matters Productions joined forces to launch the LED-wall-powered XR stage located in Northeast Atlanta. Pixomondo Fixed Location: Toronto, Canada. Size of LED stage area: 24,030 sq. ft. Size of LED wall: 208 ft. long x 24 ft high. (LED volume internal stage: 72 ft. in diameter, 90 ft, in length, 24 ft high). Software driving the stage: Unreal Engine. First operational date: January 15, 2021. Bespoke/pop-up walls availability: “While we have done multiple pop-up set extensions, we try to focus on larger, more permanent installations as that is more cost-effective for our clients,” says Josh Kerekes, Head of Virtual Production. Major wall productions: Star Trek: Discovery Season 4 and newest franchise, Star Trek: Strange New Worlds, Season 1. Comments: One more LED stage coming in Toronto and one in Vancouver. 80six Fixed Location: Slough, U.K. Size of LED stage area: 3,500 sq. ft. Size of LED wall: 18m x 4.5m high (approximately 59 ft. x 15 ft.).

FALL 2021 VFXVOICE.COM • 57

PG 56-61 LED STAGES SURVEY.indd 57

8/29/21 2:41 PM


VIRTUAL PRODUCTION

Software driving the stage: Unreal Engine. First operational date: March 2021. Bespoke/pop-up walls availability: “Yes, we can advise, design and build custom-made LED Volumes at the client’s location. We pride ourselves on our flexibility and can adapt our current volume to fit the unique requirements of incoming film productions. Our extensive ROE Visual LED inventory, one of the largest in Europe, permits us to build LED walls at scale as per clients’ requirements,” comments Aura Popa, 80six Marketing Manager. Comments: “The Virtual Production Stage at our studios in Slough features a ready-to-shoot and pre-calibrated curved LED Volume, specifically designed for film and TV drama,” says Popa. “Over the past couple of years, we’ve worked alongside industry leaders including Epic Games, DNEG, ROE Visual, Brompton Technology and ARRI on various R&D tests for camera and LED walls. The recent considerable interest globally in real-time live-action filmmaking with LED walls and remote workflows increased the demand for a specialized studio space with an adaptable pre-configured LED Stage. Set in an area of 3,500 sq ft, the Virtual Production Stage features a large format (18m x 4.5m) curved Mandalorian-style LED volume. The studio’s technical infrastructure supports HDR video by combining some of the most advanced VP technologies such as Unreal Engine, Disguise Media servers, Mo-Sys camera tracking, ROE Visual LED panels, and Brompton Technology LED processing, to name a few.” Projects: DNEG, Wilder Films and Dimension Studios chose 80six’s studios to shoot the short film Fireworks, according to Popa. Fireworks was directed by DNEG Creative Director Paul Franklin, and also supported by Epic Games, Lipsync Post Production and StoryFutures Academy.

TOP TO BOTTOM: Pixomondo’s name wraps around the LED wall in Toronto that became operational in January 2021 and where two Star Trek series have utilized the stage. (Image courtesy of Pixomondo) The Pixomondo LED stage in Toronto, now fully operational, during construction. (Image courtesy of Pixomondo) A ceiling is readied in front of a spectacular forested backdrop on the Vū Studio LED stage. (Image courtesy of Diamond View Studios) Dramatic scenery on the LED wall and ceiling at the Vū virtual production studio in Tampa. (Image courtesy of Diamond View Studios)

ILM Fixed Locations: two in Manhattan Beach, California, and one in London, England. Size of LED stage area: Manhattan Beach Studios, Stage 1: 25,000 sq. ft., Manhattan Beach Studios, Stage 2: 18,000 sq. ft., Pinewood Studios: 24,742 sq. ft. Size of LED wall: Manhattan Beach Studios, Stage 1: StageCraft LED volume is 75 ft. wide x 90 ft. long x 23 ft. tall, 270-degree oval with LED doors. Manhattan Beach Studios, Stage 2: StageCraft LED volume is 75 ft. wide x 80 ft. long x 23 ft. tall, 270-degree oval with LED doors. Pinewood Studios: StageCraft LED volume is 75 ft. wide x 90 ft. long x 23 ft. tall, 270-degree oval with LED doors. Software driving the stage: StageCraft 2.0 runs ILM’s Helios. First operational dates: Manhattan Beach Studios, Stage 1: October 2019 Manhattan Beach Studios, Stage 2: January 2021. Pinewood Studios: February 2021. Bespoke/pop-up walls availability: “Yes. ILM continues to offer bespoke custom Volumes for projects and have done so on projects such as The Midnight Sky (2020), Thor: Love and Thunder (2022), and The Batman (2022).” says Chris Bannister, Executive Producer, Virtual Production - ILM StageCraft. Comments: “StageCraft 2.0 running ILM’s Helios is our primary engine offering our most dynamic set of production tested tools. We also support Unreal Engine and can ingest assets from a wide variety of 3D content creation tools based on production needs,”

58 • VFXVOICE.COM FALL 2021

PG 56-61 LED STAGES SURVEY.indd 58

8/29/21 2:41 PM


says Bannister. “Helios is Industrial Light & Magic’s first cinematic render engine designed for real-time visual effects. It was engineered from the ground up with film and television production in mind. Helios was designed from the start to work seamlessly with ILM StageCraft and is compatible with all the major content creation tools including Unreal Engine.” History: “Along with The Mandalorian, ILM utilized LED screens for accurate real-time lighting, reflections and backgrounds on films such as Rogue One: A Star Wars Story and Solo: A Star Wars Story,” continues Bannister. ILM StageCraft Volumes were currently in use on The Book of Boba Fett, Obi-Wan Kenobi and Ant-Man and the Wasp: Quantumania. Coming: “We are evaluating continued expansion in a number of key creative production centers including Vancouver, Atlanta, Sydney and London,” says Bannister. Vū Studio (Diamond View Studios subsidiary) Fixed Location: Tampa, Florida. Size of LED stage area: 20,000 sq. ft. Size of LED wall: 100 ft. wide x 18 ft. high. Software driving the stage: Unreal Engine. First operational date: November 2020. Bespoke/pop-up walls availability: “Yes, back in April we announced turnkey LED Volume solutions for companies/ studios interested in creating their own LED Volumes,” comments Jonathan Davila, President & Co-founder of Vū. Comments: “Vū is now one of the world’s largest LED Volumes, but also a provider of proprietary Vū LED Panels for other studios looking to build their own LED Volumes. We produce our own panels called Vū Panels. We also engineered the first-ever in-panel tracking system in partnership with Mo-Sys, making the Vū ceiling panels the first of their kind,” says Davila. Dark Bay Fixed Location: Potsdam-Babelsberg, Brandenburg, Germany. Size of LED stage area: 17,890 sq. ft. Size of LED wall: 180.5 ft. long x 23 ft. high (or: 75.5 ft. volume diameter x 23 ft. volume height; 4,306 sq. ft. total shooting area). LED wall: 1,470 ROE Ruby panels. LED ceiling and ARRI Sky Panels: 4,090 sq. ft. Software driving the stage: Unreal Engine. First operational date: May 3, 2021. Bespoke/pop-up walls availability: “If productions request a pop-up volume, we’re ready to create and operate it,” says Türkiz Talay, Dark Bay Public Relations and Marketing Director. Projects: The Netflix series 1899.

TOP TO BOTTOM: The TRICK 3D LED stage in Atlanta, Georgia has had TV and music video productions filmed there since it launched in April 2021 in collaboration with Music Matters Productions. (Image courtesy of TRICK 3D) Semi-permanent pop-up-style open cube LED Volume built for testing. (Image courtesy of 80six) Studio Lab and TEDx Cambridge teams preparing for an XR shoot in a virtual museum space designed by Neoscape. (Image courtesy of Studio Lab)

FALL 2021 VFXVOICE.COM • 59

PG 56-61 LED STAGES SURVEY.indd 59

8/29/21 2:41 PM


VIRTUAL PRODUCTION

MELS Studios and Postproduction Fixed Location: Montreal, Canada. Size of LED stage area: 15,000 sq. ft. Size of LED wall: 17 ft. high x 35 ft. long, with a ceiling. Software driving the stage: Unreal Engine. First operational date: October 2020. Bespoke/pop-up walls availability: Yes. Projects: The Moodys, Arlette.

BESPOKE/CUSTOM VOLUMES

Weta Digital Bespoke/pop-up walls availability: Yes. “Rather than maintaining a fixed Volume, Weta Digital has partnerships with companies like Streamliner Productions and Avalon Studios here in Wellington, New Zealand. This allows us to tailor solutions that are the right fit for productions shooting here. The technology has many useful applications, and not every project needs a fully immersive Volume. We’ve also been working with technology providers in other production centers around the world to produce content for Volume stages,” says Dave Gougé, Weta Digital Head of Marketing & Publicity. “LED panels can be strung together to create any shape or form needed for production. Planning is important to build the wall necessary for a film. Having a fully dedicated virtual production stage can be a fantastic tool and can be utilized for many projects. Other projects will require customization and will need to be built on location. Often times, a large LED wall isn’t necessary, and it is more about placement and fitting the LED into a set in smart and methodical ways. This means working closely with production designers, directors of photography and set builders to plan and execute,” says Gougé. Projects: Avatar sequels, KIMI, Black Adam.

TOP TO BOTTOM: NantStudios has a fixed-location LED wall and Volume in El Segundo, near Los Angeles. Epic also has an LED demo stage on the same campus. (Image courtesy of Resort World and NantStudios) MELS Studios and Postproduction opened its LED stage in Montreal in October 2020. (Image courtesy of MELS Studios) A view of the American Southwest comes to life on the LED wall of Vū Studio in Tampa, Florida. (Image courtesy of Diamond View Studios) OPPOSITE TOP: Vū’s LED wall has 16.4 million pixels and 8K resolution, according to the firm. (Image courtesy of Diamond View Studios)

Framestore Bespoke/pop-up walls availability: Yes. Tim Webber, Framestore Chief Creative Officer, says, “One of our core aims as an industry is to make the impossible possible, and LED technology is another tool by which we can achieve this. It has clear standalone benefits – allowing cinematographers, directors and production designers to see their ideas evolving in real-time; assisting with perennially tricky issues like matching lighting; helping cast to better visualize the CG world around them; and allowing directors and camera crew to see ‘non-existent’ backgrounds through the camera. But perhaps its biggest benefit is when it is part of a wider creative-workflow shift that begins with concepting and previs and extends all the way through to final VFX and post, all enabled by real-time technology. Virtual scouting, ICVFX, postvis and the rapidly-developing application of engine technology are all helping streamline, hone and speed up the filmmaking process, and ultimately offer limitless storytelling opportunities to directors and showrunners.” Adds Webber, “Framestore is adopting an agile, flexible approach to the way we work with LED. There’s understandable appetite and excitement from clients working across film, episodic,

60 • VFXVOICE.COM FALL 2021

PG 56-61 LED STAGES SURVEY.indd 60

8/29/21 2:41 PM


advertising and immersive, but we’ve never been a company to adopt a one-size-fits-all approach to creativity, and we’d never shoehorn clients into solutions that aren’t tailor-made for them. At present we have a small (32.75 ft. x 11.5 ft.) LED setup in our LA studio, which we mainly use for tests, demos and small shoots. At the opposite end of the spectrum, we’re used to working with third-party suppliers for feature film work, and for the upcoming Netflix title 1899 helping to design and set up Europe’s largest LED stage, Dark Bay.” Lux Machina Bespoke/pop-up walls availability: Yes. “Lux Machina doesn’t own any stages presently, but we do design, install and operate stages on behalf of studio and media groups, says Lauren Paul, Lux Machina Director of Business Development. “We worked with NantStudios to design and install their stage and will supply supplemental production support when required. We will be installing four other stages this year, with more to come in 2022, and have consulted on at least half a dozen other permanent stages. Lux Machina has had a hand in developing and improving LED, rendering and camera technology product lines to support their use in virtual production.” On Lux Machina’s joint venture with U.K.-based Bild Studios: “Our partnership with Bild Studios is intended to support our upcoming projects in both the U.S. and the U.K. Having just recently opened our office in the U.K., and with many of our projects taking place in the U.K., we felt it’s important to work with Bild Studios so that our clients will have access to the world’s best talent and most cutting-edge technologies and workflows,” says Paul. “We use a variety of tools depending on the goals of production, but our workflows and proprietary software tools predominantly center around Unreal Engine and Disguise.” Coming: “We will be deploying fixed stages in Los Angeles, Atlanta, Seoul and the U.K. Sizes of the stages range from 40 ft. [to] 60 ft. and 90 ft. in diameter,” says Paul. The firm expects to increase that number in 2022 and beyond. Dimension Studio Bespoke/pop-up walls availability: Yes. Naomi Roberts, Dimension Studio Head of Marketing, comments, “At Dimension, we set up a new bespoke custom build for each virtual production. Our live projects on feature films are under wraps at the moment, but we wrapped on the Fireworks short film with DNEG earlier this year. Fireworks began in January 2021 with an LED stage 18m x 4.5m high with a 2.5-degree curve. In terms of software, scenes are running in Unreal Engine. The Unreal scenes run through

Brompton servers and nDisplay to the LED wall. We used EZTrack for camera tracking.” DNEG Bespoke/pop-up walls availability: Yes. (DNEG sets up LED walls bespoke, as required.) Paul Franklin, Creative Director at DNEG, comments, “LED technologies have made an enormous impact on how we make films. For First Man, we used a large LED backing to create the spectacular views of the upper atmosphere and space seen by the U.S. astronauts in their quest to be the first to land on the Moon. The brilliance and resolution of the screens gave us ‘final pixel’ results, often without the need for additional post-production. A few years later we took LED techniques to the next stage for Fireworks, creating real-time interactive content that tracked the movement of the camera to produce a full 3D effect completely in-camera. I have no doubt that LED Volumes will become the standard way of doing shots that previously would have required months of post-production to complete – it’s a game-changer.” Fireworks began in January 2021 in collaboration with DNEG and 80six in Slough.

DEMO STAGES

Epic Games Fixed Location: London, U.K. (Innovation Lab, a demo space). Size of LED stage area: 6m x 8m (approximately 20 ft. x 26 ft.). Size of LED wall: 9m x 4m (approximately 29.5 ft. x 13 ft. high|) curved wall, with a 6m x 4m (approximately 20 ft. x 13 ft.) ceiling. Software driving the stage: Unreal Engine. First operational date: September 2020 (of LED volume). Bespoke/pop-up walls availability: Not at this time. History: Based in the Soho district of central London, Epic’s Innovation Lab is situated in the epicenter of London’s VFX and production district and serves as a hub for the creative community. The space is designed to serve primarily as a demo and testing environment for the industry, rather than an active production studio. However, Epic used the Lab to film the host sessions Epic’s “Unreal Build: Automotive” virtual event, which attracted participants from around the world. “Additionally, Epic MegaGrant recipient Imagination recently shot a test film at the Lab with excellent results,” says Alistair Thompson, Director, Epic Games London Innovation Lab. Comment: The NantStudios campus in El Segundo also has an Epic “Innovation Lab” LED stage for demos, education and R&D. Coming: Plans for demo stages in other locations.

FALL 2021 VFXVOICE.COM • 61

PG 56-61 LED STAGES SURVEY.indd 61

8/29/21 2:41 PM


PROFILE

ENGINEERING MEMORABLE ACTION SCENES WITH SFX SUPERVISOR HAYLEY WILLIAMS By TREVOR HOGG

Images courtesy of Hayley Williams, except where noted. TOP: Hayley Williams OPPOSITE TOP: A fully working mechanical clock was constructed on D stage at Pinewood Studios for Hugo. (Image courtesy of Paramount Pictures) OPPOSITE BOTTOM LEFT: Hayley Williams with her father and mentor, Joss Williams. OPPOSITE BOTTOM RIGHT: Williams takes part in making Dark Shadows for frequent collaborator Tim Burton.

Movie and television productions are a way of life for Hayley J. Williams, who used work with her Oscar-winning father, Joss Williams (Hugo), before establishing her own special effects company. “One of the biggest recollections I have is going to visit the set of Judge Dredd in 1994,” states Williams. “Dad would show us his tests, things exploding, and the famous ABC robot he built standing in the corner of the workshop, which was a huge deal when you’re a child. When I was a teenager, I got some work experience with him on Sleepy Hollow and Mortal Kombat. That’s when I knew I would follow in his footsteps one day.” Born in Taplow, England, Williams spent her adolescent and teenage years in Worcestershire and Maidenhead where her separated parents lived. “I grew up in the countryside where I enjoyed horse riding and being outdoors a lot of the time.” Family businesses are the norm in the special effects industry. “People grow up around it and come along with their parents and learn the trade,” observes Williams. “There’s an artistic mindset in my family. We’ve been in the film industry for many years. My father, uncle and my father’s uncle are all successful in their own right before me, so I naturally gravitated towards it.” As a child, Williams was intrigued by how bridges and buildings were constructed and remained standing. “I was fascinated in how and who designed them, so I took an interest in civil engineering and studied the science behind design and construction – knowledge that became very useful to me in special effects today.” The high school student decided to pursue a career in the film industry, but her father had other plans. “My dad insisted I come into it with a skillset first. He said, ‘You’re not just leaving school and getting a job. I want you to go to college first.’ I chose to do mechanical engineering at Worcestershire College of Technology. Once I finished my full-time course, the offer was there for a job with my dad but, at that time, the college also offered to further my education and an apprenticeship. I took the apprenticeship with an investment foundry called Doncasters Precision Castings, who make airplane engine and gas turbine blades. I became a project engineer with them. After five years I felt like I knew the process inside out and wanted to get started in the film industry. “Even though there are the explosions, fire, rain and other atmospherics in special effects, there is also a great demand for mechanical engineering, too,” remarks Williams. “The experience I’d gained in my education gave me a good foundation to train and build my way up to a special effects technician.” A lot was learned from her father. “My dad taught me everything he knew and saw the potential in me. He didn’t treat me differently just because I was his daughter. He pushed me hard and taught me well with discipline and respect for special effects. A huge thing I learned from my dad was how important testing and safety was but, most importantly, how to stand your ground when it comes to that.” Another mentor was Oscar-winner Neil Corbould, VES (Gravity). “Neil was good to me and helped me along my journey the same way my dad did. They’re both huge special effects supervisors, and I learned from the best. It’s wonderful to have their support and input.”

62 • VFXVOICE.COM FALL 2021

PG 62-66 HAYLEY WILLIAMS.indd 62

8/29/21 2:42 PM


“Neil [Corbould] was good to me and helped me along my journey the same way my dad did. They’re both huge special effects supervisors, and I learned from the best. It’s wonderful to have their support and input.” —Hayley Williams Filmmaker Tim Burton has played a big role in the careers of father and daughter. “When Dad retired, I supervised for Tim because he knew me and my team,” states Williams. “Tim views his film crew as family and loves being surrounded by people he knows and trusts. The first film I worked on was Charlie and the Chocolate Factory. I’ve gotten to know how he works, what he likes and dislikes. There are certain questions that I don’t need to ask because I already know the answer.” Special effects not only require technical expertise but artistry, too. “So much of what we do isn’t just about how we are going to achieve it, but also it’s

about the final look. We always test constantly before showing the director different variations, because what they’re looking for and what’s in my head maybe slightly different. Testing allows us to get on the same page as the director to help create their original vision.” The belief that digital effects would completely replace practical elements has not proven to be true. “There was a point where visual effects felt threatening, and some special effects technicians believed it could potentially end an era for us all,” remarks Williams. “But now having been through that period we realized

FALL 2021 VFXVOICE.COM • 63

PG 62-66 HAYLEY WILLIAMS.indd 63

8/29/21 2:42 PM


PROFILE

TOP TO BOTTOM: One of the most rewarding aspects of working on Annihilation for Williams was being able to be part of the creative decisions early on in pre-production. (Image courtesy of Paramount Pictures) Williams served as Special Effects Supervisor on Maleficent: Mistress of Evil. (Image courtesy of Walt Disney Pictures) Williams assists her father, Joss Williams, in producing the special effects for Dark Shadows.

there will always be a harmony between practical and digital effects. In fact, I think it’s great now, as the dust has settled, as we all realized both areas are very much needed. We all have a role to play working in a symbiotic way, taking our live-action work and crossing over into digital. I’ve worked closely with visual effects supervisors over the last five years, and together we have come up with some great effects. You provide as much as you can safely and cost-effectively on set, and they can enhance it. You can work together and produce some beautiful-looking effects.” Advances in technology have made certain aspects of special effects faster. “Water cutting, laser cutting and 3D printing are essential to us meeting deadlines these days with time-scales and schedules getting busier with more demand,” notes Williams. “For example, where once we would be in the workshop cutting and building something for ourselves, instead now we have a full CAD team who will design our rigs with stress- testing factored in on computer and then send those files to fabricators and be back to us within days ready for assembly.” The Pacific was heaven for the special effects team, where Williams served as an administrator and technician for her father. “We went to Australia for 12 months to film HBO’s The Pacific,” recalls Williams. “We were so busy on that project, prepping and shooting countless bullet hits, large explosions, and constructing breakaway buildings that we then destroyed. It was early on in my career, and I was on set for most of the production. It was hard work, but we had a brilliant time, and I was proud to be a part of it as special effects then went onto win two Emmys as a result.” Hugo was a good lesson on the importance of atmospherics. “Smoke haze, steam vents and water drip rigs were all added

64 • VFXVOICE.COM FALL 2021

PG 62-66 HAYLEY WILLIAMS.indd 64

8/29/21 2:42 PM


to help create the final look of the old train station,” remarks Williams. “As well as the importance of atmospherics, we had to construct a huge, fully-working mechanical clock on D stage at Pinewood Studios. It was also extremely important to get the safety measures right with the children moving all around it. We made a number of large cogs with soft foam teeth. There were so many moving pieces to that rig, and it looked amazing. We went over by three months, which is unusual, but worth the extra input to get the final look as special effects won an Oscar for it.” Sci-fi thriller Criminal by Ariel Vromen was the debut of Williams as a special effects supervisor. “I had been assistant supervising for a while, and Neil Corbould gave me a break. We did lots of car explosions, bullet hits, some great flipping rigs, and I began to build my team up and get my reputation as a supervisor.” Projects have ranged from small scale to blockbusters. “My approach is the same in terms of getting the best effect for the budget that we have. I want to find out what the director wants to see, do some tests and get some feedback. We had some lovely gags to do on Holmes & Watson, but it was on a completely different scale to Maleficent: Mistress of Evil, where we were blowing up big castle wall sections and made huge air rigs to replicate downdraft on the flying Fey that were later put in as CG. “Annihilation was a strange one for me as it wasn’t just a case of what the director wanted and we go out and make it happen, but this time I was more involved in the decision-making of the storyline from a special effects point of view,” states Williams. “I met with director Alex Garland and the team early on, and we sat in a room in London talking through the script. I had good input into items that involved special effects. It wasn’t a huge effects film, but

TOP TO BOTTOM: Williams enjoyed how collaborative filmmaker Alex Garland was when making Annihilation. (Image courtesy of Paramount Pictures) Williams as a special effects technician setting up fires for Dark Shadows. Williams with special effects technician Max Brown on the set of Charlie and the Chocolate Factory.

FALL 2021 VFXVOICE.COM • 65

PG 62-66 HAYLEY WILLIAMS.indd 65

8/29/21 2:42 PM


PROFILE

“There was a point where visual effects felt threatening, and some special effects technicians believed it could potentially end an era for us all. But now having been through that period we realized there will always be a harmony between practical and digital effects. In fact, I think it’s great now, as the dust has settled, as we all realized both areas are very much needed. We all have a role to play working in a symbiotic way, taking our live-action work and crossing over into digital.” —Hayley Williams

TOP TO BOTTOM: The special effects were designed to support the comedic gags in Holmes & Watson. (Image courtesy of Columbia Pictures) Along with working on blockbusters, Williams also been involved with productions such as Me Before You. (Image courtesy of Warner Bros. Pictures) A personal highlight for Williams was being able to follow in the footsteps of her father and being a special effects supervisor for Tim Burton on Dumbo. (Image courtesy of Walt Disney Pictures) Williams appreciates having a good professional team around her that can get the results needed for productions such as The Old Guard. (Image courtesy of Netflix)

was big enough. We went out to some beautiful locations, got some great shots, and Alex loved it all. It’s always rewarding when you get that feedback from the director.” Williams was part of the Netflix production of The Old Guard, which had several female heads of department as well as director Gina Prince-Bythewood. There is still a long way to go in raising the presence of women in special effects, says Williams. “I have a few female special effects technicians on my crew, and I always encourage them. It still feels like a male-dominated department, but I hope that my success does help people to see that it is possible. I always try to accommodate young girls who want to start a career in special effects and bring them within our department to learn and forward their careers.” Charlize Theron is a force to reckon with as an actress and producer, states Williams. “She is awesome! We did a lot of gags on The Old Guard. The nice thing about that job was that we worked quite closely with the actors getting them amongst a lot of the practical effects. Safety being a priority, we were confident after enough rehearsing to have bullet hits flying around them and doors blowing in near them. It’s always nice to have the actual cast involved with some of those shots rather than it always being a separate pass with doubles. It looks better and more convincing to tell the story.” What is the secret to a long career in the movie industry? “It comes down to a few things,” observes Williams. “I have good working relationships with different producers and production teams. When they’re happy with the job that you do and the results show on screen, then you get called back. Another secret is having a good professional team around me that get the results. When those things come together and you’ve done a good job, it’s a nice feeling when the phone rings for your next potential project.” Williams enjoys the escapism of watching a movie. “I don’t like it to be too serious. I like the action and laughs. There is enough going on in life, so it’s nice to sit and watch a movie and escape from the world. For me, that kind of feeling comes from the Indiana Jones films such as Raiders of the Lost Ark. A passion for action movies carries on when you become a special effects supervisor!”

66 • VFXVOICE.COM FALL 2021

PG 62-66 HAYLEY WILLIAMS.indd 66

8/29/21 2:42 PM


PG 67 FTRACK AD.indd 67

8/29/21 3:07 PM


VIRTUAL PRODUCTION

VIDEO GAME TECH AND VIRTUAL PRODUCTION: FOUR VIEWS IN REAL-TIME By TREVOR HOGG

TOP: Epic Games aims to democratize the making of content for games with the release of programs such as MetaHuman Creator. (Image courtesy of Epic Games) OPPOSITE TOP AND BOTTOM: Making photorealistic digital humans is hard and expensive to do as a bespoke pursuit, so Epic Games has attempted to make the process easier by releasing MetaHuman Creator. (Images courtesy of Epic Games)

Virtual production would not exist without real-time technology, which the video game industry has been funnelling R&D dollars into for years. That in turn has benefited film and television productions. At the forefront of the innovation is Epic Games, but others have been making their own contributions to further the education of up-and-coming digital artists, as well as authoring computer software that will make work in real-time more photorealistic and efficient. Former Senior VFX Artist at Riot Games, Jason Keyser, established the online learning platform VFX Apprentice, which is focused on 2D games effects. Klemen Lozar formed Tuatara Games, which is one of the newest companies specializing in effects for real-time. Nick Seavert founded JangaFX, which has a reputation for creating real-time volumetric authoring software, and Kim Libreri is responsible for making sure that Epic Games remains an industry leader. Here they share their views on today’s connections between video games, real-time technology and virtual production for film and TV. Jason Keyser, Founder, VFX Apprentice “In all my time working as a 2D FX and 3D real-time VFX artist, I noticed one continuing theme: the intense studio-driven demand for new FX talent far out-paces the supply of students coming out of school. After years of sitting through candidate sourcing and review meetings, I realized the trend wasn’t going to reverse itself. I decided to forge out and found VFX Apprentice to meet the need. “The fact that real-time VFX embodies such a broad range of artistic skills is appealing to many of us, giving us flexibility to be generalists always trying out new things. Others tend to specialize in their niche, going super deep on a narrow range of techniques and styles, which is also a viable career path. It’s been a journey to find the common thread that binds it all together. In the end, I go

68 • VFXVOICE.COM FALL 2021

PG 68-72 VIDEO GAMES.indd 68

8/29/21 2:43 PM


“The line between real-time quality and simulation quality is frequently getting blurred. That’s exciting because real-time offers a highly art-directable pipeline where changes can be made and evaluated instantaneously. It’s stunning what can be done in the real-time space. Video games have been a bastion for 2D animation in the West while the trend was moving to 3D. Now we see aesthetics of 2D games migrating back into film.” —Jason Keyser, Founder, VFX Apprentice back to classic art training: start with the foundational principles, then build from there. I’m not naturally a good teacher. I’ve just done it for so long and made so many mistakes along the way that I’ve learned how to teach! “The line between real-time quality and simulation quality is frequently getting blurred. That’s exciting because real-time offers a highly art-directable pipeline where changes can be made and evaluated instantaneously. It’s stunning what can be done in the real-time space. Video games have been a bastion for 2D animation in the West while the trend was moving to 3D. Now we see

aesthetics of 2D games migrating back into film. A prime example of this is Spider-Man: Into the Spider-Verse, where one of our instructors, Alex Redfish, took his experience as a video game 2D FX animator and applied it to those gorgeous explosion effects you see in the film.” Klemen Lozar, Founder, Tuatara Games “I missed the more focused experience of working on a smaller team and feeling a stronger sense of ownership of my work. I also always valued creative autonomy. Founding and working at

FALL 2021 VFXVOICE.COM • 69

PG 68-72 VIDEO GAMES.indd 69

8/29/21 2:43 PM


VIRTUAL PRODUCTION

“The adoption of various real-time technologies in other entertainment mediums is only increasing. A lot of it comes in a way of great workflow improvements like making it much easier for directors and even actors to have a better understanding of what the finished shots and scenes might look like, just to mention one example. In the case of VR, video game technology is essential and arguably the main reason for its existence and resurrection in the first place.” —Klemen Lozar, Founder, Tuatara Games

TOP TO BOTTOM: Nick Seavert has kept in mind his experiences as a visual effects artist when developing tools for JangaFX, including real-time volumetric authoring software for smoke, explosions and clouds. A real-time authoring software program created by JangaFX for embers is called EmberGen. (Images courtesy of JangaFX) OPPOSITE TOP TO BOTTOM: Tuatara Games is one of the newest companies specializing in effects for real-time. (Image courtesy of Tuatara Games) Klemen Lozar spent three years developing his own video game called Let Them Come. (Image courtesy of Tuatara Games) Tuatara Games identifies a specific need or a problem area and then figures out the technology needed to provide the solution. (Image courtesy of Tuatara Games)

Tuatara has allowed me to satisfy both of those needs while still being at the forefront of game development. Working at large studios teaches discipline, organization, and exposes you to a wide network of like-minded peers and more seasoned developers. Another great learning experience was developing my own game called Let Them Come. This was roughly three years of very focused work in my spare time. It offered a lot of lessons in self-management and perseverance. “Real-time software can be very demanding, especially graphically. I would say in this area the hardware is definitely not keeping pace because the ceiling is just so high. It’s of course great to see steady progress and all the new affordances technological advancements can bring. There is also something to be said about having technical limitations. I personally love the creative problem solving that happens when developers need to squeeze every bit of ‘juice’ out of their target hardware. “Whenever possible we try to work backwards [when developing software]. Basically, first identifying a specific need or a problem area and then figuring out what sort of technology we might need to solve that. Sometimes R&D is also exploratory and a means to an end by itself. Tuatara is currently hard at work at producing our first video game as a team. The main motivation behind creating a new game is perhaps simply the fact that everyone on the team has a passion for games and game development in general. “The adoption of various real-time technologies in other entertainment mediums is only increasing. A lot of it comes in a way of great workflow improvements like making it much easier for directors and even actors to have a better understanding of what the finished shots and scenes might look like, just to mention one example. In the case of VR, video game technology is essential and arguably the main reason for its existence and resurrection in the first place.” Nick Seavert, Founder & CEO, JangaFX “I got my start by modding Half-Life 2 and that introduced me to the world of visual effects in games. I eventually pivoted my life towards entrepreneurship and startups, and luckily, I was able to

70 • VFXVOICE.COM FALL 2021

PG 68-72 VIDEO GAMES.indd 70

8/29/21 2:43 PM


combine my deep passion for VFX and business with JangaFX. I saw a big need for real-time tools in the market and decided to finally launch JangaFX in early 2016 after a few failed startup attempts with other businesses. I never had any software development experience prior to this and hired the right team to ensure that we were successful in developing these tools. One thing that I did have going for me was I knew exactly what VFX artists in the games industry needed and we built tools to suit them. “Real-time tools are definitely the future, but it’s hard to say if hardware is keeping pace. If we want mass adoption of these technologies and if we want higher fidelity without any hitches, we’re going to need terabytes, more memory bandwidth from GPU manufacturers where an RTX 3090 is only 936GB/s. With this, you can only push so much sim data around the GPU, and both this and VRAM eventually become the bottlenecks for high-resolution simulations. “I honestly got tired of waiting 12-plus hours for simulations and renders in other tools and figured that this was a problem that I could solve by putting everything onto the GPU. I felt like other companies were content with the speeds they were getting and were happy to just have market dominance. The fact that these tools were prohibitively expensive as well for individual artists made me want to change the way software was sold in the industry. Our tools are easy to learn and give you results within minutes of opening the tool for the first time. “As video game technologies increase in fidelity [i.e. Unreal Engine 5] we will see them used more often in film and TV. VR of course is already using game technology. The days of having massive post-production crews for visual effects may eventually simmer down and we may see more on-set artists during filming. I’d love to see production times for visual effects dramatically reduced for film. Time is money, and unfortunately a lot of VFX studios in film close down due to under-bidding on projects. Pipelines comprised of real-time solutions will give these studios a better chance at becoming stable enterprises.”

“The days of having massive post-production crews for visual effects may eventually simmer down and we may see more on-set artists during filming. I’d love to see production times for visual effects dramatically reduced for film. Time is money, and unfortunately a lot of VFX studios in film close down due to under-bidding on projects. Pipelines comprised of real-time solutions will give these studios a better chance at becoming stable enterprises.” —Nick Seavert, Founder & CEO, JangaFX

Kim Libreri, CTO, Epic Games “The way that we work is to create technology that allow creatives to tell a story in different ways. Photogrammetry has become more important, but to make great photogrammetry assets is still quite hard. We got to know the Quixel guys, and their vision is similar to ours. It’s like, ‘Let’s scan the world and take the drudgery out of making photorealistic components.’ It seemed like a natural fit [to make them part of the Epic Games family]. The same goes with acquiring 3Lateral and Cubic Motion. Making awesome digital humans is hard and expensive to do as a bespoke pursuit. It’s all an effort to try to democratize making content for games and to avoid having our customers put a lot of effort into the stuff that they shouldn’t have to be experts in. “We have some new tech in Unreal called Nanite, which is a microgeometry system that allows you to get movie-quality-level assets running in real-time. Even though it looks like you’re rendering billions and billions of triangles, the engine is doing this clever switch between different resolution assets. We’re able to be

FALL 2021 VFXVOICE.COM • 71

PG 68-72 VIDEO GAMES.indd 71

8/29/21 2:43 PM


VIRTUAL PRODUCTION

“You’re going to see quite the revolution in terms of how believable dinosaurs, creatures and muscles can be in video games by using a deep learning approach to surface deformations as opposed to trying to do the full simulations. Deep learning is going to play a big part of helping to bridge the gap between pure computer graphics and photography.” —Kim Libreri, CTO, Epic Games

TOP TO BOTTOM: An example of 2D effects work created by Jason Keyser. (Image courtesy of VFX Apprentice) Heartseeker Varus skin created by Jason Keyser for League of Legends. (Image courtesy of VFX Apprentice) The final still of “Fire and Water,” which is part of the branding for VFX Apprentice. (Image courtesy of VFX Apprentice)

flexible about what we’re rendering in any particular frame. We have a light baking system called Lightmass that runs on a GPU, and making it as close to a production-quality ray tracer as you can have is important to us, and being able to preview that within the volume. People should have the right to change the sun’s angle if they want to from shot to shot. We’re investing heavily in fast previews of your lighting so that you can get the best possible result with a minimum result of turnaround time, so that you can live in the moment and not have to go back to your art department and ask them to make changes. “If you want to make a digital human with a fully articulate muscle, flesh and fat system with skin tension, that is computationally a horrendously complex and expensive thing to do. But deep learning gives us the ability to train a computer to be able to fake it by looking at enough input data of, ‘Here is the low-resolution and high-resolution version of this.’ Deep learning can start to work out how to add detail. You’re going to see quite the revolution in terms of how believable dinosaurs, creatures and muscles can be in video games by using a deep learning approach to surface deformations as opposed to trying to do the full simulations. Deep learning is going to play a big part of helping to bridge the gap between pure computer graphics and photography. “Right now, we’re happy with the first release of MetaHuman Creator, which will continue to evolve. We want to get cloth and flesh simulations in there as well as have more of a variety in the faces that you can make. At some point we’ll have it so you can make likenesses of yourself or whoever you want to be able to do in a more intuitive way. You can make an infinite amount of faces already; however, for the bodies you’ve got a bunch of pre-sets. We want to be able to represent the whole of humanity in terms of what you can make for bodies over time. “It’s gotten to the point where a real-time engine like Unreal has a lot of generic capabilities that will benefit many industries. We have this tool called Twinmotion built on top of Unreal that is compatible and can export content to Unreal Engine, but it has a user interface focused on architectural visualization. I do think that we’re heading towards a metaverse. Fortnite is this melting pot of IPS, and eventually there will be different art styles in there, not just the animated look. You’ll start seeing more photorealistic stuff over time. A lot of people hang out in Fortnite to meet with their friends, especially during COVID-19 when they couldn’t get together. What a great way to meet your friends by playing a game or experiencing a concert. “The most important thing is storytelling, and what real-time technology does is speed up your iteration time. A bad idea can stick around for a long time in visual effects. Wouldn’t it be great to know that earlier and actually have time to correct it? The LED walls make such a difference as the actors actually feel that they’re in a place now. All of this technology is helping filmmakers craft something that can tell the best story and in a way that allows them to iterate quicker and focus on the stuff that is working as opposed to being forced into accepting whatever was the first idea.”

72 • VFXVOICE.COM FALL 2021

PG 68-72 VIDEO GAMES.indd 72

8/29/21 2:43 PM


PG 73 EKWB AD.indd 73

8/29/21 3:08 PM


FILM

TIME-TRAVELING FOR LOST LOVE IN THE FUTURE RISING-WATER WORLD OF REMINISCENCE By TREVOR HOGG

Images courtesy of Warner Bros. Pictures, except where noted. TOP: The reminiscence machine was affectionately referred to as ‘The Hamburger,’ with visual effects providing the top portion and cables. OPPOSITE TOP: The final look of the hologram was more like beads of light that are volumetric. OPPOSITE BOTTOM LEFT: RISE was responsible for submerging parts of Miami. OPPOSITE BOTTOM RIGHT: Half-sunken landmarks such as bridges and buildings help to convey the height of the water.

After stepping behind the camera for an episode of Westworld, which she co-created with Jonathan Nolan, Lisa Joy makes her feature directorial debut with Reminiscence, set in the near-future where war and rising water levels caused by climate change have ravaged the world. Private investigator Nick Bannister (Hugh Jackman) operates a machine that allows his clients to access lost memories, and becomes obsessed in solving the disappearance of femme fatale Mae (Rebecca Ferguson). Assisting their Westworld colleague are Cinematographer Paul Cameron (Collateral) and Production Designer Howard Cummings (Side Effects), with Visual Effects Supervisor Bruce Jones (The Italian Job) being a new addition to the team. A hybrid of different genres, Reminiscence is set in Miami, though principal photography primarily took place in New Orleans. “There wasn’t a lot to reference,” notes Cameron. “The important thing is to get the cues for Lisa [Joy]. It’s 2050 and the water comes up three stories. It’s too hot to be outside in the daytime, so people work at night. It’s not blatantly dystopian in any kind of Arthur C. Clarke or Ray Bradbury way. It’s like we had gone through certain digital stages of our lives and have gone back to an analog reality.” Joy was open to collaboration. “What I found to be refreshing about Lisa was she had a definite point of view and knew what she wanted, but when we actually went to look at locations, I could ask, ‘What do you think about this place?’” states Cummings. “[For example], we ended up shooting in an abandoned Six Flags amusement park called Jazzland, and that wasn’t in the script. What she had written in the script was something akin to an underground tunnel. Because it was such a beautiful ruin of a place, Lisa reshaped the script to make it fit.” An emphasis was placed on practical effects including using

74 • VFXVOICE.COM FALL 2021

PG 74-78 REMINISCENCE.indd 74

8/29/21 2:44 PM


“There is a wonderful opening shot that takes us right off the ocean where we see waves crashing, starts to rise up, and we see the whole opening of this city partially submerged in water. It’s a 3,000-frame shot with multiple levels of composites, and it’s almost all made of whole cloth. We rebuilt Miami completely digitally and raised the water 30 feet. Then we added bridges to places. Everybody gets around in their little boats.” —Bruce Jones, Visual Effects Supervisor actual water for the submerged city scenes. “Any set that had water in it looked amazing!” enthuses Cummings. “Lisa had this vision that after becoming flooded, Miami turned into Thailand where people have adapted to living on the water. Next to the amusement park was a boardwalk, and we turned that area into the water market. We had to rewire everything and add lighting as if it was still operational. On the other side of the park was what used to be a recreation of Bourbon Street. I had to take what was meant to be specifically New Orleans architecture, and all of the ironwork and balconies, and turn it into Art Deco South Beach, which is

completely different. Our special effects department, run by the father-son duo of Pete and Peter Chesney [Vice], figured out a way to build dam walls so we could flood this entire street. We added bridges and floating docks. Parts of the movie were shot in Miami to tie it to things that we were shooting in New Orleans.” In order to have Hugh Jackman interact with the reminiscence projection in a believable manner, Cameron used a past experience to come up with a solution. “There is a material called Hologauze used for rock concerts, theatrical events and industrial presentations that you project onto it. Once we had the set design for where

FALL 2021 VFXVOICE.COM • 75

PG 74-78 REMINISCENCE.indd 75

8/29/21 2:44 PM


FILM

TOP: Colleagues Watts (Thandiwe Newton) and Nick Bannister (Hugh Jackman) help people relive their memories for a cost. BOTTOM: A proud moment for Paul Cameron occurred when an electrician took the initiative to light a disused Ferris wheel that provided great dramatic effect. (Photo: Ben Rothstein) OPPOSITE TOP TO BOTTOM: Facial capture was done of Rebecca Ferguson in anticipation that she would have to leave for another project before principal photography was completed. Even though the story is set in Miami, principal photography primarily took place in New Orleans. (Photo: Ben Rothstein) Rebecca Ferguson and Hugh Jackman talk to Lisa Joy, who made her feature directorial debut on Reminiscence. (Photo: Ben Rothstein)

the machine and tank would go, I had to go through and figure out roughly where Hugh Jackman or Thandiwe Newton would be on any given scene and what their relationship would be to the projection during principal photography. Not only did I have to shoot the scenes on location, but I also had to shoot the projection angles for projection in Nick Bannister’s office, which happened towards the end of the schedule. You’ve got to know your lens, distance, height and, of course, the machine is off the ground a couple of feet. It’s a combination of doing the math and saying, ‘This feels right.’ There were three 20K projectors rigged on set.” The requirements of the storytelling lead to the creation of approximately 600 visual effects shots. “The reminiscence machine was Scanline VFX, the water was Rise FX, and Hollywood VFX did simple composites and paint-outs,” remarks Jones. “More important than anything for me is key art. We had great key art that gave us a visual palette in which to work with. Then we did a fair amount of previs and techvis because some of our shots were complicated.” A lot of conceptualizing went into devising the look of the reminiscence machine, Jones recounts. “Howard built the bottom portion. We had to build the top part and the cables. With all of these things, we want to make sense of it. How does the machine work exactly? Are there electrons being positively charged by somebody’s memory? There was a lot of research that went into it that you don’t actually see in the screen.” The reminiscence machine was affectionately referred to as ‘The Hamburger,’ he adds. “It has a cyberpunk sci-fi look and an hourglass shape. A 24-foot diameter round pedestal sits up on the floor. Then there is another disk that sits on top, and the two grow out. The final look of the hologram was more like beads of light that are volumetric. “There is a wonderful opening shot that takes us right off the

76 • VFXVOICE.COM FALL 2021

PG 74-78 REMINISCENCE.indd 76

8/29/21 2:44 PM


ocean where we see waves crashing, and starts to rise up, and we see the whole opening of this city partially submerged in water,” states Jones. “It’s a 3,000-frame shot with multiple levels of composites, and it’s almost all made of whole cloth. We rebuilt Miami completely digitally and raised the water 30 feet. Then we added bridges to places. Everybody gets around in their little boats.” Digital set extensions appear in a number of shots. Observes Jones, “A lot of being able to extend a shot and making it feel like it goes for miles involves studying atmosphere and how black levels of buildings become milkier and bluer as they get further off into the distance. We distressed all of the buildings, we added vines and busted out holes in roofs. The villain lives in an old industrial plant that is half underwater. There are rope bridges between things. Life goes on. People just figure it out. “Our hero takes the train from Miami to New Orleans, and we wanted to use that as an opportunity to show the water has risen so much that even a bridge is now about a foot underwater,” remarks Jones. “The CG train was modeled on the 1940s style of heavy metal engines with more of a modern twist because it’s a parallel reality that has occurred. We looked at Spirited Away, which is a classic Japanese animation film that has those types of visuals. Howard built a train set for us so we could put our actors inside of the car with bluescreen out of the back for the CG water world. Broken and half-sunken windmills allowed us to cut between interiors and exteriors that boom up to show the train going off in the distance.” Originally, the plan was to have more shots with digital doubles. “For the most part we were able to get away with just the stunt work and the angles,” remarks Jones. “This was a balance of a film noir detective sci-fi love story. There were a number of action

FALL 2021 VFXVOICE.COM • 77

PG 74-78 REMINISCENCE.indd 77

8/29/21 2:44 PM


FILM

“Our special effects department, run by the father-son duo of Pete and Peter Chesney, figured out a way to build dam walls so we could flood this entire street. We added bridges and floating docks. Parts of the movie were shot in Miami to tie it to things that we were shooting in New Orleans.” —Howard Cummings, Production Designer

TOP TWO: Scanline VFX combines plate elements to make it appear that the holographic image of Mae (Rebecca Ferguson) is lifelike and shares the same space with Nick Bannister (Hugh Jackman). (Images courtesy of Scanline VFX and Warner Bros. Pictures) BOTTOM TWO: Concept art of Nick Bannister (Hugh Jackman) interacting with the holographic image of Mae (Rebecca Ferguson) and the final cinematic version. (Images courtesy of Scanline VFX and Warner Bros. Pictures)

sequences, and Hugh is such a great action actor. He just handled it. Our villain jumps from a four-story building literally 20 to 35 feet down to a lower building. We had Hugh on wires on an eightfoot piece of steel deck with bluescreen. At one point in the air, because it was one continuous shot, we took that over as a digital double. We had avatars of all of our key actors, but we didn’t have to blend to them often.” Eels were recreated in CG, he adds. “Hugh confronts a drug lord who has these fish tanks filled with eels. The real eels never left the bottom of the tank because of being timid, so we made some hybrid futuristic eels that stick to the face of Hugh. He had some cues and we reverse engineered the animation.” Not everything happens above water. “Hugh knocks down the top of the piano lid and it gets somehow caught on Cliff Curtis’ arm,” remarks Jones. “The floor breaks free and we do a cut to a mostly CG shot of the piano going down in the distance in this dark, gloomy and even bigger auditorium. Hugh can’t let Cliff go because he’ll never uncover the mystery of what happened to Rebecca Ferguson. We literally only had six feet or so to show them sinking underwater from below. We would hook up our stunt players, or in this case Hugh and Cliff, to a cable that they would hold onto with their arm and we would pull them across a couple of feet below the surface back and forth in the pool. Then we rotated that in post and added a CG piano to make it look they were sinking 30 feet in this gloom. Rebecca is revealed in the light like an angel and Hugh starts to swim towards her, gets back to the surface and is saved. We had Rebecca do a series of poses, faces and mouthing some of the lines to Hugh in anticipation that she would have to move onto another show. The CG scan of her face tracked perfectly onto the double who was underwater and backlit.” Jones has particular fondness for the train shot. “That was fun! Whenever we had a chance, we wanted to show off our world without being overly didactic and keeping it in the story. The whole holographic world that we built is something I would hope that effects nerds can enjoy. It feels like you’re interacting and yet it’s not real. The underwater fight is a neat sequence. We did it in such a way that it tells the action of the story, you feel the tension, and it is open enough that you believe that they really did fall into this giant 40-foot auditorium where things are rotted and seaweeds are growing.” Cameron has a certain philosophy. “Once onboard,” explains Cameron, “I establish a relationship with the production designer and visual effects supervisor immediately, because as you’re conceiving a film with a director you want to make sure that everyone is on the same page with the look and feel. What are going to be the big CG shots? What is that going to feel like? How much of a set do we need to build? How will the color palette translate to wardrobe? We started getting some concept drawings done of what Miami might look like in 2050, and when Lisa said, ‘That’s great,’ we all knew what to do.” Cummings enjoyed the collaboration that went into making Reminiscence. “I remember standing on set when Hugh Jackman stepped out on the bridge for the first time and said, ‘Wow. This is one of the most beautiful sets I’ve ever seen.’ That was most gratifying. It was such a team effort for everybody.”

78 • VFXVOICE.COM FALL 2021

PG 74-78 REMINISCENCE.indd 78

8/29/21 2:44 PM


PG 79 VFXV VOICE ACTIVATED HOUSE AD.indd 79

8/29/21 3:09 PM


VR/AR/MR TRENDS

LIVE VR THEATER: NEXT-LEVEL INTERACTIVITY MAKES VR EVEN MORE IMMERSIVE – AND MORE FUN By CHRIS McGOWAN

TOP: Concept art of the hawker, Jack’s mother’s house, and the swamp where Jack’s mother has thrown the seemingly worthless magic beans in Jack: Part One. (Image courtesy of Baobab Studios) OPPOSITE TOP: In the lair of the giants, Jack (you) must avoid becoming a giant’s supper as you search for the golden harp in Jack: Part One. (Image courtesy of Baobab Studios)

In Baobab Studios’ VR theater experience Jack: Part One, you don a headset and step into the shoes of the titular possessor of the magic beans. A physical stage transforms into a fantastical world above which looms a formidable giantess (voiced by Lupita Nyong’o). You roam about, grasp objects, and talk to a character played by an actor. You are part of the story. You bring the beans back home to your family, and your mother becomes furious and tosses them out a window. At night, a giant beanstalk tears the roof off the house, and you see a night sky and feel a cool breeze. You fly above the clouds on the wooden floor and enter the realm of the giants – you must evade them and bring back a certain magical golden harp. With tracking markers, motion capture and other modern tricks, the 1734 English fairy tale Jack and the Beanstalk has sprouted into something unprecedented. So too has virtual reality. Just as VR has added immersive qualities to both films and video games, it’s also doing so for theater in various interactive experiences. Baobab’s Jack: Part One, DVgroup’s Alice: The Virtual Reality Play, Adventure Lab’s Dr. Crumb’s School for Disobedient Pets and Double Eye Studios’ Finding Pandora X are expanding the boundaries of storytelling by combining virtual reality with immersive theater and elements of escape rooms and video games. All the above experiences include live performances by real actors. “It feels like a mix of some things that we know and love and then there’s something else that feels more immersive. It’s like being inside the play,” says Kiira Benzing, Executive Creative Director for Double Eye Studios, about Finding Pandora X. There is much more interest now in immersive experiences, according to Adventure Lab COO and Co-founder Kimberly Adams.

80 • VFXVOICE.COM FALL 2021

PG 80-85 LIVE VR THEATER.indd 80

8/29/21 2:45 PM


“The more immersive and interactive the better,” she states. “Before the pandemic, we saw a huge rise in escape rooms, immersive theaters and location-based VR companies like The Void, Sandbox and Dreamscape. Millennials and Gen Z were also hanging out digitally on platforms like Fortnite, Minecraft and Roblox. Now, digital immersive has gained a lot of ground. During the pandemic, there was a massive and almost instantaneous shift in cultural behavior – we all learned how to live, work and play on digital platforms like Zoom. Next Gen has a growing desire to interact with their entertainment and have it interact back. That’s why we are seeing platforms like Twitch, Discord and Clubhouse take off – because of that feeling of intimacy and interactivity. We see a huge opportunity for a digital entertainment platform that creates connection through social play, performance and magical technology.” One path for live VR theater is to go fully remote – both you and the actor(s) can log in from anywhere, but you have to have to register for a show with a set time. Another path is for the experience to be a type of location-based VR with mapped physical sets that enable the audience to move about a stage and interact with an actor(s) and physical objects. In some cases, sensory stimuli further enhance the reality of the experience. Mathias Chelebourg, who directed Jack: Part One, has so far taken the second path for VR theater. “If, like me, you think of VR as a parallel world where your body is fully [immersed] in the narration, then you’ll look for every trick available to address the senses. To date, fixed, crafted set design with embedded haptic gears, where you can simulate smell, touch and even taste, is definitely the optimal approach because it’s the most organic way

to make you feel the environment. Physicality in the context of live performance is to me the most efficient way to trick the viewer into believing he belongs to the fictional VR world.” Baobab Studios, an interactive animation studio that has won six Emmys, produced Jack. Mathias Chelebourg also directed Alice: The Virtual Reality Play (co-created by Marie Jourdren) and the VR experiences Dr. Who: The Runaway (from the BBC) and most recently Baba Yaga, also from Baobab. Chelebourg is based in Paris where he has his studio, Atelier Daruma. Both Jack and Alice engage the senses. In Alice, participants are even handed a meringue to eat. “Immersion is a tricky equilibrium. I think that’s the main difference between directing for traditional shows and directing for immersive content,” says Chelebourg. In VR one needs to take extra care about the amount of sensorial stimuli you give the viewer at any given time. “Too much haptic feedback and you drive them away from the story, not enough and the magic will start to fade. But let us be honest, the most immersive part of Alice is the twisted play with the live performer. This connection is worth all the tech you can think of.” Yet, tech is fundamental. “It takes a lot of tech to make you forget there’s any!” says Chelebourg. He used Unreal Engine and Unity “extensively” to support his shows. “Real-time engines are now at the core of all my creative process, with the companionship of a galaxy of handmade tools and custom software to merge the magic together.” One example of practical tech in Jack was the set, which is “pretty massive. A fully movable pneumatic plate was built to create the ground rumble. I think the most immersive aspect of it was how far we pushed wireless motion capture to not only

FALL 2021 VFXVOICE.COM • 81

PG 80-85 LIVE VR THEATER.indd 81

8/29/21 2:45 PM


VR/AR/MR TRENDS

TOP: Deep in Dr. Crumb’s lab in Dr. Crumb’s School for Disobedient Pets, which combines features of immersive performance, an escape room and a role-playing game. (Image courtesy of Adventure Lab) BOTTOM: An immersive environment in Dr. Crumb’s School for Disobedient Pets, which requires a VR headset to explore. (Image courtesy of Adventure Lab)

track the actors and the viewers, but also every single prop and set element that you can then lift, move around and play with, in the context of the narration. We also designed custom smells with a nose expert, heat and cold cannons, huge fans for wind blowing, etc.” Up until today, he comments, “Jack really is one of the most complete and insane multi-sensorial VR experiments [ever] attempted.” Chelebourg thinks there is also a place for live VR theater where audiences participate remotely and do not share the same physical space as the play’s actors. “It’s here to stay! And it has proven powerful, especially [last] year when we spent half our time interacting remotely inside VR worlds for festivals, events and performances. We also held Baba Yaga’s premiere in VRChat [the social VR platform] with cast and press. I’m curious about it, I love to experiment with it [live remote VR theater].” Live VR theater can be experienced at home or it can be on a set. Either way, at the moment it requires a great deal of flexibility on the part of the cast. Actors played multiple roles in both Jack and Alice. “It’s the beauty of live mocap performance,” says Chelebourg. “It’s at the same time part puppeteering, part traditional presence on stage. In Alice the actor plays three different roles, and in Jack the actress embodies two drastically different characters. I was very meticulous in casting performers with voice-shifting talents and mime-playing notions because to me that was key to make it work.” Improvisation is a key element in this new field of live VR theater. In Jack, a character would engage the audience in conversation. “What surprised me was the sheer variety of reactions!” Chelebourg comments. “Jack was, like Alice, based on a pretty

82 • VFXVOICE.COM FALL 2021

PG 80-85 LIVE VR THEATER.indd 82

8/29/21 2:45 PM


linear backbone script, and I am very careful how much room I give to improvisation. I want the viewers to feel free while they are following the invisible path designed for the story. But even with all the control you think you have, so many beautiful things happen, I had viewers dancing, singing, battling over Shakespeare poems with actors. And in the end the only conclusion I could draw is that it is complete madness to think you can anticipate all the possible behaviors in such a narrative device!” He adds improvisation “is not necessary, but it’s a powerful tool to tailor the experience for each viewer. I love to write around this constraint and work with actors on their side script and characters so they’re geared up to face most situations in there. The way I usually work is by starting to write a strong linear script the way I would for a traditional play. I then identify the best [moments] for controlled improvisation to happen. Then life finds a way!” Adventure Lab’s Dr. Crumb’s School for Disobedient Pets, with its mixture of immersive theater and other elements, also includes a good amount of improvisation and is built around “bringing the escape room experience into your living room” with the help of VR and elements of role-playing games. It is experienced at home and features a live actor. Adventure Lab CEO and Co-founder Maxwell Planck adds, “Currently, our performers spin up their shows from the cloud at showtime and host their guests in VR from their homes. Guests can be anywhere in the world. We have run shows for over 1,000 people in 18 different countries, and we can run many, many shows simultaneously.” So far, the remote approach with actors makes larger audiences possible, as compared to fixed-location VR theaters with actors. Dr. Crumb’s requires signing up for a showtime. Notes Planck,

TOP: Paw pugilism in Dr. Crumb’s School for Disobedient Pets. Characters come to life thanks to a live improv performer. (Image courtesy of Adventure Lab) BOTTOM: Dr. Crumb looms large in Dr. Crumb’s School for Disobedient Pets, which is played by two to eight players in VR anywhere in the world at a scheduled time. (Image courtesy of Adventure Lab)

FALL 2021 VFXVOICE.COM • 83

PG 80-85 LIVE VR THEATER.indd 83

8/29/21 2:45 PM


VR/AR/MR TRENDS

TOP: Cory, your guide, will take you to this light forest where each pillar of light is dedicated to a particular Greek god in the VR theater experience Finding Pandora X. (Image courtesy of Double Eye Studios) BOTTOM: Guests consult with the Oracle during their VR experience. To create the imaginative world of Finding Pandora X, Unity and VRChat were both fundamental. (Image courtesy of Double Eye Studios) OPPOSITE TOP: The Temple is part of this virtual world based on Greek mythology. Finding Pandora X won the award for Best VR Immersive User Experience at the 2020 Venice International Film Festival. (Image courtesy of Double Eye Studios)

who worked for 10 years as a technical director at Pixar and was a co-founder of Oculus Story Studio, “When you launch our app and enter the world of Dr. Crumb’s School for Disobedient Pets, you are greeted by an in-game character performed by a real person. Their job is to be your host, your guide, your ally and your nemesis. You can see how we’re combining all of the elements of an escape room clue master, D&D dungeon master and improv performance. In the end, it’s not about whether the players or the host wins. The host is incentivized to just make sure everyone has a blast.” Adds Adams, previously a VFX producer and a producer at Pixar Animation Studios prior to Adventure Lab, “After a lot of trial and error, we landed on a good balance of scripted and improv interactions with the performer; enough narrative structure so folks know what the world is, who they are in relation to it and what is expected of them, and enough improv to make it feel special and unique. Ultimately, people really want to feel seen and heard.” When the performer has too much dialog and explains something for too long without any back-and-forth with the audience, the latter goes quiet. “Growing up,” Adams explains, “we learn that when an actor is performing, we are expected to sit back and be quiet in order to be a good audience member. That’s the opposite of what we want for them in our experience! They need to feel safe, comfortable and be good verbal communicators as they interact with the performer and their team. If we have done our job right, within five to 10 minutes they should feel confident and like they have a license to play.” “It’s getting easier and easier for VR developers to make content,” Planck says. “We use Unity, Maya, Blender and Substance. For our cloud infrastructure, the space where the master client is running, we’re leveraging several AWS services

84 • VFXVOICE.COM FALL 2021

PG 80-85 LIVE VR THEATER.indd 84

8/29/21 2:45 PM


including S3, EC2, SQS, Lambdas, Gateway API. For our web app tools that allow our hosts to schedule and operate their own shows, we’re using AWS’s rich amplify tools, building on ReactJS, DynamoDB, Cognito and GraphQL. For Finding Pandora X, “Unity is definitely the backbone. Unity, VRChat, and originally we also worked with this platform LiveLab for the live-streaming aspect,” says Double Eye Studios’ Kiira Benzing. The immersive live VR theater experience Finding Pandora X puts the audience in the role of the Greek Chorus, and they interact with real actors as all seek to find Pandora and her box of hope. Finding Pandora X won the award for Best VR Immersive User Experience at the 2020 Venice International Film Festival. Hewlett Packard helped the Finding Pandora X production with VR headsets and workstations, technical and financial support, according to Joanna Popper, HP Global Head of Virtual Reality for Location Based Entertainment, who served as executive producer. She notes, “HP is proud to be working with Kiira Benzing and Double Eye Studios on Finding Pandora X. Their award-winning, groundbreaking storytelling has proven to captivate and connect global audiences.” Benzing emphasizes that the Pandora VR theatrical experience is quite different from traditional theater. “When I am rehearsing my actors in their avatars, and I am in an avatar, there are elements that feel the same [as traditional theater], and then there are elements that feel utterly new, such as flying to our ‘places’ at the top of the show.” In Finding Pandora X, one moves throughout the production, teleporting through a vast landscape as the story progresses. When the story branches, one chooses which quest one will embark on. Both quests have puzzles, but one quest is built

on more logic-based puzzles and the other on more action-based puzzles. Each player is considered “a member of the story,” explains Benzing. “In this production, the Greek Chorus reveals important information to the main characters. Another key theme in our work is the principle of collaboration. The members of our Greek Chorus have to “collaborate with each other to solve tricky situations during the quest part of the storyline. With that role comes responsibilities, from speaking at key moments, to solving puzzles, to aiding the main characters. It’s quite immersive but also different than immersive theater.” For example, one can defy the physical limitations of the physical reality, such as by flying. Another is you can interact with narrative objects. Finding Pandora X has hosts in the “Cloud Lobby” to guide users after they arrive, Benzing explains. “Two of our characters, Hermes and Iris, are played by actors from our cast, and they interact with you to help you familiarize yourself with menus and controls. But they do this while improvising with you and peppering in commentary about the storyline and the main characters. It has been a delight to get our actors so technically savvy that they can help to onboard the audience and also give them story clues.” “As we keep working on the narrative and structure of these shows,” observes Benzing, “I feel like it keeps revealing new things to me. We are taking risks in the narrative. I am sure some people don’t understand what we are doing. But by trying something new it also feels like we are getting somewhere – we are on our way to developing XR narrative and XR storytelling. I believe [the] unique blend of interactive, immersive theater and VR could be considered a new art form. It feels daring to say so, but it also feels regularly, mind-bendingly different.”

FALL 2021 VFXVOICE.COM • 85

PG 80-85 LIVE VR THEATER.indd 85

8/29/21 2:45 PM


TV/STREAMING

EFFECTS TEAM ADDS THE BIZARRE AND SURREAL TO SUPER-CRAZY DOOM PATROL By TREVOR HOGG

TOP: Robotman (Brendan Fraser), Crazy Jane (Diane Guerrero), Cyborg (Joivan Wade) and Elasti-Girl (April Bowlby) spend some quality time with one another. (Photo: Bob Mahoney/HBO Max) OPPOSITE TOP: Elasti-Girl (April Bowlby) extends her fingers to dramatically increase her reach. (Image courtesy of HBO Max)

No strangers to the DC Universe on the small screen are Encore Hollywood Creative Director & Visual Effects Supervisor Armen Kevorkian and Encore Hollywood Visual Effects Coordinator Gregory Pierce, who have worked together on Supergirl, DC’s Legends of Tomorrow, The Flash, Titans and now the third season of Doom Patrol. A group of misfits who have acquired superpowers through tragic circumstances are brought together by Dr. Niles Caulder (Timothy Dalton) to form a dysfunctional family that battles the forces of evil with unconventional methods. Each of the 10 episodes has approximately 100 to 200 visual effects shots created by Encore VFX and were divided among facilities in Burbank, Vancouver and Atlanta, based on scenes and expertise. The second season was shortened to nine episodes because of the coronavirus pandemic, which had a ripple impact as the narrative carried over to Season 3 as well as the need to work remotely. Kevorkian notes, “It’s a different world for how we shoot things, but it was nice to get back out there.” Outrageous storylines and characters are a trademark of the action-adventure-comedy. “One of the things that make Doom Patrol different is that it’s creative in terms of how it’s written, how the characters are developed and the situations that they find themselves in,” remarks Pierce. “All of the effects work that we have to do has to be photoreal because this is a photoreal show, but at the same time there is a lot of surrealness to it. A lot of times with our visual effects work, we don’t know exactly what this will look like because it never happened in real life. But we still have to ground it within the scene. Some of the scenes are clearly written with an idea in mind as to how they should look, and those are related to us during the prep meetings before shooting. Other

86 • VFXVOICE.COM FALL 2021

PG 86-89 DOOM PATROL.indd 86

8/29/21 2:46 PM


“All of the effects work that we have to do has to be photoreal because this is a photoreal show, but at the same time there is a lot of surrealness to it. A lot of times with our visual effects work, we don’t know exactly what this will look like because it never happened in real life. But we still have to ground it within the scene. Some of the scenes are clearly written with an idea in mind as to how they should look, and those are related to us during the prep meetings before shooting. Other things we have more freedom to experiment with and try new things.” —Gregory Pierce, Visual Effects Coordinator, Encore Hollywood things we have more freedom to experiment with and try new things.” Ideas from the comic books are reimagined to fit into the real world. “The writers come up with kookiest ideas where you go, ‘Holy shit, we’re going to do that!’” laughs Kevorkian. “Jeremy Carver [Showrunner] and Chris Dingess [Executive Producer] are great to collaborate with to bring those things to life.” Over the three seasons, there are not many reoccurring visual effects, reveals Kevorkian. “One of the things that has remained the same since Season 1 is the negative spirit, the entity that lives inside of Larry Trainor [Matt Bomer] and comes out, because it tells the story that it needs to, and looks good visually,” states Kevorkian. “We do have the shield that Cyborg [Joivan Wade] brings up, and the arm cannon when it reveals itself and goes away. Once you do something like that it’s the same methodology but shot in a different way. There are digital doubles for all of the characters. When we can’t do something practically for Robotman [Riley Shanahan] then we’ll do a CG version.”

Encore VFX is currently using a pipeline that utilizes Houdini and RenderMan. “We’ve landed on different and new technologies, especially for our CG characters,” notes Pierce. “We’re doing a lot more simulations on them. We’re trying to push the boundaries for what we can to do to add some dynamics to these characters. During the end of last season and into this season we’ve been doing more motion capture, in particular for [new character] Monsieur Mallah as it helped to give him more of a presence and grounded in a scene.” April Bowlby portrays actress Rita Farr, otherwise known as Elasti-Girl. “We still do face drooping when she feels nervous and is not in control of her power,” remarks Kevorkian. “We’ve done a few gags where she is able to stretch out to confront someone or grab something from a distance. The general idea of what is happening to Rita is the same – it’s just putting her into different situations. Some of the stuff that she will be doing this season like manifesting into her blob self is a completely different methodology. It was challenging to make it seem seamless, but was also

FALL 2021 VFXVOICE.COM • 87

PG 86-89 DOOM PATROL.indd 87

8/29/21 2:46 PM


TV/STREAMING

“In Season 1, we did butt monsters, which are butts with arms, and they’re making a comeback this year in a bigger sequence. I’m excited to bring them to life because that’s not something you get to do every day. The butt monsters speak to the comedic factor of the show along with being terrifying as well. It’s the most fun that we had animating and coming up with different things that they’ll be doing.” —Armen Kevorkian, Creative Director & Visual Effects Supervisor, Encore Hollywood

TOP: When it comes to Cyborg (Joivan Wade) and Robotman (Brendan Fraser) there is a heavier reliance on practical effects for their characters. (Photo: Bob Mahoney/HBO Max) BOTTOM THREE: The plate photography, animation and final lighting of the Moonscape encounter in Episode 301 between Dorothy Spinner (Abigail Monterey) and the Candlemaker (Lex Lang). (Images courtesy of HBO Max and Encore VFX)

fun to do something different with that character. None of it is procedural. If she turns into a blob in a container, she comes out and forms into a person that takes a lot of blend shapes and morphing. It’s figuring out how to do go from CG cloth simulations to real cloth. The biggest challenge has always been Rita doing something that we have to reimagine how we’re going to do it. If I send back shots at all they’re usually Rita shots.” “Dorothy Spinner [Abigail Shapiro], the daughter of Dr. Niles Caulder, has these imaginary characters that she manifests which have to exist in the real world, such as the Candlemaker [Lex Lang],” explains Kevorkian. “He premiered last season and there is a little bit of him in Season 3. In Season 1, we did butt monsters, which are butts with arms, and they’re making a comeback this year in a bigger sequence. I’m excited to bring them to life because that’s not something you get to do every day. The butt monsters speak to the comedic factor of the show along with being terrifying as well. It’s the most fun that we had animating and coming up with different things that they’ll be doing. There was one moment where the Doom Patrol interacts with one or two of them. We sent our model to props which did a 3D print of one so that the actors could touch it.” Introduced is the Sisterhood of Dada, which consists of five bizarre supervillains with the ability to act as chaotic as the Dadaism art movement. “There is a lot that we’re creating for that storyline, for example, Dada birds,” states Kevorkian. “This season we also have a gorilla character from the comics called Monsieur Mallah that wears a beret and has a machine gun. There are a few images out there of him. He actually speaks and appears in a few episodes. We started from scratch to avoid any similarities with Grodd [from The Flash]. We built a model and whole new muscle and fur systems for him. The final ADR determines the animation. The facial anatomy doesn’t translate one-to-one with a human, so we have to cheat certain things to make it visually correct. I was never worried about Monsieur Mallah – he looks great.” Research is guided by the scripts. “If something gets mentioned in a script, like a character, I will do research to see if it’s something that really exists,” remarks Kevorkian. “I will always use that as my starting point. I will go ahead and conceptualize something and send it over to Jeremy, who knows exactly what he wants and is very

88 • VFXVOICE.COM FALL 2021

PG 86-89 DOOM PATROL.indd 88

8/29/21 2:46 PM


“We’ve landed on different and new technologies, especially for our CG characters. We’re doing a lot more simulations on them. We’re trying to push the boundaries for what we can to do to add some dynamics to these characters. During the end of last season and into this season we’ve been doing more motion capture, in particular for Monsieur Mallah as it helped to give him more of a presence and grounded in a scene.” —Gregory Pierce, Visual Effects Coordinator, Encore Hollywood specific. It makes our process easier knowing that Jeremy has a vision of what we’re creating. If we’re in sync, sometimes we get first-pass approvals. There are times when he chimes in and says, ‘You need to adjust this to tell the story about.’ We’ll have one or two rounds of notes and get there fairly quickly. That adds to the quality, because we’re not going back and forth on things that slow you down.” A majority of the storyboards, previs and postvis are created by Encore VFX. “They do have storyboard artists in Atlanta to work with directors on sequences,” states Kevorkian. “Encore VFX does all necessary previs and concept art of creatures. At times, some of the concepts that we have to match come from the art department because it’s something they’re going to build practically. Postvis is done with stunts.” An emphasis is placed upon location shooting. “The greenscreen is used for environments that do not exist and need to be created,” adds Kevorkian. “Most of the world building involves digital augmentation. For example, last year we did a scene where Dorothy is on a lunar surface. The art department built part of a set for the lunar surface that had crystals which popped out of the ground. We built our own version based on that for the extensions that we did.” Visual effects have a close partnership with stunts and special effects. “Stunt Co-ordinator Thom Khoury Williams always has great ideas and works with our ideas,” remarks Kevorkian. “All of those sequences usually come out the way that we prepped them so there are not surprises when we get into editorial.” Doom Patrol definitely has its own visual style, he observes. “Maybe it is a bit retro with some of the equipment that you see on set. The look of Robotman is from the comic books. He’s not your shiny 21st century robot, but something you could have built in your garage. The show definitely has an older feel, so we try to stay within that same world.” Kevorkian found everything was challenging in an exciting way. “I’m excited for people to see Monsieur Mallah and the butt monsters that come back in Episode 304. Each episode is unique, so it will be different things for different people when they watch this season. Doom Patrol is not repetitive. Every season is its own thing.”

TOP: Cyborg (Joivan Wade) watches Elasti-Girl (April Bowlby) working on controlling her powers. (Image courtesy of HBO Max) BOTTOM FOUR: Ideas from the comic books are re-imagined to fit into the real world, such as the Candlemaker. (Images courtesy of HBO Max and Encore VFX)

FALL 2021 VFXVOICE.COM • 89

PG 86-89 DOOM PATROL.indd 89

8/29/21 2:46 PM


[ VES SECTION SPOTLIGHT: LONDON ]

London Revisited: Thinking Differently to Meet the Moment By NAOMI GOLDMAN

TOP TO BOTTOM: VES London members partake in the Wellness Series, including a cooking class, essential oils tutorial and whiskey tasting.

The VES’ international presence gets stronger every year, and so much of that is because of its regional VFX communities and all that they do to advance the Society and bring people together. The London Section exemplifies ‘the big pivot’ compelled by the global pandemic, as they developed interactive online experiences that continue to deliver education and entertainment and foster a strong sense of community among their 265-plus members. “We were on a tremendous upswing as a Section in 2018/2019, delivering hands-on demonstrations and trainings, starting our collaboration with CAVE Academy, and hosting screenings and sponsored pub nights that attracted hundreds of people,” said Gregory Keech, Co-Chair of the VES London Section. “We had a lot of momentum and buzz in the community and were poised with a roster of amazing events that we expected would draw new members and build out our Section. Then COVID hit. Lucky for us, we have exceptional Section officers, including CAVE Academy founder Jahirul Amin, who helped us adapt quickly to create a calendar of dynamic webinars.” During COVID, VES London held more than 40 webinars (and counting) with CAVE and other partners, launched as the U.K. went into lockdown in March 2020. The series has covered topics ranging from: how lenses work, digital sculpting, drawing, real-time technology and on-set data acquisition, to panel conversations on mental health, the state of the industry, and diversity in VFX (co-hosted with ACM SIGGRAPH London, ILM and DNEG). The Section brought in more than 50 guest speakers, including VFX supervisors, producers, department heads, directors, actors, R&D executives and FACS experts. VES London also joined in multi-Section events, such as online Speed Networking, hosted by VES Los Angeles with the European Sections (London, France and Germany), to network with people they might not have the chance to meet in person. Some of the standout events hosted with CAVE Academy have included “Unleashing Unity for Media & Entertainment” featuring VFX training guru Ben Radcliffe; “Women in VFX” featuring senior industry professionals Debra Coleman, Harriet Edge-Partington, Marieke Franzen, May Leung, Lauren McCallum and Brenda Ximena Roldan Romerol; and “Sculpting Realistic Portraits” with Vimal Kerketta, which had 300-plus attendees – the most popular program to date. For all of these collaborations with CAVE, the Section secured sponsors and contributed to artist fees for the talks; and VES London board member Amin organized and hosted the events and created high-visibility promotions drawing hundreds of attendees. VES London prides itself on its creative approach to programming and takes its charge to building a vibrant VFX community seriously. “Before the pandemic, we looked at hosting screenings and panels with a different spin,” said Danny Duke, VES London Treasurer. “We were moving towards a big passion project, show-

90 • VFXVOICE.COM FALL 2021

PG 90-91 VES LONDON SECTION.indd 90

8/29/21 2:47 PM


casing indie low-budget films, such as our screening of Criminal Audition, and conversation with writer/director Luke Kaile [at the MPC theater on Wardour Street] in Soho. We find that filmmakers with indie budgets that often don’t allow for VFX supervisors on crew are as interested in learning from our members and forging relationships as we are. And when we can share knowledge and glean a diversity of insights into our craft, that’s a win-win for our industry and the Society.” The issue of self-care and nurturing social connections has been front and center amidst the pandemic. VES London wanted to give back to their members in an engaging way where they could talk and hang out and do something physical, and so the Wellness Series was born under the leadership of Section officer Tony Micilotta. “We wanted to create unique online experiences that were interactive by design, where people received things they could touch and feel in their own homes and feel connected to their peers doing the same,” said Micilotta. “Our members-only series had a course on using essential oils, a wine tasting, a whiskey tasting and cooking class where we all made a vegetarian meal. A special component was using our budget to send VES-branded packages to each attendee – the spirit samples, essential oils, local produce and spices to use in the classes. The events were exuberant and have been a springboard to brainstorm future programs beyond screenings and pub nights. I’m so proud we reinvested in our members and created a new way for us to come together.” “Working as a team, we’ve turned these challenging times into a dynamic opportunity,” said Keech. “It’s been a challenge the last 18 months to reinforce the value of VES as a resource for our members. Joining forces with CAVE and other partners to create exclusive educational and networking events gave us the ideal platform to reinforce the benefits of membership and set us on a new course.” “Tasting and film nights are great fun,” said Duke, “but it’s serving up thought leadership and diversity and education, and fulfilling our responsibility for the artists of tomorrow – that’s our core. If we can play a part in that and also broaden our Section’s reach, that’s why we’re here.”

TOP LEFT: VES London members and guests at hands-on training and demos session hosted in partnership with CAVE Academy. TOP RIGHT TO BOTTOM: Members and guests enjoy festive pub nights. VES London celebrating summer in style. VES London comes out for its popular screening series. VES London setting the mood for its annual holiday party.

FALL 2021 VFXVOICE.COM • 91

PG 90-91 VES LONDON SECTION.indd 91

8/29/21 2:47 PM


[ THE VES HANDBOOK ]

Shooting Elements for Compositing BY MARK H. WEINGARTNER Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Visual Effects – 3rd Edition Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES The VFX supervisor of the 21st century has access to the most complex 3D procedural solutions in order to generate smoke, clouds, flames, water, debris, and even crowds. Sometimes, however, the fastest, easiest, cheapest, and most rewarding way to get the elements is to have a special effects technician come to the nearest parking lot or stage, set up a camera, and just shoot! The costs involved in putting together an element shoot can look daunting, but compared to the timeline involved in seeing successive versions of something being built in CG, an experienced VFX camera crew supported by a competent special effects team can experiment quickly and film many different versions of an effect in a short period of time. Types of Elements In discussing element photography, it is useful to categorize the types of elements in a couple of different ways. One can differentiate between full-sized elements and scaled elements, and between shot-specific elements and more generic ones. These two sets of distinctions are not mutually exclusive – either miniature or full-sized elements can be destined for a specific shot. Likewise, one can shoot a library of generic elements as scale elements, full-size elements, or a combination of both.

Author with motion control legs for flame, stream and water reflection passes. (Photo courtesy of Mark H. Weingartner)

Generic versus Shot-Specific Elements Resizing elements in the digital world is only a mouse-click away, but even with this ease there are benefits to shooting elements in a shot-specific way. In shooting an element for a specific shot, one has the advantage of being able to choose the appropriate lens, distance, framing and lighting to match the shot. Any camera position data from the original shot can be used to aid in lineup, but with elements such as water, pyro, flames, or atmospherics that are likely to break the edges of frame, it is best to shoot a slightly wider shot, which will allow the compositor freedom in repositioning the element to best

“Resizing elements in the digital world is only a mouse-click away, but even with this ease there are benefits to shooting elements in a shot-specific way. In shooting an element for a specific shot, one has the advantage of being able to choose the appropriate lens, distance, framing and lighting to match the shot.” —Mark H. Weingartner advantage without bits of the action breaking the edge of the frame. VFX DPs often use the full area of the film frame or digital sensor in order to allow for the most repositioning without sacrificing resolution when resizing or repositioning the image in post. When it comes to shooting actors in front of a process screen, or sky, once the correct distance and angle have been worked out to ensure proper perspective, one frequent practice is to select a longer focal length lens in order to magnify the subject. As long as the subject does not break the edge of frame, this magnification yields better edge detail, which allows for easier, cleaner matte extraction. Care should be taken to avoid the perspective difference of the foreground and background becoming apparent. The intent is still to make the finished shot appear as if shot as one. Even though there are obvious advantages to shooting elements for specific shots, once the equipment and crew are assembled, a great return on investment may be realized by shooting a library of elements with variations in size, focal length, orientation, action, and lighting. These generic elements can be used as necessary in building many different shots, whether planned for or not.

92 • VFXVOICE.COM FALL 2021

PG 92 HANDBOOK.indd 92

8/29/21 2:48 PM


PG 93 VES AWARDS HOUSE AD.indd 93

8/29/21 3:09 PM


[ VES NEWS ]

New Webinar Series to Destigmatize Mental Health By NAOMI GOLDMAN

“Reignite Yourself – Instruments to Face Daily Life” is a five-part webinar series from the VES and Quebec Film and Television Council (QFTC).

In July, VES and the Quebec Film and Television Council (QFTC) launched “Reignite Yourself – Instruments to Face Daily Life,” a five-part webinar series around mental health, to support the worldwide visual effects industry. The free, open-source episodes feature conversations with veteran visual effects professionals and mental health professionals, providing both personal insights and expert guidance. This video series is the first initiative of the VES’ new global Health and Wellbeing Committee co-chaired by Emma Clifton Perry, VES 1st Vice Chair, and Philipp Wolf, VES Montreal Chair. It is part of the QFTC’s Release Your Creativity project, which is made possible thanks to the financial support of the City of Montreal, the NAD-UQAC School, as well as partner studios Caribara, DNEG, Framestore, Method Studios, Reel FX and Technicolor (through its three brands, Mikros, MPC and MR. X). “Reignite Yourself” covers a spectrum of issues around mental health, including handling stress and anxiety, dealing with highpressure situations, maintaining work-life balance, empathy, handling negative feedback, and creating a growth mindset through confidence-building and motivation. The episodes feature conversations with acclaimed VFX practitioners sharing real-life experiences, including Chris White (VFX Supervisor, Weta Digital), John Dykstra (Academy Award-winning special effects pioneer), Kaitlyn Yang (VFX Supervisor, Alpha Studios), Monica Lago-Kaytis (Producer/CEO, Frogbot Films) and Mark Osborne (Director, Netflix); and mental health specialists Dr. Melanie Bilbul (Psychiatrist, CHUM), Dr. Amal Abdel-Baki (Psychiatrist, CHUM), Dr. Drea Letamendi (Clinical Psychologist, UCLA Student Resilience Center) and Camille Charbonneau (Mental Performance Consultant). The series – available in both webinar and podcast formats on https://bit.ly/VESReignite, vfx-montreal.com, Spotify and Apple podcasts – is accompanied by resources to contact mental health professionals for assistance. “The VES is committed to supporting the health and welfare of our members and the VFX community at large,” said Lisa Cooke, VES Board Chair. “We take our role seriously in working to create a safer, healthier and more equitable environment for VFX artists and practitioners worldwide, and I’m gratified that our Society is demonstrating its leadership through our new Health and Wellbeing Committee. Developed in collaboration with our partners in Montreal, the ‘Reignite Yourself’ series is a meaningful step towards enhancing our shared industry experience.” “Our mental health is something we tend to ignore and not take ownership of,” said Philipp Wolf, Co-Chair, VES Health and Wellbeing Committee and producer/creator/moderator of the ‘Reignite Yourself’ series. “It is of great significance for us to actively change this behavior and acknowledge its importance. Only then can we truly excel in this fast-paced animation and visual effects environment. We need to create a space in which it is normal to openly speak about mental health, and I hope this web series sets the stage for many conversations.”

94 • VFXVOICE.COM FALL 2021

PG 94-95 VES NEWS.indd 94

8/29/21 2:49 PM


Visual Effects Society Announces Special 2021 Honorees

TOP TO BOTTOM: Mike Chambers, Rita Cahill, Gene Kozicki, Richard Winn Taylor II, VES

VES proudly announced the Society’s newest Lifetime members and this year’s recipients of the VES Founders Award, recognized for meritorious service to the Society and the global industry. The honorees will be celebrated at a special event in October, alongside this year’s VES Fellows, Honorary members and inductees into the VES Hall of Fame (unnamed at the time of publication). Award-winning Visual Effects Producer Mike Chambers and venerated global business and marketing consultant Rita Cahill were named recipients of the 2021 VES Founders Award. The Society designated digital production manager and leading VFX historian Gene Kozicki, acclaimed creative and cinematic director Richard Winn Taylor II, VES, Mike Chambers and Rita Cahill with Lifetime VES memberships. “Our VES honorees represent a group of exceptional artists, innovators and professionals who have had a profound impact on the field of visual effects,” said Lisa Cooke, VES Board Chair. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.” Mike Chambers is an award-winning freelance Visual Effects Producer and Independent VFX Consultant, specializing in large-scale feature film productions. He is currently working on an untitled project for Universal Pictures and most recently worked on Tenet, his fourth collaboration with esteemed writer-producer-director Christopher Nolan. Chambers has contributed to the visual effects efforts on numerous Academy & BAFTA award-winning films, and won three VES Awards for Best Visual Effects, on Dunkirk, Inception and The Day After Tomorrow. Chambers has been a VES member for over 20 years, serving multiple terms on the Board of Directors, including six years as Chair. Rita Cahill is an international business and marketing/PR consultant and has worked with a number of U.S., Chinese, Canadian, U.K. and EU companies for

visual effects and animation projects. Current and former clients include companies in the feature film, television, VR and game industries, as well as government and educational entities. She is also a partner in MakeBelieve Entertainment, a film development company, and Executive Producer on a number of international projects. Previously, Cahill was the Vice President of Marketing for Cinesite and a founding Board member of the Mill Valley Film Festival/California Film Institute. Cahill served six terms as VES Secretary, and as Chair or Co-Chair of the VES Summit for eight years. Gene Kozicki, recipient of the VES Founders Award, has served as a member of the Board of Directors and L.A. Section Board of Managers and was the co-founder of the VES Festival for many years. During his tenure as chair of the VES Archives Committee, he helped secure portions of personal archives of Robert Abel, Richard Edlund, VES and others, and was instrumental in organizing the VES archives and helping to secure funding to digitize many of VES assets. As a VFX historian, Kozicki is active in the archiving of information, imagery and artifacts from the visual effects industry and regularly consults with the Academy of Motion Picture Arts and Sciences and the American Cinematheque on retrospectives and conservation. Richard Winn Taylor II, VES, is a VES Fellow and has served as a member of the Board of Directors. He has an extensive background in live-action direction, animation, production design, special effects and computer-generated images for theatrical films, television, games, themed entertainment venues and VR, and his commercial work has garnered numerous awards. Taylor supervised the special effects and storyboarding of the visual effects sequences on Star Trek: The Motion Picture and designed and supervised the building of the Starship Enterprise, and he directed the innovative special effects for TRON, one of the VES 70: The Most Influential Visual Effects Films of All Time.

FALL 2021 VFXVOICE.COM • 95

PG 94-95 VES NEWS.indd 95

8/29/21 2:49 PM


[ FINAL FRAME ]

Dune – Then and Now

This year sees a new adaptation of Dune, based on the 1965 sci-fi novel by Frank Herbert. The first film iteration, budgeted at $40 million, made its film debut in 1984 and was written and directed by David Lynch. Featuring Kyle MacLachlan, Patrick Stewart, Dean Stockwell, Max von Sydow, Virginia Madsen and others, the plot involves rivalry among elite families in the future vying for control of a harsh planet called Arrakis (aka Dune). Reportedly, some several hundred workers spent a few months actually hand-clearing three square miles of Mexican desert for the shoot. Other scenes were filmed at studios in Mexico City. It was lensed by cinematographer legend Freddie Francis. The production crew numbered 1,700 with 80 sets on 16 sound stages. Some special effects scenes required more than a million watts of lighting and 11,000 amps. That film

did not fare well with critics or the box office but has gained a cult following over the years. The new Dune, budgeted at $165 million, is helmed by Dennis Villeneuve, fresh off his Blade Runner 2029 success (see article page 20). It stars Timothée Chalamet and Rebecca Ferguson. The special effects were handled by DNEG, Clear Angle Studios, FBFX, Lidar Guys, Territory Studio and Rodeo FX. Shot with large-format cameras with Panavision lenses, much of the location work was shot in Norway and the Middle East. The new Dune is a vivid illustration of how far visual effects have traveled in 37 years – light years from the original. Images courtesy of Universal Pictures (1984) and Warner Bros. Pictures (2021).

96 • VFXVOICE.COM FALL 2021

PG 96 FINAL FRAME.indd 96

8/29/21 2:49 PM


CVR3 FOTOKEM AD.indd 3

8/29/21 2:57 PM


CVR4 SIDEFX AD.indd 4

8/29/21 2:58 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.