VFX Voice Summer 2022

Page 1

VFXVOICE.COM SUMMER 2022

THE LEGEND OF LIGHTYEAR

20TH ANNUAL VES AWARDS & WINNERS • VFX EMMY CONTENDERS • HIGH-END EPISODICS ANIMATION RENAISSANCE • HALO • PROFILES: KRISTEN PRAHL & SHEILA WICKENS

CVR1 VFXV issue 22 SUMMER 2022.indd 1

5/2/22 2:01 PM


CVR2 RAYNAULT AD.indd 2

5/2/22 2:19 PM


PG 1 AUTODESK ADVERTORIAL.indd 1

5/2/22 2:24 PM


[ EXECUTIVE NOTE ]

Welcome to the June 2022 issue of VFX Voice! This issue shines a light on the milestone 20th Annual VES Awards and our return to the in-person gala celebration. Congratulations again to all of our outstanding nominees, winners and honorees! To infinity and beyond – our cover story goes inside the new film starring animated action hero Buzz Lightyear. We also dive into TV/streaming, including the evolution of military sci-fi game Halo to the big screen, the rise of episodic TV and a showcase of this year’s potential Emmy nominees. We explore trends in the explosive growth of VR/AR and the new golden age of animation, look at virtual production in commercials and gain insights from a stellar VFX producers roundtable. We sit down with Star Trek: Discovery VFX Producer Kristen Prahl and MPC VFX Development Supervisor Sheila Wickens. As always, we serve up tech and tools, this time focused on previs and production assets. We spotlight the VES Bay Area Section and share lessons learned from our award-winning VES Handbook of Visual Effects. We also share reflections from this year’s special honorees, acclaimed filmmaker Guillermo del Toro and Lucasfilm Executive Vice President & General Manager Lynwen Brennan. Now turn the page and meet the visionaries and risk-takers who push the boundaries of what’s possible in the field of visual effects. And please continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on Twitter at @VFXSociety. Cheers!

Lisa Cooke, Chair, VES Board of Directors

Eric Roth, VES Executive Director

2 • VFXVOICE.COM SUMMER 2022

PG 2 EXECUTIVE NOTE.indd 2

5/2/22 2:03 PM


PG 3 SPINVFX AD.indd 3

5/2/22 2:26 PM


[ CONTENTS ] FEATURES 8 THE VFX EMMY: TV/STREAMING SHOWCASE Potential nominees reflect how far TV production quality has come.

VFXVOICE.COM

DEPARTMENTS 2 EXECUTIVE NOTE 92 VES SECTION SPOTLIGHT: BAY AREA

16 TV/STREAMING: EPISODIC VFX Episodics ascend to memorable longform cinematic experiences. 26 VFX TRENDS: VFX PRODUCERS ROUNDTABLE VFX producers discuss the biggest challenges they currently face. 32 ANIMATION: ANIMATION RENAISSANCE Animated projects are pushing the envelope in content and style.

94 THE VES HANDBOOK 96 FINAL FRAME – EPISODICS

ON THE COVER: Buzz Lightyear, star of Lightyear. (Image courtesy of Disney/Pixar)

38 PROFILE: KRISTEN PRAHL Emmy winner’s star rises as VFX Producer of Star Trek: Discovery. 44 ANIMATION: LIGHTYEAR Capturing Buzz while making a classic, contemporary sci-fi movie. 50 THE 20TH ANNUAL VES AWARDS Celebrating the very best in visual effects. 60 VES AWARD WINNERS Photo Gallery 66 PROFILE: SHEILA WICKENS MPC VFX Development Supervisor delivers filmmakers’ visions. 70 TV/STREAMING: HALO Bringing the game to film offered a multitude of VFX challenges. 76 VR/AR/MR TRENDS: EXPANDING MARKETS New headsets, games, gear and apps are driving VR/AR growth. 80 COMMERCIALS: VIRTUAL PRODUCTION Virtual production is rapidly gaining traction in commercials. 84 TECH & TOOLS: PREVIS Previs studios extend their efforts in real-time with custom tools. 88 TECH & TOOLS: ASSETS Industry pros break down how they made their most complex assets.

4 • VFXVOICE.COM SUMMER 2022

PG 4 TOC.indd 4

5/2/22 2:05 PM


PG 5 SCANLINE AD.indd 5

5/2/22 2:27 PM


SUMMER 2022 • VOL. 6, NO. 3

WINNER OF THREE FOLIO AWARDS FOR PUBLISHING EXCELLENCE.

VFXVOICE

Visit us online at vfxvoice.com PUBLISHER Jim McCullaugh publisher@vfxvoice.com

VISUAL EFFECTS SOCIETY Eric Roth, Executive Director VES BOARD OF DIRECTORS

EDITOR Ed Ochs editor@vfxvoice.com CREATIVE Alpanian Design Group alan@alpanian.com ADVERTISING Arlene Hansen Arlene-VFX@outlook.com SUPERVISOR Nancy Ward CONTRIBUTING WRITERS Ian Failes Naomi Goldman Trevor Hogg Kevin H. Martin Chris McGowan ADVISORY COMMITTEE David Bloom Andrew Bly Rob Bredow Mike Chambers, VES Lisa Cooke Neil Corbould, VES Irena Cronin Paul Debevec, VES Debbie Denise Karen Dufilho Paul Franklin David Johnson, VES Jim Morris, VES Dennis Muren, ASC, VES Sam Nicholson, ASC Lori H. Schwartz Eric Roth

OFFICERS Lisa Cooke, Chair Emma Clifton Perry, 1st Vice Chair Susan O’Neal, 2nd Vice Chair Rita Cahill, Secretary Laurie Blavin, Treasurer DIRECTORS Jan Adamczyk, Neishaw Ali, Nicolas Casanova Mike Chambers, VES, Bob Coleman Dayne Cowan, Michael Fink, VES Gavin Graham, Dennis Hoffman, Thomas Knop Kim Lavery, VES, Brooke Lyndon-Stanford Josselin Mahot, Arnon Manor, Andres Martinez Tim McGovern, Karen Murphy Janet Muswell Hamilton, VES, Maggie Oh Jim Rygiel , Richard Winn Taylor II, VES David Valentin, Bill Villarreal, Joe Weidenbach Susan Zwerman, VES ALTERNATES Colin Campbell, Himanshu Gandhi, Johnny Han Adam Howard, Robin Prybil, Dane Smith, Philipp Wolf Visual Effects Society 5805 Sepulveda Blvd., Suite 620 Sherman Oaks, CA 91411 Phone: (818) 981-7861 vesglobal.org VES STAFF Nancy Ward, Program & Development Director Jim Sullivan, Director of Operations Ben Schneider, Director of Membership Services Jeff Casper, Manager of Media & Graphics Colleen Kelly, Office Manager Shannon Cassidy, Global Coordinator Shannon Carmona, Administrative Assistant P.J. Schumacher, Controller Naomi Goldman, Public Relations

Tom Atkin, Founder Allen Battino, VES Logo Design Follow us on social media

VFX Voice is published quarterly by the Visual Effects Society. Subscriptions: U.S. $50: Canada/Mexico $60; all other countries $70 a year. See vfxvoice.com Advertising: Rate card upon request from publisher@vfxvoice.com or VFXVoiceAds@gmail.com Comments: Write us at comments@vfxvoice.com Postmaster: Send address change to VES, 5805 Sepulveda Blvd., Suite 620, Sherman Oaks, CA 91411. Copyright © 2022 The Visual Effects Society. Printed in the U.S.A.

6 • VFXVOICE.COM SUMMER 2022

PG 6 MASTHEAD.indd 6

5/2/22 2:05 PM


PG 7 YANNIX AD.indd 7

5/2/22 2:28 PM


EMMY CONTENDERS SHOWCASE HOW MUCH VISUAL EFFECTS HAVE EXPANDED THE VISION IN TELEVISION By TREVOR HOGG

TOP: South Korea-based Gulliver Studios created the dramatic effects for the surprise Netflix hit series Squid Game, which returns for a second season in 2024. (Image courtesy of Netflix) OPPOSITE TOP TO BOTTOM: Part of the futuristic appeal of Star Trek:Discovery is the amount of attention and detail put into creating believable UI. (Image courtesy of Paramount+) One question is whether The Book of Boba Fett can carry on the Emmywinning ways of The Mandalorian. (Image courtesy of Lucasfilm) Nostalgia reigns supreme as Ewan McGregor and Haden Christensen reprise their roles from the Star Wars prequels for Obi-Wan Kenobi. (Image courtesy of Lucasfilm)

With The Mandalorian taking a breather after winning two consecutive Emmy Awards for Outstanding Special Visual Effects in a Season or a Movie, it will be up to The Book of Boba Fett to continue the winning streak as the iconic bounty-hunter-turned-crimelord series has been described as The Mandalorian 2.5. Whether Boba Fett receives a nomination, or more, will be revealed when the Primetime Emmy Awards takes center stage on September 12, 2022 at the Microsoft Theater in Los Angeles. The other category is Outstanding Special Visual Effects in a Single Episode, which last year was awarded to Star Trek: Discovery, another contender from a storied science fiction franchise which will be trying to repeat the feat once again. An interesting bellwether is the 2022 Visual Effects Society Award nominations that place Loki and Foundation at the forefront with both being singled out for their stunning environmental work on Foundation for Lamentis and Trantor. “We were asked to create meteor effects from scratch,” states Digital Domain Visual Effects Supervisor Jean Luc-Dinsdale when discussing Lamentis and its moon Lamentis-1. “We went through multiple versions of providing the meteors, the impacts, and the dust and debris that flies around them. That was then tweaked and populated throughout the episode because the meteors are a constant threat, but not always the focus of the sequence.” Trantor is literally 50 different cities stacked on top of each other. “Every level was built hundreds of years before the next one, so there was a lot of concepting and architectural research

8 • VFXVOICE.COM SUMMER 2022

PG 8-14 EMMYS.indd 8

5/2/22 2:06 PM


that went into how Trantor and its multilevel structure was designed,” explains DNEG Visual Effects Supervisor Chris Keller. “We created all of these interstitial elements between buildings, like bridges, platforms, megastructures spanning 1,000 meters, through the sky ceiling of a certain level into the next level. Then you’ll see hyperloop trains and, if you look carefully, flying vehicles. All of that had a certain logic.” When it comes to photorealistic CG characters, Leshy-infected Eskel and Nivellen from The Witcher and Ampersand from Y: The Last Man were also nominated for VES Awards. “There has been real growth on the monster side,” explains Andrew Laws, Production Designer for The Witcher. “We work in ZBrush from the ground up to understand the movement and how the creature is going to take shape in all dimensions. It’s a much more fluid process. Once we have established a ZBrush model that has an organic shape, we’ll do some overpainting to get the mood of the creature. When it is agreed upon how that is going work, then the 3D model goes out to visual effects and the vendors to bring in the detail and movement.” Originally, Ampersand was going to be real but was changed to CG because Disney has a ‘no primate’ rule. “Stephen Pugh and Jesse Kawzenuk, our amazing visual effects supervisors, made it so easy for me,” recalls cinematographer Catherine Lutes. “I was constantly laughing at the puppet Amp that we had. It helped with the way that the light was falling, and that’s a good reference as well for visual effects. Stephen said that camera shouldn’t do things that

SUMMER 2022 VFXVOICE.COM • 9

PG 8-14 EMMYS.indd 9

5/2/22 2:06 PM


THE VFX EMMY

TOP TO BOTTOM: Oscar Isaac becomes a part of the MCU for the first time, along with Ethan Hawke, in the Disney+ series Moon Knight. (Image courtesy of Disney) Raised by Wolves is seen as a better exploration of an Alien-inspired universe than the prequels directed by Ridley Scott. (Image courtesy of HBO) There is no shortage of monsters to be found in The Witcher, such as a powerful vampire known as a Bruxa. (Image courtesy of Netflix)

a monkey wouldn’t do. If the camera is a little bit stilted or doesn’t move smoothly, that’s great because that’s what would happen if you were trying to follow an actual monkey running or moving.” The Wheel of Time features a wide gamut of visual effects from creatures, magic and world-building done in a grounded fashion. “One thing that was important for me from the beginning was that this world feel authentic and real,” explains The Wheel of Time creator, executive producer and showrunner Rafe Judkins, “even for the actors and crew, trying to go to places, as much as we can put stuff in-camera, even if we end up augmenting or enhancing it later with visual effects.” The fact that the sixth season is the grand finale for The Expanse may see Emmy voters finally honor the body of work with a nomination. “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like deorbiting maneuvers where you slow going forward to be able to drop,” notes Bret Culp, Senior Visual Effects Supervisor of The Expanse. “We’ve done a good job and, as a result, it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].” The usual suspects include Lost in Space, which has been rightly lauded for being able to turn practical locations into alien worlds and making biomechanical robotic beings that are empathetic and menacing. “The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship,” remarks Lost in Space Visual Effects Supervisor Jabbar Raisani. “The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”

10 • VFXVOICE.COM SUMMER 2022

PG 8-14 EMMYS.indd 10

5/2/22 2:06 PM


PG 11 NETFLIX SQUID GAME AD.indd 11

5/2/22 2:29 PM


THE VFX EMMY

“The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship. The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.” —Jabbar Raisani, Visual Effects Supervisor, Lost in Space

TOP TO BOTTOM: For those looking for major robot battles, Season 3 of Lost in Space will not disappoint. (Image courtesy of Netflix) The Battle of New York scene from 2012’s The Avengers was used as a flashback in Hawkeye, which was released as a limited series in 2021. (Image courtesy of Disney) A welcome return to the world created by Gene Roddenberry is Patrick Stewart reprising his signature role of Jean-Luc Picard in Star Trek: Picard. (Image courtesy of Paramount+)

Returning for sophomore seasons are Star Trek: Picard and Raised by Wolves, with the former mining the fan adoration for the Starfleet officer portrayed by Patrick Stewart and the latter infusing Alien mythology into the android survival tale produced by legendary filmmaker Ridley Scott. The hardest sequence to design, create and execute for Raised by Wolves was the outerspace sequence between Mother and the Necro serpent in Episode 208,” reveals Raised by Wolves Visual Effects Supervisor Raymond McIntyre Jr. “The flying Necro serpent is lured away from killing Campion by Mother, who leads the serpent into outer space in order to attempt to kill it. This scene was added deep into postproduction, and visual effects was tasked with designing an entire sequence from scratch as no live-action footage existed. Visual effects designs included the flying serpent, lighting design in outer space, nebulas, the planet Kepler 22B seen from this viewpoint, Mother’s new kill scream and a visualization of the failure of the EMF dome protecting this area of the planet. Execution involved creating realistic camera motion for each shot, and beauty lighting with sun flares, allowing for dirt on the lens to show up during flares, all while rendering fully CG shots.” Making their debuts are Obi-Wan Kenobi, which has Ewan McGregor reprising his role as the legendary Jedi Master from the Star Wars prequel trilogy, and Star Trek: Strange New Worlds, an exploration of life on the USS Enterprise under the command of Captain Christopher Pike; both of them serve as prologues to the original movie and television series and have the best chances to get nominations for their respective franchises, especially if a proper balance is struck between nostalgia and canon expansion. Then there is a matter of art imitating life that will resonate with some while being too close to the bone for others, where the viral mayhem portrayed is even more devastating and required extensive invisible effects to paint out modern-day life. In Sweet Tooth, a pandemic causes hybrid babies that are part human and animal, with the adolescent protagonist being half deer, while Station Eleven focuses on humanity trying to rebuild society after a virus has decimated the population, and See envisions a future where blindness has reached epidemic proportions. Serving as dark social commentary on the growing financial divide is Squid Game, which combines elements of The Most Dangerous Game, childhood games and Dickensian debt into a rating sensation for Netflix, and is a strong contender to upset the voting establishment. “The game spaces in Squid Game were unique and something we had never experienced before,” states

12 • VFXVOICE.COM SUMMER 2022

PG 8-14 EMMYS.indd 12

5/2/22 2:06 PM


PG 13 NETFLIX STRANGER THINGS AD.indd 13

5/2/22 2:30 PM


THE VFX EMMY

TOP TO BOTTOM: A favorite to win at the Emmys is Foundation, which features stellar environments throughout the AppleTV+ series. (Image courtesy of Apple TV+) A planet gets destroyed amongst the purple haze in the Disney+ series Loki.(Image courtesy of Marvel Studios) A surreal situation for the cast and crew of Station Eleven was shooting a story about a pandemic during one. (Image courtesy of HBO)

Cheong Jai-hoon, Visual Effects Supervisor of Squid Game. “What we wanted to achieve from the settings of Squid Game was a fabricated yet realistic look, and it was quite challenging to balance the two conflicting characteristics. Especially in Episode 107, characters play the game of Glass Stepping Stones from high above the ground, and we had to create an environment that would make the viewers immerse in the fear and tension. We put the most effort into deciding the depth from the stepping stones to the ground and the overall scale of the whole setting. We could have easily exaggerated, but we strived to find the right balance between what seemed fake and realistic, as it was more difficult than we thought.” Also, present is the author only outdone by the Bard himself when it comes to number of film and television adaptations of his works. Lisey’s Story was conceived by prolific horror maestro Stephen King, who has supernatural unrest intersecting with personal trauma. Comic book adaptations are not in short supply. A superhero who has a sharp wit and archery skills is paired with a like-minded protégé in Hawkeye, which channels Shane Black’s penchant for Christmas, action sequences and odd-ball comedic pairings. For those wanting an irreverent take on the genre, James Gunn helms the smallscreen adaptation of Peacemaker, where an extremist murderer embarks on a quest for peace. Moon Knight introduces the Marvel Studios equivalent of Batman, but with an Egyptian god reincarnation twist that raises questions about the mental sanity of the main character. Superman & Lois reimagines The Daily Planet colleagues as a married couple trying to balance domestic life and a rogues’ gallery of high-flying adversaries. “If Superman is fighting someone in the air where they would both be horizontal, it was much more time efficient and easier on the actors if they can be vertical,” states cinematographer Stephen Maier, who added a physical camera shake for the sake of realism. “The stunt team will often go away to design or rehearse something, do their previs that they film on their iPhones, cut it together and show it to us. We have a close collaboration with special effects in regards to atmospheric smoke and haze. The gags that they come up help to exemplify the strength of Superman, such as him lifting a car.” Considering the growing demand for content and the acceptance of visual effects as the primary work tool of potential nominees reflect how far the production quality of television and streaming shows have come in being able to expand the scope of creatives with a theatrical sensibility. It is because of this that the Primetime Emmy Awards has become as fascinating to watch as the Academy Awards as both showcase the very best of what can be achieved when talented digital artists get to contribute to the storytelling. Undoubtedly, the eventual winner will encapsulate the highest of level of creative and technical ingenuity achievable under current circumstances and will serve as a building block for what is to follow.

Animal/human hybrids populate the world of Sweet Tooth because of a deadly virus. (Image courtesy of Netflix)

14 • VFXVOICE.COM SUMMER 2022

PG 8-14 EMMYS.indd 14

5/2/22 2:06 PM


PG 15 AMAZON WHEEL OF TIME AD.indd 15

5/2/22 2:30 PM


TV/STREAMING

THE RISE AND FUTURE OF HIGH-END EPISODIC VFX By TREVOR HOGG

TOP: MPC Episodic created a post-apocalyptic environment for The Witcher. (Image courtesy of MPC and Netflix) OPPOSITE TOP TO BOTTOM: Image Engine, which contributed to The Mandalorian, was originally seen as a television visual effects studio, making it difficult to garner film work, but that paradigm no longer exists. (Image courtesy of Image Engine and Lucasfilm) ILM had fun dealing with the Loki variants, including an alligator, for Marvel Studios and Disney+ series Loki. (Image courtesy of ILM and Marvel Studios) Serving as a bridge between Seasons 2 and 3 of The Mandalorian is The Book of Bobba Fett. (Image courtesy of ILM and Disney)

When it comes to witnessing what is achievable with visual effects, no longer does one have to go to a theater, as high-end episodic has essentially become a long form cinematic experience that can be enjoyed by turning on a television or mobile device. This is not going to change with streamers spending billions of dollars to create content to stand apart from their like-minded competitors. The result is an impressive array of shows that are not lacking in storytelling ambition, whether it be The Wheel of Time, The Witcher, Foundation or The Book of Boba Fett. Virtual production has become synonymous with The Mandalorian, but this innovative methodology is only an aspect of the visual effects landscape which continues to evolve technologically. What does the future look like for the visual effects industry and episodic productions in the pandemic and post-pandemic era? This is a question that we try to answer by consulting the players responsible for producing the wealth of content that is available for viewers to watch. Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine “The requirements of television work are identical to feature film work in many ways. But back then it was much less resolution involved with the final output. Interestingly, we became known as a television visual effects house, and that precluded us from actually doing feature film work. It came with a stigma back in those days and was a large barrier that we had to break through. District 9 was a tipping point of recognition of us being able to execute on large-scale work. “Shawn Walsh [General Manger and Executive Producer, Image Engine] has done a good job of holding the line. Placing the value on what we deliver to the client and making them understand what that value is and why it is of value. The shortened timelines

16 • VFXVOICE.COM SUMMER 2022

PG 16-24 EPISODIC.indd 16

5/3/22 3:38 PM


“Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.” —Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine have been the long-term progression ever since I could remember. Coupled on top of that are the demands. Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.” Drew Jones, Chief Business Development Officer, Cinesite “You’re fine-tuning the teams of people attached to particular projects, ensuring that you have the right personalities dealing with the right style of work so that you can shortcut the processes and still deliver the quality threshold that it needs to be. You haven’t got the luxury of time to develop an idea across many months for the most part. “Vendors having concept artists and art departments in-house are definitely a use for a quick, more cost-effective process to get closer to an answer within the visual effects post-production environment. We will often use conceptual artists to build imagery quickly to present an option to a production rather than go through a long gestation period of a CG build and compositing to get an idea across. “There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen. The content I don’t think has changed. I don’t feel like we’re

SUMMER 2022 VFXVOICE.COM • 17

PG 16-24 EPISODIC.indd 17

5/3/22 3:38 PM


TV/STREAMING

“The area that’s getting the most attention at the moment … is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.” —Paul Riddle, Executive VFX Supervisor, DNEG doing anything outrageously different. All visual effects have a complexity component to them, and at the end of the day it comes down to how far the directors want to push their thoughts and ideas.”

TOP: Final graded image by DNEG that was shot against greenscreen for Star Trek: Discovery. (Image courtesy of DNEG and Paramount+) BOTTOM: A massive tarantula was created by Image Engine for the reimagining of The Twilight Zone. (Image courtesy of Image Engine and Paramount+)

Janet Muswell Hamilton, Senior Vice President, Visual Effects, HBO “Right now, you have a lot of executives and post executives who have been doing a good job of producing the visual effects, who needed additional help because their slates were busy. The industry has just exploded. Being able to take work off of their plates to help them find heads of departments, facilities, and getting their heads around budgets –- that was the first thing I did. But what I needed was processes in place in order to make it easier for me so I wasn’t working in 10 different ways, because HBO has been a bespoke studio. I am a fan of tools and processes that help us with the creative process. “House of the Dragon is utilizing the LED screens at Warner Bros. Studios Leavesden for a whole bunch of sequences. It’s 360 [degrees] and has a retractable roof. The ability to shoot magic hour for about a week is incredible. Yes, you need to do it upfront. Yes, you need a director who is willing to go that way. Ultimately, when you start to see the results, how beautiful things look and the stories you can tell that you couldn’t tell before because you

18 • VFXVOICE.COM SUMMER 2022

PG 16-24 EPISODIC.indd 18

5/3/22 3:38 PM


couldn’t go there or afford it – it’s going to revolutionize how we do things. It’s a technology that is here to stay. It was an unexpected benefit of the lockdown. My biggest desire is to never ever shoot another greenscreen driving shot!” Alex Hope, Co-CEO, beloFX “The biggest barrier to growth for many visual effects companies for many years has been finding talented artists. There is a finite global talent pool, but it is one we are all working hard to build. We’re all making huge efforts to train and develop visual effects talent at every stage whether that’s in college, entry level into the industry or once people have gotten into the industry. Many visual effects companies are getting behind career development for artists. In the U.K. we are helped by organizations like ScreenSkills, who standardize training at various levels, and to ensure that the industry is working to support the education sector to bring new talent into the industry. “Visual effects is perhaps the fastest-growing component of the film and television industry. It’s fantastic that we’ve seen an explosion in content creation of all types, and we’ve seen a consequent growth in demand for visual effects, so certainly the money spent on content creation is coming through to all parts of the industry, including visual effects. As we see more localized production for streamers, it’s going to be really interesting to see what opportunities this provides for partnerships between local visual effects companies and those companies in more established centers, like the U.K. and Canada, and that’s very exciting and interesting to us at beloFX.” Lucy Ainsworth-Taylor, CEO & Co-Founder, BlueBolt “With the global demand and need to get shows finished, work is being spread everywhere, often disregarding the rebate. We still cannot compete with the Indian prices, but the flip side is that the

TOP TO BOTTOM: Vision starts to disintegrate courtesy of Digital Domain for the Marvel Studios and Disney+ series WandaVision. (Image courtesy of Digital Domain and Disney) Bubbles were digitally created by Milk VFX for the outer space suffocation scene in Intergalactic. (Image courtesy of Milk VFX and Sky One) The robot battle scene in Season 3 of Lost in Space. (Image courtesy of Digital Domain and Netflix)

SUMMER 2022 VFXVOICE.COM • 19

PG 16-24 EPISODIC.indd 19

5/3/22 3:38 PM


TV/STREAMING

talent coming out of India now adds to the international remote marketplace. Netflix purchasing Scanline VFX is not a gamechanger at all. Studios have purchased facilities before, and as long as they can keep feeding the work into the facility, it will work. With the content Netflix is making at the moment, it makes sense, but I would assume they should probably buy many more facilities for the amount of work they require! Scanline is a well-respected visual effects house; does this mean they will now only work on Netflix shows?”

TOP: An actual helicopter was used to create the impression of a spacecraft landing on the water in a scene from Foundation. (Images courtesy of Important Looking Pirates and Apple TV+) BOTTOM: The big screen gets adapted for the small screen with Image Engine bringing the world of Snowpiercer to life. (Image courtesy of Image Engine and TNT) OPPOSITE TOP: Rising Sun Pictures was part of the visual effects team on the live-action remake of Cowboy Bebop for Netflix. (Image courtesy of Rising Sun Pictures and Netflix)

John Fragomeni, Global President, Digital Domain “Working on award-winning projects like WandaVision, Lost in Space, Carnival Row and Loki was basically like making six to eight mini-films. We use the same tools on episodics that we use on features, and often the same team of artists. That has helped to accelerate our development on some of the tools we use, giving us the ability to handle the volume of work while still delivering quality. “Some builds tend to lend themselves to features. For instance, the Free City game world we made for Free Guy or the 2.5 miles of New York City that we recreated for Spider-Man: No Way Home. But that doesn’t mean that you couldn’t do that for an episodic, given enough time and budget. “One thing we are seeing more and more of on the episodic side is that the productions are coming to us with a detailed vision of what they want for the entire season. This helps us forecast schedules more finitely and identify breaking points when it comes to

20 • VFXVOICE.COM SUMMER 2022

PG 16-24 EPISODIC.indd 20

5/3/22 3:38 PM


tight deadlines. From that, we can determine with the production where we can best serve the visuals, then coordinate with any other visual effects vendors the production may bring in. One of the more interesting by-products of the rise of elevated quality effects in episodics is that studios that used to compete for the same projects are now partners. As the demands for effects grow, we’ll probably see more groups involved.” Michelle Martin, Chief Business Development Officer, Milk VFX “The speed with which streaming content has grown globally has given VFX houses the opportunity to raise the bar in VFX, to create high-end content for tentpole series and feature-length projects, in turn giving a wider range of artists the opportunity to work on interesting projects. Standards have certainly been raised. “The networks and studios are engaging us earlier and are keen to discuss capacity with us, as well as share more information regarding their up-and-coming slates. There’s a keenness to share information and artists are being block-booked ahead of productions starting, which is where we should be to help develop and visualize the storytelling. We are seeing a very different landscape to where we were 10 years ago.” Christopher Gray, Global Managing Director, Episodic, MPC “In the short term, we’re already seeing the wider application of these techniques [virtual production, real-time, machine learning]. Every show we are working on in episodic employs at

least one of these toolsets in some capacity, but I think the greatest opportunity, as silicone begins to catch up, is the ability to iterate more quickly, particularly in animation. In the next five years, we’ll see more widespread adoption of real-time and near real-time GPU rendering for final pixel. The technology is close, but it’s the development of existing workflows and the continued widening of the knowledge base that needs to expand to capitalize on this moment. “We’re seeing the rise of great new prospects for counter programming, film and episodic projects that would struggle to find an audience five years ago, and we’re doing so more and more now thanks to strategic work in this space by Amazon and Apple leading the charge, and Netflix particularly so, with its commitment to international and local-language film and series and limited theatrical releasing. The exciting aspect is that as studio operations become more integrated and these two mediums converge, film production can benefit greatly from these efficiency gains, and episodic production can benefit from a knowledge base carved at the highest level.” David Conley, Executive VFX Producer, Weta FX “The bidding process has changed due to sheer demand, and streamers have a completely different greenlight process than the traditional studio system. Where once we had the luxury of bidding over several weeks against a schedule that was fairly developed, and you could bid down to the crew weeks, we are now being asked to turn around more bids in less time, days even. To

SUMMER 2022 VFXVOICE.COM • 21

PG 16-24 EPISODIC.indd 21

5/3/22 3:38 PM


TV/STREAMING

“There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen.” —Drew Jones, Chief Business Development Officer, Cinesite

TOP: Fantasy has become a prominent genre on the streaming services, with DNEG taking part in Shadow & Bone for Netflix. (Image courtesy of DNEG and Netflix). BOTTOM: Making use of extensive virtual production is the HBO prequel House of the Dragon, which stars Emma D’Arcy as Princess Rhaenyra Targaryen and Matt Smith as Prince Daemon Targaryen. (Image courtesy of HBO)

drive confidence in our bidding system, we’re relying on a more robust set of analytics to help drive the bidding process. That said, this means we really rely on the perception and skills of our bidding team because no project is similar, and analytics and performance metrics can never replicate the creative process. We rely on our bidding team to have great creative skills when reading and breaking down a script before applying metrics based on analytics. “I would say there’s greater narrative risks being taken in episodic, but more technical and creative risks in film. That said, the bar is raised across the board for both episodic and theatrical. There’s more being spent in episodic [than previously] but at a lower price point per shot, with the expectation that results are feature-level quality. To date, features are still where, as an industry, we are being asked to produce groundbreaking visual effects. However, streaming services mean that mid-range projects now have direct access to wide audiences, so our challenge becomes leveraging our emerging technologies from the feature side to help produce high-end-caliber visual effects across multiple episodes, within the schedule demands of episodic, at a viable price point that works for our industry.” Måns Björklund, Executive Producer, Important Looking Pirates “Virtual production, real-time and machine learning are becoming more common, or even standards nowadays. With moving more work into prep instead of post, visual effects becomes even more involved before anything has been shot. For sure, there is more work around than ever. However, finding artists is harder than ever, and costs have also risen. We spend a lot of time finding artists and developing them in-house. Instead of just having a few clients doing high-end work, nowadays the demand is almost doing 10 mid-tohigh-end features per season of episodic work. Due to shorter turnarounds, there is a more ‘going direct to the goal instead of trying all possible versions.’ The room for experimentation depends mostly on when you get involved with a project and how long a schedule you have. I feel there is more room to rebid and not have ‘fixed’ bids. It’s a constant discussion and collaboration with clients to get the best results within the budget and time. Important Looking Pirates don’t have facilities around the world. We are trying to do our thing and focus on the quality of our work.” Paul Riddle, Executive VFX Supervisor, DNEG “One of the main considerations in approaching our episodic work at the moment is the diversity within that work, whether that’s the creative requirements, the timescales involved or the financial requirements of the production. There’s so much scope for projects of varying sizes and complexities within episodic that there’s a real need for a degree of specialist skills and creative approaches in our artists across the globe. “There has been an interest lately in ‘deep fake’ AI and

22 • VFXVOICE.COM SUMMER 2022

PG 16-24 EPISODIC.indd 22

5/3/22 3:38 PM


PG 23 AMAZON NIGHT SKY AD.indd 23

5/2/22 2:31 PM


TV/STREAMING

“[T]he ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees.” —Stefan Drury, Executive Producer, ILM TV

machine learning, and how those things will eventually come to be utilized within our industry. There’s definitely an interest there and a buzz around it, and we’ve seen clients wanting to understand how it can be utilized sensibly without doing it just for its own sake, using the technology for a real creative impact. “The area that’s getting the most attention at the moment, it seems, is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.” Stefan Drury, Executive Producer, ILM TV “It’s been the busiest I’ve seen the industry in the 24 years I’ve been in it! More importantly, it’s also consistent, with a steady stream of projects in development and post at almost all times. It certainly feels like we’ve been able to better forward a plan for our crew, and the ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees. “There is still room for experimentation, especially through close collaboration with writers/director/showrunners in the episodic format. We’ve been involved with several episodic shows in which we’ve been part of the development, pitch and greenlight process, which have involved working closely with the clients to find creative methodologies to make projects possible. That said, this experimentation generally has to happen early in the production, agreed upon by all and adhered to, as the waterfall nature of episodic delivery and the sheer volume of material to be reviewed means post schedules leave little room for misdirection.”

TOP TO BOTTOM: What was originally meant to be practical became a CG Ampersand created by ILM for Y: The Last Man. (Image courtesy of Hulu and ILM) Expanding upon the Vikings franchise for Netflix is Vikings: Valhalla, with visual effects produced by MPC Episodic. (Image courtesy of MPC and Netflix) ILM was recruited to produce Nivellen, which was a combination of practical and digital effects, for The Witcher. (Image courtesy of ILM and Netflix)

Meredith Meyer-Nichols, Head of Production, Rising Sun Pictures “In 2017, 100% of RSP’s work was on theatrical releases, and in 2021 we were 50/50 streaming to theatrical. As we move into 2022, this trend continues. Streaming projects are predominately series in nature, at about 35%. From our perspective, on the projects that we’re working on, they have large-scale budgets and demand the same kind of quality that RSP is known for. They’re essentially gigantic movies with thousands of visual effects shots and hours of content. RSP, in partnership with the University of South Australia, are delivering accredited courses in visual effects. In classrooms set up to mirror real-world production environments and with instructors who are working professionals, RSP rigorously trains students in the technologies and techniques they’ll need to succeed in an expanding global film industry. We have done a remarkable job of turning out job-ready graduates and have found that our graduates are in high demand with most major visual effects studios.”

24 • VFXVOICE.COM SUMMER 2022

PG 16-24 EPISODIC.indd 24

5/3/22 3:38 PM


PG 25 BLACKMAGIC AD.indd 25

5/2/22 2:40 PM


VFX PRODUCERS ROUNDTABLE: ‘MY TOUGHEST CHALLENGE’ By IAN FAILES

TOP: A final shot from Mulan, on which Diana Giorgiutti was Visual Effects Producer. (Image courtesy of Walt Disney Pictures) OPPOSITE TOP TO BOTTOM: A greenscreen stuffy version of the baby elephant on the film Dumbo. Hal Couzens was VFX Producer. (Image courtesy of Walt Disney Pictures) On Blade Runner 2049, Murphy-Mundell VFX-produced effects ranging from holograms to vehicles and digital humans. (Image courtesy of Warner Bros. Pictures) A greenscreen plate of actor Damon Wayans in the TV series Lethal Weapon, on which Mark Spatny was Visual Effects Supervisor. (Image courtesy of Mark Spatny)

It’s been a couple of highly unusual and disrupted years in the visual effects industry. Among the many weathering the storm have been VFX producers, those responsible for managing projects, undertaking and reviewing bids, tracking VFX shot delivery and so many other aspects of the visual effects process. Here, several visual effects producers – some operating for film studios, or as independent contractors, or work at VFX studios, and some who do both VFX supervision and producing – discuss the biggest challenges they’re currently facing. Our roundtable of producers include: Diana Giorgiutti, currently on Dungeons & Dragons, after having worked on Mulan; Terron Pratt, who recently finished three seasons of Lost in Space before moving to post on Season 4 of Stranger Things; Hal Couzens, in post on Beast, with past credits including F9 and Dumbo; Karen Murphy-Mundell, whose recent films include Blade Runner 2049 and Gemini Man, in post on Black Adam; Mark Spatny, experienced in both VFX supervision (Lethal Weapon and Station 19 series) and a VFX producer (currently on The Peripheral); Scott Coulter, a VFX supervisor and producer for independent features, most recently Reagan; Annie Normandin, a VFX producer at Rodeo FX on Jungle Cruise, Shang-Chi and the Legend of the Ten Rings and Season 5 of Better Call Saul; and Anouk L’heureux, Vice President of Production at Rodeo FX, with VFX producing experience at several other VFX studios. AN INTERESTING CHALLENGE: SO MUCH WORK

Diana Giorgiutti: “For me, the explosion of streaming content alongside theatrical releases has now created a situation where there is too much work and not enough crew to cover everything. This in turn leads to a lot of crew being thrown into positions they simply are not really experienced or qualified to do. The other key and equally important factor is that there are not enough VFX facilities to easily do all the VFX work across all the varying release timelines. You really have to be on your game to make sure you are doing deals well ahead to guarantee VFX capacity. “For my current project, we awarded the work to our vendors

26 • VFXVOICE.COM SUMMER 2022

PG 26-31 VFX PRODUCERS.indd 26

5/2/22 2:06 PM


“Not only does one have a diplomatic role to play straddling a few fences to ensure the right info is given at the right time and in the right way, one also gets to see all stages of the project from development, prep, shoot, post and wrap, occasionally resulting in a lovely moment – being ignored – on a red carpet.” —Hal Couzens, VFX Producer well ahead of shooting, which is something I had not done before. And we awarded with only script pages as reference. There were little to no visuals at this point, so the award bids were very early and based loosely on words off the page. In turn, this has led to a lot of changes, from the award to actual shot turnovers. A lot changes from the script through prep as things are fleshed out leading up to shooting, and then shooting itself. Not to mention the many adjustments that happen getting a film greenlit.” Mark Spatny: “Hands down, the biggest problem for VFX producers right now is the global glut of work filling up every VFX facility in the world, thanks to the explosion of high-production-value streaming shows piling in on top of the normal big tentpole features. Prior to March 2020, VFX vendors were constantly knocking on doors looking for work and undercutting each other to get it. In today’s climate, I’ve had AAA multi-national vendors and small boutiques alike turn down $2 million of work with a relatively easy eight-month schedule, simply because there aren’t enough artists and in-house supervisors to get the work done. Even simple roto and paint work that previously would have been completed in a week by an outsource vendor, have to be planned months in advance. And prices have soared accordingly.” Terron Pratt: “Most recently, one of the biggest challenges we’re facing is limited artist resources. With so much content being created right now, vendors all over the world are booked for months, even years out. On Season 3 of Lost in Space, thankfully, we were one of the first shows to get back to production, which gave us a slight edge for booking talent when it came time for post. We could see it coming and built a season-wide plan for

SUMMER 2022 VFXVOICE.COM • 27

PG 26-31 VFX PRODUCERS.indd 27

5/2/22 2:06 PM


VFX TRENDS

distributing the work as early as we could. Even with that, we still felt the pinch toward the back half of post forcing us to spread the work a bit more than was originally planned.” PANDEMIC VFX, THE RISE AND RISE OF WFH, AND NEW TECH

TOP: A final visual effects shot by Rodeo FX for Season 2 of The Witcher. (Image courtesy of Netflix) BOTTOM: A final visual effects shot on Season 3 of Lost in Space. (Image courtesy of Netflix)

Scott Coulter: “Dealing with COVID has been the biggest challenge for me, namely staffing. I have had to rethink every aspect of working in film. From set work to post, we really have taken the idea of remote work to heart. The primary lesson for me is that specialization is not as valuable as it was before. Now you have to wear many hats simply because there are fewer people on set.” Annie Normandin: “Now that remote work is becoming a new reality, we’ve had to rethink the ways we approach projects, and stay together and cohesive as a team while continuing to offer the clients new possibilities. I mean, let’s take Shang-Chi, for example. When we were tasked to do the scaffolding fight scene in Macau, the client had planned for it to be a matter of sending a team on location, getting footage, and then we would composite it in the background. But with the travel bans, the production had to change the approach, and they asked us to digitally recreate the entire city. This was a completely new process and required a lot of collaboration with the client to get the best possible outcome. Adaptation is key.” Anouk L’heureux: “I would add that establishing a good partnership with the different clients is key to the success of our shows. To listen to their needs, manage expectations and work

28 • VFXVOICE.COM SUMMER 2022

PG 26-31 VFX PRODUCERS.indd 28

5/2/22 2:06 PM


with them to find creative solutions. I also think that today one of the other main challenges is how to keep that sense of community and togetherness as a team when everybody is alone at home. As a producer, we also have to deal with the human aspects of our work.” Hal Couzens: “During the pandemic, a lot of films went to tell stories in locations requiring few, if any, extras. My most recent [project] had us in South Africa on the borders of Zimbabwe and Namibia in a tented camp. Naturally, the pressure of this landed mainly on the unit and transport departments. However, being so remote there was almost no cellphone connectivity or internet, and base camp and set were far apart. As a unit, we had 40Mbps for the entire operation. No, not 40Mbps per person or department. What internet there was, we needed for production to run the operation, including getting the rushes back to the editors in the U.K. “It was back to the days of runners delivering messages as even sending WhatsApp messages was a significant challenge, let alone running a previs operation back in the U.K. Given our distance and the need to remain extremely tight during a pandemic, we worked without an actual on-set server for the first seven weeks of a data-intensive shoot. This created a number of expected and unexpected issues. One doesn’t appreciate multiple users operating on the same system together until one can’t! Herculean efforts in data management with rather long hours from our coordinators got us through. Not an experiment I intend to repeat soon!” Karen Murphy-Mundell: “Technology-wise, for me, the latest

“The primary lesson for me is that specialization is not as valuable as it was before. Now you have to wear many hats, simply because there are fewer people on set.” —Scott Coulter, VFX Producer/Supervisor

TOP: Scott Coulter was Visual Effects Producer on this Layton’s “Mystery Journey” commercial featuring a CG hamster. (Image courtesy of Scott Coulter) BOTTOM: A scene from Black Sails. Terron Pratt worked on the show as Visual Effects Producer. (Image courtesy of Starz)

SUMMER 2022 VFXVOICE.COM • 29

PG 26-31 VFX PRODUCERS.indd 29

5/2/22 2:06 PM


VFX TRENDS

challenge is weighing the pros and cons of the latest LED walls and on-set virtual camera system technologies. We have to determine the benefits in quality of the final shots as well as the cost of using the technology compared to old school/traditional methods. Putting up an LED wall right now is an expensive venture. There is real pressure in determining what you can save in post and quantifying the value of being able to provide temp shots quicker and screen a more complete film earlier.” WHAT YOU MIGHT NOT KNOW ABOUT VFX PRODUCING

TOP: Gray and chrome balls and a Macbeth chart are captured on the set of Season 3 of Lost in Space. (Image courtesy of Netflix) BOTTOM: The entire Macau surrounds were synthetic in this Rodeo FX sequence from Shang-Chi and the Legend of the Ten Rings. (Image courtesy of Walt Disney Pictures) OPPOSITE TOP: On the set of Station 19. (Image courtesy of Mark Spatny)

Spatny: “The VFX producer is an equal partner to the VFX supervisor on a project and is every bit as responsible for its success – or failure – as the supervisor. At a facility, many artists just think of producers as the timecard police who pinch pennies on every shot. They aren’t aware that every project is an intricate, constantly moving puzzle that the producer has to solve, with a million variables and moving parts all over the world. It’s frankly embarrassing that one of our major industry awards doesn’t recognize and include the VFX producer. I’m glad the Emmys and the VES Awards are more progressive in that way, and I’m proud to say I had a hand in making that happen for both organizations.” L’heureux: “I think people might see a producer as someone who takes care of business and the admin part. But when working with a VFX supervisor, you need to be as creative as they are so that you can help them, provide the tools they need and understand what the client has in mind. And through that creativity,

30 • VFXVOICE.COM SUMMER 2022

PG 26-31 VFX PRODUCERS.indd 30

5/2/22 2:06 PM


a symbiosis between the VFX supervisor and the VFX producer occurs.” Normandin: “The VFX supervisor and the VFX producer are a real duo. We really have to work in sync with a great team around us to bring a sequence to life, so you need to know your partner and trust them.” Coulter: “What people don’t always know about VFX producing is that people come from all sorts of areas. My background originally came from practical effects. Over time, this has proven to be a fantastic training ground for visual effects. Even in today’s productions, I am always offering a practical solution, such as using a translight instead of a greenscreen. On a large project like Automata, I proposed full-scale practical puppets instead of the planned pure CG robots. This saved production countless dollars and provided a superior result.” Murphy-Mundell: “I think that people outside film communities aren’t aware that the job as a VFX producer involves evaluating the daily changes in all departments while in film production and how it affects the VFX plan. Decisions made on set in stunts, art department, costumes, construction and script can have big consequences for VFX costs and methodology many months into post.” Giorgiutti: “As a studio-side VFX producer, most people probably don’t realize that we are on from the very beginning of a film to the very end. A lot of the time the VFX producer starts even ahead of the VFX supervisor. Other than the director and producers, VFX is the only department that is on a film all the way. Bearing this in mind, I often say that my job as a VFX producer is

50% budget and management of filmmakers and our VFX teams, with the other 50% being counseling. This counseling is totally tied into the plethora of politics we have to navigate to keep a good balance with the filmmakers, studios, VFX crew and VFX facilities.” Couzens: “Indeed, often the VFX team and the VFX producer are the longest-running crew members on a production, aside from director, producers and occasionally an accountant. The result of this is that the VFX producer provides a continuity that runs the ‘full length of the counter’ and can thus provide support in making choices, creative and otherwise, to all sides of filmmaker/ studio/editorial/finance and facility equations. Not only does one have a diplomatic role to play, straddling a few fences to ensure the right info is given at the right time and in the right way, one also gets to see all stages of the project from development, prep, shoot, post and wrap, occasionally resulting in a lovely moment – being ignored – on a red carpet.” Pratt: “VFX producing doesn’t start in post. I think it’s important that the VFX team is involved as early as possible, even at the script stage, including the VFX producer. The conversations early on are not solely creative but also involve securing resources [production team, on-set team, previs artists, budgeting and scheduling], understanding which vendors are going to be available when and engaging them early. The job is as much about developing relationships and building a team as it is about hitting a deadline. The only way to hit that deadline is with the right teammates.”

SUMMER 2022 VFXVOICE.COM • 31

PG 26-31 VFX PRODUCERS.indd 31

5/2/22 2:06 PM


ANIMATION RENAISSANCE FUELED BY TECH ADVANCES, NEW STORYTELLERS AND LOCAL CONTENT By CHRIS McGOWAN

TOP: Pixar’s Turning Red is a coming-of-age fantasy/comedy featuring 3D animation with anime influences, directed by Domee Shi. (Image courtesy of Pixar/Disney) OPPOSITE TOP TO BOTTOM: Twenty-seven years after Toy Story, Pixar used computer animation tools in Lightyear that were “vastly superior in terms of scale and complexity” to what was possible in 1995. (Image courtesy of Pixar/Disney) Emiko (Kylie Kuioka) in Paramount’s Blazing Samurai, animated by Cinesite. (Image courtesy of Paramount Pictures) Matt Groening (The Simpsons) created and co-developed Disenchantment, which was produced by Rough Draft Studios and the ULULU Company (Images courtesy of Netflix)

Animated features and series of all types and stripes are launching this year, thanks to growing global demand, an anime goldrush and the impact of the streamers. The diverse array of titles includes both high-grossing sequels of franchises that are mostly for children and fare for teens and/or adults that push the animation envelope in terms of content and style. Animated movies have already established themselves as a significant part of the movie business, having achieved formidable box office grosses. Four animation titles in the top 25 films released in the past four years have grossed over $1 billion dollars worldwide, led by Disney’s Incredibles 2 (2018), The Lion King (2019), Toy Story 4 (2019) and Frozen II (2019), according to Box Office Mojo. More than 50 animated titles have topped the $500 million mark. Animation has also established a strong streaming presence. “There has been an explosion of animation over the last few years in all areas and styles,” says David Prescott, Senior Vice President, Creative Production of DNEG Animation, which co-produced the Disney/20th Century Studios hit Ron’s Gone Wrong in 2021 and worked on Paramount’s Under the Boardwalk for this year and Alcon/Sony’s Garfield project for 2024. Kane Lee, Head of Content for Baobab Studios, comments that “there has been a real sea change in supply and demand for animation content, both in general and across genres.” According to Ingrid Johnston, Animal Logic Head of Production, “Right now, we have a great opportunity to see a wide range of animation styles and stories told in animation. Flee [the animated Danish docudrama] being nominated for this year’s Oscars is a great example of this. The success of films like [Sony’s] Spider-Man: Into the Spider-Verse are showing that audiences are engaged in different styles of animation. We have already seen an increase in the amount of animated content for adults, such as Love, Death + Robots, and filmmakers are seeing animation as a way of

32 • VFXVOICE.COM SUMMER 2022

PG 32-36 ANIMATION.indd 32

5/2/22 2:07 PM


telling more diverse stories.” Animal Logic co-produced Sony’s Peter Rabbit 2: The Runaway (2021) and is working with Netflix Animation on The Magician’s Elephant and Warner on Toto, the latter two due in 2023 and 2024, respectively. High-profile 2022 titles include Paramount’s Blazing Samurai, Universal/Illumination’s Minions: The Rise of Gru, Universal/ DreamWorks’ The Bad Guys and Puss in Boots: The Last Wish, Sony’s Spider-Man: Across the Spider-Verse (Part One), Warner’s DC League of Super-Pets, Disney’s The Ice Age Adventures of Buck Wild and Strange World, Disney/20th Century’s The Bob’s Burgers Movie and Disney/Pixar’s Turning Red and Lightyear. Plus, there are animated series bowing in 2022 that will join several dozen already available. New arrivals include Dan Harmon’s Krapopolis for Fox, Amazon’s The Legend of Vox Machina and The Boys Presents: Diabolical, and the Disney sequel The Proud Family: Louder and Prouder. The streamers have jumped into animation in a big way, either as producers or distributors. Netflix has had the biggest footprint, acquiring or producing numerous animated films. Several 2022 Netflix releases have auteurs at the helm: Guillermo del Toro’s Pinocchio, Henry Selick’s Wendell and Wild, Richard Linklater’s Apollo 10 1/2: A Space Age Childhood, Nora Twomey’s My Father’s Dragon and the stop-action horror anthology The House, written by Enda Walsh. Netflix is also distributing Rise of the Teenage Mutant Ninja Turtles: The Movie, The Sea Beast and Riverdance: The Animated Adventure this year. Meanwhile, Sony Pictures Animation’s Hotel Transylvania: Transformania is distributed by Amazon Studios and Paramount/Skydance Animation’s Luck by Apple TV. Lee comments, “As we move into streaming and other new platforms, the playing field is more level, and we have more ready access to global content than ever before. So, the audience’s perception of what animation is, can be and who it’s for – especially here in the U.S. – is changing.” Lee’s firm, Baobab Studios, makes both animated films and interactive animation, often releasing titles on multiple platforms. WIDE VARIETY, DAZZLING DIVERSITY

VFX firms WetaFX and RISE Visual Effects Studios have expanded into animated film production, joining the likes of Animal Logic and Cinesite, who are well established in producing animated features. WetaFX CEO Prem Akkaraju comments, “Weta Animated has been something that has been discussed for years within Weta. We have such a wealth of storytelling talent within the company that creating a business structure around them to help generate original content really felt like the logical next step. Weta has also developed a robust pipeline of tools over the years that give artists and directors a broad palette to work from in creating a style that best suits their creative project. Now is the perfect time for us to make this move.” Johnston notes, “Factors such as an increase in animation studios, with traditional VFX studios starting to make animated films, and audience demand for content have expanded the types

of animated films being created. Also, storytellers and filmmakers are seeing that not only can you tell stories beyond traditional family films, but that there’s also an audience who want to see them. The idea of traditional animated content is really being challenged, and we’re also seeing how it can work alongside other forms of content to tell rich, complex stories.” “There has been an explosion of stylistic exploration in computer feature animation in recent years which I find super exciting,” says David Ryu, Vice President, Tools for Pixar Animation Studios. “Projects big and small are experimenting with looks, and I love seeing the range of looks projects are finding. There’s borrowing from so many influences: 2D hand-drawn

SUMMER 2022 VFXVOICE.COM • 33

PG 32-36 ANIMATION.indd 33

5/2/22 2:07 PM


ANIMATION

animation, stop- motion, live-action film, so many lineages of 2D traditional art. And the ways we see the principles of all these things being put together to make something new is exciting and inspiring. This is going in so many directions, and I’m excited to see what looks arise over the next few years and what that means in terms of the technologies and pipelines we use to make them.” Cinesite Head of Animation Eamonn Butler points to many titles with NPR (Non-Photorealistic Rendering) styles, for example The Mitchells vs. the Machines (produced by Sony and distributed by Netflix), Sony’s Spider-Man: Into the Spider-Verse and Arcane (produced by Riot Games and Fortiche and distributed by Netflix), which utilize 3D-manipulated renders combined with handdrawn 2D techniques, painterly lighting and clever experimentation with surfacing and form. Cinesite has itself created a painterly NPR look for Hitpig, an Aniventure film from author Berkeley Breathed. Butler says that it’s exciting to experiment “with design, animation style and lighting” to create unique and appealing looks for Cinesite’s movies. Cinesite also teamed with Aniventure on Blazing Samurai and Riverdance: The Animated Adventure. NEW STORYTELLERS TELLING DIFFERENT STORIES TOP TO BOTTOM: Sony Pictures Imageworks contributed to the visual effects on Hotel Transylvania: Transformania. (Image courtesy of Amazon Studios) Elfo and Bean cross a bridge in Disenchantment, which was animated by Rough Draft Studios (Futurama). (Image courtesy of Netflix) Animal Logic worked on Warner’s DC League of Super-Pets. (Image courtesy of Warner Bros. Pictures)

Looking at animation history, WetaFX’s Senior Animation Supervisor Sidney Kombo-Kintombo comments, “Animation was at first an art for the initiated only. It used to be expensive and only a very small group of people had the required expertise. But nowadays, animation is a very accessible door to producing and sharing a story. Thanks to online tutorials, student licenses for professional software and the generosity of studios such as Weta, knowledge and professional tools are being put at the disposal of whoever wants to learn, even in remote regions where the use of internet is still a luxury. Thanks to that, we have witnessed the emergence of new talents and storytellers that create content based on remote

34 • VFXVOICE.COM SUMMER 2022

PG 32-36 ANIMATION.indd 34

5/2/22 2:07 PM


cultures, stories and legends. These changes have a refreshing and enriching effect on the entire animation industry. The art is getting richer with more diverse artists and storytellers representing a wider range of cultures, [and] the world is opening up even more to the fact that there is more than one way of animating.” Animal Logic approaches each film it works on as a new opportunity to evolve artistically and technically. “We first consider the story and then we look for the best way to represent it visually,” according to Johnston. “This has allowed each of the films we’ve worked on to have their own distinctive look.” This includes the Warner-distributed Legend of the Guardians: Owls of Ga’hoole, with richly detailed feathers and foliage, Sony’s Peter Rabbit films with realistic rabbits integrated with live-action plates, and Warner’s LEGO Movie franchise with their unique stop-motion style. “Even within the LEGO universe, each film explored what elements of real world or stop-motion would best suit the requirements of the film. And DC League of Super-Pets has a whole new look of its own, too.” ANIME CONTINUES TO GROW AND EXPAND

Anime accounts for a growing portion of the global animation business. Demon Slayer the Movie: Mugen Train and Spirited Away have globally earned over $503 million and $396 million at the box office, respectively, and 20-odd titles are nearing or above $100 million, while innumerable series and movie sequels contribute to large totals for anime franchises. Netflix has invested heavily in anime acquisitions and original programming. Hulu also has a large selection. Sony owns Crunchyroll, which, as of March, had more than 40,000 episodes, or 16,000+ hours, of a wide range of anime, according to Rahul Purini, Crunchyroll Chief Operating Officer. Looking back, he observes, “Animation in the West has primarily focused on children or comedy and the growth of anime and video games has helped create a generation that is much more comfortable with adult dramatic animation.” He adds, “Anime is not new, but it has grown exponentially over the last decade or so with the expansion of streaming platforms and expanded international rights and distribution. Many don’t understand that anime is not a genre in itself – there are many styles within it, like fantasy, action, adventure, comedy and more. And as investment in the anime ecosystem and industry increases, you will see the storytelling growing and expanding in all directions.” NEW TECHNOLOGY UNLEASHES CREATIVITY

Pixar’s Toy Story (1995) was the first feature-length computeranimated film. This year, Pixar will launch its latest spinoff, Lightyear, about which Ryu comments, “It’s interesting, in terms of software, that we’re actually using lots of spiritually similar stuff! We are still using RenderMan, and our animation system Presto shares DNA with our old ‘Menv’ system used in those days. Of course, RenderMan and Presto are light years ahead of what they were in those days. We’re in a different universe in terms of the scale and complexity of what we can do. And looking at where we’re at now vs. where we were, it’s cool to see the sea change in

TOP TO BOTTOM: Lighthouse Studios, which animated The Cuphead Show!, is based in Kilkenny, Ireland and specializes in 2D animation. (Image courtesy of Netflix) Cinesite animated Riverdance: The Animated Adventure, directed by Eamonn Butler and Dave Rosenbaum. (Image courtesy of Netflix) Turning Red has quickly become another top-notch addition to Pixar’s growing library of classic animated films. (Image courtesy of Pixar/Disney)

SUMMER 2022 VFXVOICE.COM • 35

PG 32-36 ANIMATION.indd 35

5/2/22 2:07 PM


ANIMATION

terms of artist interactivity and how far we’ve come in terms of reducing the technical barriers to entry. Both of these are showing up on the screen in terms of the complexity and quality of what we’re making.” Prescott comments, “New technology always has an interesting effect on any form of creative storytelling. With animation, it is allowing filmmakers the chance to tell stories they were not able to really approach before. It is also allowing each project to have a look and feel that suits that particular project. We’re integrating real-time workflows and machine learning to develop new and cutting-edge interactive experiences for our artists, which is changing the animation production process, all designed to let creativity thrive.” Cinesite Chief Technology Officer Michelle Sciolette, speaking of game engines such as Unreal Engine and Unity, notes that, “In the past few years, major technical advancements in the graphics-processing unit [GPU] of computers have enabled such engines to render production-quality imagery while maintaining their real-time speed. Cinesite, along with its production partner Aniventure, is currently developing an animated feature film utilizing game engine technology.” Akkaraju adds, “The most exciting technology development is the inclusion of AI and machine learning techniques in the animation workflow. It has the potential to affect the artists’ day-to-day workflows as much as the transition to having computers handle the in-betweens. There is still much of the animation workflow that is manual and repetitive – tasks that are not aiding the creative process.” ADAPTING TO THE PANDEMIC

TOP TO BOTTOM: Kranz (Zachary Levi) in Mission Control in Richard Linklater’s Apollo 10 ½: A Space Age Childhood, which blends a unique combination of hand-drawn animation, live-action and CGI. (Image courtesy of Netflix) Warner Animation Group, Animal Logic, DC Entertainment and Seven Bucks Productions teamed on the production of Warner’s DC League of Super-Pets. (Image courtesy of Warner Bros. Pictures) Hank (Michael Cera) and Jimbo (Samuel Jackson) in Blazing Samurai. (Image courtesy of Paramount Pictures)

The growth of animation was accelerated by adaptations to COVID-19. “With the arrival of the pandemic, live-action filmmaking was largely put on hold while the demand for animation accelerated – our mostly digital pipelines could adapt to remote work and such. So, in 2022 we’re going to see the fruits of that labor, and it’s only just the beginning,” says Lee. During the pandemic, “live-action directors and studios were able to consider animated scripts, and now we’re seeing a large demand for animated films. The industry has never been busier,” comments Johnston. “Audience demand for animation has also grown substantially over the last few years and people are consuming more and more,” observes Akkaraju. “One interesting by-product of this trend has been an expanding of audience perception of what animated storytelling can be. We’re seeing a wider range of stories and storytellers be embraced by the mainstream, and that’s great for everyone.” Concludes Johnston, “The beauty of animation is that anything you can imagine can be created, so there’s a lot more freedom and scope within the world of animation.”

36 • VFXVOICE.COM SUMMER 2022

PG 32-36 ANIMATION.indd 36

5/2/22 2:07 PM


PG 37 FTRACK AD.indd 37

5/2/22 2:33 PM


KRISTEN PRAHL: VFX PRODUCER FINDS SUCCESS COMES WITH ‘A LITTLE FAITH, TRUST AND PIXIE DUST’ By TREVOR HOGG

TOP: Kristen Prahl, VFX Producer for Ghost VFX in Copenhagen, Denmark. (Image courtesy of Kristen Prahl)

Even though Copenhagen, Denmark has been home for a decade, Ghost VFX Visual Effects Producer Kristen Prahl was born in the American South and raised in the Midwest, first living in Kentucky, then moving to Troy, Ohio, which is just north of Dayton. Her mother was an art major, but decided to go into teaching because she thought there would be better career opportunities with a degree in education rather than art. Childhood hobbies were shared amongst the siblings. “My older sister and I were very involved in 4-H growing up. It was more ‘artsy’ though and we’d do tons of craft projects. I made it to State a couple of years, otherwise my early claim to fame was at the local county fair.” Movies were also part of her adolescent life. “Hands down, the movie that has left a lasting impression on me is Jurassic Park [1993]. I was only 11 then, so naturally I was that girl in the theater that screamed and jumped out of my seat during the Velociraptor kitchen scene. Another special movie for me is the original Lion King [1994]. I made my dad take me to the theater twice. My parents got the hint and took me on a studio tour at MGM where I got to see animators at work. From that day, I swore to my parents that that was what I was going to do when I grew up.” Disappointment arose when after high school Prahl attended Ohio University where she studied in the School of Art + Design with hopes of earning a major in Graphic Design. “Unfortunately, I didn’t make the cut, and I told my parents I was thinking of art history as my new major. My parents gave me the option of pursuing art history or choosing any school [within reason] that I wanted, without majoring in art history. I took them up on their offer and choose Savannah College of Art and Design (SCAD). After three years, I graduated with a BFA in Visual Effects.” California soon beckoned. “With prospects looking slightly better in the early 2000s, I got to make the leap, and I owe my parents everything for that,” says Prahl. “During my last year of college, I went on a school trip to Los Angeles and managed to get an internship at Zoic Studios. Initial industry highlights include being a lip double on CSI and painting ooze on a ‘dead’ body for a forensics scene.” An interesting early job for the aspiring artist was as a dustbusting lead at Digital Domain. “It’s not far off from regular housekeeping,” notes Prahl. “You essentially paint out any dust that might have gotten caught when film is scanned. Typically, you’ll just clone pixels from another frame or from another part of the same frame. Most films today are shot digitally, so dust-busting isn’t really needed all that much, but I had some great years at Digital Domain, which I’ll always cherish.” She then transitioned to be a rotoscope artist on Speed Racer and Star Trek. “I haven’t been on the box as an artist for some time now,” she admits, “but obviously the technology has gotten better. Nevertheless, the process of rotoscoping is still user driven, and so, until an AI learns this art form, it can still be extremely time-consuming.” A temporary move to Europe became more permanent. “My boyfriend at the time [now husband] is Danish and wanted to try life a bit closer to home as he’d been stateside for over 10 years,” recalls Prahl. “I was totally onboard to give Copenhagen a try. We had only planned to stay for a year, but 10 years later and we’re still

38 • VFXVOICE.COM SUMMER 2022

PG 38-42 KRISTEN PRAHL.indd 38

5/2/22 2:08 PM


“Often, success is a perplexing combination of hard work and chance. Remember that you need both, and trust that you’ll stumble upon what you need, when you need it.” —Kristen Prahl, VFX Producer, Ghost VFX here. I was quite worried about being able to find the same type of work, but I was able to get my foot in the door at Ghost in 2011 as a freelance roto/paint artist. I bounced around at a couple of other companies, but always found myself back at Ghost. The atmosphere was similar to what I knew from Digital Domain, and a giant plus was that the working language was English.” Ghost VFX had humble origins. “Ghost was originally founded by a few ex-LEGO employees working out of a garage [aka trailer],” explains Prahl. “They did mostly commercial work at first, but over the years local features were added, then Hollywood blockbusters. I remember when we were awarded work on Rogue One: A Star Wars Story. It was a milestone for the company and also one of my favorite shows to have been part of. Production has been at a ludicrous speed ever since, and today high-end streaming shows make up the lion’s share of what we do.” Becoming a visual effects producer was a natural transition. “I’ve always been a bit compulsive in terms of organizing and planning,” Prahl acknowledges, “so when Ghost looked to expand their production group, I threw my name in the hat. I started out as an assistant on various commercials and local features, then moved into my first real producer role on Legendary’s feature

TOP LEFT: VFX Supervisor Ivan Kondrup Jensen, Prahl and Creative Director Martin Gårdeler representing the Emmywinning Star Trek: Discovery Ghost VFX team on the red carpet. (Photo courtesy of Ghost VFX) TOP RIGHT: Prahl celebrates winning the 2021 Primetime Creative Arts Emmy for the Star Trek: Discovery episode “Su’Kal.” (Photo: Anna-Lene Riber. Courtesy of Kristen Prahl) BOTTOM: A milestone for Ghost VFX was being awarded work on Rogue One: A Star Wars Story. (Image courtesy of Ghost VFX and Lucasfilm)

SUMMER 2022 VFXVOICE.COM • 39

PG 38-42 KRISTEN PRAHL.indd 39

5/2/22 2:08 PM


PROFILE

TOP TO BOTTOM: One of the favorite projects Prahl worked on was Rogue One: A Star Wars Story. (Image courtesy of Ghost VFX and Lucasfilm) A number of the Ghost VFX artists who worked on Rogue One: A Star Wars Story grew up with the Star Wars franchise. (Image courtesy of Ghost VFX and Lucasfilm) Rogue One: A Star Wars Story was the first film in the franchise to deviate from the Skywalker family storyline. (Image courtesy of Ghost VFX and Lucasfilm)

Krampus and their first season of the TV show Colony. In more recent years, I’ve been primarily working as Ghost’s VFX Producer on Star Trek: Discovery.” As technology advances, the role of the visual effect producer has essentially remained the same, according to Prahl. “Obviously, shows come in many shapes and sizes, but I think that being a producer at its core is about understanding team dynamics and keeping everyone’s focus on the [hopefully] shared end goal. On the client side, it is about building trust, creating transparency and clear communication. Internally, it’s often about trying to predict future challenges and never assuming anything.” She has endured a few tough shows over the years. “The hardest of shows also make you realize that you are a part of an amazing team of very talented artists, and that you can take on any curveball the client might throw your way. I’ve managed to be incredibly lucky to have so many talented people working alongside me.” Working on Rogue One: A Star Wars Story was memorable for Prahl. “Rogue One was a blast because most artists [at Ghost VFX] grew up with Star Wars, and everyone at the company wanted to help out with any small task just to be able to say to their friends or parent, ‘I worked on Star Wars!’ To be honest, I didn’t see A New Hope until I got to college. My dad was a big Star Trek fan, and I’ve seen every old [and new] Star Trek movie and all of Next Generation multiple times.” Over the past six years the company’s focus has been more on high episodic content. “The biggest difference is schedule and pace,” Prahl observes. “Movie shot production can span many months, even years, depending on where in the chain you start. Here you have the ability to work on looks for months before rolling it out to your hero shots, then all shots. Episodic shows, on the other hand, always have new assets or effects for every episode. We still run through all the same steps,

40 • VFXVOICE.COM SUMMER 2022

PG 38-42 KRISTEN PRAHL.indd 40

5/2/22 2:08 PM


“[B]eing a producer at its core is about understanding team dynamics and keeping everyone’s focus on the [hopefully] shared end goal. On the client side, it is about building trust, creating transparency and clear communication. Internally, it’s often about trying to predict future challenges and never assuming anything.” —Kristen Prahl, VFX Producer, Ghost VFX but much faster and often with overlapping episodes as these typically are spaced out a few weeks apart.” There is a convergence occurring between visual effects for television and film. “Right now, we are finding the streaming schedules to be an excellent fit, but we always try to push for as much visual complexity and realism as we can, and so any additional time is always greatly appreciated,” states Prahl. “There’s a high demand right now, and for many vendors not having enough capacity is becoming a real challenge. I do think we’ll see this trend continue in the years to come with heightened competition for viewership and market share between all the studios. However, as we move from growth to a more mature streaming market, we’ll see the usual suspects assume their dominant role, and it will be up to us smaller shops to be lean enough to compete.” In regard to Netflix buying Scanline VFX, Prahl notes, “We’ve seen studio in-source in the past, but typically this has been through organic growth with varying success. If demand continues this crazy upward trend, it’s likely we’ll see more

TOP TO BOTTOM: From Lost in Space. Prahl believes that a VFX producer should understand team dynamics and keep everyone focused. (Image courtesy of Ghost VFX and Netflix) There is a convergence occurring between visual effects for television and film, as illustrated by the Netflix production of Lost in Space. (Image courtesy of Ghost VFX and Netflix) Prahl has served as the production VFX Producer for Star Trek: Discovery since Season 2. (Image courtesy of Ghost VFX and Paramount+)

SUMMER 2022 VFXVOICE.COM • 41

PG 38-42 KRISTEN PRAHL.indd 41

5/2/22 2:08 PM


PROFILE

“This year’s Oscar nominees, for instance, were all male. However, women are well represented in production, and at Ghost we are also starting to see an uptick of more young women coming through our doors. We still have a way to go, and it would definitely be fantastic to see more women in every discipline and at all levels.” —Kristen Prahl, VFX Producer, Ghost VFX

TOP TO BOTTOM: In recent years, Prahl has been primarily working as Ghost’s VFX Producer on Star Trek: Discovery. (Image courtesy of Ghost VFX and Paramount+) Prahl has traversed multiple galaxies in her career, from Rogue One: A Star Wars Story (2016), one of her favorite shows she has worked on, to Star Trek: Discovery, pictured here. (Image courtesy of Ghost VFX and Paramount+) Prahl is proud of what the Ghost VFX team has been able to accomplish on Star Trek: Discovery and their capacity to handle the high demand for content. (Image courtesy of Ghost VFX and Paramount+)

studios secure capacity through retainers and acquisitions.” Overall, the bidding process has remained the same. “But we see a bit more time put into previs, which is extremely helpful,” remarks Prahl. “Time zones are not necessarily important for the client, but can be a huge advantage for vendors with solid global pipelines. However, rebates have historically been a requirement for getting a seat at the initial bidding table.” Sharing shots and assets amongst vendors has become easier. “As the industry has matured, off-the-shelf software and open-source formats have come to play a central role at most facilities. As a result, this also means that most companies can now easily share geometry, textures and shader assignments. Disciplines further down the pipeline are still often too entangled in proprietary code, so things like rigs and shading still typically require a full rebuild, albeit with a turntable or similar as reference.” Inspiration can be found in the application of real-time graphics and video game technology. “Especially virtual production, both as we saw on The Lion King and with ‘the volume’ on The Mandalorian, has really taken the industry by storm,” states Prahl. “At Ghost, we’re starting to implement some of these approaches, but mainly with game engines and real-time renders as another tool in the ‘traditional’ visual effects pipeline. It will be exciting to follow how/if the game and visual effects industry will merge.” The visual effects industry remains male dominated, especially on the artist side. “This year’s Oscar nominees, for instance, were all male. However, women are well represented in production, and at Ghost we are also starting to see an uptick of more young women coming through our doors. We still have a way to go, and it would definitely be fantastic to see more women in every discipline and at all levels.” Prahl expresses the enjoyment she receives working in the visual effects industry. “First, I really love my job and what I do, and this a common factor for anyone who’s had a long career in any industry. I’ve poured my heart and soul into this industry and learned from ‘Dory’ to just keep swimming when faced with adversity.” A particular career highlight would appeal to her father. “Nothing beats seeing your name on the big screen for the first time, but honestly, I’m still on an all-time high from our recent Emmy win for Outstanding Visual Effects in a Single Episode for Star Trek: Discovery (“Su’Kul”). I got nominated alongside Ivan Kondrup Jensen, Ghost’s VFX Supervisor on the show, and I’m so proud of what our team was able to accomplish. A favorite quote comes from Peter Pan, which goes, ‘All you need is a little faith, trust and pixie dust.’ Often, success is a perplexing combination of hard work and chance. Remember that you need both, and trust that you’ll stumble upon what you need, when you need it.”

42 • VFXVOICE.COM SUMMER 2022

PG 38-42 KRISTEN PRAHL.indd 42

5/2/22 2:08 PM


PG 43 RODEOFX AD.indd 43

5/2/22 2:33 PM


LIGHTYEAR IS ‘LIGHT YEARS’ BEYOND WHAT PIXAR HAS DONE BEFORE By TREVOR HOGG

Images courtesy of Disney/Pixar. TOP: Each time Buzz Lightyear attempts to achieve hyperspace during a test flight, he freezes in time while those around him grow older. OPPOSITE TOP TO BOTTOM: Concept art developed by Bill Zahn exploring what hyperspace travel might look like from a cosmic perspective. The lighting was complex for spaceship cockpit shots. Lightyear provides a clever twist on the signature line, ‘To infinity and beyond!’

As Buzz Lightyear experiences identity issues throughout the Toy Story franchise, the beloved animated character takes on a new persona in Lightyear, as director Angus MacLane (Finding Dory) wanted to make the movie that inspired Andy to buy the toy. But do not expect a carbon-copy interpretation since the demands and intention of the project were entirely different. “The key for us was to capture the elements of what people love about Buzz,” states Galyn Susman, Producer of Lightyear. “In the Toy Story world, Buzz is more defined in relationship to Woody. We are now making a feature where Buzz is the protagonist, so obviously some of the things that make Buzz endearing as a sidekick aren’t substantive enough to necessarily carry through a feature film. The thing that we came up with that we love about Buzz is that he is out of step with reality.” The theme was built into the narrative structure. “Buzz and his compatriots are stranded on a planet and need to develop a fuel that will help them to reach hyperspeed so that they can get back to Earth,” explains Susman. “What they all discover is every time he goes on a test flight, because he’s approaching the speed of light, time passes slowly for him. He ends up spending act one like a skipping stone through time. His disconnect with reality is that he’s frozen in a time that doesn’t exist anymore, and everybody else on the planet is moving on with their lives. It’s much more serious

44 • VFXVOICE.COM SUMMER 2022

PG 44-48 LIGHTYEAR.indd 44

5/2/22 2:08 PM


than Toy Story. You can’t have a sci-fi action epic adventure kind of movie if you don’t feel like you have real stakes.” A major source of inspiration were the blockbuster films of the 1980s and 1990s. “It would need to be emotional, relatable and funny,” remarks director MacLane. “But the core idea was how do we make this movie so that it has the excitement potential of the sci-fi movies that people of my age grew up with that gave you ‘that was awesome’ feeling. We achieved that with a combination of elements. Some of it is limitation. You could only afford one probe droid or one [additional prop element], but that gave a clarity and simplicity to things that I enjoyed. Creatively, I wanted to chase a look that was cartoony enough to be animated but realistic enough to be concerned about the character’s safety, and allowed the characters who sit in that universe to feel like they belong there. I wanted to make a film that was making clear decisions visually about what the audience is seeing. I wanted the art direction to be something that feels clunky, substantial and manufactured.” There is a major reason why Lightyear does not look like previous Pixar movies. “Pixar has a great library called ‘The Backlot,’” explains Tim Evatt, Production Designer of Lightyear. “It’s just a library of pieces that have already been used in previous existing films, and I knew that in order for Lightyear to have its own language we almost need to not use anything from The

“I knew that in order for Lightyear to have its own language we almost need to not use anything from The Backlot [Pixar shot library]. We needed to replenish and make our own backlot. The strength of having a modeling art department is that we were able to replenish our pieces and make a new movie.” —Tim Evatt, Production Designer

SUMMER 2022 VFXVOICE.COM • 45

PG 44-48 LIGHTYEAR.indd 45

5/2/22 2:08 PM


ANIMATION

TOP TO BOTTOM: Buzz Lightyear stares at an experimental fuel cell that he uses for a number of test flights, which results in a surprising side effect. The physicality of Sox was inspired by animatronics. Certain poses such as this one pay homage to the Toy Story franchise.

Backlot. We needed to replenish and make our own backlot. The strength of having a modeling art department is that we were able to replenish our pieces and make a new movie.” The art directors were proficient with 3D modeling, which eased the transition of 2D concepts. “They were able to get the shape language into 3D as soon as possible, start building things in 3D and distribute those pieces to the other departments,” adds Evatt. “We weren’t having to talk about what is the shape language.” There was an extra dimension of complexity in the lighting, especially for the spaceship cockpit shots. “When Buzz is flying out in deep space, we relied on a lot of the self-illuminated buttons in his cockpit,” states Ian Megibben, Cinematographer - Lighting on Lightyear. “On our past movies we have tracked whether a light is on or off all the way, from our modeling department through animation and into lighting and rendering. But it was far more complex with this because before, in the past, it was whether a car had its headlights on or off. Here Buzz has 300 different buttons in his cockpit, and they all had to be something that the animator could animate on and off, and we’re going to see that reflected in his helmet.” An approach was adopted that was similar to using LED panels for The Mandalorian. Comments Megibben, “We would capture probes of our environments, and a lot of times, for the sake of optimization, the set dressing department would say, ‘We’re not going to dress anything behind the camera.’ And I said, ‘I actually need those! Because we’re going to see that reflected in Buzz’s helmet.’ This is the first time that we leaned into something that nerdy and specific.” When it came to the animation rigs, Buzz was treated differently when wearing the Space Ranger suit. “When he’s in the Space Ranger suit, his body is not in it. It’s just a hard-rigged suit,” notes David DeVan, Animation Supervisor of Lightyear. “When he’s wearing soft goods in other scenes, that is his body underneath

46 • VFXVOICE.COM SUMMER 2022

PG 44-48 LIGHTYEAR.indd 46

5/2/22 2:08 PM


the fabric. We made a model of Buzz, and then made the hard-suit Buzz.” The hard suit has inherent challenges. “You have to cheat everything because you can’t get his arms in front of him,” observes DeVan. “We had to accept the limitations. Part of it is accepting that he is in this big barrel thing, and that’s part of how he moves. We wanted to incorporate that into the motion and feeling of things. The shots where Buzz dives and rolls in the fight scene were exciting and tactile because they incorporate the limitations and physicality of what’s there.” The animation of the adversarial Emperor Zurg and his robots followed a ‘less is more’ principle. “We always talk about how Yoda doesn’t have to do anything,” remarks DeVan. “The harder you show them working, the less powerful they must be.” The mechanical limitations of feline robot companion Sox were played to full advantage. “We went through the gambit of how cat-like is she?,” DeVan says, boiling it down to what’s funny. “The challenge with Sox early on was it had to have rotational joints on a complex shape and it took some time to figure out how it was going to work.” Effects like smoke plumes should be photoreal enough to be believable but also be able to sit seamlessly in a stylized animation environment. “The process for me is always, let’s get as much reference as we possibly can of the real-world thing that we’re looking at and figure out what it means to hit that,” explains Bill Watral, FX Supervisor of Lightyear. “You start a simulation and target that. Then you start taking away levels of detail or adding stylized silhouettes to things. You also get reference of really stylized stuff like Japanese anime of space shuttles launching and put that side by side with SpaceX footage. What is the difference between these two things? It’s the levels of frequency and details. Then you try to find this happy balance between the level of detail that feels right and makes you believe that this is the phenomenon you’re witnessing, but not so much detail that your eye goes there and

TOP TO BOTTOM: A LED screen-style setup was utilized to get the proper reflections and refractions on the space helmet visor worn by Buzz Lightyear. Emperor Zurg (James Brolin) appears to be an invincible adversary of Buzz Lightyear (Chris Evans). Tim Evatt and Ian Megibben created concept art in an effort to develop the aesthetic of hyperspace travel.

SUMMER 2022 VFXVOICE.COM • 47

PG 44-48 LIGHTYEAR.indd 47

5/2/22 2:08 PM


ANIMATION

TOP: A real-life approach was taken when re-envisioning the Space Ranger outfit from the Toy Story franchise for Lightyear. BOTTOM: Grant Alexander explores the silhouette and poses of Buzz Lightyear wearing his iconic Space Ranger outfit.

you’re starting to scrutinize it in relationship to all of the other work around you.” Producing a sun went faster than originally thought. “That was one of those effects I wish I had six months to work on, and we got it done in about a month,” remarks Watral. “It was challenging because that’s one of those [where] we could have made a realistic sun, but it just didn’t fit in the world. We really stylized the heck out of that one. It lent itself to the time frame that we had to work on it, too. [We had] Enrique Vila, the effects artist, and the compositor on that. They worked tightly together and we added layers as needed. Originally, we thought that we were going to add a lot more detail into the sun to see it, but Angus embraced this idea from Sunshine where everything is really freaking blown out, which added to the sense of danger when Buzz slingshots around the sun and there is this heatshield that comes on. The way to sell that was to bloom things out. We created a library of arcs and magnetic loops. In the end, when we started to bloom it, we realized that we could back off on some of that detail because we didn’t need it, and it was actually introducing too much chatter on the images, drawing our eye away. We’re always dropping detail away from where we don’t want you to be looking.” Atmospherics were essential in creating the various biomes found on the lunar-locked planet of T’Kani Prime. “When I first got onto the film, that was the first thing we tackled,” remarks Watral. “On previous films, we had a thing called dress effects, which is basically an effect that you can dress in at lighting time in our lighting software Katana. But that was relatively limited to mostly semi-homogeneous volumes to fill the air a little bit. We knew in this film that we were going to need big plumes and big vistas with plumes dressed out all over the place and all over the planet. We took a month and went in and rewired that system to be more robust. We added all of these new simulations at a much larger scale and squirreled those away on disks. We made a handshake deal with the lighting artists where we said, ‘We have these simulations that can be cached out at a regular speed, half speed, quarter speed, and a static version. You can choose anyone of those versions you want on this pulldown in Katana. You pick the silhouette you want, the speed for the scale, then place it in the scene and dress it around.” Getting access to extra render cores was factored into the budget. “We had to store this data to begin with. Storage is not expensive, but also not cheap. The render time is a huge thing. We originally explored ways at render time to re-rasterize these grids into voxels that are larger further away while the closeup voxels are smaller. But in the end, what we found is if we let RenderMan do its thing, it was mostly okay as long as we split the layers separately and lighting had control over it. We could iterate on those independently and had enough time. Lightspeed is the optimization section of the lighting department, and they go in and turn all of the nobs and optimizations to try to get things to render the best. We’re trying for somewhere between 30 to 40 hours per frame. Some of these with lots of volumes in them will be a lot heavier than that. But that’s where we’re at right now. We’re in the thick of it.”

48 • VFXVOICE.COM SUMMER 2022

PG 44-48 LIGHTYEAR.indd 48

5/2/22 2:08 PM


PG 49 VFXV BEST WEBSITE AD.indd 49

5/2/22 2:34 PM


VES AWARDS

VISUAL EFFECTS ARTISTRY IN THE SPOTLIGHT By NAOMI GOLDMAN

1

Captions list all members of each Award-winning team even if some members were not present or out of frame. Some winners provided a video acknowledgment. For more Show photos and a complete list of nominees and winners of the 20th Annual VES Awards visit vesglobal.org.

All photos by: Danny Moloshok and Phil McCarten 1. Eric Roth, Executive Director of the Visual Effects Society, welcomes the crowd. 2. Acclaimed filmmaker Guillermo del Toro received the VES Award for Creative Excellence. 3. Lynwen Brennan, Executive Vice President and General Manager of Lucasfilm, receives the VES Lifetime Achievement Award. 2

The Visual Effects Society held the 20th Annual VES Awards on March 8, recognizing outstanding visual effects artistry and innovation in: film, animation, television, commercials, video games and special venues. This marks the Society’s 20th VES Awards program, celebrated during the organization’s milestone 25th anniversary. Industry guests gathered at The Beverly Hilton celebrated VFX talent in 25 awards categories. Dune was named the photoreal feature winner, garnering four awards. Encanto was named top animated film, also winning four awards. Foundation (The Emperor’s Peace) was named best photoreal episode. Sheba (Hope Reef) topped the commercial field with two wins. Jim Morris, VES, President of Pixar Animation and founding VES Chair, presented the VES Lifetime Achievement award to Executive Vice President/ General Manager of Lucasfilm, Lynwen Brennan. Academy Award-winning VFX pioneer Phil Tippett, VES presented the VES Award for Creative Excellence to Academy Award-winning filmmaker Guillermo del Toro. Presenters also included: Academy Awardnominated director Denis Villeneuve; actors Alfred Molina, Tawny Newsome, Mouzam Makkar, Emma Caulfield Ford and Deborah Cox. Eric Bourque, Autodesk’s Senior Director of Engineering, Media & Entertainment, presented the Autodesk Student Award. “As we celebrate the Society’s 25th Anniversary and 20th Annual VES Awards, we’re honored to keep shining a light on remarkable visual effects artistry and innovation,” said VES Chair Lisa Cooke. “In all of our esteemed colleagues, we see best-in-class work that elevates the art of storytelling and exemplifies the spirit of adaptation and ingenuity – talents that have kept audiences engaged and uplifted, now, more than ever. The VES Awards is the only venue that showcases and honors these outstanding global artists across a wide range of disciplines, and we are extremely proud of all our winners and nominees!”

3

50 • VFXVOICE.COM SUMMER 2022

PG 50-58 VES AWARDS.indd 50

5/2/22 2:09 PM


4. The VES Award for Virtual Cinematography in a CG Project went to Encanto (We Don’t Talk About Bruno) and the team of Nathan Detroit Warner, Dorian Bustamante, Tyler Kupferer and Michael Woodside. 5. The VES Award for Outstanding Compositing & Lighting in an Episode went to Loki (Lamentis, Shuroo City Destruction) and the team of Paul Chapman, Tom Truscott, Biagio Figliuzzi and Attila Szalma.

4

5

6. The VES Award for Outstanding Compositing & Lighting in a Commercial went to Verizon (The Reset) and the team of David Piombino, Rajesh Kaushik, Manideep Sansietty and Tim Crean. 7. The VES Award for Outstanding Compositing & Lighting in a Feature went to Dune (Attack on Arrakeen) and the team of Gregory Haas, Francesco Dell’Anna, Abhishek Chaturvedi and Cleve Zhu.

6

7

8

9

8. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Episode want to See (Rock-A-Bye) and the team of Chris Wright, Parker Chehak, Javier Roca, Tristan Zerafa and Tony Kenny. 9. The VES Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Last Night in Soho and the team of Tom Proctor, Gavin Gregory, Julian Gnass and Fabricio Baessa.

10

10. The VES Award for Outstanding Visual Effects in a Student Project went to Green and the team of Camille Poiriez, Arielle Cohen, Eloïse Thibaut and Louis Florean.

SUMMER 2022 VFXVOICE.COM • 51

PG 50-58 VES AWARDS.indd 51

5/2/22 2:09 PM


VES AWARDS 11. The VES Award for Outstanding Created Environment in an Episode, Commercial or Real-Time Project went to Sheba (Hope Reef) and the team of Henrique Campanha, Baptise Roy, Luca Veronese and Timothee Maron, who received the award via video. 12. The VES Award for Outstanding Created Environment in an Animated Feature went to Encanto (Antonio’s Room) and the team of Camille Andre, Andrew Finley, Chris Patrick O’Connell and Amol Sathe. 11

12

13. The VES Award for Outstanding Created Environment in a Photoreal Feature went to Spider-Man: No Way Home (The Mirror Dimension) and the team of Eric Le Dieu de Ville, Thomas Dotheij, Ryan Olliffe and Claire Le Teuff. 14. The VES Award for Outstanding Visual Effects in a Real-Time Project went to Call of Duty (Vanguard) and the team of Yi-Chao Sandy Lin-Chiang, Joseph Knox, Gareth Richards and Shane Daley.

13

14

15. The VES Award for Outstanding Model in a Photoreal or Animated Project went to Dune (Royal Ornithopter ) and the team of Marc Austin, Anna Yamazoe, Michael Chang and Rachel Dunk. 16. The VES Award for Outstanding Special (Practical) Effects in a Photoreal Project went to Jungle Cruise and the team of JD Schwalm, Nick Rand, Robert Spurlock and Nick Byrd.

15

16

52 • VFXVOICE.COM SUMMER 2022

PG 50-58 VES AWARDS.indd 52

5/2/22 2:09 PM


18

17

19 17. The VES Award for Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project went to Foundation (Collapse of the Galactic Empire) and the team of Giovanni Casadei, Mikel Zuloaga, Steven Moor and Louis Manjarres who received the award via video. 18. The VES Award for Outstanding Effects Simulations in an Animated Feature went to Raya and the Last Dragon and the team of Le Joyce Tong, Henrik Fält, Rattanin Sirinaruemarn and Jacob Rice. 19. The VES Award for Outstanding Effects Simulations in a Photoreal Feature went to Dune (Dunes of Arrakis) and the team of Gero Grimm, Ivan Larinin, Hideki Okano and Zuny Byeongjun An.

20 20. The VES Award for Outstanding Animated Character in an Episode or Real-Time Project went to The Witcher (Nivellen The Cursed Man) and the team of Marko Chulev, Rasely Ma, Mike Beaulieu and Robin Witzsche. 21. The VES Award for Outstanding Animated Character in a Commerical went to Smart Energy (Einstein Knows Best, Einstein) and the team of Alex Hammond, Harsh Borah, Clare Williams and Andreas Graichen.

21

SUMMER 2022 VFXVOICE.COM • 53

PG 50-58 VES AWARDS.indd 53

5/2/22 2:09 PM


VES AWARDS 22. The VES Award for Outstanding Animated Character in a Photoreal Feature went to Finch (Jeff) and the team of Harinarayan Rajeev, Matthias Schoenegger, Simon Allen and Paul Nelson. 23. The VES Award for Outstanding Animated Character in an Animated Feature went to Encanto (Mirabel Madrigal) and the team of Kelly McClanahan, Sergi Caballer, Mary Twohig and Jose Luis “Weecho” Velasquez. 22

24. The VES Award for Outstanding Visual Effects in a Commercial went to Sheba (Hope Reef) and the team of Grant Walker, Sophie Harrison, Hernan Llano and Michael Baker who received the award via video.

23

25. The VES Award for Outstanding Visual Effects in an Animated Feature went to Encanto and the team of Scott Kersavage, Bradford Simonsen, Thaddeus P. Miller and Ian Gooding.

24

25

26. The VES Award for Outstanding Visual Effects in a Photoreal Episode want to Foundation (The Emperor’s Peace) and the team of Chris MacLean, Addie Manis, Mike Enriquez, Chris Keller and Paul Byrne. 27. The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Dune and the team of Paul Lambert, Brice Parker, Tristan Myles, Brian Connor and Gerd Nefzer. Dune Director Denis Villeneuve, left, served as a presenter.

26

27

54 • VFXVOICE.COM SUMMER 2022

PG 50-58 VES AWARDS.indd 54

5/2/22 2:09 PM


28. Jim Morris, VES, President of Pixar Animation and founding VES Chair. 29. Academy Awardwinning VFX pioneer Phil Tippett was on hand to present the VES Award for Creative Excellence to Guillermo del Toro. Accepting for del Toro was producer J. Miles Dale. 30. Lisa Cooke, VES Board Chair, who was a presenter, flanks VES Lifetime Achievement Award recipient Lynwen Brennan of Lucasfilm and VES Executive Director Eric Roth.

28

29

30

31

31. Jeff Barnes, former VES Chair, and Jeff Okun, VES, tag-teamed as presenters. 32. The VES Award for Outstanding Visual Effects in a Special Venue Project went to Jurassic World Adventure and the team of Eugenie von Tunzelmann, Maximilian McNair MacEwan, Stephen Goalby and Brad Silby.

32

SUMMER 2022 VFXVOICE.COM • 55

PG 50-58 VES AWARDS.indd 55

5/2/22 2:10 PM


VES AWARDS 33. Eric Bourque, Autodesk’s Senior Director of Engineering, presented the VES Award for Outstanding Visual Effects in a Student Project. 34. Eric Roth, VES Executive Director, actor Alfred Molina and Jeff Okun, VES. 35. Actresses Tawny Newsome and Mouzam Makkar were both presenters. 36. Dune Director Denis Villeneuve and actress Deborah Cox were Awards presenters. 33

34

37. Emma Caulfied Ford was on hand as a presenter. 38. Jim Morris, VES and Lynwen Brennan share a moment backstage. 39. The 2022 VES Awards returned as a live in-person event.

35

36

37

38

39

56 • VFXVOICE.COM SUMMER 2022

PG 50-58 VES AWARDS.indd 56

5/2/22 2:10 PM


40

41 40. VES Awards Committee members who paused for a photograph included, from left to right: Diego Rojas, Martin Rushworth, Olun Riley, Chuck Finance, Bob Coleman, Dan Rosen, Reid Paul, Lopsie Schwartz, Scott Kilburn and David Johnson, VES. Not pictured: Den Serras, Brent Armstrong, Rob Blau, Katie Brillhart, Stephen Chiu, Emma Clifton, Dave Gouge, George Macri, Sarah McGee, Jeff Okun, VES, Lisa Sepp-Wilson, David Valentin. 41. Backstage, left to right: Jeff Okun, VES; Jim Morris, VES, Pixar President; Lynwen Brennan, Executive Vice President and General Manager of Lucasfilm; VES Executive Director Eric Roth; Lucasfilm President Kathleen Kennedy; VES Board Chair Lisa Cooke; and former VES Chair Jeff Barnes.

42

42. Actress/singer/ presenter Deborah Cox gave the crowd a taste of her soaring vocal range. 43. The VES Awards Show was an historic event as it marked the 20th year of the Awards gala. 44. Alfred Molina with his wife Jennifer Lee, the VES Award-winning director of Frozen and Chief Creative Officer of Walt Disney Animation Studios.

43

45. David (DJ) Johnson of Undertone FX was on hand as a presenter. 44

45

SUMMER 2022 VFXVOICE.COM • 57

PG 50-58 VES AWARDS.indd 57

5/2/22 2:10 PM


VES AWARDS

20TH ANNUAL VES AWARDS WRAP-UP Following are excerpts from the acceptance speeches delivered by this year’s distinguished VES Honorees. Guillermo del Toro, Recipient of the 2022 VES Award for Creative Excellence What magic and beautiful collaboration it is when we jive together to create a moment – a sleight-of-hand trick in which we use every physical resource at our disposal. When we combine set design, physical effects and acting and make the audience buy into ‘that which would not be.’ The digital effects craft has inherited a multidisciplinary tradition: painting, sculpting, perspective, animation and optical effects. It is a beautiful, expressive and powerful art and one that more and more fuses with character animation and points to roads full of promise in our narrative future. We are heading towards a moment in which we will, indeed, one day tell stories that were impossible at any other moment in history. And it is very important that we do it together. Yes, we are all in this storytelling business together. We always have felt, and feel today, a great and disciplined kinship with what the VFX world has to offer. I have partnered with many of you and have possibly bankrupted a couple of you. So it is with gratitude and enormous joy that I receive this honor. We must stand together in these times and reaffirm that what we have is an intimate partnership in telling a story and that we do it all together on a leveled ground, and that we honor each other when the time comes. This is that time and I salute you all. May we meet on the road many more times. Lynwen Brennan, Executive Vice President/General Manager at Lucasfilm, Recipient of the 2022 VES Lifetime Achievement Award I have always seen it as my job to provide a safe place to take risks, experiment and innovate and to encourage a culture of trust where we look out for each other as family. We have needed that in the past two years more than ever, and I have been in awe of how ILM and the whole visual effects industry has been able to pivot on a dime and completely overhaul our entire process to work remotely – and not only survive, but come out stronger. If these last two years have shown us anything, it has shown us how fast we can change. Standing still has never been a trait of the visual effects industry – so why then is it taking us so long to make progress in increasing diversity in our ranks? We can do better. Yes, we need to improve the pipeline into the industry and that takes some time, but the drop-off up the levels both to VFX supervisor and VFX producer is severe and can be addressed more immediately if we are intentional about it. That doesn’t mean giving opportunity if not earned – no one wants that. But we tend to be creatures of habit and it is an industry built on relationships. We tend to give opportunities to people we know and

TOP TO BOTTOM: Filmmaker Guillermo del Toro. (Image courtesy of Netflix) VES Lifetime Achievement Award recipient Lynwen Brennan with Lucasfilm President Kathleen Kennedy. Lynwen Brennan and her family.

who we have worked with before. We are in the midst of a true boom in the industry where there is simply not enough visual effects talent in the world to satisfy the demand. Let’s use this moment to give some people chances based on their abilities, not just their resume – like my colleagues took a chance on me so many times. It will take each of us to make that commitment – on the productions, the studios and the VFX studios.

58 • VFXVOICE.COM SUMMER 2022

PG 50-58 VES AWARDS.indd 58

5/2/22 2:10 PM


PG 59 VES HANDBOOK AD.indd 59

5/2/22 2:35 PM


VES AWARD WINNERS

DUNE

The VES Award for Outstanding Visual Effects in a Photoreal Feature went to Dune, which won four VES Awards including Outstanding Model in a Photoreal or Animated Project (Royal Ornithopter), Outstanding Effects Simulations in a Photoreal Feature (Dunes of Arrakis) and Outstanding Compositing & Lighting in a Feature (Attack on Arrakeen). (Photos courtesy of Warner Bros. Pictures)

60 • VFXVOICE.COM SUMMER 2022

PG 60-65 VES AWARD WINNERS.indd 60

5/2/22 2:11 PM


SUMMER 2022 VFXVOICE.COM • 61

PG 60-65 VES AWARD WINNERS.indd 61

5/2/22 2:11 PM


VES AWARD WINNERS

ENCANTO

62 • VFXVOICE.COM SUMMER 2022

PG 60-65 VES AWARD WINNERS.indd 62

5/2/22 2:11 PM


The VES Award for Outstanding Visual Effects in an Animated Feature went to Encanto, which won four VES Awards including Outstanding Animated Character in an Animated Feature (Mirabel Madrigal), Outstanding Created Environment in an Animated Feature (Antonio’s Room) and Outstanding Virtual Cinematography in a CG Project (“We Don’t Talk About Bruno”). (Photos courtesy of Walt Disney Studios Motion Pictures)

SUMMER 2022 VFXVOICE.COM • 63

PG 60-65 VES AWARD WINNERS.indd 63

5/2/22 2:11 PM


VES AWARD WINNERS

FOUNDATION

64 • VFXVOICE.COM SUMMER 2022

PG 60-65 VES AWARD WINNERS.indd 64

5/2/22 2:11 PM


Foundation won the VES Awards for Outstanding Visual Effects in a Photoreal Episode (“The Emperor’s Peace”) and Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project (“Collapse of the Galactic Empire”). (Photos courtesy of Apple TV+)

SUMMER 2022 VFXVOICE.COM • 65

PG 60-65 VES AWARD WINNERS.indd 65

5/2/22 2:11 PM


SHEILA WICKENS: COLLABORATING TO REALIZE A FILMMAKER’S VISION IS A VFX SUPERVISOR’S DREAM By TREVOR HOGG

TOP: Sheila Wickens, VFX Development Supervisor at MPC (Image courtesy of Sheila Wickens)

A career highlight for MPC VFX Development Supervisor Sheila Wickens was being invited to join the Academy of Motion Picture Arts and Sciences as a member of the Visual Effects Branch in 2018. The film industry honor is an indication of how far she had come from her early days as a Multimedia Program Developer at the Limerick Institute of Technology and teaching assistant at Bournemouth University Computer Animation Department. The native of Monaghan, Ireland is the daughter of a primary school teacher and a civil servant and has six elder brothers who helped spark her interest in playing football and computer games. “One of my brothers was studying electronic engineering when I was seven and he taught me some Basic programming on our BBC home computer,” recalls Wickens. “I was always interested in math and science, and had thought I would like to be an engineer until I got more into art and wondered how these interests could be combined.” The works of legendary puppeteer and filmmaker Jim Henson were inspirational. “I absolutely loved Henson’s Labyrinth and was also a huge fan of The Muppets and Fraggle Rock. I remember going to the touring exhibition of ‘Muppets, Monsters and Magic’ in Dublin as a teenager and being amazed to see behind the scenes. Another favorite film from that time was The Princess Bride, so I was excited when I had the opportunity to work on Ella Enchanted and then Stardust as they reminded me so much of that classic.” Wickens did a year at an art foundation after school in an effort to improve her portfolio and creative skills with the goal of becoming a graphic artist or illustrator. “During that year I found out about the National Center for Computer Animation at Bournemouth University. Although a fan of traditional animation and stop-motion, I had never heard of 3D animation before and was so excited to discover this subject that combined my interests. I left Ireland to study there for three years and got a degree in computer visualization and animation.” Most of the graduates with the same degree went on to work in the video game and visual effects industries. “I took a job at Bournemouth University as a lecturer’s assistant which allowed me access to the kit there. During that time, they started to branch into compositing, and I had the opportunity to learn Softimage Eddie and Quantel. That led to me getting a job as a Hal operator in a TV station in London. I left that after a year and a half to work on a kids’ TV show as a compositor using Combustion, and then finally got my first job in film at Cinesite working as a rotoscoper on Lara Croft: Tomb Raider.” Compositing was an area in which Wickens excelled. “When I discovered compositing, where you could take images that exist already and put them together rather than starting from scratch, I took to it much more than I had done with 3D. The software has changed over the years. I started out using Cineon, moved to Shake and to Nuke. Essentially, it’s the same principles, although the big shift was the move into Nuke because that was when 3D tools became available to compositors.” The next step was to become a visual effects supervisor. “There was an in-between bit of being a compositing supervisor at MPC and then moving to Lipsync. I became a supervisor in the absence of anyone else more senior

66 • VFXVOICE.COM SUMMER 2022

PG 66-69 SHEILA WICKENS.indd 66

5/2/22 2:11 PM


“I have always loved working directly with the filmmakers and content creators, and being there from the start to help them realize their vision as a visual effects supervisor. So this is a dream job!” —Sheila Wickens, VFX Development Supervisor, MPC than me in the visual effects department at Lipsync at the time. I learned on the job, initially doing work with DNEG then going on set for independent films. How to Lose Friends and Alienate People was the first shoot supervision I did on my own. It started out in 3 Mills Studios in London then went on location to New York City. That was one of many films I did with Stephen Woolley’s and Liz Karlsen’s company Number 9 Films.” Being associated with major movie productions provided Wickens with some family benefits. “I got to work on some exciting big films including Harry Potter and the Chamber of Secrets and Charlie and the Chocolate Factory, which impressed my nieces and nephews at the time and now my children too! Also, I got to meet my hero, Tim Burton, on Big Fish and then worked on Corpse Bride as well. I love the books of Dan Brown, so The Da Vinci Code was amazing. I got to work with some great supervisors whom I learned so much from, in particular Tom Wood, Chas Jarrett and Richard Stammers. That was a great time, but I love smaller independent films the best, so when the opportunity came up at Lipsync Post to help build their visual effects department I took it. Even though the work was much simpler, I loved that you were working directly with the filmmakers. To work on them was incredible because they were great stories with a lot of heart,” Wickens offers. Made in Dagenham (2010) recounts the push for equal pay for female Ford

TOP: Wickens plotting out a complicated army replication shot for BBC One World War I miniseries The Passing Bells. (Image courtesy of Sheila Wickens) BOTTOM: Wickens works with MPC Executive VFX Supervisor Pete Jopling. (Image courtesy of MPC)

SUMMER 2022 VFXVOICE.COM • 67

PG 66-69 SHEILA WICKENS.indd 67

5/2/22 2:11 PM


PROFILE

TOP TO BOTTOM: Wickens hanging out in the BBC offices with an iconic Doctor Who character known as a Dalek. (Image courtesy of Sheila Wickens) Colette director Wash Westmoreland photographs Wickens excited about the giant greenscreen constructed outside the studio in Budapest (2018). (Image courtesy of Sheila Wickens) Wickens at the top of Mount Teide in Tenerife in the Canary Islands where the Gallifrey, the home of Doctor Who, is going to be created with the TARDIS supervising the production. (Image courtesy of Sheila Wickens)

plant workers in the 1970s. “I remember going on a recce for Made in Dagenham out to East London to see around the sites. It was a period drama, so we did environment work to help recreate the look of that era.” Kilo Two Bravo (aka Kajaki in the U.K.) takes place in 2006 and revolves around British soldiers trapped in a minefield in Afghanistan. A CG Chinook helicopter had to be created within a tight time frame without having the benefit of the visual effects team being present during the shoot. “It was an incredible film,” remarks Wickens. “The director told us that one of the military personnel at the premiere asked, ‘How were you able to get hold of a Chinook when they were all in service?’ I consider that to be a tremendous compliment. They shot with a regular helicopter so you can get the dust blowing up from the ground, we painted it out and put in the CG Chinook. It was an intense couple of months trying to get that done knowing that there was no flexibility, because the military charity premiere was already booked. Woman Walks Ahead was another incredible experience. I got to go on location in New Mexico. It was absolutely stunning. Definitely a career highlight from that point of view. I remember the first time I walked on set, and it felt like I was starring in a Western myself!” Not all of the projects have been movies, as Wickens has also been responsible for television series such as Harlots, Big Little Lies, The Romanoffs and Doctor Who. “Being at a smaller company for so many years, we were used to having a dynamic workflow and figuring out what is the best that you can do with the time and resources available,” states Wickens. “It’s all about the artists, the team and getting the best out of them to produce the best work.” The experience translates well for Wickens’ newly established role at MPC. “I work with the new business and development team. As clients are coming in with the work, we’ll read and break down the scripts and put together bids. I work with the art department to show clients ideas and to visualize certain sections of the script. I’m also getting jobs set up until the producer and supervisor who are going to stay on the show are available. I am helping across the board in a more general role to make sure that clients have a contact point.” A report has been released stating that women are significantly underrepresented in the visual effects industry [Editor’s note: the Visual Effects Society has an all-women Board Executive Committee at this time]. “I think growing up for so many older brothers might have prepared me in some way with working in an often male-dominated environment, but I haven’t had any gender specific issues in my career that I’m aware of. However, I do fully support all initiatives to support and attract more women into the industry. I’ve had excellent mentors over the years and I’m very proud to have mentored other women, too. At MPC we recently became members of AWUK [Animated Women UK] and WFTV [Women in Film & Television UK], and I’m looking forward to attending their events throughout the year and seeing more women choose a career in VFX.” As for whether there is a visual effects shot that haunts her, Wickens replies, “One that strikes me was in a film called A Bunch of Amateurs. The camera does a 360 around Burt Reynolds, and

68 • VFXVOICE.COM SUMMER 2022

PG 66-69 SHEILA WICKENS.indd 68

5/2/22 2:11 PM


“Getting asked to supervise Doctor Who was such an honor, but also completely terrifying because of the its legacy and fan base.” —Sheila Wickens, VFX Development Supervisor, MPC

as it moves around him, he goes from a small theater to a bigger theater to a theater with an audience. This was done in Shake before 3D compositing tools were available.” There have been several memorable moments for Wickens. “I remember when Sara Bennett [Visual Effects Supervisor at Milk Visual Effects] asked if I would like to join the ‘Academy.’ If she had specified AMPAS, I would have been confused! I thought I’d better clarify what she meant before getting too excited – maybe it was a horseriding academy or some other club she had in mind! Getting asked to supervise Doctor Who was such an honor, but also completely terrifying because of its legacy and fan base. And of course, now I have this exciting new role as VFX Development Supervisor at MPC working with Executive Supervisor Pete Jopling, Art Director Gurel Mehmet and VFX Development Executive Rory Knight-Jones. I have always loved working directly with the filmmakers and content creators, and being there from the start to help them realize their vision as a visual effects supervisor. So, this is a dream job!”

TOP: Wickens went to New Mexico to shoot Woman Walks Ahead (2017). (Image courtesy of Lipsync Post) MIDDLE LEFT: An incredible experience for Wickens was working on the Western Woman Walks Ahead (2017), which stars Jessica Chastain. (Image courtesy of Lipsync Post) MIDDLE RIGHT: A major greenscreen setup was required to recreate Moulin Rouge for Colette. (2018) (Image courtesy of Lipsync Post) BOTTOM: Period street extensions were created for thriller horror film The Limehouse Golem (2016). (Image courtesy of Lipsync Post)

SUMMER 2022 VFXVOICE.COM • 69

PG 66-69 SHEILA WICKENS.indd 69

5/2/22 2:11 PM


LONGTIME GAMING MAINSTAY HALO TRANSITIONS TO LONGFORM STREAMING By KEVIN H. MARTIN

Images courtesy of Paramount+ TOP, BOTTOM AND OPPOSITE TOP: During action scenes, live-action Spartans were often replaced by digital doubles, which required a deft hand with CG animation to faithfully reproduce each actor’s distinctive movements. OPPOSITE MIDDLE AND BOTTOM: The design and performance aesthetic for Master Chief and the other Spartans was assembled based on what had been established in the various Halo games, plus the physical limitations of shooting on set and location.

Science fiction has long been a key source for the gaming world, even prior to the advent of video games. A dice-based board game based on Robert A. Heinlein’s Starship Troopers novels arrived more than two decades ahead of Paul Verhoeven’s film adaptation, while at least a pair of computer games derived from Arthur C. Clarke’s Rendezvous with Rama. Troopers, along with James Cameron’s Aliens, served as inspiration to various other properties leveraging off the ‘Grunts in Space’ concept, including the Wing Commander franchise and the Battlestar Galactica reboot, as well as the immensely successful series of Halo video games. Set in the 26th century, Halo features a variety of alien and human factions in conflict, a premise that seemed ripe for cinematic adaptation. But an early feature attempt involving Peter Jackson, Neill Blomkamp and Alex Garland failed to pan out. Steven Spielberg later took an active interest in the property, and helped shepherd it along over a span of some years, even as the Haloverse grew to encompass novels, live-action shorts and animated features, all overseen by 343 Industries. The newest incarnation, debuting on Paramount+, is a series that retains the key heroes, antagonists and world-building of the games, but with a greater focus on characters, especially that of super soldier Master Chief (Pablo Schreiber). In addition to the many practical issues of production design, special effects and makeup prosthetics, a multitude of VFX challenges needed to be met, ultimately involving 14 vendors. VFX Producer Bill Halliday, while lacking credits in otherworldly science fiction, had overseen work on The Tudors, The Borgias, Vikings, Penny Dreadful and the futuristic Into the Badlands. Halliday boarded the project in June 2018, which, since the cameras didn’t roll until November of the following year, provided ample time for what he calls, “A very healthy prep, which was useful in helping us relate the series to the

70 • VFXVOICE.COM SUMMER 2022

PG 70-75 HALO.indd 70

5/2/22 2:12 PM


“[Creature Designer Howard Swindell] worked in 3D and then did a kind of paint-over. With these sketches in hand, we went to 343 [Industries] to get their signoff for each look. While they were the final approval step, pilot director Otto Bathurst was also very involved. After that, we went into a testing phase on the creatures, matchmoving to a gray-suited stunt player, who also needed footwear to make them a lot taller than they were in life.” —Bill Halliday, VFX Producer games. Halo 3 is one of their benchmarks, so we aspired to a look in that vein.” U.K. Creature Designer Howard Swindell was brought on to render various aliens very early on. “He worked in 3D and then did a kind of paint-over. With these sketches in hand, we went to 343 to get their signoff for each look. While they were the final approval step, pilot director Otto Bathurst was also very involved. After that, we went into a testing phase on the creatures, matchmoving to a gray-suited stunt player, who also needed footwear to make them a lot taller than they were in life.” Production’s first Visual Effects Supervisor, Tom Turnbull, began evaluating various vendors for the creatures, some of which would be partially CG while others were fully digital. He was later joined by Dominic Remane, another Vikings alum who had shared an Emmy win plus nominations with Halliday. “Our

SUMMER 2022 VFXVOICE.COM • 71

PG 70-75 HALO.indd 71

5/2/22 2:12 PM


TV/STREAMING

TOP TWO: Visual Effects Supervisors Tom Turnbull and Dominic Remane had to develop creature concepts that were both practical and met with the approval of 343 Industries, the company responsible for all things Halo. BOTTOM TWO: Creatures created for Halo included the enemy alien Covenant’s ground forces, the Sangheili, aka The Elites.

initial focus on creatures had to widen to environments later, after those designs had been largely worked out by the art department,” Remane states. “Then it was ours to take and run with, though still reflecting the look of the game to 343’s satisfaction, especially with the planet Reach.” Another character that required a lot of iterations was Cortana, Master Chief’s AI, voiced by Jen Taylor, who also provided Cortana’s voice in the games. “We went through multiple design phases with her to bring that to a level everybody was happy with, from us to 343 to the network,” Remane explains. “We’d mocap Jen in a suit on set, tracking her behavior and performance as well as her facial features. She has played that character of Cortana for so many years that hers is an iconic performance, and we wanted our version to contain all those gestures and nuances that would ring true for fans. We wanted them to just accept, ‘oh, she’s Cortana.’” Reference was key with the design for the Covenant creatures. “Part of my job was working with the directors, 343 and the producers to get a performance containing the necessary qualities on the day,” notes Remane. “Then it was often a matter of translating what was needed to take it all the way through post with the executing of VFX. Fortunately, we’ve had really good vendors handling the creatures, and they made sure to use the artists who caught on quickly to the nuances and the exact kind of look we wanted.” Pre-planning the VFX utilized old-school storyboarding as well as previs. “Each director used that tool in a different way,” says Turnbull. “Some preferred storyboards, but one director actually did his own previs on the Unreal Engine. That was a very positive experience because he was quite good at it, and this pushed our typical process in a different way, one that I enjoyed quite a lot. I’m wondering if that might point the way toward something we’ll see more of in the future.” Another consideration tabled for future seasons was the use of LED walls. “When we started shooting, that was not yet a thing, outside of The Mandalorian,” observes Turnbull. “But now, two years later, there are dedicated stages, and in another two years we will have a really solid idea about what LED walls are good for versus where they don’t work so well. It’ll take a while for the process to become fully integrated with production.” With the shot load initially projected to share among seven to eight vendors, another phase of testing was done to ensure the pipeline between each provider was translatable. “We wanted to avoid cases where somebody had to redo a rig, rerender or change the shading, and weren’t going to fall into a situation where some vendor said, ‘It will take a month for us to ingest this into our pipeline,’” Turnbull relates, noting that ‘cornerstone’ software packages – Maya for animation, Houdini for particle work – were most often used. “Everybody adhered to ACES color space throughout,” Remane emphasizes, “with each vendor knowing how to look at the footage; in certain circumstances they would know to ignore the LUT and instead view it in the color space that was going to be in play when it went to the DI. Because we used the same renderer most of the time, lighting-wise things had a like quality. The one company

72 • VFXVOICE.COM SUMMER 2022

PG 70-75 HALO.indd 72

5/2/22 2:12 PM


“Each director used that [previs] tool in a different way. Some preferred storyboards, but one director actually did his own previs on the Unreal Engine. That was a very positive experience because he was quite good at it, and this pushed our typical process in a different way, one that I enjoyed quite a lot. I’m wondering if that might point the way toward something we’ll see more of in the future.” —Tom Turnbull, Visual Effects Supervisor using Renderman [Mr. X] did an A-to-B version and built an Arnold version for everybody else that looked the same.” Given that the action spans various star systems, a variety of space-going vessels would also be required. “343 gave us a huge drop of ship assets during prep,” says Halliday. “These included models done by Blur that were used in game cinematics. Since those had been developed more in a visual effects way than most gaming designs, we tended to lean on those most. These represented much-loved designs, so our hard-surface modeling tried to only improve them a bit, taking them to finishes that weren’t readily achievable for the games.” Adhering to the hard science fiction template of the game, the look of space was less stylized than has been the case in many recent series. “There’s definitely no ‘lens flare fever’ on this show,” Remane declares. “We made sure the environments were all lit in ways that made sense, not overlit or with the light too wellbalanced. You make sure the sun source is correct, then let everything else fall off naturally into darkness.” Realizing the extraterrestrial Prophets posed a unique challenge in that it spanned many departments, “We decided very early on that the Prophets would be only partially CG,” says Turnbull. “This hybrid approach began with prosthetic suits for stunt players, handled by Filmefex in Budapest. Wardrobe provided full costumes, while their bodies and arm movements were puppeteered on wire rigs. Sound had to wire the suits so the actors inside could hear us, while vocal performance happened off to one side, along with the facial capture that served as a reference for the later animation. It was probably the most fun I had on the whole show, seeing all these technologies and techniques come together, and it really proves the [adage] ‘Rely on Analog/Rely on Digital.’” Further complicating scenes involving the Prophets, the creatures are seen in articulated gravity chairs. “Special Effects Supervisor Paul Stephenson’s team developed systems for moving these 400-pound loads,” Halliday reports. “SFX also built the rig to puppet these static heads, which, while only seen in over-theshoulder shots, were absolutely camera-ready work. MR. X cut those heads off and replaced them digitally. Dom and I had used them as our primary vendor for six seasons of Vikings, where they did 95% of the work, so we had them in mind for this work right from the start.”

TOP TWO: Spartans take the battle to The Elites. BOTTOM TWO: Leg extensions were used for the Spartans and various other aliens portrayed by stunt personnel.

SUMMER 2022 VFXVOICE.COM • 73

PG 70-75 HALO.indd 73

5/2/22 2:12 PM


TV/STREAMING

TOP TWO: The leg extensions elevated the Spartans to the necessary stature and served as a reference for the eventual CG animation. BOTTOM TWO: The Covenant’s spiritual leaders, called the Prophets, drew on a variety of disciplines, including prosthetic suits created by Filmefex and on-set puppeteering, while CG by MR. X was used to replace and animate the face of each creature.

Other races ported over from the games include Jackals, the Unggoy (aka ‘Grunts’) and the slithering Collective known as the Lekgolo. “We spent a long while trying to get the CG performances just right on the Lekgolo character animation work,” acknowledges Remane. “It was kind of a balance we managed to strike between the undulating movements of a worm and that of a snake. The creatures can curl up on themselves, but then rapidly accelerate to strike a target. Finessing that so the results looked both correctly in character and credible to the eye took time, but then we found a National Geographic reference of several snakes trying to attack a small critter that really showed us a way in. The speed and ferocity of these snakes swarming and tearing through the sand to reach this target was unbelievable.” The practical limitations arising from the armor worn by Master Chief and his Spartans imposed restrictions on the live-action battle scenes. “The actors were able to do a lot of movements in those suits, which was a big plus,” Turnbull states, “but the real win for us was that they could act and have their performance come through even when fully suited with these huge helmets on their heads. Pablo has a number of scenes where, though he is fully masked, the audience will understand what he is feeling. They were only really limited when it came to hard action, and we always knew that was going to be more our end of things.” “With Master Chief, the design and performance aesthetic was assembled based on the game character’s actions, plus what was and wasn’t achievable on set,” Remane acknowledges. “The actor himself brought so much to it, but there were aspects that were never going to work practically, owing to the limitations of wire gags and how any actor or stunt personnel could move while inside that suit. Spartans often got replaced digitally when they were running and shooting, so we’d have to translate the actors’ movements into digital versions of those characters.” Most of the battle scenes feature practical explosions, though these were occasionally augmented with CG enhancement. “Paul Stephenson really almost overdelivered with the pyrotechnics,” Turnbull enthuses. “Things blew up really large when they needed to, and he was always in a good dialog with us, right from the start and carrying on through the end of shooting.” Except for one scene involving an actor playing two roles, no motion-control was used during filming. “Twinning is a kind of traditional use for mo-con,” says Turnbull, “so it made sense to use what worked. But the rest of the time we shot free cameras, with all the plate photography done wild, because we were all about letting each director make his movie his way.” The division of VFX labor had been well-planned, but when COVID hit and interrupted production, the resulting delay ultimately impacted the vendors in a significant way. “We had three different major battles throughout the season,” explains Halliday, “and so we knew that creatures were going to have to be shared among vendors to get that done in time. Dividing up the shows went pretty clean at first, as we knew the first, fifth and final episodes were going to be big and we could allocate resources accordingly. But after we came back from COVID, it was winter in Hungary, so we couldn’t go outside to shoot these big scenes. This

74 • VFXVOICE.COM SUMMER 2022

PG 70-75 HALO.indd 74

5/2/22 2:12 PM


“We’d mocap Jen [Taylor] in a suit on set, tracking her behavior and performance as well as her facial features. She has played that character of Cortana for so many years that hers is an iconic performance, and we wanted our version to contain all those gestures and nuances that would ring true for fans. We wanted them to just accept, ‘oh, she’s Cortana.’” —Dominic Remane, Visual Effects Supervisor meant pushing the three biggest, most VFX-heavy sequences to the end of the schedule. Our careful plan for apportioning the work so nobody would get overwhelmed went right out the window, requiring us to bring in larger facilities with the capacity to take on 250-shot sequences. MPC’s episodic division is really helping us now, while ReDefine has, I believe, access to the entire DNEG network, so they can really throw bodies at a volume-heavy project to get it done quickly. It was kind of regrettable that the companies building many of these assets didn’t get to always execute those scenes, but we faced a real logjam.” Halliday was philosophical about how the best laid-plans don’t always come together, acknowledging that, “You always go in with expectations on how things will play out – but then you adjust. You learn more about each company’s strengths, and your relationship with them develops over time. For example, we knew it was going to be a very big creature job, but actually, while that proved to be true, it also was the smoothest part of the operation – partly because we already had a good idea of how things would look from the existing franchise. But there were certain high-concept bits – otherworldly things – that didn’t always have visual precedents. Each of these challenging VFX-driven moments required us to invent a new playbook, because it was new territory, so visualizing many of those required a lot more effort in the conceptualization process. “Rodeo FX has this enormous reputation for doing marvelous environment work,” he continues, “so initially we wanted to take advantage of that with them. But they soon became such great partners that the possibilities expanded, as they showed us how much wider a gamut of work they could do for us, and those wound up including a number of these far-out effects.” Other vendors utilized throughout the series include Pixomondo, MR. X, Rocket Science VFX, Rodeo FX, Cinesite (Montreal), Fillscrn, FuseFX, MPC (London), Goodbye Kansas Studios, The Frame Distillery, Mavericks VFX, Stereo D, RedefineFX and Rayon FX. Turnbull, who left Halo during post to take the VFX reins on Tim Burton’s The Addams Family-inspired Wednesday series, thinks this Paramount+ effort – already renewed for a second season before its debut – strikes a balance between meeting expectations of die-hard fans and offering up greater depth to that universe. “We had a charter to go bold,” he states, “so the objects, environments, creatures and vessels that were new to the Halo universe were very fun. But even with that mandate, these additions are respectful of the legacy, with roots based in hard science fiction, which I happen to enjoy. So for me, a big part of the job was maintaining that focus to deliver a style that is more realistic, rather than some fantastic or magical reinterpretation of spaceships and planets. That was my MO from day one.”

TOP TO BOTTOM: The Prophets are chairbound, so the special effects department devised articulated rigs powerful enough to propel each 400-pound load of suited, seated performer. The series VFX supervisors and VFX Producer Bill Halliday felt mixing digital character animation and set extensions with practical techniques and technologies proved the adage, ‘Rely on Analog/Rely on Digital.’

SUMMER 2022 VFXVOICE.COM • 75

PG 70-75 HALO.indd 75

5/2/22 2:12 PM


VR/AR: THE QUICKENING PACE OF THE IMMERSIVE NOW By CHRIS McGOWAN

TOP: Stockholm-based Resolution Games developed the colorful dueling shooter VR game Blaston. (Image courtesy of Resolution Games)

Virtual reality and augmented reality are more tangible now. VR in particular has enjoyed a steady ascent recently, with help from the Oculus Quest headsets and the strong earnings of many VR titles, especially in the gaming realm. This year, VR looks to get a boost from the release of the Sony PlayStation VR2 headset as well as expected introductions of new VR and AR gear from Apple and other tech firms. In addition to games, education, travel, health, enterprise and social interaction apps should all drive VR and AR in the near future. Plus, the pervasive talk of a “metaverse” is attracting interest and investment, and VR and AR look to be the chief modes of access. Fortune Business Insights projects that the global virtual reality market will grow from around $6.4 billion in 2021 to over $80 billion by 2028. “VR and AR are rapidly gaining traction,” says Christine Cattano, Framestore Head of Immersive, U.S. “I can’t think of an industry that isn’t starting to explore how VR/ AR hardware or software offers transformative new workflows and/or ways to connect with their audiences/customers. At Framestore, our work in this space is largely driven by technology and entertainment clients, but also brands looking to explore these platforms as well.” “Each generation of headsets has enabled a huge leap forward for our industry,” says Larry Cutler, co-founder and CTO of Baobab Studios, which has released VR titles like Baba Yaga and Namoo. “For example, the release in 2019 of standalone VR headsets such as the Oculus Quest empowered users to be completely mobile and untethered to a high-end gaming PC and sensors. Our 2019 VR

76 • VFXVOICE.COM SUMMER 2022

PG 76-79 VR/AR.indd 76

5/2/22 2:13 PM


“Creating rich worlds filled with compelling characters that can potentially accommodate more modes of participation will almost certainly grow the audience and, by extension, the market.” —Vicki Dobbs Beck,Vice President of Immersive Content Innovation, ILMxLAB narrative Bonfire was a launch partner for the Quest, and we saw first-hand how this headset expanded the VR audience.” In January at CES, Sony revealed details about its next virtual reality headset. The PlayStation VR2 will have high-fidelity visuals with 4K HDR, a 110-degree FOV, a display resolution of 2000 x 2040 per eye and frame rates of 90/120Hz. It will have inside-out tracking with integrated cameras in the VR headset and eye tracking, and be haptic-equipped for sensory feedback. It will compete with the HP Reverb G2, Valve Index VR, HTC Vive Pro 2 and HTC Vive Cosmos Elite, among other headsets, and the HTC Vive Flow glasses. Ian Hambleton, CEO of London-based immersive studio Maze Theory, comments, “We definitely think the VR2 will be an unexpected hit. It solves many of the issues of the first headset but has loads of great new features, and for PSV5 owners will be so easy to add. We think it will do amazingly well.” Maze Theory developed the VR experience Dr. Who: The Edge of Time and will release Peaky Blinders: The King’s Ransom this year. Hambleton adds, “The Quest 2 has been a huge shot in the arm for the VR market and the ecosystem is growing fast. Attachment rates and software sales are very positive.” Joanna Peace, Meta Manager of Technology Communications for VR Product, says that Meta’s next headset, a next-generation all-in-one VR hardware, is code-named Project Cambria and will launch in 2022. “This isn’t a Quest 2 replacement or a Quest 3. Project Cambria will be a high-end device at a higher price point, because it’s going to be packed with all the latest advanced technologies, including improved social presence, color Passthrough, pancake optics, and a lot more. We’re designing Project Cambria for people who want to start testing out a new kind of computing on the cutting edge of what’s possible today.” At CES in January, Microsoft announced a partnership with Qualcomm to develop custom augmented-reality chips that can be used in future lightweight AR glasses. Canon revealed that it is developing a social VR platform for 2022 called “Kokomo” that will mix immersive experiences with video calling. Panasonic subsidiary Shiftall announced lightweight VR glasses known as MeganeX, and numerous other companies announced new VR and AR gear. Looking at VR’s many possible applications, Tuong H. Nguyen, Senior Principal Analyst at Gartner, Inc., comments, “The volume is mainly consumer entertainment. But I feel the bigger opportunity is in training and simulation. We’re seeing more enterprises looking to use VR for training, simulation and collaboration. The reason I think the enterprise-use cases show promise relative to

TOP TO BOTTOM: Los Angeles-based Stress Level Zero’s platinum title Boneworks is a first-person shooter that is powered by the Unity engine and supports PC-compatible VR headsets. (Image courtesy of Stress Level Zero) Blade and Sorcery features VR fantasy sword fighting and utilizes the Unity game engine. It was developed and published by WarpFrog, located in Saint-Herblain, France. (Image courtesy of WarpFrog) The Elder Scrolls V: Skyrim VR is a platinum seller, published by Bethesda Softworks in Rockville, Maryland. It uses the Creation Engine, created by Bethesda Game Studios and based on the Gamebryo engine. (Image courtesy of Bethesda Softworks)

SUMMER 2022 VFXVOICE.COM • 77

PG 76-79 VR/AR.indd 77

5/2/22 2:13 PM


VR/AR/MR TRENDS

TOP TO BOTTOM: Capetown-based Free Lives’ platinum Gorn is a violent gladiatorial simulator. It uses a “physicsbased combat engine.” (Image courtesy of Free Lives) Seattle-based Polyarc’s Moss is a single-player action-adventure puzzle game for all ages, made with Unreal Engine. (Image courtesy of Polyarc) Infiltrate the 1920s Birmingham criminal underworld in Peaky Blinders: The King’s Ransom, developed by London-based Maze Theory. (Image courtesy of BBC/Maze Theory) The Resident Evil 4 VR, developed by Austin-based Armature Studio along with Capcom and Oculus Studios, utilizing Unreal Engine, was the fastest-selling app in the history of Quest. (Image courtesy of Capcom/Oculus/Armature)

entertainment is because of the purpose-built nature of these solutions. As in, to get significant growth in entertainment, you need to address a broad market with both breadth and depth of content. Whereas on the enterprise side, it’s more narrowly scoped as, for example, ‘here’s a tool for you to do the following at work.’ In that sense, the adoption trajectory looks better.” One of VR’s uses has been for location-based entertainment, which, says Cattano, was “really blossoming pre-2020 [but] saw a bit of a slowdown due to COVID, as clients looked for alternative technology to reach their audiences in a safe and accessible way. But we are seeing this start to bounce back in 2022.” Video games have been a leading driver of VR sales to date. The rhythm game Beat Saber has sold over 4 million units, and estimates place its total revenue (with DLC included) at $180 million, with $100 million of that coming from the Quest store alone. It is followed by other platinum titles (one million units sold) like Seattle-based Polyarc’s Moss and Capetown-based Free Lives’ Gorn. Skydance Interactive’s The Walking Dead: Saints and Sinners has grossed more than $60 million across all platforms, according to Santa Monica, California-based Skydance. Meanwhile, the Oculus Blog last November revealed that Resident Evil 4 (an exclusive Quest 2 release) had become the fastest-selling app in the history of Quest. Two other upcoming VR game titles should help boost sales. Sony Interactive Entertainment will exclusively publish a VR2 version of Amsterdam-based Guerrilla Games’ popular Horizon Call of the Mountain (being developed by Guerrilla and Liverpool-based Firesprite). And Meta has an exclusive deal to create an Oculus 2 version of Rockstar Games’ Grand Theft Auto: San Andreas. More than $1 billion has been spent on Quest store content since its launch in 2019, according to Meta Platforms CEO Mark Zuckerberg in a company earnings call. Last February, Chris Pruett, Director of Content Ecosystem at Meta, noted on Twitter that eight titles have each made over $20 million in gross revenue, another 14 titles more than $10 million and an additional 17 titles over $5 million in the Quest store. More than 120 total VR games available for Oculus Quest and Quest 2 have grossed over $1 million apiece in revenue, according to Pruett. Looking at the entire VR market, some other platinum VR games include Rust’s Hot Dogs, Horseshoes & Hand Grenades; Superhot’s Superhot VR; Vankrupt Games’ Pavlov; Stress Level Zero’s Boneworks; Illusion’s VR Kanojo; Bethesda Software’s The Elder Scrolls V: Skyrim VR; and, WarpFrog’s Blade and Sorcery. Cutler comments, “We are witnessing a big push into multiplayer VR. Some notable examples include Blaston and Demeo [Resolution Games, located in Stockholm], and the battle-royale shooter Population: ONE [BigBox VR, which was purchased by Facebook]. One really ambitious and artistically interesting multiplayer game is Space Pirate Arena [from I-Illusions, located in Geraardsbergen, Belgium], which combines large-scale multiplayer combat in a shared space.

78 • VFXVOICE.COM SUMMER 2022

PG 76-79 VR/AR.indd 78

5/2/22 2:13 PM


Of course, multiplayer leads us to the VR metaverse, which opens a ton of creative possibilities but is still in its infancy.” Hambleton notes that VR has an advantage with video games: “You’re fully immersed in a way no other device can match.” “VR is perhaps the most powerful means of transporting people to another time and/or place including to a ‘galaxy far, far away,’” says Vicki Dobbs Beck, ILMxLAB Vice President of Immersive Content Innovation, which has released the much-lauded VR title Star Wars: Tales from the Galaxy’s Edge. “VR’s appeal will continue to grow as the breadth and nature of experiences speak to all kinds of engagement. Some people want to leap into high-stakes adventures while others wish to explore and discover at their own pace. Creating rich worlds filled with compelling characters that can potentially accommodate more modes of participation will almost certainly grow the audience and, by extension, the market.” While gaming has so far had the most success in VR, “other popular content includes fitness, productivity and social experiences,” says Peace. “Developer success in VR is broad. You’re seeing developers and studios pursue making VR content in those categories. And the strong foundation in today’s popular VR categories enables people to imagine and build engaging apps for more use cases. There’s also a burgeoning group of creators, aka builders, who are actively building worlds and experiences, whether it’s in Horizon Worlds, Rec Room or other [Social VR] apps. Those apps are great examples of VR inspiring creative pursuits.” Venice, California-based Wevr has worked on the TheBlu (Sundance), the immersive Harry Potter experiences with Warner Bros. and the Gnomes & Goblins VR game with Jon Favreau. Wevr’s next project strives “to empower interactive creators and developers around the world to more easily collaborate and share their creations. We are developing a cloud software platform to do just that – we call it Wevr Virtual Studio (WVS).” Peace says, “On the AR side, we’re seeing wide adoption of free tooling via our Spark AR platform. Hundreds of thousands of creators are using AR to build interactive experiences that do everything from transform dancers on Instagram Reels and let people play multiplayer games on Messenger, to helping others try on – and buy – makeup from the comfort of their home. With AR embedded at the center of Meta apps and experiences, creators and studios have amazing opportunities to reimagine the way people connect, share and explore.” Nguyen adds, “Generally speaking, AR is much more broadly applicable because it’s rooted in the physical world. Since we live in the physical world, there are more scenarios where this applies.” Cattano comments, “AR applications are extremely popular. There’s no doubt that audiences see the allure of interacting with virtual content in their physical environment, provided it’s contextualized properly and is user-friendly.” Framestore developed the His Dark Materials: My Daemon AR app for HBO. With the help of a growing number of high-quality VR titles, the popularity of the Oculus Quest 2 headset, and the looming arrival of PlayStation VR2 and other new big-tech immersive gear, the mainstream light can be seen at the end of the VR/AR tunnel.

TOP TO BOTTOM: Population: ONE is a multiplayer, battle-royale-style VR game developed by BigBox VR that includes a “vertical combat system” and is powered by the Unity game engine. BigBox VR is located in Seattle and was acquired by Facebook in 2021. (Image courtesy of Meta/BigBox VR) The hall-scale VR game Space Pirate Arena seeks to get players off their couches and into a multiplayer game of hide and seek. It was developed by I-Illusions, located in Geraardsbergen, Belgium. (Image courtesy of I-Illusions) Skydance Interactive’s first-person VR game The Walking Dead: Saints and Sinners, in which the player battles with zombies, has grossed over $60 million in revenue across all platforms. (Image courtesy of Skydance Interactive)

SUMMER 2022 VFXVOICE.COM • 79

PG 76-79 VR/AR.indd 79

5/2/22 2:13 PM


VFX TRENDS

VIRTUAL PRODUCTION: MAKING A REAL IMPACT IN COMMERCIALS By IAN FAILES

We tend to hear a lot about the growing use of LED walls, game engines and real-time tools in the making of film and television shows. It turns out, of course, that these virtual production technologies are also being used in commercials at an equally astonishing rate. Indeed, virtual production can be well-suited to the fast-paced production of commercials, and offers impressive results when actors, characters, vehicles and creatures need to be placed in diverse settings that might not otherwise always be possible to film in. Here’s a breakdown of some recent spots where virtual production came into play in a significant way. AN LED WALL SOLUTION WHEN YOU CAN’T SHOOT ON LOCATION

TOP: This Genesis Motors Canada spot was captured on Pixomondo’s LED stage in Toronto. (Image courtesy of Pixomondo)

When car brand Genesis Motors Canada wanted to showcase its electrified GV60 models, agency SJC Content looked to reference nature and electrical elements in the campaign. It was wintertime in Canada when shooting was originally planned to occur, and that ultimately rendered physical filming a difficult prospect. The creative team turned to Pixomondo to shoot the entire spot on its LED stage in Toronto. “Our LED volume allowed us to shoot the complete commercial, including the car beauty shots in one location, in a controlled and heated environment,” remarks Pixomondo Virtual Production

80 • VFXVOICE.COM SUMMER 2022

PG 80-83 COMMERCIALS.indd 80

5/2/22 2:13 PM


Supervisor Phil Jones. “At this Pixomondo LED volume, the ceiling can split into four separate sections, each hung from four remotely operated chain motors. This allows us to raise and lower each ceiling section from approximately 30 feet in the air all the way down to the stage floor. Throughout this travel, we can tilt and angle each section so that the director of photography can get precisely the lighting and reflections he desired on the car.” The ‘A New World’ commercial required a location at the edge of a forest just outside a city after a recent thunderstorm. “Since it was winter during the shoot, this location would have been challenging to find, and very expensive to travel an entire crew to a more temperate climate,” says Jones. “Instead of working overnight at a practical location to ensure the lighting, shooting in the Pixomondo volume allowed the production to shoot during normal working hours with consistent and controllable lighting at all times. It gave the DP the opportunity to tune the lighting within the volume and in the virtual environment without worrying about when the sun was going to rise and stop the shoot.” Pixomondo’s LED stage, which uses Unreal Engine at its core, was rigged to allow for a cuable lightning system that combined practical elements with both dynamic lights and color-correct regions within the environment. “This system allowed us to quickly adjust the position and effect of the lightning required in the client brief on a shot-by-shot basis throughout the shoot,” states Jones. “We were able to select and cue specific lightning shapes and rapidly re-position the areas affected by the lightning, which allowed for the progression of the storm throughout the commercial. Combining this with our positionable ceiling sections, we were also able to choreograph the timing and position of the lightning in the reflections on the practical car. “To keep the forest environment alive,” continues Jones, “we added a controllable procedural wind to all of the trees, shrubs and small grasses in the scene. This worked well on a powerful artist workstation, but when brought into the volume, we were under the required 24fps. Instead of simply turning off the wind throughout the scene, we kept it in the most foreground elements, then tested the camera angles and moves from the storyboards to determine how deep in the frame we needed to keep full 3D elements. After we determined a distance where the parallax was negligible, we baked the elements beyond this onto cards to reduce the workload on the GPUs. Instead of leaving them completely static, we incorporated a procedural displacement into the shader, so there wasn’t a distinct line where the forest ‘died.’” WHAT HAPPENS IN VEGAS HAPPENS VIRTUALLY

A Resorts World Las Vegas campaign called ‘Stay Fabulous’ included an immersive short film with celebrities Celine Dion, Carrie Underwood and Katy Perry, among others. Hooray Agency and Psyop crafted the piece, with creative solutions studio Extended Reality Group (ERG) handling the virtual production side. Shooting took place on a LED volume. “ERG was responsible for creative asset optimization from models provided by the client’s graphics team,” outlines ERG Senior Unreal Artist Patrick Beery. “After completing the initial

TOP TO BOTTOM: The Resorts World Las Vegas campaign “Stay Fabulous” made use of LED wall panels and real-time rendered imagery. (Image courtesy of Extended Reality Group) This Istanbul Financial Center commercial saw several Formula 1 drivers racing through various cities. (Image courtesy of disguise/MGX Studios) In the Google Play commercial called “Battle at Home Screen,” key hero and villain characters from Blizzard’s Diablo Immortal game emerge from inside a Google Pixel phone. (Image courtesy of Impossible Objects)

SUMMER 2022 VFXVOICE.COM • 81

PG 80-83 COMMERCIALS.indd 81

5/2/22 2:13 PM


VFX TRENDS

assets’ ingestion into Unreal Engine, ERG completed all scene building, lighting and final touches for programming and real-time run optimization. As the project progressed, when several new scenes were added to the scope, the client turned to ERG to design and build all of the creative assets directly within Unreal Engine.” ERG also provided on-site environment work through a multiuser Unreal workflow. This enabled the director and director of photography to make variable changes to visual effects within the scene and lighting in real-time while still displayed within the LED volume. The company used Houdini to create moving animations through textures so they would be more performance friendly. A live DMX plugin was used to control virtual and physical lighting in real time, allowing for a consistent color balance from LED to final camera. “During the design phase and pre-development, techvis played an integral role in determining camera moves, placement and lens selection” says Berry. “This planning helps to mitigate the risk of physically implausible virtual choices, as well as establish clever ways to obscure the virtual seams. For example, to create a seamless waterline for the boat, a LED band-aid was designed to wrap the practical boat and generate assets that could be adjusted in real time, enabling the entire shot to be captured in camera and successfully create the parallax effect.” FAST CARS AND LED WALLS

TOP TO BOTTOM: The LED wall panels reflected nature imagery onto the vehicle. (Image courtesy of Pixomondo) A final render from the “Stay Fabulous” campaign. (Image courtesy of Extended Reality Group) The drivers were captured in their car cockpits on an LED volume. (Image courtesy of disguise/MGX Studios)

A commercial for the Istanbul Financial Center (IFC) features several Formula 1 drivers – specifically, their helmets – racing through various cities. Reflected in their helmets, visors and their car cockpits are recognizable city icons and colorful lighting setups. To make that possible, production company Autonomy brought together a virtual production effort from MGX Studios and disguise to capture the drivers and cars on a stage with LED side panels, a back wall and ceiling that was projecting the imagery, and thus enabling in-camera reflections. “In recent years, disguise has developed its extended reality [xR] workflow,” notes disguise Vice President of Virtual Production Addy Ghani. “xR represents the next generation of virtual production technology that is set to replace the standard practice of filming against blue or greenscreens for film, television and broadcast. With the disguise xR workflow, talent is filmed against large LED screens that feature real-time-generated photorealistic virtual scenes.” MGX Studios utilized three disguise content playback and compositing servers together with disguise’s RenderStream, a protocol that allows the user to control a third-party render engine, in this case Unreal Engine. “Unilumin Upad III LED walls and three Brompton Tessera SX40 LED processors allowed for high-quality graphics content to be displayed on the LED walls and in camera,” adds MGX Studios Virtual Production Operations Coordinator Mete Mümtaz. “A Mo-Sys camera tracking system, together with disguise RenderStream, ensured that graphics from Unreal Engine were updated according to the camera’s movements, making the scene three-dimensional and immersive.”

82 • VFXVOICE.COM SUMMER 2022

PG 80-83 COMMERCIALS.indd 82

5/2/22 2:14 PM


During the shoot, MGX’s team were able to control the content, the lighting and reflections on demand. Says Mümtaz, “We were also able to control the pre-made animations and mapping of the videos on the LED screen. Controlling the content in real time, ensuring that all departments, including the director and DP, can work in an integrated manner with the project, is of great importance for both speed and budget preference.” CREATURE FEATURE IN YOUR PHONE

In a Google Play commercial called ‘Battle at Home Screen,’ a raft of key hero and villain characters from Blizzard’s Diablo Immortal game emerge from inside a Google Pixel phone to engage in battle. This all happens on the surface of a coffee table, and includes the perspective of a live-action character who enters the room. It was brought to life by virtual production studio Impossible Objects, working with agency Omelet. “From creative direction to producing virtual environments, assets, real-time previsualization, a hybrid live-action shoot and intricate animation sequences,” notes Impossible Objects founder Joe Sill, “we took the film from end to end in a more efficient and more creatively satisfying experience than we could have imagined.” Impossible Objects relied on its virtual art department (VAD) and previs teams to plan the action first in Unreal Engine, finding that the game engine approach allowed them to make fast iterations. “We did make use of a simple VR scout so that a sense of real-world depth was understood, but we found that just using a live camera capture setup to explore the set was quicker and more intuitive to aiding previs,” explains Impossible Objects Head of Technology Luc Delamare. “This process was repeated for both the true-scale view of the set for the live-action integration, as well as the miniature portion where the phone screen is the size of a football field.” “Running in parallel was our character team, prepping the Blizzard/Diablo characters for Unreal Engine as well as animation in Maya,” says Delamare. “The animation pipeline was unique in that the animation team’s work was also immediately ingested and pushed through Unreal Engine so that the feedback loop included the real-time visualization of the characters in UE and not only Maya playblasts.” Meanwhile, on the hybrid virtual production side, Impossible Objects took the same virtual interior used in the fully in-engine portion and brought that to the live-action stage via a combination of real-time rendering, live camera tracking and a live composite. “This high-fidelity and real-time comp of our scene allows the entire team to work in a semi-tangible environment despite filming on chroma green,” mentions Delamare. “We needed to have epic sequences that felt cinematic, high stakes and realistic,” comments Sill. “The action choreography had to be precise, intentional, and through a highly iterative and non-linear process were we able to all align on creative choices in a more efficient manner than before.”

TOP TO BOTTOM: Impossible Objects relied on game engine tools to visualize the spot and aid in realizing the shoot. (Image courtesy Impossible Objects) Unreal Engine user interface view in relation to Pixomondo’s LED wall stage. (Image courtesy of Pixomondo) Lighting control and real-time game engine tools combined for the Resorts World Las Vegas production. (Image courtesy of Extended Reality Group) Cities and colorful light displays transported the drivers to different locations, with production orchestrated by MGX Studios using disguise tools. (Image courtesy of disguise/MGX Studios)

SUMMER 2022 VFXVOICE.COM • 83

PG 80-83 COMMERCIALS.indd 83

5/2/22 2:14 PM


TECH & TOOLS

TAKING PREVIS TOOLS TO THE NEXT LEVEL By IAN FAILES

It wasn’t too long ago that many previsualization studios had only just begun adopting game engines and real-time tools to help deliver their ‘viz.’ Now, a number of outfits have extended their efforts in real-time with bespoke tools, like virtual cameras, scouting apps or specific software solutions to help creators visualize scenes. Here’s a look at the state of play in previs and virtual production. PACKAGING UP A VIRTUAL CAMERA SOLUTION

TOP: Techvis frames from NVIZ crafted using ARENA for a sequence involving a mountain climb and goats in The King’s Man. (Image courtesy of NVIZ) BOTTOM: The ARENA system from NVIZ enables virtual camera shooting to help visualize scenes, in this case from The Midnight Sky. (Image of courtesy NVIZ)

As part of its previsualization services, NVIZ has built a virtual production system it calls ARENA, essentially software that drives a virtual camera within an Unreal Engine scene. “One operator uses the machine running the UE4 scene and can lens up, lens down, pull focus, narrow the depth of field and cue the animation within the scene,” details Janek Lender, Head of Visualization at NVIZ. “ARENA can also take shots, or takes, and run the animation within the shot at different speeds for slow-motion capture. The iPad operator simply uses the iPad as a viewfinder to frame up shots to capture. Once a series of shots or takes have been created, we can look through to make selects and cut these together straight away to create a previs cut.” “ARENA is the result of all our experience in the field of visualizing camera work,” states NVIZ Real-time Supervisor Eolan Power. “It’s the result of years of experimenting, iterating and finding the most useful approach to explore filmmaking. It’s a really useful and unobtrusive tool for concepting shots, especially those that involve VFX and is made possible by our ongoing work in the Unreal Engine.” ARENA came into play for previs on such projects as The Midnight Sky and The King’s Man. “I think most sequences in films that are in their blocking stages could benefit from exploring with a virtual camera and figuring it all out before they commit

84 • VFXVOICE.COM SUMMER 2022

PG 84-87 PREVIS.indd 84

5/2/22 2:14 PM


to anything,” says Power, on why he suggests something like ARENA can come in handy for creators. “It usually gets deployed on VFX-heavy films, but we see more and more it’s becoming a thinking tool for the creative leads on all sorts of projects.” One aim of ARENA, apart from being just one aspect of the viz effort on any project, is fast deployment and a small footprint on set, studio or location footprint, with Lender noting that it requires just two people and can be used pretty much anywhere. “On The Midnight Sky, for example, we were deploying it in mountaineer huts next to a glacier in Iceland, so it has definitely proven its both quick and robust qualities.” TAKING ON-SET EXPERTISE, APPLYING IT TO AN ACCESSIBLE APP

In addition to the previsualization and virtual production services it offers to clients, game engine maker Unity Technologies now also makes a virtual camera solution that anyone can use. This consists of two elements: a Unity package called Live Capture and an app on the iOS App Store called Unity Virtual Camera. “Together, these let you use an iPhone or iPad as a virtual camera,” relates Habib Zargarpour, Virtual Production Supervisor at Unity Technologies, who notes that there are facial capture capabilities and real-world camera rigs as part of the Live Capture package. “The images of the real-time rendering out of Unity also get sent back to the mobile device so that you can freely compose and shoot shots without having to look at the computer monitor or a TV.” Zargarpour advises that the Unity Virtual Camera was recently used on an LED wall shoot for the Shani ft. Andy “Breathe Free” music video. “This involved using the tracking of the iPhone 12 Pro with LiDAR mounted onto a SONY VENICE camera to give us the parallax of the camera onto the LED backdrop. When the camera would elevate or lower itself or move laterally, you could see the shift in perspective. In addition, the director of the project used the Virtual Camera to create all CG shots using the same assets she had on the LED screen. This gave a perfect match for continuity to the piece.” One of the toughest aspects of making the Unity Virtual Camera as accessible as possible, says Zargarpour, was ensuring the streaming of the real-time renders back to the device occurred smoothly with minimal lag, making it feel as natural as moving a physical camera. “The streaming was a very important feature that we needed to have so that filmmakers could compose easily and use touch to tap focus on an element or zoom in to a subject. Our goal is to make powerful tools for filmmakers that are also intuitive for them to operate. The real-time tools provide a powerful means of communication for creators to visualize their projects by collaborating with production designers, DPs and VFX.” WHEN YOU NEED TO ART DIRECT THE SKY

The Virtual Art Department or VAD is now a department often tied in closely with visualization and virtual production studios. One aspect of this department’s work has been to give creators control

TOP TO BOTTOM: A previs frame by NVIZ for The Midnight Sky. (Image of courtesy of NVIZ) On the LED wall set shoot for Shani ft. Andy’s “Breathe Free” music video, on which Unity’s Virtual Camera app was used. (Image courtesy of Unity Technologies) Closeup on the Unity Virtual Camera app running on a tablet. (Image courtesy of Unity Technologies) A virtual production shoot with a virtual camera being operated by Unity’s Habib Zargarpour. (Image courtesy of Unity Technologies)

SUMMER 2022 VFXVOICE.COM • 85

PG 84-87 PREVIS.indd 85

5/2/22 2:14 PM


TECH & TOOLS

over virtual environments, especially now in real-time. Halon Entertainment, now NEP Virtual Studios Company’s VAD, has built a tool in Unreal Engine that lets creatives art-direct skies, that is, hand-place clouds in the environment. “The tool allows us to dress the skies from the ground or the air, and maintains the physics behind how clouds form and look depending on altitude,” outlines Halon Virtual Art Department Supervisor Jess Marley. “This ranges from the stratus, cumulus, to the cirrus, as well as giving the option to load in an HDRI if the client has a specific direction or reference we must adhere to. We are also introducing modular set builders/configurators and tools like Houdini in Unreal Engine to be quick and non-destructive. “Being able to visualize and compose the skies is extremely important, especially when you are viewing them at 20,000 feet,” adds Marley. “You get a better sense of speed, depth, and can better tell a story simply using only vapor, which is pretty cool.” Making their cloud tool in a time when viz can be seen both on a computer screen during design and also on the often more frantic location of an LED stage, for example, meant that Halon had to pay close attention to the real-time rendering challenges that clouds bring. “With this tool,” says Marley, “the ability to have realistic/interactive clouds without killing frame rate was the hardest challenge. So, using some in-engine techniques, only a few textures, and introducing noise that looks believable and does not look repeatable or CG, was key. “It is a constant balance of fidelity and functionality. We are quickly bridging that gap with Unreal, and it’s only getting better with the release of UE5.” PREVIS’ING REALITY WITH AUGMENTED REALITY

TOP TO BOTTOM: Halon Entertainment’s tool built inside Unreal Engine to art direct real-time skies. (Image courtesy of Halon Entertainment) Clouds and terrain can be visualized with Halon’s tool. (Image courtesy of Halon Entertainment) An Unreal Engine render of a sky and cloud landscape from Halon’s real-time toolset. (Image courtesy of Halon Entertainment)

The latest tool coming out of The Third Floor lets creators view virtual set builds or models in a highly collaborative setting using Microsoft HoloLens AR glasses. Called Chimera, the tool allows users to see the virtual set model while still being able to see and collaborate naturally with each other, as The Third Floor Co-founder/VFX Supervisor Eric Carney explains. “The virtual set or model appears in a view shared by each user so they interact with the asset in a similar manner to discussing or planning a real set or model. Additionally, there is an operator in the session who can make changes to the virtual set or model that are seen by the group on the fly. The operator can move, add, remove, aspects of the model as well as cycle through different design ideas.” To make Chimera, the studio relied on Unreal Engine for the real-time graphic capabilities. “We wanted to make sure both novice and experienced users would be comfortable using the tool,” notes Carney. “We worked with leading production designers and art directors to design Chimera to work for their real-world needs. Additionally, we needed to develop the multiuser aspects of the system including the desktop operator. In addition the system allows users to join a review session via iPad or VR along with the AR users.”

86 • VFXVOICE.COM SUMMER 2022

PG 84-87 PREVIS.indd 86

5/2/22 2:14 PM


A Previs Journey into Real-time Visualization studio Day for Nite is one of several outfits that has leaped into the world of game engine pipelines for previs, specifically Unreal Engine. Senior Asset Supervisor Gustav Lonergan examines why the studio made the switch and the changes it’s made for the studio’s previs pipeline. Lonergan says Day for Nite has, since adopting Unreal Engine in 2019, been developing its own tools to integrate the studio’s existing pipeline to “work well with exports, streamline renders, and breach into a different level of visualization where we can help improve speed and costs in other areas of production. Since we already had a wellestablished pipeline for both assets and animation, it was not without investment to adapt to the new pipeline, but we definitely noticed the benefits on our very first job.” Work is continuing at Day for Nite to integrate Maya (a staple of previs work) and Unreal Engine. Londergran adds that “other tools that became indispensable for development were for sure some that we didn’t expect to reach pre-production so soon, like Substance Painter, Zbrush and Marvelous Designer.” In terms of recent Day for Nite projects where the game engine has been used for previs, Londergran pinpoints one where “we definitely had so many things that instantly became available to us in a scale that we never had before – crowds, dynamic effects, ways to visualize entire sequences, and cut the sequence before even rendering to check on continuity, lighting, and other things that sometimes would have taken the artist extra time to go through the regular pipeline. And we got to work directly with photography to visualize realistic lighting and camera setups that were not possible in the regular previs and postvis work.” “Other than those technical aspects of the work,” adds Londergran, “we benefit in getting something that looks a bit better, and working directly with the art department gives clients not only a better idea of action and cut, but also an overall better chance of seeing their lookdev in action and make changes in parallel with our pipeline.”

One use case The Third Floor sees for Chimera, in particular, is reviewing virtual ‘White-Card’ models of sets. The virtual White-Card model can be reviewed on a table or scaled up to life size so that users can walk around in the virtual set. “Chimera can even be used on a stage or location before construction has started,” says Carney. “Chimera can also be used on a stage in a typical ‘set tape out’ process where the art department will make out the footprint of a set prior to beginning construction. With Chimera, users can now walk the entire extent of the set seeing the planned build to better understand the scope and scale of the set and make last-minute changes before construction starts.”

TOP TO BOTTOM: The Third Floor’s Chimera tool uses Microsoft HoloLens AR glasses to help immerse creators inside virtual areas. (Image courtesy of The Third Floor) The Third Floor Chimera setup can be used to review scale table-sized models. (Image courtesy of The Third Floor) Chimera users can review scenes together by both wearing the AR glasses. (Image courtesy of The Third Floor) A still from Day for Nite’s short film FAE created in Unreal Engine. (Image courtesy of Day for Nite)

SUMMER 2022 VFXVOICE.COM • 87

PG 84-87 PREVIS.indd 87

5/2/22 2:14 PM


TECH & TOOLS

UNCOMPLICATING HOW TO BUILD COMPLEX VFX ASSETS

When visual effects artists build computer-generated creatures, characters or environments, they certainly don’t do things by halves. And that means the resulting CG assets tend to be, well, complicated. Here, a range of modelers, texture painters, riggers, lighters, FX and lookdev artists as well as crowd, animation and CG supervisors break down how they made some of their most complex assets in recent times.

By IAN FAILES A HUMAN-LIKE CREATURE MADE UP OF… HUMANLIKE CREATURES

TOP: Scanline VFX’s model of ‘The Long Boy’ in Lisey’s Story. (Image courtesy of Apple TV+) OPPOSITE TOP TO BOTTOM: Crowd simulation and animation pass. (Image courtesy of Apple TV+) The different elements are composited together. (Image courtesy of Apple TV+) Final shot by Scanline VFX. (Image courtesy of 2021 Apple TV+)

The miniseries Lisey’s Story sees the emergence of a gigantic human-like creature realized by Scanline VFX called ‘The Long Boy,’ made up of ‘countless suffering souls.’ Artists began a sculpt, as modeler Kurtis Dawe states, “by splitting it into parts and tackling the individual corpses first. We made five different types of corpses with varying body shapes that were distributed randomly to make up The Long Boy’s body. A big challenge was also finding a clever way to use these individual corpses to construct his different body parts in a way that could function.” The animation department painstakingly posed the low-resolution rigs of individual corpses and elements to match the sculpt. “The Long Boy rig, with low-resolution static corpses, was used by animation in each shot,” says Crowd Department Supervisor Sallu Kazi. “For the final round of animation, the rig was pushed through a Golaem plugin in Maya, which would read the data stored on the rig, recreate the exact poses and layer the motion from the animation library, including facial performances, on top of each individual corpse. The final rig had 590 individual corpses, 486 heads, 480 arms and three legs.”

88 • VFXVOICE.COM SUMMER 2022

PG 88-91 ASSETS.indd 88

5/2/22 2:15 PM


Performance capture, in addition to keyframe animation, also came into play, according to Animation Supervisor Mattias Brunosson. “It was a way to quickly collect a wide variety of motions for the corpses to be able to match what The Long Boy was emoting in the scene, from frantic, painful convulsions to more subdued motions. We then created a library of cycles that both the animators and the crowd department used to give life to all the corpses making up The Long Boy. Facial capture was used for the corpses as well. The performances were adjusted and edited to fit with the cycles we created for the bodies.” Reference that showed malnourishment and decay on human bodies was used to guide texturing. “The blood on the corpses had a deep red color tone and a matte quality, as plenty of time has passed and the blood on the corpses would have dried up,” explains CG Supervisor Pablovsky Ramos-Nieves. “We referenced images of burned corpses and added that look below the layer of blood. The skin had to look and feel like flesh, so those parts had more of a glossy and reflective sheen that allowed for the creature to receive nice highlights from the moonlight. We also dressed up some of the corpses with damaged and weathered outfits. Finally, we added an extra layer of wet and dried mud to add more color variation as it dragged its body over the ground.” Interaction with plant life required FX simulations from Scanline. “We wanted to achieve precise interaction with the moving bodies, each leaf and blade of grass,” notes FX supervisor Marcel Kern. “In shots where the creature is moving quickly through the forest, we used a custom FX rig to push down all the grass and break every tree in his path. We also created skin deformation simulations to have a transition from The Long Boy’s body to all the corpses attached to him.” CREATING CARNAGE

DNEG’s Carnage asset for Venom: Let There Be Carnage needed to facilitate high-octane fight scenes, shape-shifting and extensive tentacle movement. “Like any other complex asset, a simple start with a very strong foundation is key,” notes Lead Modeler/Sculptor Lucas Cuenca. “Carnage was built like an action figure where you are able to interchange arms, legs and tentacles.” To sculpt Carnage, Cuenca says he started with human anatomy but made it very distorted. “The muscles didn’t need to be attached to the bones; instead, they were attached to other muscles, sometimes shaped like bones. The sculpture had to sell the idea that Carnage was able to shift shapes, add weapons to the arms and extrude tentacles out of any part of the body, but at the same time he had to feel strong and have a structure to him.” The Carnage animation rig was built in a modular way to aid in the morphing nature of the character, as Animation Supervisor Ricardo Silva details. “The base is a standard human rig, but then he also has all these extra parts that can be loaded and attached when needed. The rig had a lot of flexibility with unconventional deformations like stretching and scale for bone joints. This was mostly used for the transformation shots where Carnage’s shape and size had to conform to Cletus for the morphing effect.” “Our rigging/creature FX team also created a Ziva simulation

rig,” adds CG Supervisor Eve Levasseur-Marineau. “This took into account his unconventional exposed musculature and allowed us to both introduce a layer of dynamics but also relax any issue in the topology introduced by the extreme poses. Our tentacle rig was completely overhauled for the show to make it much more versatile and faster for the animation team to work with.” DNEG integrated Houdini Engine into its Carnage Maya rig approach, enabling the animators to load a tentacle rig and create the performance needed by visualizing a simple cylinder shape. “We would just turn on the OTL node and have a very accurate representation of how things look like coming out of Houdini,” explains Silva. “We had a list of attributes that the animators can adjust in the OTL and see the changes in real-time. Once the shot was approved, it was baked down and sent to Houdini where the FX artist picked up the same attributes and ran a more refined simulation with extra detail.” “As animation cached the branching tentacles,” continues Levasseur-Marineau, “the relevant attributes would get included and ingested by the FX team, who would use it as a base to add further noise and deformation to create the organic look of the deformation, add a layer of veins flowing across the tentacles, a layer of translucent membrane goo and a connective goo between the different layers, emphasizing the symbiotic feel of the tentacles.”

SUMMER 2022 VFXVOICE.COM • 89

PG 88-91 ASSETS.indd 89

5/2/22 2:15 PM


TECH & TOOLS

DROID DESIGNS

Solo: A Star Wars Story featured L3-37, a character brought to life on set by actor Phoebe Waller-Bridge wearing partial practical pieces of the droid. Industrial Light & Magic (ILM) also crafted L3 entirely in CG, sharing final shots with Hybride. “We had a high-resolution LiDAR scan of L3 and also a detailed set of photos taken at 360 degrees, plus different angles for maximum coverage for both the costume and also the life-size prop that production had constructed,” explains ILM Senior Modeler Gerald Blaise. “First, I had to model the life-size prop of L3, matching every element of it with exacting precision, prioritizing the parts from the costume and then the mechanical parts. Once that was done, I imported the scan of Phoebe Waller-Bridge to compare and evaluate how it sat on her body when she was wearing it and how it interacted with her when standing in various poses.” From there, Blaise reverse-engineered all the possible mechanical movements that would be needed for L3 to perform and also for VFX artists to ultimately replace Waller-Bridge’s visible body parts with the droid parts. “We went back and forth with our rigging department to test the movements, make adjustments where necessary, and to check the range of motion to ensure we were staying true to the range of movement that Phoebe was actually able to achieve on set in the physical costume,” says Blaise. L3 is constructed from various astromech parts, so Blaise studied different droids from across the Star Wars saga for reference in the build. Maya was used for modeling and Keyshot to add some base shading materials before the model was texturepainted. “This would give me a rough first representation of it to compare to the real costume. From there, I could make sure all of the edges and thickness of each part felt right in an environment.” After the droid was modeled, rigged and textured, it was shared with Hybride for shot production. Sharing models has indeed become so much more common within the VFX industry. “Outside of the technicality of texture color pipeline and shading to match the look we created for her,” outlines Blaise, “Hybride also had to ingest our specialized rig and learn the way it was used in order to ‘drive’ L3. Both the ILM animators and the team at Hybride did a terrific job with L3, and it was a real pleasure seeing her come to life on screen in totality, after many dailies sessions watching our pieces replace the green spandex elements of Phoebe’s costume and match seamlessly to the hard surface part of the L3 costume.” RENDERING THE RED ROOM

TOP TO BOTTOM: The rendered character and composited shot by DNEG. (Image courtesy of Sony Pictures) DNEG concept art showcasing the heavily tentacled body of Carnage in Venom: Let There Be Carnage. (Image courtesy of Sony Pictures) DNEG integrated Houdini Engine into its Carnage Maya rig for the character. (Image courtesy of Sony Pictures)

The airborne Red Room seen in the finale of Black Widow was a Digital Domain creation, with the studio crafting a structure made of 355 exterior CG assets, and then another 700 CG assets, including debris, for its crash to earth. The hallmarks of the Red Room design are several arms connected to a massive central tower. “The arms house airstrips, fuel modules, solar panels and cargo,” outlines CG Supervisor Ryan Duhaime. “Details like ladders, doorways and railings were designed and added to maintain a sense of scale. We also established two hero arms that required high-res

90 • VFXVOICE.COM SUMMER 2022

PG 88-91 ASSETS.indd 90

5/2/22 2:15 PM


geometry to integrate seamlessly with the live-action footage by matching LiDAR scans for the physical set piece runways, hallways and containment cells. We then instanced the individual assets, like beams, scaffolding and flooring by combining them into layouts.” “We had the arms and the props on them divided into multiple sections, which allowed for greater control in what we would see that was left behind as the destruction took place,” continues Lead Layout Artist Natalie Delfs. “There is also a ton of movement throughout the ship during the sequence, so we knew we would need to build a wide variety of layouts for different areas inside and out of the ship as the location of the action changed. This included building damaged and undamaged layouts for multiple areas.” Texturing of the Red Room was established in Substance Painter starting with five main materials. “The number of shaders increased as we went, based on the level of detail and complexity,” says texture artist Bo Kwon. “We also created separate grungy shaders to be added on top of our base layers. Using these smart materials gave us a fast turnaround of assets to build the foundation, and to decide the look of the Red Room. For the closeup shots and hero assets, we then imported those maps into Mari to add more details.” Lookdev occurred concurrently to texturing, as Lookdev Lead James Stuart describes. “We looked into the Redshift material and the components which we could base all of our lookdevs on. In the end, we decided to utilize a metalness workflow from Substance. The Red Room is mostly made up of metallic objects, so we created base materials for bare metal, painted metal, rust and scratches. Once we found the look we wanted with these materials, we were able to provide baseline values for all the components we needed from textures, including diffuse, specular roughness, metalness and bump.” Lighting was deceptively simple, suggests Lighting Lead Tim Crowson, “but ultimately required a lot of fine-tuning at the shot level. Our DFX Supervisor, Hanzhi Tang, captured a very high-dynamic-range image of a sunset, which we used as the basis for our exterior lighting. We split this into a couple of lights to maintain artistic control over fill and key, and established this as our starting point for shot lighting. The other vital component to selling this kind of high-altitude, high-speed traversal was the use of dynamic shadows from surrounding objects, sometimes cast by clouds, sometimes by falling debris. It was imperative to include shadowing from these surrounding objects to sell the sense of speed and integrate our CG elements fully.” Overall, Digital Domain used a wealth of tools for the Red Room, as Duhaime observes. “We utilized Foundry’s Nuke for our compositing needs; Houdini’s Mantra for clouds, dust, fire, smoke and initial FX destruction renders; Houdini for our FX dev, CFX groom and cloth simulations; V-Ray for our hero character digital doubles; Redshift for hard-surface objects like the Red Room, vehicles and the destruction renders; Substance Painter and Mari for our texture painting needs; and Maya for our general modeling, animation, lookdev, lighting and layout setups.”

TOP TO BOTTOM: The droid L3-37 as seen in Solo: A Star Wars Story. (Image courtesy of Lucasfilm) Actor Phoebe Waller-Bridge performed L3 in partial costume and practical droid pieces. (Image courtesy of Lucasfilm) The final droid on screen was a mix of live-action and CG droid, accomplished in the film by ILM and Hybride. (Image courtesy of Lucasfilm) Digital Domain’s Red Room asset for Black Widow was made up of more than 17,000 instances and 1.7 billion triangles. (Image courtesy of Marvel Studios)

SUMMER 2022 VFXVOICE.COM • 91

PG 88-91 ASSETS.indd 91

5/2/22 2:15 PM


[ VES SECTION SPOTLIGHT: BAY AREA ]

Bay Area Revisited: Lifting Up Diverse Voices in VFX By NAOMI GOLDMAN

TOP: VES Bay Area “Virtual FAANG Panel” held during the pandemic, featuring Polly Ing, Adobe; Leslie Valentino, Facebook Reality Labs; Maggie Oh, Google; and Alan Boucek, Tippett Studio, moderated by Camille Eden, Nickelodeon. MIDDLE: VES Bay Area hosted a virtual workshop on “Shifting Gears: Next Level Storytelling” featuring Diana Williams, Kinetic Energy Entertainment; Colum Slevin, Facebook Reality Labs; Miles Perkins, Epic; and Alonso Martinez, Google AI, moderated by Corey Rosen, Tippett Studio, and Maggie Oh, Google. BOTTOM, OPPOSITE TOP AND MIDDLE: VES Bay Area members and guests enjoy the Summer 2021 picnic with special friends from Star Wars joining the festivities. (Photos: Anet Hershey Photography)

The VES’ international presence grows stronger every year, and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. VES Bay Area is the Society’s first and oldest Section, and continues to thrive with 500+ members. Home to Lucasfilm, ILM, Pixar and Tippett Studio, as well as Dolby HQ, dozens of smaller VFX studios focused on commercial production and games and the entirety of Silicon Valley, the Section is proud to carry the distinction of being one of the only places that lives at the center of these various endeavors. Like the rest of the industry, VES Bay Area revised its programming during the pandemic to a largely virtual platform to address the needs of its members and serve as a point of comfort, connection and continuity during a time of uncertainty. The Section invested in producing events focused on Virtual Production, Working Remotely, Creative Storytelling and leveraging VFX skills in other industries to provide robust education and forums for networking, career development and social interaction. “We forged ahead with a series of educational programs billed as FAANG events (Facebook, Amazon, Apple, Netflix, Google), given the booming growth of tech companies in the region and need for people with visual effects skills, especially as these companies move from being service and tech entities to media production companies creating art,” said Cory Rosen, VES Bay Area Treasurer. “These forums have been great opportunities for people to share their experiences on pivoting from working at traditional VFX companies to career opportunities in the tech field. It was easy to keep these programs going during the pandemic and invite diverse voices to be represented on these panels.” The Section’s efforts to address issues around diversity, equity and inclusion (DEI) were borne out of the FAANG panels and deepened. “From the experience with the FAANG events, we wanted to further the topic of DEI and anti-racism, and at the end 2021, we started an anti-racism book club,” continued Rosen. “Our first book was How to Be an Antiracist by Ibram X. Kendi, and our members have shared that these moderated conversations have been among the most meaningful events they have been a part of during the pandemic. It is very rewarding to know that we are delivering something with that kind of value and impact to our community.” Continuing on the theme of diversity, the Section is focused on expanding the segments of the industry represented in its membership and finding ways to highlight the talent on the Board and the entire member community. “It’s very exciting to be working with a Board that is intergenerational and boasts leaders who bring diversity in thought and life experience,” said Alan Boucek, VES Bay Area Chair. “We have a great amalgam of voices across our leadership, who are propelling us to program on ‘the new and the now’ – not just on retrospectives that harken to our history.” As the pandemic ebbed and flowed, VES Bay Area held a number of in-person events as allowed by health protocols. In 2021, the

92 • VFXVOICE.COM SUMMER 2022

PG 92-93 VES SECTION BAY AREA.indd 92

5/2/22 2:15 PM


Section held screenings of The Matrix Resurrections two weeks before it opened, Shang-Chi and the Legend of the Ten Rings and Eternals. The Section was overjoyed to be able to host its two cornerstone events in person – the Annual Summer BBQ (featuring beloved Star Wars characters as special guests) and the Holiday Party in December. Boucek continued, “Building on all that we accomplished over the last two years, I’m interested in developing live and virtual programming that resonates with many different sub-communities in the industry. That ranges from convening a forum of freelancers to share their experiences navigating challenges and opportunities, to further engaging people from the games industry, to gathering alumni of the Unreal fellowship program to share lessons learned, to using our resources to build and support the pipeline of younger VFX professionals to help them get the requisite experience needed to qualify for VES membership. There is so much we are doing and can be doing to nurture aspiring professionals and deepen our connections across the industry.” Marking the Society’s milestone 25th anniversary, the Section leaders shared their insights. “As a community, we are all interconnected, and have genuine pride and a sense of support for each other,” said Boucek. “The VES brings people together to share the passion and challenges of what it means to be a working VFX professional. The essence of the VES is being surrounded by people who see you and know you and understand you. We are the whole sum experience of our members, and we provide an opportunity for shared connection that is truly unique in all the world.” “In the early part of my career at ILM – before I was a VES member – I recall the industry as being contentious and tribal, with people reticent and fearful to share knowledge and information,” said Rosen. “I truly feel the VES opened up that dialogue and community, which was transformational in my life and career. In 25 years, the VES has proven that we can break down barriers and build meaningful connections and we have so much to be proud of.”

BOTTOM: VES Bay Area members and guests celebrate the season at the 2021 Holiday Party. (Photo: Anet Hershey Photography)

SUMMER 2022 VFXVOICE.COM • 93

PG 92-93 VES SECTION BAY AREA.indd 93

5/2/22 2:15 PM


[ THE VES HANDBOOK ]

How to Expose a Green Screen Shot, and Why By BILL TAYLOR, ASC, VES Edited for this publication by Jeffrey A. Okun, VES Abstracted from The VES Handbook of Visual Effects – 3rd Edition Edited by Jeffrey A. Okun, VES and Susan Zwerman, VES

Figure 3.8: Schematic H&D curve. This graph shows how the color negative responds to increasing exposure. In film, each color record has a linear section, where density increases in direct proportion to exposure, and a “toe” and a “shoulder” where shadows and highlights respectively can still be distinguished but are compressed. The “white point” is shown for all three records: the density of a fully exposed white shirt that still has detail. Figure 3.9: Normal smoke plume with ND filter strip, underexposed plume and strip with contrast boosted. Images courtesy of Bill Taylor, ASC, VES.

Balancing Screen Brightness to the Shooting Stop Let us assume that the camera choices are optimal, screen materials and lighting are ideal, and the foreground lighting matches the background lighting perfectly. A common misconception is that backing brightness should be adjusted to match the level of foreground illumination, for example “one stop below key.” In fact, the optimum backing brightness depends only on the f-stop at which the scene is shot. Thus, normally-lit day scenes and low-key night scenes require the same backing brightness if the appropriate f-stop is the same for both scenes. The goal is to achieve the same green density on the negative, or at the sensor, in the backing area for every shot at any f-stop. The ideal green density is toward the upper end of the straight-line portion of the H and D curve, but not on the shoulder of this curve, where the values are compressed. Figure 3.8 shows an idealized “H&D curve.” Eight stops of exposure range can comfortably fit on the H&D curve, a range recently exceeded by digital cameras. Imagine a plume of black smoke shot against a white background. It is a perfect white: The measured brightness is the same in red, green and blue records. For

clarity, the range of transparencies in the smoke has been duplicated with a stepped series of neutral density filters. The densities in Figure 3.9 ranges from dead black to just a whisper. What exposure of that white backing will capture the full range of transparencies of that smoke plume? Obviously, it is the best compromise exposure that lands the white backing at the white point toward the top of the straight-line portion of the H&D curve in film (a white-shirt white) or a level of 90% in video, and brings most of the dark values in the smoke up off the toe. If the backing was overexposed, the thin wisps would be pushed onto the shoulder, compressed (or clipped in video), and pinched out by lens flare. If the backing was underexposed (reproduced as a shade of gray), detail in the darkest areas would fall on the toe, to be compressed or lost entirely. You could compensate for underexposure by boosting the image contrast or making a levels adjustment (remapping the brightness range to a wider gamut). As Figure 3.9 shows, boosting contrast makes the backing clear again and the blacks opaque, but tonal range is lost: the dark tones block up, the edges of the smoke become harder and noise is exaggerated.

The effect of a levels adjustment is subtler (and preferable), but extreme levels adjustments betray the missing tonal values with banding or blocking. Now imagine that instead of a white screen, we are shooting the smoke plume and the neutral density steps against a greenscreen and that the measured green brightness is the same as before. What is the best exposure for the green screen? Obviously, it is the same as before. The only difference is that the red- and blue-sensitive layers are not exposed. Just like the smoke plume, greenscreen foregrounds potentially contain a full range of transparencies. Transparent subject matter can include motion blur, smoke, glassware, reflections in glass windows, wispy hair, gauzy cloth and shadows. To reproduce the full range of transparency, the greenscreen should be fully – but not over – exposed. In other words, its brightness should match the green component of a well-exposed white object like a white shirt, roughly defined as the whitest white in the foreground that still has detail. (We do not want to expose that white shirt as top white, because we want to leave some headroom for specular reflections, on the shoulder in film, 100% and over in digital recording.)

94 • VFXVOICE.COM SUMMER 2022

PG 94 HANDBOOK.indd 94

5/2/22 2:16 PM


PG 95 FOTOKEM AD.indd 95

5/2/22 2:35 PM


[ FINAL FRAME ]

Hi-Yo Episodic!

There is some debate about what was the very first episodic or serialized TV show now that streaming and broadcast television have been refining both of those concepts spectacularly and innovatively in more recent years. On the VFX side of things, streamers and broadcast entities have brought both episodic and serialized shows to never-before-seen levels akin to cinematic movie-making. Examples are Game of Thrones, The Mandalorian, Halo, Star Trek: Picard, Stranger Things and many others. Viewers can’t get enough. Episodic suggests self-contained stories about the same characters, while serialized suggests a story that takes multiple installments to tell. In some instances, today, episodic and serialized have merged into hybrid extravaganzas. Some TV historians point to The Lone Ranger as the first true episodic. It aired from 1949 to 1957, featuring Clayton Moore as the popular Texas Ranger and Jay Silverheels as his trusty companion Tonto. It spanned nine years and featured 221 full episodes. The show was considered episodic in that the stories were also self-contained. TV historians also point to the first true serialized nighttime show as Peyton Place, which came along in the 1960s, with the primetime genre fully blossoming in the 1970s with Mary Hartman, Mary Hartman. Daytime honors go to Guiding Light, a TV soap opera which started airing in 1952. It’s the longest running soap opera (1952-2009). Featuring any number of actors over the years, the series had multiple storylines around family, work ethic and climbing the social ladder – all with a spiritual flavor.

(Photo from The Lone Ranger courtesy of Universal Pictures Home Entertainment. Image from The Mandalorian courtesy of Disney+)

96 • VFXVOICE.COM SUMMER 2022

PG 96 FINAL FRAME.indd 96

5/2/22 7:22 PM


CVR3 VES MAP AD.indd 3

5/2/22 2:20 PM


CVR4 AMAZON OUTER RANGE AD.indd 4

5/2/22 2:20 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.