KINGDOM OF THE PLANET OF THE APES
Welcome to the Summer 2024 issue of VFX Voice!
Thank you for being a part of the global VFX Voice community. We’re proud to be the ultimate source for everything VFX.
In this issue, our cover story ventures into The Kingdom of the Planet of the Apes, the fourth installment in the acclaimed Planet of the Apes reboot franchise. We go behind the scenes of dystopian film Civil War and the animated coming-of-age sequel Inside Out 2 and delve into Iwájú, a new streamer set in a futuristic Lagos, Nigeria. We explore the art and craft of creature designers, glimpse into the XR future of video games and herald ‘The Age of the Storyteller’ calling for high-quality VFX. VFX Voice explores the debate about Invisible CGI, looks at the rise of LED volumes in our virtual production feature and shines a light on VFX in Europe and the VES France Section.
We highlight the contenders for the 76th Emmy Awards and share personal profiles of VFX Supervisor Pablo Helman and this year’s VES Awards special honorees – sci-fi icon and VES Award for Creative Excellence recipient William Shatner and acclaimed Producer/VFX Producer Joyce Cox, VES, recipient of the VES Lifetime Achievement Award. And more!
Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible to advance the field of visual effects.
Cheers!
Kim Davidson, Chair, VES Board of Directors Nancy Ward, VES Executive DirectorP.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X (Twitter) at @VFXSociety.
FEATURES
8 THE VFX EMMY: THE 76TH EMMY AWARDS
Up close with candidates in the running for this year’s prize.
24 COVER: KINGDOM OF THE PLANET OF THE APES
More nuanced, human-like digital cast in an upside-down world.
30 PROFILE: JOYCE COX, VES
VFX Producer walks the walk to keep the show on track.
34 PROFILE: WILLIAM SHATNER
Saluting a legend: the face of modern TV/film sci-fi VFX.
36 FILM: CIVIL WAR
VFX spikes the drama in Alex Garland’s American war zone.
40 PROFILE: PABLO HELMAN
VFX Supervisor went to ‘film school’ with Spielberg & Scorsese.
46 VFX TRENDS: THE CGI DEBATE
“No CGI was used in the making of this film” – or was it?
50 VFX TRENDS: CREATURE DESIGNERS
How digital artists bring characters and their worlds to life.
56 SPECIAL FOCUS: VFX IN EUROPE
Despite setbacks, VFX companies look for year-end bounce.
62 ANIMATION: INSIDE OUT 2
Expanding the cast of Emotions and vocabulary of shapes.
68 VIDEO GAMES: XR FUTURE
Industry in slow-growth mode as platforms invest in the future.
74 TV/STREAMING: THE AGE OF THE STORYTELLER
Original content driving the demand for high-quality VFX.
80 VIRTUAL PRODUCTION: LED VOLUMES
Virtual production gains spur broadening adoption of volumes.
86 TV/STREAMING: IWÁJÚ
Disney/Kugali partnership gives voice to African storytellers.
DEPARTMENTS
2 EXECUTIVE NOTE
90 THE VES HANDBOOK
92 VES SECTION SPOTLIGHT – FRANCE
94 VES NEWS
96 FINAL FRAME – JOHN CHAMBERS
ON THE COVER: An intimate moment between Soona (Lydia Peckham) and Noa (Owen Teague) in Kingdom of the Planet of the Apes (Image courtesy of Walt Disney Studios Motion Pictures)
VFXVOICE
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen Arlene-VFX@outlook.com
SUPERVISOR
Nancy Ward
CONTRIBUTING WRITERS
Naomi Goldman
Trevor Hogg
Chris McGowan
Barbara Robertson
Oliver Webb
ADVISORY COMMITTEE
David Bloom
Andrew Bly
Rob Bredow
Mike Chambers, VES
Lisa Cooke
Neil Corbould, VES
Irena Cronin
Kim Davidson
Paul Debevec, VES
Debbie Denise
Karen Dufilho
Paul Franklin
David Johnson, VES
Jim Morris, VES
Dennis Muren, ASC, VES
Sam Nicholson, ASC
Lori H. Schwartz
Eric Roth
Tom Atkin, Founder
Allen Battino, VES Logo Design
VISUAL EFFECTS SOCIETY
Nancy Ward, Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Kim Davidson, Chair
Susan O’Neal, 1st Vice Chair
Janet Muswell Hamilton, VES, 2nd Vice Chair
Rita Cahill, Secretary
Jeffrey A. Okun, VES, Treasurer
DIRECTORS
Neishaw Ali, Alan Boucek, Kathryn Brillhart
Colin Campbell, Nicolas Casanova
Mike Chambers, VES, Emma Clifton Perry
Rose Duignan, Gavin Graham
Dennis Hoffman, Brooke Lyndon-Stanford
Quentin Martin, Maggie Oh, Robin Prybil
Jim Rygiel, Suhit Saha, Lopsie Schwartz
Lisa Sepp-Wilson, David Tanaka, VES
David Valentin, Bill Villarreal
Rebecca West, Sam Winkler
Philipp Wolf, Susan Zwerman, VES
ALTERNATES
Andrew Bly, Fred Chapman
John Decker, Tim McLaughlin
Ariele Podreider Lenzi, Daniel Rosen
Visual Effects Society
5805 Sepulveda Blvd., Suite 620
Sherman Oaks, CA 91411
Phone: (818) 981-7861 vesglobal.org
VES STAFF
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Charles Mesa, Media & Content Manager
Ross Auerbach, Program Manager
Colleen Kelly, Office Manager
Shannon Cassidy, Global Manager
P.J. Schumacher, Controller
Naomi Goldman, Public Relations
EMMY VFX HOPEFULS RISE TO THE CHALLENGE TO SERVE THE STORYTELLERS
By CHRIS McGOWANBig-screen VFX continues to stretch the small screen, and the effects keep getting more essential and cinematic, as evidenced by the shows eligible in the Outstanding Special Visual Effects categories (in a Season or a Movie or in a Single Episode) of the 76th Primetime Emmy Awards on September 15. The contenders represent a rich segment of the VFX work that has become the backbone of high-end episodic television, including world-building, digi-doubles, face replacements, de-aging, simulations, environmental and invisible VFX. Following is a look at some impressive VFX-infused TV/streaming shows poised for an Emmy nomination, with VFX supervisors revisiting their VFX highlights.
When it comes to world-building, no canvas is broader or more complex than sci-fi. “The largest VFX challenge on Season 2 of Foundation was executing all of the different types of visual effects required on the show while maintaining the quality we had established on Season 1,” states Visual Effects Supervisor Chris MacLean. “The variety of visual effects was daunting given we had to complete complex CG environments, CG creatures, giant CG mechas, CG destruction, water simulation, CG vehicles, and holograms, just to name a few. If we are talking about a challenging sequence, I would have to say that the escape from Synnax in Episode 202 was one of the most difficult. There were a lot of practical sets and stunt work that had to seamlessly integrate with CG water simulation and stunt work. Beki, our domesticated Bishop’s Claw, was a huge win for us. Knowing how difficult it is to make a ridable CG animal feel grounded in reality, the team planned and executed this flawlessly.”
Jason Zimmerman, VFX Supervisor on Star Trek: Strange New Worlds for Paramount+, embraces the opportunity to add to the Star Trek legacy. “A big challenge with any Star Trek show is always working with canon on some of the fans’ most beloved characters, ships and effects,” he says. “In the case of Star Trek: Strange New Worlds Season 2, we saw more of the Gorn than had been seen in many years. In addition to the stellar work from Legacy Effects in creating the practical Gorn, we worked to augment it with additional facial animation, drool, breath, etc. We also did entirely CG shots of the Gorn or CG Gorn with practical actor interaction to help tell the story in the final episode of the season. It was crucial to our showrunners that we seamlessly integrate the CG moments with the practical to aid in the storytelling.”
Zimmerman points to the Gorn fight and the conclusion of the last episode of the season as peak experiences. “In addition to the full-CG Gorn, facial performance enhancements and fight sequence, we had the scene take place on a backdrop of a partially destroyed ship hull destined to crash. Combining the practical and CG fight action and set extensions inside, with the exterior full CG beats as the ship begins to enter the nearby planet’s atmosphere, was both challenging and fun to play with as a team and with our vendors. The full CG exterior shots and destroyed ship assets were massive, requiring quite a bit of simulation in the debris field, re-entry fire and smoke, etc. Cut together with the interior fight scene with CG Gorn, along with the eventual escape of our heroes to the exterior in what became a full-CG shot with digi doubles, was
TOP AND MIDDLE: Complex CG environments, CG creatures, giant CG mechas, CG destruction, water simulation, CG vehicles and holograms were just a few of the many VFX tasks required on Season 2 of Foundation. (Images courtesy of AppleTV+)
BOTTOM: Beki, the domesticated Bishop’s Claw, was a big success for VFX on Season 2 of Foundation, knowing how difficult it was to make a ridable CG animal grounded in reality. (Image courtesy of AppleTV+)
quite challenging but ended up as one of our favorite shots during our tenure on the show.”
Orchestrating the wide variety and different types of VFX work necessary to bring the story to life was a daunting task for Jay Worth, Visual Effects Supervisor on Fallout for Amazon Prime Video show. “I have worked on shows in the past where they were primarily a set extension show or genre or world-building,” Worth says. “However, in this one we had everything. We needed to help create the overall look and feel of the world we were inhabiting. We needed to develop multiple real-time environments for use in Unreal for shooting on a volume. We had multiple creatures with various skins and textures. We had human characters that needed photorealistic replacements to portions of their face. We had characters we were de-aging using new and cutting-edge methods. And unique hard-surface vehicles that needed to match 1:1 to practical production vehicles – as well as a lead character that needed to have a CG nose replacement throughout the entire series.”
Worth and his team “fell in love” with the Cyclops. “I remember [Executive Producer] Graham [Wagner] calling me to say they wanted to do a cyclops,” Worth recalls. “When we started to talk about it, I realized how crucial this character was and how nuanced his performance would be. So, we tested a few methodologies – a pure compositing approach and an AI-generated approach. However, both of those had limitations in terms of the variety of environments we were shooting in along with performance flexibility we knew we would need. Chris Parnell’s performance was the primary thing we knew we needed to nail if we were going to pull this off, so we partnered with our long-time collaborators at Important Looking Pirates in Sweden and were ecstatic with the results. We were able to capture Chris’s performance, the humor, heart and nuance, while creating a full CG effect. I feel like we were
able to push past the uncanny valley.”
Visual Effects Supervisor Douglas Purver adhered to an extraordinary level of detail for Season 2 of HBO’s The Gilded Age. “With a show this opulent in its set design, costumes and more, there’s been a fine line to walk with the effects work. We don’t ever want to call attention to ourselves while maintaining the level of detail that seamlessly blends in with what’s captured on camera. Most of the time we can use elements from what’s practically there. I’m constantly taking stills of textures and architectural details, or we bring in a team to get high-resolution scans. But often we are creating things from scratch and finding a real-world reference is challenging, involving a deep dive into historical texts and postcards or a significant collaboration with our production designer and locations department to find elements that can fit into our world.”
The season’s climax found Purver and his team at the opening of the Metropolitan Opera House, “which was filmed in three different locations, weeks apart from one another, at a stage in Albany, New York, the main Opera House in Philadelphia and a set of five opera boxes built on our film stages,” Purver details. “Getting them all to sit together, especially when the camera wraps around Bertha as she enters her box for the first time, was extremely satisfying. Building the CG crowd to blend with our tiled plates filled the entire venue and gave it the grand opening it deserved. Being able to collaborate with the production designer on how much to build and where, with the cinematographer on light placements – how and when to move the camera – the director on which story pieces to shoot where, along with the amazing VFX team who contributed countless hours to allow the viewer to stay in the moment and marvel at this climactic, cinematic moment in our story – was just a fantastic experience.”
Tim Crosbie, Visual Effects Supervisor on Season 3 of The Witcher for Netflix, called out Episode 6 where “almost every shot required some form of VFX, from the spells cast during the battle to the many set extensions throughout all of the exterior fights, then the destruction of Tor Lara and Aretuza towards the end. Our on-set teams had their work cut out. We knew we needed to provide very accurate lighting and LiDAR data to ensure that post-production ran as smoothly as possible because the schedule was going to be tight to get all the shots through. All of our vendors came to the party and produced really beautiful work to help tell the story. This show was one of the more collaborative ones I’ve worked on, with everyone pulling in the same direction. I think the most satisfying accomplishment for us in VFX was how much value we were able to bring to the story.”
Charlie Lehmer, Visual Effects Supervisor on All the Light We Cannot See for Netflix, cited the “rampart run sequence” in Episode 4 as the most challenging. “Early on in pre-production, director Shawn Levy emphasized the need for its immense scale, leading us to explore numerous shooting solutions. However, as ambition grew, on-location filming became unfeasible. Constructing a fully-CG, period-accurate St. Malo [an historic port city in France] environment demanded rigorous research and planning. The core difficulty lay in marrying historical fidelity with our artistic narrative vision.” Lehmer says. “We dedicated a week to thoroughly scanning and photographing St. Malo. Archival footage, both pre- and post-bombing, was further analyzed to ensure as much accuracy as possible. ILM did an amazing job digitally
transforming the modern town into its 1944 counterpart. The result was a full CG city of St. Malo, procedurally built to allow for extensive bombing and collapse of various buildings.”
Crafting the detailed destruction of St. Malo was a great source of pride for the VFX team. “Much of our ground-based filming was set in the charming Villefranche-de-Rouergue,” Lehmer reveals. “Digitally transforming it into a ravaged postwar St. Malo presented a considerable yet rewarding challenge. We aimed to surpass conventional depictions of bombed-out cities which focus solely on brick and mortar, instead prioritizing granular detail for profound visual impact. Our rubble wasn’t mere debris; it was imbued with poignant elements: teddy bears, pianos and intricate paintings lending an unsettlingly personal dimension.”
John Haley, Visual Effects Supervisor on Marvel Studios’ Echo for Disney+, was heavily focused on the main action sequences in Episode 2. “Bushto and the train heist presented challenges due to their scope and complexity. Recreating the Choctaw Bushto environment for the stickball sequence, which takes place in the year 1200 AD in what is now Alabama, required careful research. The production and VFX teams worked with historians and cultural consultants to ensure that the sensitive historical details were correct. All the shots in the sequence were augmented with visual effects to make the game and scene as realistic as possible. The team at ILM thoughtfully created the environment and CG background characters to portray life in 1200 AD before the arrival of Europeans.”
TOP AND BOTTOM: An extraordinary level of detail was required for Season 2 of The Gilded Age. Working in close collaboration with the production designer, the VFX team referenced historical texts and postcards to produce effects that seamlessly blended in with what was captured on camera. (Images courtesy of HBO)
Continues Haley, “[For the train heist], combining live-action acting, stunt performances, digital doubles, face replacements, practical train cars, CG train cars and effects in photographed and digital environments seamlessly into the action-packed ‘Train Heist’ sequence was no easy feat. When Maya Lopez plunges off the highway overpass onto the speeding train below, the VFX team used all of those resources to achieve the shot – transitioning from the plate photography of Alaqua Cox to stunt photography on blue screen to a full Maya digital double, back to Alaqua on a bluescreen, all in a photorealistic all-CG environment. Whew! We wanted the train heist sequence to feel grounded and gritty, choosing camera positions as if we were shooting the scene on a fast-moving train or from a pursuit vehicle. Day-for-night train array plates were shot and colorgraded, then used as a basis for the environment. Then, the nighttime environments were modeled and designed to give a sense of speed, danger and depth. Each shot was balanced and composited so it appeared as though it was photographed using only available moonlight and artificial practical light sources.”
Haley adds, “Orchestrating the collaboration between ILM and Digital Domain to create and bring the photoreal Biskinik bird to life [was also an accomplishment]. We were very pleased with the look, animation and attention to detail of the final shots. With the Biskinik bird being such a big part of Choctaw tradition, and Maya’s story, it was important that the bird be completely believable.”
The audience’s emotional reaction, particularly from fans of the book, to a couple of moments in Episode 8, the season finale, stood out to Andy Scrase, Visual Effects Supervisor on Season 2 of The Wheel of Time for Amazon Prime Video. “One was the death of Hopper. The performance of our Czech Wolfdog, Ka Lupinka, was fantastic! We supplemented that performance with an animated color flash to the eyes and added a pool of blood from Hopper’s fatal neck wound forming around the head. I think that little addition emotionally pushed everyone over the edge! It was straight-forward VFX work, but it gave such a payoff because it complemented the performance and initiated sadness and horror among those watching.”
Not long after Hopper’s death in the episode is Mat Cauthon blowing the Horn of Valere. “Again,” Scrase notes, “this seemed to get a big reaction with the book fans, but at the other end [of] the emotional scale. Seeing the ‘Heroes of the Horn’ form and emerge from a localized mystical fog brought a certain degree of euphoria. The fog moment features in the second book [The Great Hunt], and so it felt important to keep that component. We then used influences from the Hindi festival of Holi, fireworks exploding in thick smoke, and some beautiful photography I found of dancers holding poses in clouds of powdered paint to inspire the heroes appearing in our CG fog. The low of Hopper’s death almost immediately followed by the excitement of Mat blowing the Horn heightened the emotional reaction from those in the audience. For me, it showed how our work in the industry is not just about flashy
effects or seamless additions; it can emotionally contribute to a scene and an audience’s reaction.”
Christopher Townsend, VFX Supervisor on Season 2 of Marvel Studios’ Loki for Disney+, had many loose “threads and strands” to tie up for the finale of the limited series. “Creating some of the CG environments so they still fit in with the lo-fi, analog visual style of the whole show was challenging, particularly when outside the TVA, with swirling prismatic flares, a disintegrating spaceman-like suit and a massive floating loom weaving threads of time. The unique and original spaghettification and time-slipping effects were designed to fit within the visual motif of time represented as lines, threads and strands. The final tree-like Yggdrasil galactic
TOP AND BOTTOM: Constructing a fully-CG, period-accurate city of St. Malo in France for All the Light We Cannot See, ILM digitally transformed the modern town into its 1944 counterpart. The fully CG St. Malo was procedurally built to allow for extensive bombing and collapse of buildings. (Images courtesy of Netflix)
TOP AND BOTTOM: The production and VFX teams on Echo worked with historians and cultural consultants to ensure the accuracy of sensitive historical details. The Creation Pools sequence was rooted in Choctaw lore. Digi-doubles were made for the main Choctaw characters. VFX enhanced the realism. (Images courtesy of Marvel Studios)
image, showing the transformed timelines with Loki at its heart, felt like a beautiful and epic moment to end the show.”
For Jay Redd, VFX Supervisor on Season 4 For All Mankind for Apple TV+, the biggest challenge was the sheer variety of VFX work for Season 4 – and for every season of the show, “and keeping our feet firmly planted in real physics and science while bending the rules here and there to serve the storytelling,” he says. “While we are an alternate timeline show, our approach is hard science. We put a major effort into making sure things feel real in space, on Mars and on Earth. A lot of this work comes early in the previs stages, me working with The Third Floor in designing shots and sequences, working on physics, pacing, and scale. We work handin-hand with our astronaut and technical consultants to keep things as realistic and scientifically accurate as we can, knowing there are times when drama and story call for changing the pace and timings of certain events, like ships docking, landings, etc.”
Redd continues, “This year, we had two big challenges: the huge expansion of the Happy Valley Mars Base and the Asteroid Captures. Happy Valley was a 50-fold expansion from Season 3, so there was a massive amount of work in designing and creating the dozens of new modules, landing pads, roads, terra-forming, vehicles, and connecting infrastructure to show the huge growth over roughly a decade. We worked very closely with production design to make sure we integrated our look from Season 3 while also showing the epic scale of growth in Season 4. The DNEG Montreal team, led by VFX Supervisor Mo Sobhy, did an amazing job in
hitting a massive amount of detail across the base and multiple landscapes under varying lighting and atmospheric conditions.”
The Asteroid captures in the beginning and the end posed major challenges for Redd and his team. “Once again, we needed to make things as plausible and scientifically accurate as we could while serving a dramatic and emotional story,” he states. “Working with very limited set pieces on small stages, we had dozens of shots that are fully CG, partial live-action and hybrid/mid-shot blends – utilizing extensions, digi-doubles, face replacements and big simulations for asteroid pebbles, rock and dust. The designs of the asteroids are based on real existing asteroids, and capture ships and mechanisms come from real-world examples and futurelooking potential endeavors. We had conceptual challenges in showing ships firing engines but appearing to be moving backwards, and slowing asteroids to enter Mars orbit. The Ghost VFX team in Copenhagen, Denmark, led by VFX Supervisor Martin Gårdeler did an incredible job in working with me on a ton of scope and detail in the models and simulations, and very specific lighting cues to show the scale and reality of these scenes.”
Daniel Rauchwerger, Visual Effects Supervisor on Silo for Apple TV+, found that his biggest VFX challenge was the open-space, curved mega structure of the silo, “where in every shot we see, continuous to the plate, a crowd that behaves naturally and actively reacts to the actions of our characters and tensions in the silo,” he says. “We had to make the natural feel of a living, breathing underground city where 10,000 people live, and make sure that we
get the organic texture of movement and life combined with the mechanics and inner workings of the silo seamlessly. We are very proud that we managed to bring the character of the silo to life in an invisible way and become something the audience does not think about – and instead accepts the silo and its residents as real, hopefully not thinking about VFX.”
For Ben Turner, Visual Effects Supervisor for Season 6 of The Crown for Netflix, fidelity to character and story was paramount –and going unnoticed was an achievement. “The story of Princess Diana’s death brought with it perhaps the most expectations, and the greatest burden of responsibility, of any subject we tackled in the preceding 52 episodes. It was clear from the beginning that the subject would have to be handled sensitively and our VFX team was at the heart of achieving this.
Explains Turner, “One of our biggest VFX challenges of the final series came in Episode 3 [‘Dis-Moi Oui’]. A central location to the scenes in this episode was the famous Ritz Hotel, located in Place Vendome in Paris. The art department built a partial set [for the doorway of The Ritz] on the backlot at Elstree Studios in London. Our team created the rest of the enormous square in 3D, using extensive LiDAR scanning and photography of the real location in Paris. We then tweaked the CG to better match the art department build in order to create a seamless environment. The scenes required a building sense of frantic claustrophobia; we helped to heighten this by adding crowds and additional photographers to the square surrounding the characters and their cars.”
A VFX highlight for Turner occurred in the same episode. “It sees a teenage Prince William shoot his first stag in the Highlands of Scotland. We were tasked with creating the animal fully in CG, together with tweaks to the environment, for a scene in which
“The low of Hopper’s death almost immediately followed by the excitement of Mat blowing the Horn [of Valere] just heightened the emotional reaction from those in the audience. For me, it showed how our work in the industry is not just about flashy effects or seamless additions, but it can emotionally contribute to a scene and an audience’s reaction.”
—Andy Scrase, Visual Effects Supervisor, The Wheel of Time
TOP: The VFX for Season 2 of The Wheel of Time demonstrated that the work isn't about flashy effects and seamless additions, but contributing to the story to evoke an emotional response from the audience. (Image courtesy of Amazon Prime Video)
BOTTOM: The VFX for Episode 8, the season finale of The Wheel of Time, was designed to complement performance. (Images courtesy of Amazon Prime Video)
our work was literally in the crosshairs. We also had to make the CG creature match a real stag used on location for close-up shots of the animal. This required sculpting and grooming the model, to have an exact match for the antlers and fur coloring. It was a short sequence but very satisfying, as I don’t think people will question it for a moment. These invisible effects sequences typify the VFX work on The Crown. We help bring the writer’s and director’s visions to life but aim to maintain a quality, which means that the viewer would have no idea of the enormous amount of work that’s gone into our shots.”
Working on a high-flying, high-profile project like Masters of the Air for Apple TV+ was technically and creatively challenging for DNEG VFX Supervisor Xavier Bernasconi. “There were months spent on virtual production, featuring air battles with hundreds of planes in a war theatre on a scale never done before. DNEG’s VFX work covered thousands of shots taking place over thousands of kilometers, including accurate 1940s 3D landscapes and cloudscapes from Greenland and Algeria to Norway and the South of France, all with hundreds of plane models, liveries and damaged variations performing in extremely complex choreography while being truthful to every historical detail,” he explains.
Masters of the Air was the biggest launch ever for Apple TV+. Viewership climbed after the premiere. Bernasconi notes, “This meant that with DNEG’s work we were able to engage the viewers and tell a believable and compelling story, while wrangling thousands of people across the globe to deliver incredibly complex work. Historians, air pilots and veterans alike have praised the attention to historical details in the VFX work.”
One of the show’s biggest achievements was keeping that laser focus on historical detail, which carries over to the depth of the effects. “The show has so many incredibly stunning shots,” Bernasconi says. “Everyone was crafted with the highest level of detail. If I had to pick [one outstanding shot] I’d say the wide shots with hundreds of planes raging in battle – each crewed with digi-doubles, fighters zooming past at 600mph breaking the stillness of contrails, and with realistic choreography of the events – are a visual testament to the incredible work that our DNEG team produced.”
Netflix’s 3 Body Problem, One Piece and Avatar: The Last Airbender, Amazon Prime’s Gen V, FX’s Shōgun, AMC’s The Walking Dead: The Ones Who Live and Disney’s Percy Jackson and the Olympians are among the other series eligible to be nominated.
IMAGINATION REIGNS SUPREME FOR KINGDOM OF THE PLANET OF THE APES
By TREVOR HOGGWhere the original Planet of the Apes pushed the boundaries of prosthetic makeup and the prequel trilogy introduced photorealistic CG apes, Kingdom of the Planet of the Apes provided an opportunity to expand upon the digital cast members and their ability to speak without relying heavily on sign language.
The story takes place approximately 300 years after War for the Planet of the Apes as Proximus Caesar attempts to harness longlost human technology to create his own primate kingdom. “This is about apes all the way through. The world is upside down and the humans are now these feral little creatures running around in the background,” states director Wes Ball, who was responsible for The Maze Runner franchise and is laying the groundwork for another Planet of the Apes trilogy. “We’re going to have more talking, and the apes are going to be acting more human-like because this is marching towards to the 1968 version where they are full-on walking on two legs.”
Continues Ball, “In terms of the visual effects of it all, you’ve got all of these amazing new developments that Wētā FX has done from Avatar: The Way of Water. Rise of the Planet of the Apes came on the heels of the performance capture leap forward. [We looked at] all the tech on Avatar Wētā FX had provided, and then we took it outside,” Visual Effects Supervisor Erik Winquist recalls. “From a hardware and technology standpoint, one of the improvements is now we’re using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actor’s face. It allows us to get a much better sense of the nuance of what their face was doing.”
Avatar: The Way of Water used old Machine Vision cameras that straddled the matte box on the main hero camera. “We did
the same thing here in every instance, and it has allowed us to get a wider field of view and also a stereo mesh of whatever was standing in front of the camera,” Winquist explains. “If we need to harness that to help reconstruct the body performance of what the actors are doing, we can use that as an aid in terms of reconstructing what their limbs were doing that we couldn’t see off-screen from the main camera. Unlike the previous three films, this was shot with Panavision anamorphic lenses, so we no longer had that extra real estate above and below the frame lines like we did when we were shooting spherical, so that came in handy there. The other obvious thing that we took from Avatar: The Way of Water was the water itself. There were literally two shots in War for the Planet of the Apes where Caesar goes over the waterfall and winds up in the river down below. Those shots were definitely a struggle back in 2017 when that was done. Since then, with all of the additional tech that had to be done for Avatar: The Way of Water to deal with the interaction of hair and fluids, we could leverage that in this movie to great affect.”
Clean plates had to be shot without the motion capture performers, which meant that camera operator Ryan Weisen and actress Freya Allan, who plays the human character Mae, had to recreate the camera movement and performance from memory. “Ryan has gotten really good in repeating the moves,” states Cinematographer Gyula Pados. “In the last couple of weeks, Erik came up with the Simulcam system where they can live playback what we shot overlayed on the camera so you could see the actors as simple 3D apes and play it back.” It was equally difficult for the cast. “Having to act against air is not an ideal situation,” Freya Allan admits. “That was probably the hardest part of it, of not being able
Kevin Durand had a great time portraying Proximus
The Eagle Clan was modeled on Mongolian cultures that are deeply tied to eagles.
“From a hardware and technology standpoint, one of the improvements is now we’re using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actor’s face. It allows us to get a much better sense of the nuance of what their face was doing.”
—Erik Winquist, Visual Effects Supervisor
BOTTOM: The amount of dialogue has been increased, with one of the more talkative being the orangutan named Raka, portrayed
OPPOSITE TOP SET: Kingdom of the Planet of the Apes benefited from the stereo facial cameras that allowed for 3D mesh to be constructed of what the actor’s face was doing.
OPPOSITE BOTTOM SET: The actor provides the soul of the character, but it is the animator who needs to figure out what that means in the context of an ape’s face.
to stare, like have a proper conversation with somebody when you’re looking at them at least. I also had to do some bizarre things, like I had to hug the air. The suits and cameras didn’t bother me too much. They embody the apes so well that I was more focused on that than what they were wearing or the camera on their head. Though sometimes they had to take the camera off because if they were too close to me, it would start bashing me in the face. I spent more time making fun of them, especially when they had to wear blue suits to interact with me.”
Different cultures were represented by various ape clans. “Originally, we were talking about they having their own coins but that never came necessary in our narrative,” Production Designer Daniel T. Dorrance explains. “The Eagle Clan is primitive and lives off of the land. Nothing from the human environment. Everything is organic, made from the earth. They never went beyond the Forbidden Zone because they knew once you’re in the human world, there’s danger. For when we’re traveling, for the most part, we did all of these different things along the way. Noa meets Raka, and we’re starting to see human elements creep in a little bit. Raka is a picker and has little stashes of things around his place. As we get to the end of the movie with Proximus Caesar, we see that they’re living off the human environment. Everything is made of metal that they’ve taken from the ship, and they have turned it into
things that help them to survive.” Village scenes were not straight forward. “You can only capture five people at a time,” Dorrance reveals. “Normally, in Maze Runner we have a street full of people, and they’re crossing the street doing the things that extras usually do. None of that happened. You’re sitting in front of a whole village set with everything that we dressed in that would normally be people chopping wood or whatever it might be. Those things were there, but no one was doing them on the day. All of that was done in post.”
Outside of a last-minute production change that saw principal photography take place in Australia rather than New Zealand, the trickiest part of shooting outdoors was the amount of greens required. “Part of the fun of this movie is [observing up close how] so much time has passed that our world is slowly erasing,” Ball states. “There is this great story about these guys when they found all of these ruins in South America that at first looked like a mountain. They didn’t realize that it was a giant pyramid until they cleared away thousands of years of overgrowth and trees. I loved that concept for our world, and that’s how we get to the 1968 version where there are Forbidden Zones and whole areas of worlds that have been lost to time. That’s what we’re building in this world. This sense of the Lost World living underneath Noa’s nose, and one that he has to uncover and learn about and ultimately be affected and changed by it.”
Decommissioned coal factories and power plants were photographed and painted over digitally to create ruined buildings overtaken by centuries of vegetation. “One of the things that I was looking at early on was the book The World Without Us that hypothesizes what would happen in the weeks, years, decades and centuries after mankind stopped maintaining our infrastructure,” Winquist explains. “You start pulling from your imagination what that might look like, and we had concept art to fallback on. We started from the bones of some the skyscrapers that Wētā FX did for Wes’ The Maze Runner films, stripping away all of the glass, turning all of the girders into rust and then going crazy with our plant dressing toolset to essentially cover it up. The great thing is we still had that live-action basis that we could always refer back to. What was the wind doing? How much flutter in the leaves? You have a solid baseline for moving forward.”
A daunting task was getting the look of the protagonist right. “When we first saw some of the concept art for what Wes had in mind for Noa, I was like, ‘He looks a lot like Caesar in terms of the skin pigmentation and the specific way the groom sat,’”Winquist acknowledges. “Some of that is deliberate, but Noa is his own ape in every way. We learned back on Gollum to incorporate the features of the actor into the character. Everybody has some amount of asymmetry to their face. but Owen Teague has this distinct slight difference in where his eyes sit in his face. What we ended up doing was mimicking a lot of those asymmetries. Often, when Owen would play frustrated or apprehensive, he does something distinct with his lips. There were some key expressions that we wanted to make sure that we nailed. When it’s working, it’s beautiful because you suddenly see the actor’s face coming through the character.”
Fortunately for Editor Dan Zimmerman, he had an established shorthand with Ball after cutting The Maze Runner trilogy and considering the mammoth task ahead of him, which saw him recruit his former first assistant editor Dirk Westervelt as a co-editor on the project. “First of all, it was daunting because I have never done any version of this movie or production before in my life,” Zimmerman reveals. “I was like, ‘They shoot a scene. I get the scene. I cut the scene.’ The cores of what you have are what they are. But sometimes you truly have limitless options. You can do what you want – and not only with shot selection and performance. You can choose a word from one performance and put it into a different performance because someone didn’t say that word right or flubbed it, or you can stitch performances together to create a performance that then goes into a shot. I had to wrap my mind around that whole aspect of cutting. I would turn the monitors off because what I would try to do is listen to the takes and try to figure out if I were to watch this scene what are the best performances of the scene that I want to make work, and the flow of it. I would basically do like an audio assembly of all of the different performances and go, ‘That looks good.’ And then turn the picture back on and ask, ‘What mess am I into now?’ And figure out from there how to manipulate it and then after that choose the shot that those performances go into. It was a whole process for me. There was a definitely a learning curve.”
Scenes and environments were mapped out in Unreal Engine. “In terms of set work and set extension work, we used a lot of Unreal Engine in this movie,” Dorrance explains. “Every set that we designed and drew, and location, we would actually plug it into Unreal Engine and have it in real-time lighting. Wes likes to work in Unreal Engine so he can play with his camera moves. In doing that I have to extend it anyway in that environment, otherwise I’m only dealing with the foreground. We continued to design the world beyond for every set possible.” Cinematographer Pados also took advantage of Unreal Engine. “There’s a big action sequence, which I thought maybe we can do in one shot, and I could build it and show it to everyone. It was like, ‘This is what I thought. What do you think?’ That part is useful for me because before that you would start to talk and people were nodding, but you see that they can’t see it. Sometimes I feel that using Unreal Engine has changed my life over the last couple of years. I can show them scenes, and it’s much easier for me to translate,” Pados says. Improvements were made by Wētā FX in profiling cameras. “For the first time, we have actually built a device to measure the light transmission through the lenses in terms of what the lens are doing spectrally to work into our whole spectral rendering pipeline, Manuka,” Winquist remarks. “That has been one of those elusive pieces of information that we have never had before. It’s not a huge thing visually, but it has been an interesting additional to our spectral rendering pipeline.”
All of the media in the cutting room was made available online for Wētā FX. “We were able to quickly hand over bins of cuts that would then relink to our media, which was the same on the Wētā FX side in New Zealand,” Zimmerman states. “We call it ‘Wētātorial,’ and James Meikle [Senior Visual Effects Editor, Wētā FX] was amazing over there. Basically, he and my visual effects editors, Logan Breit and Danny Walker, would communicate and say, ‘He changed this. We’re going to send you a bin, but then we’re going to send the paperwork with it.’ James could then open up that bin, and we could tag it in a way that he could see what the change was, or if it was a performance swap or something like that. James could then easily relay to Animation Supervisor Paul Story what the change was and when to expect the change and all of the data to make the change happen.”
The hardest part for Ball has been the sheer process of making Kingdom of the Planet of the Apes. Ball notes. “To shoot something that isn’t really the image, from the clean plates all the way to the end of making choices about shots, and looking at storyboards and not seeing that for six months until the last two weeks when you can’t change it [is frustrating and difficult]. And it all has to come back together. I talk about this idea of the cliché movie scene of the waiter with a whole bunch of stuff on a tray. He falls and it all goes up in the air and somehow it all comes back down and lands. That’s what we’re doing. How do you make something that feels organic, real, spontaneous and alive, but it’s so slowly pieced together by choices made over years? That has been a hell of a learning experience for me and a fun one. I enjoy a good challenge.”
TOP TO BOTTOM: Freya Allan is one of two human characters, with the rest of the principal cast being CG primates.
Nature reclaiming areas once inhabited by ancient civilizations was a major visual motif.
Proximus Caesar believes that ancient human technology is the key in being able to establish a kingdom.
Some shots had to be done entirely in CG.
JOYCE COX, VES: CELEBRATING A PRODUCTION PUZZLE MASTER
By NAOMI GOLDMANAcclaimed Producer and VFX Producer Joyce Cox, VES describes her job as a ‘wire walker’ between creative goals and financial restrictions, one who helps realize a filmmaker’s vision to create the best project possible – on time and on budget. A self-described lover of puzzles, Cox built a brilliant career out of her innate talent for organizing people and moving parts, a skillset that has branded her a luminary in the world of visual effects producing and one of the most respected producers in our industry.
With credits that include Titanic, The Dark Knight, The Great Gatsby, Men in Black 3, Avatar and The Jungle Book, Cox has been instrumental in shaping popular culture for decades, and her work has put VFX squarely at the center of big box office filmed entertainment. She has produced 13,000 visual effects shots with budgets totaling in excess of $750 million and won three VES Awards for her work on Avatar and The Dark Knight. In 2018, Cox was honored with the title of VES Fellow.
In recognition of her exceptional career as an educator, changemaker and exceptional contributor to the visual effects craft and global industry, the Society honored Cox with the VES Lifetime Achievement Award at the 22nd Annual VES Awards.
After receiving a standing ovation from her peers, Cox shared her appreciation: “I’m truly honored to be given this prestigious award from the VES celebrating my career, one of the opportunities to facilitate the work of the thousands of artists, technicians and visionaries it took to create these movies. It’s been a privilege to work with and learn from so many brilliant, dedicated people who gave life to words on a page, transforming pixels and dreams into worlds that captivate and inspire, and that is nothing short of magic. This award celebrates not just my achievements, but the collective triumphs of a creative community, and shines a light on the value of Visual Effects Producers.”
Cox continued, “Because being a VFX Producer is still a fairly new position in the film industry, we tend to disappear, with most of the emphasis on how VFX is made falling to the VFX Supervisor. But to produce and succeed in this job, you have to understand every department’s role and absorb their demands and restrictions and precisely how VFX can support and achieve the end goal of producing the best movie. So having this role recognized by the VES, and me as a woman in this role, means so much.”
Achievement
OPPOSITE TOP TO BOTTOM: VES honoree Joyce Cox, VES backstage with VES Chair Kim Davidson, VFX Producer Richard Hollander and VES Executive Director Nancy Ward.
VES honoree Joyce Cox, VES shows off her VES Lifetime Achievement Awards.
VES honoree Joyce Cox, VES hits the VES Awards red carpet with friends and family.
Harkening back to her early life, Cox recounted, “Unlike many of in this industry who set their sights early in life for a career in film, I arrived along a circuitous path of happy accidents. I grew up in a small Kansas community in the ’50s and ’60s. A time and place where most girls, including me, were not mentored toward careers. Certainly not a career in film.” Cox highlighted her parents as her first role models. “My mother had a brilliant math mind, and my friends referred to my dad as a metaphysical cowboy… a poet trapped in a laborer’s body. They married young and neither had a high school degree. Looking at my mom’s trajectory, she riveted nose cones on fighters, taught herself how to do the books in the aircraft industry and went on to become one of the first women executives at Boeing Military. That focus and drive to grow and achieve was a great source of inspiration.”
“I get to explore the diversity of highly creative and exceptionally smart people and be a part of how those minds take words off the page and realize them through an intense process into a beautiful film experience.”
—Joyce Cox, Producer and VFX Producer
Cox pursued her education in Kansas, taking classes at Wichita Business School and Kansas City Community College, and got an early exposure to business working in a series of office positions in everything from manufacturing aircraft parts to real estate. Then she was enticed to start her creative career. “My brother worked in advertising as an art director in Chicago at hot boutique agencies and his life was really appealing, so I moved to Chicago and started representing artists. It was the mid-’70s when I started my first company, Joyce Cox Has Talent, which was really the window into the creative process and the gateway to my future. I was smitten with the way concepts were realized into images and stories for magazines and commercials. I loved being exposed to art and creative thinking every day. This was before digital, which meant carrying large black portfolios, flapping in Chicago winter winds, from agency to agency. My career path changed when I married Jim Weisiger, a commercial director/cameraman. To this moment,
“VFX producing is difficult on both the vendor and client side. It is just amazing how Joyce was able to carve out the ‘Joyce side’ by asking both the client and the vendor equally hard questions, sometimes in front of each other. I called it the Joyce quasi-state, a place between the two sides. She was able to walk that thin razor’s edge revealing, with her characteristic humor and wit, the underlying issues and keep the production on track.”
—Richard Hollander, VES
the funniest person I have ever met. In addition to the value of humor, Jim taught me the value of the film professionals and what it takes to execute a production.”
Cox moved to Los Angeles in 1980 and over the next 15 years produced hundreds of commercials, eventually taking a position as the executive producer for Bruce Dorn Films where she had her first opportunity to work with digital visual effects. In the mid ’90s, a time when digital technology was rapidly evolving into its present role as a creative and technical cornerstone for filmmaking, Cox transitioned from the role of commercial producer to producing visual effects for feature films.
“Several years producing commercials, many involving visual effects, from storyboard concepts to final delivery, proved to be the perfect primer for producing visual effects for movies. One day, a dear friend, Lee Berger, asked me to fill in for him on a project at VIFX, a digital facility that had recently been purchased by 20th Century Fox. Always looking for a new challenge, I said ‘sure.’ That was the beginning of the career VES honored with this award. The timing was perfect. I stepped into this world at the beginning of its rapid growth into the massive industry we have today.”
For the next five years, Cox worked as a facility VFX producer on numerous film projects, including Titanic, Pushing Tin, Fantasia 2000, Harry Potter and the Sorcerer’s Stone and How the Grinch Stole Christmas
“One of the first projects I worked on at VIFX was Out to Sea with Jack Lemmon and Walter Matthau, then I moved on to Titanic. I was there to help organize the facility to be more effective at a time when that was really needed. Richard Hollander, VES was the President and Senior Visual Effects Supervisor for VIFX under Fox’s ownership and for my collaborator on Titanic I just started asking Richard questions about digital art and how to organize a production workflow. He was a huge influence in providing knowledge and mentorship.”
Cox continued about her work on Titanic. “It was like jumping into the pit of fire and learning under pressure, all at once. Jim [Cameron] was popular, but not like now, and we were looking at a runaway budget while he had the power to hold onto the reins. Plans constantly evolve. Movies are all theory until you shoot and cut and try to actually make them. This experience was intense and challenging, and coincided with my husband’s cancer diagnosis, which actually helped me keep perspective on what matters most in life as I went about my job.
“Jim is one of those uncompromising directors who wants to push things to the edge with the use of technology. The drowning scenes were shot in a tank in Mexico, and since it was very hot, you could not get any visible cold breath coming from the actors. At the time, the capacity to render cloud particles to that degree was unreliable, so we built a black cold room and my husband shot it. We had an actor in black read the lines. We captured his breath and had compositors working on Flame roto-ing hands and placing breath. It was one of many shots that called for our best problem-solving to bring the director’s vision to life. And it looked cool.”
In presenting the VES Lifetime Achievement Award to Cox, Richard Hollander, VES extolled her keen abilities. “I began working directly with Joyce as my VFX Producer on several projects including Titanic and Harry Potter and the Sorcerer’s Stone. I was able to experience her skillsets first hand. She was able to glide through discussions with our clients, portray the situation and tell them the truth, which was not something they always wanted to hear. Even with this frankness, our clients trusted her. There it was. A natural in our VFX workplace. I knew then that her career was only beginning.”
Hollander continued, “VFX producing is difficult on both the vendor and client side. It is just amazing how Joyce was able to carve out the ‘Joyce side’ by asking both the client and the vendor equally hard questions, sometimes in front of each other. I called it the Joyce quasi-state, a place between the two sides. She was able to walk that thin razor’s edge revealing, with her characteristic humor and wit, the underlying issues and keep the production on track.”
In 2000, Cox moved to the production side. Over the next 20+ years, she worked with some of the world’s most talented directors and crews, creating beautiful, powerful and groundbreaking films, including: Superman Returns and X2: X-Men United with Bryan Singer; Avatar with James Cameron; The Dark Knight with Chris Nolan; The Great Gatsby with Baz Luhrmann; Men in Black 3 with Barry Sonnenfeld; and The Jungle Book with Jon Favreau.
“My time in digital facilities was instrumental because I now had the ability to understand and be compassionate and demanding of facilities. On the production side, I liked being one of the first hired and one of the last out, so I could participate and observe the entire creative process.”
Looking back at her decades in the film industry, Cox points to her takeaways and what she considers markers of success. “I have learned something on every single movie I’ve ever done because the technology is moving so fast and is antiquated by the time I’ve jumped to the next project. I get to explore the diversity of highly creative and exceptionally smart people and be a part of how those minds take words off the page and realize them through an intense process into a beautiful film experience.
“During the making of the films, I see all the pieces thousands of times, but when all is done and we’re in the theater and the audience knows none of the pain it took to birth this project – it feels good. It means we’re giving people something that inspires or enriches their lives.
“My job is not necessarily the most fun as the one with fiduciary responsibility, but it has also been my love of challenges, of puzzles that has made this such a rewarding career. Motivating people to the common goal of making the best movie on time and on budget is where I have had the opportunity to excel. When asked how I do it? I maintain altitude. I get my ego out of the way to help the team achieve. And together, we navigate the often-rocky journey and create something that is greater than what we could have achieved without this harmonic convergence.”
The Twilight Zone episode “Nick of Time.” (Image courtesy of CBS Photo Archive)
The Twilight Zone episode “Nightmare At 20,000 Feet.” (Image courtesy of CBS Photo Archive)
With James Spader in Boston Legal.
(Photo: Carin Baer. Courtesy of ABC)
WILLIAM SHATNER: HONORING AN ICON
By NAOMI GOLDMANWilliam Shatner has boldly taken audiences to the final frontier throughout his remarkable seven-decades-long career. As an Emmy- and Golden Globewinning actor, director, producer, writer and recording artist, Shatner remains one of Hollywood’s most recognizable figures. With his portrayal of Captain James T. Kirk in the legendary science fiction television series Star Trek: The Original Series and in seven Star Trek movies, Shatner is the originator of one of the most iconic science fiction characters in history.
“William Shatner has been at the center point of compelling stories that use visual effects to enhance unforgettable storytelling for decades, and his work continues to leave an indelible mark on the cultural landscape,” said Nancy Ward, VES Executive Director.
For his exceptional work in the epic Star Trek franchise and in recognition of his cinematic legacy that continues to touch new generations of filmmakers, creatives and audiences, the Society recently honored Shatner with the VES Award for Creative Excellence at the 22nd Annual VES Awards.
Seth MacFarlane, award-winning actor, creator of Family Guy, The Orville and Ted, gave an epic tribute in presenting the Creative Excellence Award to his longtime friend: “Star Trek laid out a vision as relevant then as it is today, of a future where we all aspire to be nobly forward-looking and to improve the human condition. It continues to live large in our collective consciousness and remains relevant for generations of viewers. But Star Trek’s center of gravity has always been William Shatner. Bill has done something we all can only hope to do. He has made a permanent mark on this industry that is all his own. His work will endure for as long as there is an entertainment industry. He is a colossal talent, a great performer who has never lost his sense of curiosity or adventure.”
In accepting his award, Shatner remarked, “This industry is filled with the most creative people; every one of them is an artist who is always thinking ahead. The work is progressing at such a great speed, and it’s amazing what visual effects can bring to life. In the beginning [of Star Trek], it was just a flashlight and a cardboard Enterprise! Visual effects have become a truly visual organic and immersive experience, and I accept this award for those great artists – men and women – who work beyond the imagination.
Thank you to the Visual Effects Society for bestowing me with this honor.”
T.J. Hooker. (Image courtesy of Sony Pictures Television) Shatner became the oldest person to fly into space at age 90 after completing his mission on Blue Origin NS-18 on October 13, 2021.
(Image courtesy of Blue Origin/Reuters)
With friend Seth MacFarlane backstage at the 22nd Annual VES Awards after receiving the Society’s Creative Excellence Award.
(Image courtesy of the VES)
AN URGENT DISPATCH FROM THE FUTURE FRONT LINES IN CIVIL WAR
By CHRIS McGOWANIn the near future, a fractured America is at war with itself. The President has called citizens to arms against the breakaway Florida Alliance and the Western Forces of California and Texas, which are attempting secession from the U.S.A. Alex Garland, who wrote 28 Days Later and wrote and directed Ex Machina, Annihilation, Men and the FX series Devs, scripted and directed the A24 film. Kirsten Dunst stars as a photojournalist and war-zone veteran who is traveling across a dystopian, perilous landscape as the Second American Civil War escalates. The cast includes Wagner Moura (Narcos), Jesse Plemons (Killers of the Flower Moon), Stephen McKinley Henderson (Dune), Cailee Spaeny (Priscilla), Sonoya Mizuno (Devs) and Nick Offerman (The Last of Us). The film is “set at an indeterminate point in the future” and “serves as a sci-fi allegory for our currently polarized predicament,” Garland said in an interview with The Daily Telegraph
Framestore was the main VFX creative studio, with a small in-house comp team provided by TPO, according to VFX
Images courtesy of A24.
TOP: Kirsten Dunst is a war-zone photojournalist trying to reach D.C. in Civil War. (Photo: Murray Close)
OPPOSITE TOP TO BOTTOM: Framestore was the main VFX creative studio, with a small in-house compositing team provided by TPO. Approximately 1,000 VFX shots were split between Framestore and TPO to bring the shocking warfare to life.
Seamlessly blending the real and invisible VFX sometimes created moments where it was hard to tell whether the shot was real or not, adding drama to the storytelling.
Photojournalist Kirsten Dunst risks her life navigating a dangerous dystopian landscape as the Second American Civil War erupts. (Photo: Murray Close)
Supervisor David Simpson. Some 1,000 VFX shots, split between Framestore and TPO, bring the shocking urban warfare to life, including the obliteration of the Lincoln Memorial. “Our VFX team was pretty small, but I love working that way,” Simpson comments. “With a small team, the collaboration is the best part. Everyone’s in the loop, sharing ideas, helping to make the film the best it can be.”
Civil War is a road movie, which required a certain approach. Simpson says, “When it came to world-building it needed to feel grounded – almost like a documentary – so real locations played a huge part in the film. Plus, the world needed to feel populated. It was important that the audience believed people still lived here,
which meant a lot of cast members and supporting artists. Almost every scene takes place in a new location with new people, which meant we were constantly on the move.”
The number one VFX challenge was creating a replica of Washington D.C. Simpson explains, “Plan A was to find locations in Atlanta that could play for Washington D.C., but it quickly became apparent that wasn’t going to work – aesthetically or logistically. So, the production pivoted to building a number of sets and extending them in VFX. We built three sets in a parking lot in Atlanta. We called them Pennsylvania Avenue, the Lincoln Memorial and 17th Street. On-Set Supervisor Chris Zeh and I marked out the roads with tape to 80% scale, as the space wasn’t large enough to fit 1:1. Each set was a single story tall and maybe one or two buildings deep. Then, our VFX Producer Michelle Rose and Lead Wrangler Corey Burkes flew to D.C. to capture the real locations for Framestore to recreate them digitally.”
Simpson adds, “As we saw the city come together, it looked so good we decided to use it for a handful of big establishing aerial shots. Whenever we’re in a helicopter over Washington D.C – that’s always full CG. The detail is wild. The environments team under the supervision of Freddy Salazar, our CG Supervisor, didn’t just build a city, they built a full war zone. There are individual skirmishes happening with gunfire and explosions. We have tanks, Humvees and police cars driving around.
“Every CG building has a CG interior – offices with individual desks, chairs, whiteboards, potted plants. There’s even CG clutter on CG desks,” Simpson. “That CG clutter sometimes contained an inside joke. “On some desks, you might even spot little rubber ducks. This is a nod to the Framestore team because many years
Beyond
ago, on a previous project, the team received rubber ducks as a wrap gift. They’ve peppered our offices ever since! It’s a very specific tip of the hat to Framestore’s inner working life.”
Down below in the simulated city, “There are shops with displays inside, some streets are barricaded off, and even the traffic lights work! You could put the camera anywhere and it looked fantastic,” Simpson remarks. Chris Zeh adds, “Building the entirety of Washington for a flyover and individual streets as higher-resolution backdrops, hero explosions and digital people, shelled office blocks, tanks and planes and helicopters and a lot of shots fired – all of these had their sets of challenges.”
Zeh continues, “Specifically for comp, a lot of our set extensions needed integrating in footage with trees and people, flaring lenses at night and often quite dynamic camera moves. Most of the plates were shot on location with all the natural variability of the outdoors. All of this needed recreating and bedding in, and sometimes even seemingly similar shots needed quite different approaches. It is worth mentioning the paintwork, too. Some shots needed removal of buildings or pedestrians or traffic to arrive at the eerie emptiness you would find in places in the midst of a civil war.” Zeh notes, “From a personal perspective, I might also add that we went through a lot of reference material to get the look right. And due to the nature of the images, I would say that counted as challenging.”
No LED stages were used. “Everything was shot in the real world with occasional set extensions,” Simpson says. There were also lots of digital crowds, such as in the scene in Brooklyn at the beginning. He adds, “We had a lot of performers on set, but filled gaps with a combination of digital and greenscreen crowds to make it look even more crowded. The helicopter shot in that sequence was
actually filmed about nine months later from a drone, so that crowd’s 100% digital.”
“For the Washington D.C. sequence – that’s maybe 30% digital soldiers,” Simpson explains. “Everyone in the foreground is a stunt performer, then the further away we get the more we introduce digis. Again, this was to make the world feel populated. As an audience, we were up close with the journalists and a small group of soldiers, but beyond was an entire war with countless other people and stories. Our animation team, supervised by Max Solomon, used a combination of mocap and hand animation to try and create that sense of wider activity.”
Regarding blowing up the Lincoln Memorial, Simpson jokes, “Look, we tried to do as much for real as possible, but even that has its limits! Shockingly, it’s not the real Lincoln Memorial.” Instead, he notes, “We went to Washington D.C. and scouted all the locations to get a feel for how it was lit, the scale, etc., but the scene itself was shot in a parking lot in Atlanta. The foreground performers, the vehicles and the trees behind them are all real, but they’re firing at a bluescreen with a giant hole in the middle. Inside the giant hole was an enormous glowing rectangle that represented the Lincoln Memorial. This gave us realistic lighting with the added benefit of helping the soldiers know where to shoot.
“The explosion itself,” Simpson continues, “is pure FX. Authenticity and realism were really important on this project. We wanted to avoid anything that felt ‘Hollywood.’ so we spent weeks ‘casting’ our explosions. We put together a library of news footage, ammunition tests, anything real we could find. Then Alex chose a hero clip for each scene and that served as a reference to keep things honest. Our FX team, led by Effects Supervisor Ed Ferrysienanda, diligently recreated the nuances and details seen in real footage. Every explosion is based on something real.”
Perhaps the most unusual challenge for Simpson was working on the VFX for Civil War while still delivering the visual effects for Alex Garland’s previous movie, Men. “Based in Atlanta, we’d spend the day prepping for Civil War, then in the evening we’d meet up to review Men, where the team was based in London. We graded the last shot on Men about 24 hours before shooting started on Civil War.”
Zeh felt a sense of accomplishment when the real and unreal blended together seamlessly in Civil War. “While working on a project, I find the increments are often quite small. Although you see the shots take shape and improve steadily, there is rarely a moment of ‘Eureka!’ while we’re still working on them. But there are moments where I needed to remind myself whether a certain part of a shot was real or not. I really enjoy the type of work that mostly falls into the category of ‘invisible VFX,’ and those moments are fantastic. Then seeing the trailer which features a lot of our work, and how it plays in context with other shots was great. Can’t wait to see it on the big screen. That’s usually the moment for me.”
Says Simpson, “On this show, I’m proud of the work as a whole. It feels like one complete body of work. I haven’t seen it in IMAX yet. There’s a lot of scale to the movie and some absolutely epic shots. I can’t wait to see them on a massive screen.”
BOTTOM TWO: Every explosion was based on something real, such as news footage and ammunition tests. The
team
the nuances and details seen in the real footage.
PABLO HELMAN BRINGS TO LIFE THE RHYTHM OF THE RIGHT STORY BEAT
By TREVOR HOGGDespite the infamous Dirty War (1976 to 1983) where the military dictatorship in Argentina caused 10,000 to 30,000 citizens to “disappear” for being suspected life-wing political opponents, the artistic community in Buenos Aires was still able to give birth to global talent, even if gatherings had been driven underground. Rising from this crucible was a composer who would find himself collaborating with the likes of Steven Spielberg, Martin Scorsese, George Lucas and Clint Eastwood, but in an entirely different capacity.
“You couldn’t wear your hair long,” recalls Pablo Helman, Visual Effects Supervisor at ILM. “It was difficult to walk on the streets and not be stopped and asked for your documents. You never knew if your friend was involved in terrorism or not. I went to a party when I was 16 and didn’t know that the owner of the house was somebody involved with terrorism, so at 2 a.m. police came and took 50 of us kids in. They let me go at 3 a.m. or 4 a.m. I was living with my mother at that time, and she screamed at me, ‘What the hell are you doing? I don’t know where you are.’ I didn’t tell her. Four years later when I left Argentina, I told her and she started crying. It also pushed me out of Argentina in 1980, which was right in the middle of a civil war. I had friends who disappeared. Also, when I was 15, I went to a concert and was chased by a policeman on a horse in the middle of the street. That was not fun. It does change the way you see things.”
Bueno Aires was established in 1536 and is a cross between Paris, New York and Rome. “The buildings are old, from the 1700s and 1800s,” Helman remarks. “It’s a beautiful city that never stops. You can go out and have something to eat at 3 a.m., like a piece of pizza. Everything is open. At 16, I was a drummer for a band that got signed by RCA Records. While finishing high school, I was recording and touring, so I never had to get a job per se. I was touring five days a week, and three months of the year I was recording.”
TOP: Pablo Helman had ambitions of becoming a composer and instead changed his tune to become an Oscar-nominated visual effects supervisor for The Irishman, War of the Worlds and Star Wars: Episode II - Attack of the Clones.
OPPOSITE TOP TO BOTTOM: As a 16-year-old Pablo Helman signed a recording contract and was a drummer for a band that toured South America, including Chile.
A personal thrill for Helman was getting a tour of Abbey Road Studios where his musical idols The Beatles and Pink Floyd recorded. Helman, third from left, and The Irishman visual effects team attend the 92nd Academy Awards.
Compromises had to be made because of the repressive environment. “You need to make decisions to do [things] whether your safety is compromised or not, or just shut up or leave the country like me.” The love for playing music remains for Helman, with the main instruments being the guitar, drums and bass. “It’s funny. One of my first 45s was ‘Eight Days a Week,’ and because I’m working on the musical Wicked and [the Beatles] were recording at Abbey Road, the head of the studio gave me an incredible tour. Studio One was where Dark Side of the Moon was recorded by Pink Floyd. There were all kinds of pictures there. Then we went to Studio Two where the Beatles recorded the majority of their stuff. We were going down the stairs and the studio head goes, ‘This is where Paul recorded ‘Blackbird’ and the Beatles had their first tryout in 1962. Then we went to Studio Three where ‘All You Need is Love’ was filmed and all of the original music for Indiana Jones and Star Wars was recorded. It was an incredible experience going there and coming full circle living with those ghosts.”
Movies also played an important role growing up in Bueno Aires, courtesy of a famous street called Corrientes that is lined with different theaters showcasing cinema from around the world. “I would go and watch Amarcord or all of the Fellini works,” Helman remarks. “About 15 years ago, I was in Frankfurt doing a talk of some kind and had lunch with Cinematographer Giuseppe Rotunno who filmed Amarcord, and we talked about that movie. It was incredible
“A lot of it has to do with being able to open up yourself to learn, especially at ILM. Dennis Muren has always been an incredible mentor for me, and he was the one who introduced me to Steven Spielberg on The Lost World: Jurassic Park . Working with Spielberg for five years straight and Marty for eight years was like going to film school every day.”
—Pablo Helman, Visual Effects Supervisor, ILM
how I went from music to having lunch with Giuseppe Rotunno or working with Spielberg or Scorsese.” Movies and films share numerous rules and fundamentals, Helman says. “It’s the same thing. In music you have sequencing, rhythm; you’re telling a story and have a beginning and an end. I would say that probably anybody who does music could do film and the other way around. They’re both difficult.” The timing of the music was the reason why the opening shot of The Irishman, where the camera tracks through a nursing home and settles on an elderly Robert De Niro who starts talking, had to be extended digitally for 13 seconds. “It’s also the same thing with cinema. Working with Marty, you could go to his trailer or office, and he always has a TV set in the back with a movie that is playing with no sound. He loves taking a look at stuff and learning from the framing and storytelling.”
After earning a degree for music composition at UCLA and obtaining his teaching certification, Helman was composing music for television. “I connected with a PBS station and started writing music for them,” Helman recalls. “They said, ‘We don’t have a job for a musician, but we do have one for an editor. We can teach you how to edit, and you can still write the music for the promos.’ For about seven years I was an editor and directed live television there while I was getting my Masters in Education. Because I was in charge of buying equipment for this station, I bought a Quantel box called HAL or Henry, and at that time Quantel was a huge company and was putting out a computer called Domino, a hybrid solution
for editing and compositing that had the same program as Henry or HAL. They were putting the Domino in a place called Digital Magic, which was the first completely digital facility. This was 1985, and at that time in Los Angeles everything was going from optical to digital. Because I was one of the only people who could work the Domino, I started working for Digital Magic doing a lot of Star Trek: The Next Generation. From that, I went to Digital Domain to work on Apollo 13, and that’s where I learned Flame. Then I became a compositing supervisor at Pacific Ocean Post and worked on Independence Day. From there, I went to ILM which was 28 years ago. ILM had bought Flame but connected them in way where everyone could see their own discs. I was hired to supervise the department. Because I had done a lot of supervision on set, I subbed for other supervisors who couldn’t be there for a few days, and in about three years I became a visual effects supervisor.”
VFX is an area of filmmaking where the technical and artistic come together. “A lot of it has to do with being able to open up yourself to learn, especially at ILM,” Helman notes. “Dennis Muren has always been an incredible mentor for me, and he was the one who introduced me to Steven Spielberg on The Lost World: Jurassic Park. Working with Spielberg for five years straight and Marty for eight years was like going to film school every day. Every time you work with a different director it’s a completely different process. As a visual effects supervisor you have to be inside their head, learn pre-emptively how they think and offer things that fit into their vision. You need to know what the shot is about, and why you’re doing what you’re doing. It’s about communication and how you say things. You’re working with people who are very busy, so you have to hone your communication. You have only five seconds to tell the story.”
Helman has been responsible for projects that have spectacular and invisible digital augmentation. “It’s about storytelling, listening, reading the script and understanding what the filmmaker wants. Then, also being completely part of the cinematography, production design, special effects and wardrobe. Everything that has to do with the picture, and you have to do your job without being known. That’s actually the kind of visual effects I like to do. In a sense, it depends on how you define visual effects. For me, visual effects are always important. I appreciate the fact that the VES Awards separated into ‘this is just visual effects and this is supporting visual effects.’ I don’t think that there is a difference there because every visual effects shot should be supporting the story or else it shouldn’t exist. The Irishman had 1,750 shots and without visual effects, you could not tell that story. War of the Worlds had 247 shots that were supporting the story. But in a typical Steven thing. he is very economical when shooting. Steven shoots in different ways so that he can always do it without visual effects but tell the same story. Steven is smart and knows visual effects well.”
Star Wars: Episode II – Attack of the Clones pioneered digital effects and cinematography. “Attack of the Clones was the first movie shot on digital,” Helman notes. “It was a Sony HDW-F900 camera. A lot of the infrastructure that came out of working in digital came from that movie at ILM. We used to have dailies on ¾- inch machines and video. There was somebody in the booth who would play the dailies. That was the first movie where they had a server with everything digitally stored and we could click on something. It’s an incredible thing see technology going by you quickly. When I was doing The Irishman, we were right there at the beginning of deriving geometry from lighting. When those ideas came together, they came together in a specific way and creatively.
TOP TO BOTTOM: By utilizing the Sony HDW-F900 camera, Star Wars: Episode II - Attack of the Clones became the first movie shot digitally. (Image courtesy of ILM, Lucasfilm Ltd. and Twentieth Century Fox))
Helman and his wife Donna walk the red carpet at the BAFTA Awards.
Helman takes a look through the camera viewfinder while making Indiana Jones and the Kingdom of the Crystal Skull.
One of the most efficient filmmakers that Helman has worked with is Steven Spielberg.
That’s why I love working at ILM. We are encouraged to sit at the table and talk about, ‘What would happen if we had to build this and rebuild? Don’t worry about what we’ve done before. We have the resources and time to do it in a specific way.’ You sit down with a bunch of people who are a lot more creative than I am and put it out there. This what we need to do and we do it. But technology changes quickly. All of this AI stuff that is happening.”
AI and digital doubles were contentious issues that contributed to the SAG-AFTRA strike. “The main problem there has to do with a misinterpretation of ‘we don’t want anybody to be using this likeness on other projects,’” Helman states. “It’s okay for one project but not for another. We never reuse assets because technology moves so fast. When we have a scan of an actor, you can’t use it for the next movie, so I’m not worried about that. The only reason we do digital doubles sometimes is because whatever is in the script cannot be accomplished any other way. I’d rather use the actor or stunt performer, for that matter. I can understand why everybody is scared of that.”
Yoda went from being a puppet in The Empire Strikes Back and Return of the Jedi to entirely CG in Attack of the Clones (Image courtesy of ILM and Lucasfilm Ltd.)
Helman got an opportunity to collaborate with director David Fincher on the Netflix production of Mank. (Image courtesy of Netflix)
A lot of the visual effects in Killers of the Flower Moon were about getting the desired scope for the environments. (Image courtesy of Paramount Pictures)
The “No CGI” storyline is not realistic. “People who say, ‘There are no visual effects here.’ They don’t say, ‘There’s no production design here.’ Because you can’t tell me that production design does not make the movie for you! You’re changing a bunch of stuff and you’re making choices on the production design because of storytelling, and production design is being used as a tool. The same thing with lighting and performance. You can’t tell me that the performances are real. They’re performances, and you’re telling a story with them. If you are smart, you’re going to be able to use every tool that you have available to you to tell that story. Be smart and use visual effects the right way.”
Being incredibly focused is a central character trait for Helman. “I am creative in the sense that I’m always thinking about music and stories. I write and draw. I do all of those kinds of things, and I have been so lucky in my life that the majority of my jobs have been creative.”
WHAT DO YOU MEAN, NO CGI?
By TREVOR HOGGAs long as there has been CGI in films and television, debates have raged about its artificial nature; however, what is different now is that the photorealism of the technology has evolved so much that it is no longer distinguishable from reality.
Central to the public awareness of a movie is that the cast and the studios do not want to risk dimming the star wattage – especially when hundreds of millions of dollars are at stake. “If you pay a lot of money for the talent that is also supposed to market your movie, you don’t want to cannibalize the actors’ marketing power by suggesting their performance wasn’t entirely real,” remarks Florian Gellinger, Owner and Executive VFX Producer at RISE. “That aspect is being pushed to the extreme when an actor actually does something crazy in the making of a movie – like jumping with a motorcycle off a cliff, and everybody is afraid that doubt might start to materialize if there is visual effects-related behind-the-scenes material available. All that, plus visual effects being a black box that is hard to understand for most audiences, make the studios’ choice to market films this way abundantly clear.”
“It’s become a status thing to make movies with minimal or no CGI,” notes Peter Howell, Movie Critic at The Toronto Star who agrees with the idea that media coverage favors a negative point of view towards CGI. “Yes, because I think critics want to be seen as champions of old-school cinema: big screens, practical effects, celluloid film. Just as rock critics are champions of live shows, genuine musicianship and vinyl LPs.” Audiences are not as biased. Howell adds, “Moviegoers admire CGI if it’s done well and hate it if it’s done poorly. There’s no in-between.” A particular cinematic universe is not helping matters. “CGI is overused and is increasingly messy and boring. Most ‘multiverse’ movies – I’m looking at you, Marvel – look like a cake with too much icing.”
What is the definition for successful CGI? “It depends on the movie,” Gellinger notes. “CGI can be heavily stylized and artificial if that’s the concept of a film. But having something artificial-looking in a naturalistic picture would need to be justified by the story. Successful CGI has to have a reason why it looks a certain way, and that reason can be many things. In the end, when something doesn’t look right, it’s everyone’s fault to a certain extent – at least most of the time.”
“We’re in a weird time right now where CG is getting a lot of bashing in the press, and it’s not a fair criticism of what is being pulled apart because the reality is if you don’t like the CG in the shot, what you’re really saying is you don’t like the production design, set design and framing,” remarks Jay Cooper, VFX Supervisor at ILM. “There are a million different places where the CG is one component of what is being created, and what we’re seeing now is a knee-jerk reaction to artifice, and the thing that is the easiest to hang that criticism on is CG when it’s really a number of things. I debate whether those criticisms are appropriate. Sometimes it is. Sometimes it isn’t.”
Studios and directors need to be held accountable when CGI does not work out. “First and foremost, the studios and the directors have to make the right projects with the right teams,” states Allen Maris, Head of Visual Effects at Regency Productions. “The story needs to be there. Adding more visual effects will definitely not fix a third-act problem. Having more finals and less temps will not double your audience scores. Cutting the visual effects budget in half will also not help. Lack of planning will also hurt the process, as will not having enough time after turnover. The most problematic shots I’ve been involved with are ones that changed at the last minute, the production didn’t follow advice or the vendor wasn’t given enough time to work through the shot properly because of late turnovers.”
Christopher Nolan often finds himself in the center of the ‘No CGI” controversy, something he acknowledged back in 2011 when receiving the inaugural VES Visionary Award by stating, “It’s a great honor to be getting an award from the VES Society. I feel a little guilty receiving it from you guys as somebody who often appears in the press talking about my use of CG like an actress talking about her use of Botox. And I’m as dependent on visual effects, probably more so, than any other filmmaker out there.” In truth, his Oscar Best Picture-winning Oppenheimer represents a gray area of visual effects work. “Chris wants to have shots that he can cut into the film,” notes Andrew Jackson, VFX Supervisor for Oppenheimer. “I always keep that in mind when I’m shooting stuff, to frame it in a way that it can work without any [VFX] work at all.
TOP TO BOTTOM: Ender’s Game was an example of visual effects company Digital Domain taking on the role of a production company. (Image courtesy of Lionsgate)
Director James Mangold is a strong believer in capturing as much in-camera and utilizing visual effects to expand the cinematic scope, as he did with Ford v Ferrari (Image courtesy of Twentieth Century Fox/Disney)
James Cameron views visual effects as part of the fabric that makes him a filmmaker. (Image courtesy of Walt Disney Studios)
It’s great when that happens. A whole lot of shots got cut straight into film. Chris came out early on and said there is ‘No CG.’ To clarify, that means there were no computer-generated elements going into the compositing work. There were visual effects, in that a lot of those shots were a complex layering of multiple elements, but all of the input was photographic.”
Combating all of the misinformation and confusion in an educative and sharp-witted manner is the four-part video series “No CGI Is Really Just Invisible CGI,” which can be found on the YouTube channel The Movie Rabbit Hole. “I’ve dedicated this movie series to tell how much of this is actually CG, but do audience members even need to know? That’s a tricky question because they don’t [need to know],” remarks Jonas Ussing, VFX Supervisor at Space Office VFX. Ussing co-hosted a VES panel discussion with VES Executive Director Nancy Ward and VES Board Chair Kim Davidson at the 2024 FMX Conference named after his series where he debuted the final fourth video.
Continues Ussing, “I didn’t work on Top Gun: Maverick, but it’s my understanding that the people who made those CG jets would be perfectly fine if the audience assumed that it was all practical. The same way a stuntman’s finest achievement is that the audience does not even notice or think about when the director cuts between an actor and him. It’s just James Bond. The problem comes when the studio shoves one kind of film artist under the bus [in response to harsh criticism of the VFX in some films]. And what do the studios gain from it? There is an enormous publicity value in saying how practical the films are. Just read any Reddit, Twitter and YouTube comment section. People go crazy when they realize that this is real filmmaking and [believe] no CGI was used on Barbie [which Ussing points out in his series actually had 1,300 visual effects shots and 20 of them were fully CG].”
“Some filmmakers are coveting what they see as a ‘badge of honor’ in downplaying or disregarding the vital role of visual effects in bringing their stories to life, and the VES is steadfast in proclaiming that VFX must be brought into the light,’’ says VES Board Chair Davidson. “VFX is an instrumental part of the creative process that works in service to story, and VFX bring stories to life that were once impossible. VFX artists and innovators deserve to be respected and recognized as agents of cinematic storytelling, in the same breath as other creative collaborators, and not cast aside as if they are detractors dispelling an illusion of ‘pure’ filmmaking. Speaking in one voice for our more than 5,000 members in 45+ countries worldwide, visual effects artists are proud partners in the creative process, and they need to be uplifted and given proper credit for their enormous contributions.”
For better or worse, the computer has become the central tool in creating visual effects. “It’s a double-edged sword,” admits John Dykstra, a visual effects pioneer who had to come up with optical rather than digital solutions because the latter did not exist. “The really good aspect of it is you can build an image a pixel at a time and include enough accuracy in the construction to make it indistinguishable from the real one. The negative part of that is you also have to be selective about what you create. Just because you
can do it doesn’t mean that you should. The idea that everything is done on one tool to a certain extent has also taken some of the fun out of it. We used to put together some crazy rigs to mount cameras on airplanes, hot air balloons, motorcycles and cars, and a lot of that invention has been replaced because you can create anything within a box.”
“I’m consistently amazed that the response to telling someone I work in visual effects” is ‘Oh! You work on computers!’ as if the editors are still hand-splicing film or the art department avoids using Photoshop because it’s impure,” notes Jake Morrison, VFX Supervisor at Marvel Studios. “As a vinyl lover, I appreciate the analog process, but I won’t bash an album that was mastered using ProTools if the music is good.”
Not only does the quality of CGI need to be taken in consideration, but how the scenes are photographed. “There is a look and feel to modern visual effects films that audiences have gotten quite used to,” director James Mangold observes. “Some of it is the way the effects are rendered and some is the way that they’re shot. There is a certain kind of style of shooting that erupts from the shooting on stages and in large greenscreen areas which is often swinging an [crane] arm around the lot; there are less cuts and more ridiculous oners that are only possible because there are so many elements that you can bring all of these pieces together in one shot or one take because it’s a cheat. We tried to avoid that on [Ford v Ferrari]. We tried to shoot the movie even when visual effects were involved so that the film felt physically like we were shooting real cars. On top of that, our goal was always to shoot real cars whenever possible.”
Holding back information on how a movie or television show is actually made hinders aspiring filmmakers who, in turn, become the leaders in their professions. “I was big fan of the movies, especially when you were looking at something that they couldn’t have gone out and shot somewhere, like a spaceship flying or an alien planet,” recalls John Knoll, CCO at ILM. “This was all being crafted by artisans. I was fascinated by how that was done. This was before the Internet, so there weren’t loads of sources of information about this stuff. For me, one of them was American Cinematographer. There were some behind-the-scenes articles that covered visual effects, and at the University of Michigan where my dad taught, the Art and Architecture library had a subscription. I had access to the library. I would go there and read some of the old back issues and look things up. Learning how the stuff was done and starting to experiment with trying to do it myself was one of those things that I played with as a kid.”
Visual effects are woven into the fabric of what makes James Cameron a filmmaker. “Avatar: The Way of Water is three hours long,” Cameron remarks. “There is not one second of that three hours that is not a visual effect. Not one second. It more plays by the rules of an animated film, like Pixar, except the end result doesn’t look like the same. It looks like photography and has its own unique process. We used to call it special effects because they were special. When they’re not special anymore, what do you call them? To me, they’re not visual effects anymore, but the imagemaking process.”
“Some filmmakers are coveting what they see as a ‘badge of honor’ in downplaying or disregarding the vital role of visual effects in bringing their stories to life, and the VES is steadfast in proclaiming that VFX must be brought into the light. ... VFX artists and innovators deserve to be respected and recognized as agents of cinematic storytelling, in the same breath as other creative collaborators, and not cast aside as if they are detractors dispelling an illusion of ‘pure’ filmmaking.”
—Kim Davidson, VES Board Chair
Warner Bros. attracted attention by removing bluescreens from behind-the-scenes imagery of groundbreaking Barbie (Image courtesy of
Comments by Ridley Scott were taken out of context in the media, which had him declaring that there were no visual effects in Napoleon when there were 1,046 shots that required digital augmentation for soldiers, architecture and natural elements. (Image courtesy of Columbia Pictures/Sony)
UNSUNG HEROES: VFX DESIGNERS POPULATE FILMS WITH AN INVENTIVE CAST
By BARBARA ROBERTSONThey start with an idea, sometimes just the germ of an idea. Then, once they put pencil to paper or cursor to screen, they become the first people to show the world a character or creature that had previously existed only in someone’s mind’s eye. How do they do this? The short answer is iteration. The long answer involves years of study and skill perfecting, tons of research, remarkable imagination, communication skills and, some might say, an innate talent.
Character designers in a visual effects studio might work within an art department alongside modelers, texture artists, riggers, groomers and a facial performance team or on visdev teams. Their title might be concept artist, art director, visdev artist, lookdev artist and sometimes that all-embracing title, digital artist. Whether the client wants a visual effects character to bring a tear or scare a 12-year-old, and whether the character is a digital human, humanoid, talking animal, realistic animal, monster or fantasy creature, the principal goal is the same: Convince the audience the character is alive and fits within its world. And have fun creating it.
“What drives me is seeing a character come alive,” says Klaus Skovbo, who leads character design teams at MPC.
But long before a character is fully alive, character designers work with their clients to understand how visuals can support a story.
THE BRIEF
“It depends on what stage the movie is when it comes to us, but most of the time we get a creative brief from the director or producer,” says MPC Art Director Léandre Lagrange. And sometimes, the client hasn’t gotten that far. Men director Alex Garland came to Framestore wondering if the movie he had in mind of a man giving birth could even be made. “It’s pretty crazy stuff,” says Sam Rowan, Framestore Senior Concept Artist. “Very literal. We did several sequences of this male character giving birth and showed it to him. He said, ‘OK, that’s brilliant. I now know the film can work.’ That was a great experience for us. We greenlit the idea in his head.”
By contrast, Rowan’s work on the character Niffler, a fan-favorite platypus-like character, is a more typical example. When he began working on Niffler for the first Fantastic Beasts film, he received descriptions and ideas. “They wanted to keep the Niffler grounded in the real world so muggles could see it and not realize it isn’t just a duck-billed platypus,” Rowan says. Working with production VFX Supervisor Christian Manz, Framestore Senior Animation Supervisor Pablo Grillo and other animators and artists including Ben Kovar, the creature took shape. “It was a nonlinear process,” Rowan says. “He is such a key character in the film and in so many shots, we spent a lot of time on this guy.”
His brief for the character Chupacabra in the second Fantastic Beasts film was simply four lines from the book and a Wiki link to the legendary creature from the Americas whose name literally means “goat-sucker.” “That creature was coyote-sized,” Rowan says. “We didn’t have the script, so the creature I drew was too big, and it turns out in the script that it’s a small pet. I had to go back to the drawing board. They didn’t like the shape, but they liked the translucent skin. So, I made one that was like a lizard with a mane and limbs like trunks. Another had horns instead of arms. One was a combination of a bird and a lizard. I did loads of drawings. We settled on one that looks cute at first and then you see its teeth. At the end, he lost his translucence. Sometimes, you can have too much.”
In that example, Rowan and the artists at Framestore had freedom to design the Chupacabra character until reined in by the script. Often, though, character designers work within an even more predefined box as did Sr. Concept Artist Casey Straka of Industrial Light & Magic. Straka designed the characters for the television series, Percy Jackson and the Olympians, including Chimera and Cerberus. “We had an idea of what they should be from the Greek myths,” Straka says. “But we were given room to come up with ideas. It was like having finger holes in a box that grounded us.”
Percy fights with Chimera in the Gateway Arch in St. Louis, so Straka needed to find a way to have a mythical yet realistic creature, half-lion and half-goat, work in that environment. The character had to reference the past yet be something new. “I tried different kinds of mixtures of lion, goat and snake,” he says. “What if the goat is a wart on the side of a lion’s body? What if the tail is a snake? We tried horns and different snake patterns. The final design had a cool, spiky head. We wanted it to be imposing and scary to a 12-year-old.”
For Cerberus reference, Straka started with her own reference library, looking for dogs he might use for the three-headed creature. “I looked for dogs that are native to Italy and Greece, Italian Greyhounds, Cane Corsos. I felt lucky getting to look at dog pictures all day.” Eventually, with feedback from the show’s writers and author Rick Riordan, Cerberus became a three-headed Rottweiler. “I tried to give each head a personality,” Straka says. “I love how it turned out. One is drooling, one has an ear cocked up, one is not paying attention. Making stuff up is my favorite thing.”
TOOLS
Framestore artist Daren Horley began working on dinosaurs for a series of television shows and then moved onto visdev teams for film projects. Like many of the artists who design characters, he uses ZBrush and Photoshop. “I do a bit of ZBrush modeling, but the key is to get things done really fast,” he says, echoing statements from other character and concept artists. “We have to turn over ideas rapidly, try out things without getting too involved in building anything in 3D. When a client doesn’t know what they want until they see it, that’s when I bring a lot of design skills into play and explore ideas. I can paint in Photoshop quite quickly. I don’t want to labor on something that might not be what they want.” For example, given an open brief to create a vulture-like character for The School for Good and Evil, Horley needed to come up with something new from scratch. Among his iterations was a bird-like creature that had wings made from
modified finger joints with branches on each side. “I couldn’t know what they wanted until they saw something,” he says. “They’d like some elements and offer more ideas. It was a collaborative process.”
GENERATING IDEAS
Most designers have personal reference files that they augment with online searches to begin generating ideas that might result in a character no one has seen before. Framestore’s Sam Rowan added another idea. “Researching on Google is great, but you are restricted by your search ideas,” he says. “So, I went on Amazon and bought some old, second-hand encyclopedias for kids that have loads of pictures in them.” Rowan picks up a book titled Animals, rifles through the pages and stops on one. “I open pages randomly,” he says. “Maybe I need to do a tiger character, so I’ll pick a random page and wonder if I could do a tiger character colored like a pigeon.” He opens to another page. “What if the tiger had amphibious feet? Or,” he says, getting another book, “Maybe I could make the tiger out of stone.” For his part, Framestore’s Horley likes to leave his desk and look at materials in the world outside. But what about AI for reference? “Clients sometimes come to us with AI images and that can help start conversations,” Horley says. “But these images don’t have finesse. They aren’t usable.” ILM’s Straka is firmly against AI for other reasons. “I don’t like giving a machine that takes from other artists and does the fun parts of my job,” she says.
MAKING THE CHARACTER WORK
“Clients always want us to give them something they’ve never seen before,” Horley says, “and so many designs have been done over the years, it’s a challenge to do something new.” But the characters must be more than unique to the eye. Designers need to consider that a creature or character depicted in 2D artwork will eventually
TOP: The mythical three-headed dog Cerebus in Percy Jackson and the Olympians gave ILM Concept Artist Casey Straka an opportunity to experiment with using a variety of canines from Greyhounds to Cane Corsos before he settled on a design based on Rottweilers. (Image courtesy of Disney+)
BOTTOM TWO: Framestore Character Designer Daren Horley wanted something other than a mummified vulture for the character Stymph in The School for Good and Evil, so he created wings from modified finger joints. (Images courtesy of Netflix)
become a rigged, animated 3D character. “A lot of times, a visual effects supervisor might present a beautiful artwork to us for a creature that looks fantastic,” says Gino Acevedo, Creative Art Director at Wētā FX. “And, the artwork has already been bought off by the director who thinks it’s really cool. But often we can see potential issues from an anatomical standpoint.” For example, he remembers receiving artwork for a character that had spikes on its shoulder. Acevedo could immediately see that if the character raised its arm, it would poke itself in the ear.
“When you’re designing a character, you have to bear in mind it will move and give a performance,” Horley says. “No matter how outlandish, you have to always refer back to nature, to animal anatomy. We generally do neutral poses. How it moves comes later, but it has to be anatomically feasible. You have to get reference from real-world animals.” Drawing from reality is true for textures as well as anatomy. “For textures, I also like to find things that are unusual but organic, like lichen on trees, rust on metal,” Horley says. “Maybe a satellite photo of a continent. If you’re creating a creature’s skin, you can go to unexpected sources to get inspiration.”
“When we send an ‘art ref pack,’ that goes to the models and lookdev department. It contains everything needed to create the creature,” Wētā’s Acevedo says. “I’ll imagine what I’d need if I were doing the texture. But many times, we create our own textures.”
In terms of creating a creature’s skin and fur, Valentina Rosselli, MPC Look Development and Texturing Lead, sometimes takes realworld reference to an extreme, especially when designing a creature’s fur. “Animals have extraordinary fur colors,” she says. “Some even have stripes along individual hairs. We can design the look of each fur layer and mix them together to get more accurate and realistic color patterns.” She used this workflow when working on the gorilla in One and Only Ivan, and on Honest John, the fox in Robert Zemeckis’s Pinocchio. She also paid attention to the characters’ eyes. “Eyes are normally the first thing we start to explore,” Rosselli says. “We study eyes and expressions to make sure that even in a still, the character has a soul. Ivan needed to be emotionally engaging. In the case of the fox Honest John, we needed to design a character with human eyes and eyebrows and keep the actor’s eye color. We started from a fox animal eye and then, trying to find the right balance between iris size and amount of white sclera, explored how human the eyes could become.”
INTO THE PIPELINE
Once a character leaves the hands of the designers, it begins its journey through the VFX pipeline, then through final modeling, texturing, rigging, animation, lighting and rendering. “Supervising a character design build can sometimes be designing by proxy,” MPC’s Skovbo says. “You help guide a talented group of artists within our MPC Character Lab, trying to get someone else’s idea through your mind and into the minds of individual artists. Modelers do one part, texture another. You hope it all comes out good and is on design.”
The designers hand off as much information as they can to the next artists in the pipeline: their drawings; the reference they’ve used; sometimes, perhaps color and lighting keys; and keyframes
to show exaggerated poses in key moments. And sometimes they help visualize the backgrounds as well: the character’s context.
“For Jordan Peele’s Nope, because all the skies are CG and the creature is in the clouds, we wanted to control the mood of the skies,” MPC’s Lagrange says. “We broke down every scene of the script and designed every sky for the movie. That was fun. But what we do most are characters and creatures. We have conversations with Klaus [Skovbo] and the other artists as well to let them know things we want to draw attention to and to avoid. We don’t want to lose those meaningful design moments.”
Despite spending hours, perhaps days and weeks designing a character, the artists take letting go of their creations in stride. “Although sometimes we might hand off a Zbrush file with mattes, mostly we hand off images,” ILM’s Straka says. “We send our ideas out into the world and don’t follow them through. I think, OK, this is my part. It’s cool if it ends up the same on screen, but it’s the nature of the game. There are so many people involved in visual effects. You just have to let something go and be what it will be.” Framestore’s Rowan remembers a furry character he had worked on for a long time, and that after he finished working on the project, it turned into what he describes as a lizardy, fishy thing. “In a dream world, you could follow your design through the process, but that’s not always available,” he says.
A CHANGING WORLD
As the tools advance, photorealistic and fantasy characters have become more believable, and that in turn has inspired writers and directors to want more. “We’re doing many characters now,” says MPC’s Skovbo. “The build list is bigger, and it can be because the technology and pipeline allow it. Facial performance is something we’re always trying to push. One of the biggest challenges we have, especially on Disney movies, is landing in the middle between real and anthropomorphic. We use the word ‘appeal’ a lot.”
As for types of designs, Framestore’s Horley sees a trend toward less flashy visual effects. “If a magical creature is doing something with its magical powers,” he says, “it’s not lightning bolts now, it’s maybe magnetic fields. Something more grounded. More believable and less spectacular.”
With more demand for characters and creatures, it’s likely to open new opportunities for artists to consider character design as a career. ILM’s Straka challenges aspiring designers to challenge themselves to make something new. “Study your fundamentals, but on top of that, figure out what you like to see in creature design. Make your cool stuff. Make the self-indulgent art that gets your ears burning. Challenge yourself to make stuff outside what you like to do. And have fun with it. The work that stands out the most is the art you can tell the artist had a good time making. You think, ‘Oh man, this is really cool. The artist really loved making that.’”
MPC’s Lagrange adds a bit of advice, “When you get bogged down in technicalities, storytelling can get forgotten. But, the motivation behind color and shape is the storytelling.” That’s the one thing that hasn’t changed and isn’t likely to; the germ of an idea that will become a character started with a story. “It’s all about the story,” Skovbo says.
Creating Bloater for The Last of Us
The 2024 VES Award for Outstanding Animated Character in an Episode, Commercial, Game Cinematic or Real-Time Project went to Gino Acevedo, Max Telfer, Pascal Raimbault and Fabio Leporelli for creating the character Bloater in the episode “Endure & Survive” for the series The Last of Us These award winners from Wētā FX’s art department started with a huge rubber suit designed and fabricated by special effects makeup artist Barrie Gower. But the large guy fitted inside couldn’t move fast or with enough agility for the director and visual effects supervisor. So, Wētā artists sent the design to the gym.
“We started with a scan of the prosthetic suit and then widened the shoulders and lengthened the legs,” Gino Acevedo, Wētā FX’s Creative Art Director, says. “Then we added the cordyceps, a particular type of fungi that grows inside other organisms. For reference, we had artwork done for the video game and pre-designs from Barrie.”
Cordycep? Acevedo gives an example of a cordyceps that grows inside an ant, turning it into a zombie, living but under control of the fungus. “This is real stuff,” he says. “It’s the premise of the show, that these cordyceps grow inside and burst out from the characters’ heads. The Bloater was a lot of fun to create. I love my monsters,” Acevedo says.
VFX IN EUROPE: BUILDING MOMENTUM
By OLIVER WEBBOPPOSITE TOP AND MIDDLE: The Yard won the 2023 César Award for Best Visual Effects for its work on Notre Dame on Fire (Images courtesy of Pathé and The Yard VFX)
OPPOSITE BOTTOM: UPP supplied VFX for Gran Turismo (Image courtesy of UPP and Columbia Pictures/Sony)
The European visual effects industry has evolved as a global VFX leader after a notable resurgence in the last decade. European studios have also been dominating awards ceremonies across the globe, and Hollywood now relies on the impeccable work of European studios. Studios such as UPP, RISE, The Yard, Important Looking Pirates and DNEG ReDefine are in strong demand. Cities including Paris, Prague, Stockholm, Copenhagen, Munich and Berlin are at the center of Europe’s visual effects hub. Events such as The View, Annecy and FMX are also held throughout Europe each year. Despite setbacks caused by the COVID-19 pandemic and the actor’s and writer’s strikes, there is currently a high demand for visual effects. It should also be noted that the implications of Brexit have forced many VFX artists to leave the United Kingdom, further boosting the industry across continental Europe.
One of the world’s leading VFX studios, Scanline was founded in Munich in 1989. Known for having developed proprietary fluid simulation software Flowline, Scanline was the recipient of the Scientific and Technical Achievement Academy Award in 2008. Another leading VFX studio founded in Germany is RISE. Currently employing 420 artists across five studios, RISE has contributed to many Marvel Studios shows such as Avengers: Infinity War and Avengers; Endgame. “I’m also terribly proud of our more recent work on Fantastic Beasts 3 [The Secrets of Dumbledore], The Last Voyage of the Demeter and Guardians of the Galaxy Vol. 3. We’re currently working with Francis Ford Coppola on his movie, Megalopolis. To be working with him is a dream come true,” says Florian Gellinger, Owner and Executive VFX Producer at RISE.
“We’re not trying to position ourselves to be specialized in a certain area of VFX work,” Gellinger adds. “Our main USP is that we’ve been swimming among the big VFX houses for more than 15
“We would have expected more shows to be back filming already in January, but there is certainly momentum building and Q3 should be completely bonkers. Currently, we’re working with a couple of German production companies on their upcoming features. Not being fully dependent on the international market is a huge benefit in times like these.”
—Florian Gellinger, Owner and Executive VFX Producer, RISE
years, providing designs and ideas on eye-level in terms of quality and delivering work in scope you would usually not expect from a vendor of our size. When we opened 17 years ago, we would have never dreamed of co-designing the Avengers’ Thanos “Snap and Blip,” and we certainly know that we still can’t compete in sheer quantity – but we’re delivering on every promise we make, no matter if it’s FX design, creature animation, CG water or environments. That enables us to constantly evolve while offering our teams variety and our clients a very good alternative.”
One of the major players in Europe is UPP, which has just celebrated its 30th anniversary. With nearly 400 employees across offices in Prague, Budapest and Slovakia, UPP was awarded Outstanding Special Visual Effects in a Single Episode at the 2023 Emmys for their work on Five Days at Memorial episode “Day Two,” and at the 95th Academy Awards in 2023 they were nominated for Best Visual Effects for All Quiet on The Western Front. UPP’s other recent projects include Gran Turismo, Barbie and Extraction 2. “It’s a very traditional VFX company with a lot of traditional background from filmmaking and special effects,” says UPP CEO Viktor Müller. “We are in a very strange place for visual effects, being that we are in the center of Europe. When we started in the ‘90s, we were always trying to get the technology and get the people, and it was always expensive for us. I love this business,
and I started when I was 16. UPP started when I was 18. I can’t imagine doing anything else. So, for me, it’s important: There are people who have been with UPP for years. We have new people, too, that we love as well. I’m proud of that. That’s the reason I never wanted to sell the company because it is a community. Being the best company isn’t my goal, I just want to get nice work for the team and do the best work we can.”
“Through strong relationships we forged with filmmakers early on and, of course, based on the quality of our artists’ work, we began to get a foothold in the U.S. film and television industry pretty quickly,” Müller continues. “Winning an Emmy in 2003 for our work on the Children of Dune miniseries made the industry as a whole sit up and take notice. We’re also not strictly a VFX house. We have a post house and an advertising department; we have virtual production, mocap, pre and postvis departments and 3D scanning. Most of our work comes out of the U.S. or U.K. – or even Germany on occasion. In terms of 2024, we’re bidding on some very exciting projects, but if the projects coming through our door, in film and television or streaming, are any indication, I think our 30th year is going to be a very exciting one with a lot of creative challenges and opportunities once the business really starts to ramp up again.”
The Yard is a French independent creative visual effects studio founded in 2014 and dedicated to feature films and episodic content. The Yard has worked on a number of recent international projects, including Ford v Ferrari, WandaVision, Nomadland, The Gray Man, Enola Holmes 2, John Wick Chapter 4, Indiana Jones and the Dial of Destiny and Halo, and are also currently working on The Lord of the Rings: The Rings of Power Season 2. “These are great examples of the diversity of VFX projects our teams have been entrusted with by our clients, often as the sole independent VFX vendor based in France,” says The Yard Founder and VFX Supervisor Laurens Ehrmann. “We’ve also worked on various French productions, such as Les Indésirables and Notre Dame on Fire for which we won the César Award for Best Visual Effects. All these projects presented new challenges, facilitating our growth and the fortification of our infrastructure and pipeline. We are proud of all our projects, but I would single out Ford v Ferrari as a pivotal milestone in our journey, propelling us to new heights.”
Important Looking Pirates dived into VFX for Indiana Jones and the Dial of Destiny (Image courtesy of Walt Disney Studios)
RISE contributed to The Last of Us. (Image courtesy of RISE and HBO)
“For the past 10 years, we have worked on approximately 60 projects for both international and French productions across various formats, including theatrical releases and streaming fiction,” Ehrmann continues. “Today, we are proud to be one of the very few studios to have worked on the most ambitious projects, with the largest VFX budgets in France. This year, two of the largest-scope international productions have chosen The Yard to craft their VFX. We are also proud to bring together the top visual effects artists who have proven their skills in the most renowned international studios and are now looking to return to France. All of our heads of departments have spent many years working around the world – at DNEG, Framestore, Rodeo, ILM and MPC, among
others. Recently, The Yard expanded its presence in France with new offices in Montpellier and Lille. This expansion allows us to provide different locations for our new artists. These new offices also enable us to be in close proximity to ARTFX, a world-class VFX school. In 2023, ARTFX was recognized by The Rookies artists network as the top school in the Special Effects category for the fifth consecutive year. Their students won a VES Award in February for Outstanding Visual Effects in a Student Project for ‘Silhouette.’ This unique school/vendor partnership is crucial in supporting our growth with the best French VFX talent.”
Part of the DNEG group, ReDefine is a global team working across 16 studios from North America to Europe and India. In continental Europe, ReDefine has three studios: ReDefine Barcelona, Spain (opened in November 2022 and led by Heads of Studio Jordi Cardus and Daniel Buhigas and Creative Director Patric Roos), ReDefine Sofia, Bulgaria (opened in April 2023 and led by VFX Supervisor Peter Dimitrov and VFX Producer Elena Rapondzhieva) and ReDefine Budapest, Hungary (opened in June 2023 and led by VFX Supervisor Ashraf Elsayed Hassan). ReDefine’s latest VFX projects include Borderlands, Those About To Die, Dune: Part Two and Renegade Nell. “We have worked alongside DNEG on a number of large shows as well as on a wide range of other projects. The company brings a fresh and dynamic approach to visual effects and animation for features and episodic series. It leverages DNEG’s legacy of creative and technical innovation to cater to projects that benefit from its agile, boutique approach,” says Rohan Desai, Managing Director of ReDefine.
Based in Stockholm, Sweden, artists Niklas Jacobson and Yafei Wu, who had been working in L.A. and London. Launched Important Looking Pirates in 2007. “They wanted to bring back all the cool work that was done over there [in L.A. and London] to Sweden when they decided to move back to be close to family and friends. We see ourselves as a high-end, medium-sized boutique VFX facility mainly doing CG-heavy work. We have around 200 full-time employees, going up to 250 including freelancers at times, across Stockholm [main office] London and Hamburg,” says ILP Senior Executive Producer Måns Björklund. “We are very proud of all of our projects. Some of our latest projects include Avatar: The Last Airbender, Shōgun , Indiana Jones and the Dial of Destiny, The Hunger Games: The Ballad of Songbirds & Snakes and Fallout.”
Several European countries support the film industry with numerous government incentives. The French government and the CNC (The French National Centre of Cinema) have worked hard to promote and support the VFX industry as well as the broader film industry. “The standard tax rebate for International Production for incoming productions is 30%, with a minimum expenditure of 250K euros ($270K),” Ehrmann explains. “Since 2020, international productions can benefit from a 10% extra on top of the normal 30% when the VFX-related expenses with a French VFX studio reach at least 2 million euros ($2.1 million). You do not necessarily have to shoot in France to benefit from this scheme; you can partner with a French vendor only for your VFX needs. According to the latest figures from CNC, since 2020, over
TOP TO BOTTOM: Important Looking Pirates contributed to Indiana Jones and the Dial of Destiny (Image courtesy of Walt Disney Studios)
The Yard was tasked with recreating old London for Enola Holmes 2, paying close attention to the architecture of the period. (Images courtesy of The Yard VFX and Netflix)
Second Tour is one of The Yard’s most recent French productions. (Image courtesy of The Yard VFX and Pathé)
116 million euros in expenditure commitments have been registered thanks to this scheme and the share of VFX-only projects is constantly increasing.”
Also, Germany offers government incentives that support the German VFX industry. The German Federal Film Fund DFFF 1 is a tax rebate of 20%, up to 25% for projects with German production costs of more than $8.4m (€8m). While DFFF 2 is a 25% tax rebate with a per-project cap of $26.25m (€25m). DFFF 2 can be used for entire productions as well as for VFX.
UPP’s Müller admits that the Hungarian government is much better at supporting the film business overall than the Czech Republic. “We have two offices and one is in Hungary,” he says. “Tax incentives are 35%-37% in our Hungary office. There are incentives [in Hungary], though in Prague, to be candid, they could be much more competitive. We’re fortunate to be able to avail ourselves of the incentives in Hungary because we have artists and technicians there working on all of our projects, which I think our clients appreciate.” Similarly, government incentives in Sweden could be improved. “There is a fund, but it isn’t very big, unfortunately, and it has been hard to get access to it,” Björklund adds. “I think there needs to be more done on that front to make Sweden more competitive and in line with the rest of Europe.”
In terms of future industry trends, Müller remains optimistic about AI and sees it as a great tool. “I’m not worried about AI like some people are,” he notes. “AI will never replace truly talented artists. I’m not scared that the film industry will change so much that there will be no visual effects; they will always be part of it and have been since the very first films. It won’t be an easy time for a few years. For example, we will see how the film and gaming industry will merge. I believe there is one point where they will become much closer. Now, though, it will be a tough time because companies have to think about their products a little bit more. The other thing is that children and younger audiences are watching shorter and shorter programs and films, while Hollywood is generating longer and longer films. I think, at this moment, the industry as a whole is kind of amid a revolution or a reset, and a lot of people aren’t sure where things are going to land. Personally, I think it’s ultimately going to be a good thing. I also think that cinema in particular is primed for a major comeback. All of that said, I think the pendulum is going to swing for a while longer before it lands on whatever the new normal is going to be.”
Gellinger is more cautious of the impacts of AI. “The use of AI tools based on other people’s work is a huge issue, and I’d like to see both our competitors and our clients condemn their use as long as the artistic source materials used for ML have not been disclosed,” he argues. “We’re not against the use of AI; we use it in many different areas. But we can’t allow it to mass-harvest the ideas of other people on the internet, to be selectively exploited through a remix machine. It seems we’re currently in the ‘Wild West’ age again, like in the late ‘90s when the music industry almost collapsed, and everyone is either
watching or contributing to its demise. The term ‘Generative AI’ is highly misleading because it’s not generating anything new or fresh by itself.” Gellinger further notes that Rise is working through the aftermath of the actor’s and writer’s strikes without, knock on wood, major casualties. “We would have expected more shows to be back filming already in January, but there is certainly momentum building and Q3 should be completely bonkers. Currently, we’re working with a couple of German production companies on their upcoming features. Not being fully dependent on the international market is a huge benefit in times like these.”
For Björklund, getting back to a more normal workload from being super busy after the pandemic and now post-strikes being a bit less busy is currently the most pressing issue. ReDefine, however, is focused on opportunities rather than challenges. “ReDefine operates a technology group from its studio in Barcelona -- the Advanced Development Group or ADG -- which leverages real-time technology and AI tools to find innovative and impactful solutions to filmmaking challenges in collaboration with DNEG’s virtual production and R&D teams,” Desai explains. “Adrien Saint Girons heads up the team. Adrien’s work with the ADG is further expanding ReDefine’s suite of offerings in Europe to meet the continuously growing demand for the company’s services.”
For The Yard, the COVID-19 crisis led to the number of projects increasing drastically as streaming platforms experienced explosive growth. “This led to the emergence of many new vendors while existing ones expanded to accommodate the increased workload,” Ehrmann says. “However, with the onset of strikes last year, the industry paradigm has shifted. Studios are now aiming to undertake fewer projects, with an expected reduction of over 30%, impacting the volume of work for the global VFX industry. With fewer projects and the proliferation of vendors, I do hope we don’t get into a price war, which would inevitably favor the studios over the vendors. We are facing a downturn in global VFX activity. Still, there is a shared optimism that things will begin to pick up around Q3-Q4 this year. From my point of view, our primary challenge is to maintain our crew size to ensure a swift restart once activity resumes.”
Ehrmann, however, remains positive about the outlook of the French VFX market for the rest of 2024. “For a couple of years now, international studios have chosen to partner with French companies for multiple film services such as shooting, production and VFX, as they can benefit from attractive government incentives. Additionally, the high quality of French art schools and the pool of local talent have reinforced France’s appeal to numerous global creative companies. One Of Us, MILK VFX and, most recently, Rodeo FX have settled in our territory, joining French vendors who already operate on the international market, like The Yard, Light, BUF and MPC Paris. From my perspective, this is a positive trend that demonstrates the attractiveness of France’s VFX industry on multiple levels, presenting a great opportunity to attract larger-scale projects.”
The Yard has worked on large-scale Hollywood productions such as John Wick: Chapter 4 (Image courtesy of Lionsgate Films)
Ford v Ferrari was a pivotal milestone in the evolution of The Yard VFX. (Image courtesy of The Yard VFX and Twentieth Century Studios)
Netflix’s The Gray Man was a recent international project for The Yard. (Image courtesy of The Yard VFX and Netflix)
EXPANDING THE MIND FOR INSIDE OUT 2
By TREVOR HOGGImages courtesy of Disney/Pixar.
TOP: The red button labeled puberty, which was introduced in the original film, gets activated in Inside Out 2
OPPOSITE TOP: The Real World has a more muted color palette.
OPPOSITE BOTTOM: Joy interacts with a memory sphere found in Riley's Belief System.
Inside Out explored the psychological implications of a tween forced to deal with her family moving to another part of the country and having to leave her childhood friends behind. Filmmaker Pete Docter observed that sadness is as essential to the human experience as other emotions such as joy, anger, fear and disgust. Nine years later, Pixar releases Inside Out 2 under the direction of Kelsey Mann. “Inside Out ended with such a great line, which was Joy saying, ‘After all, Riley’s 12 now. What can happen?’’’ director Mann notes. “They had set up the new console, and there’s a little thing on the console that says ‘puberty,’ and they had no idea what it meant. I had a great opportunity to see what I wanted to see next as an audience member, and what I wanted to see was that puberty alarm going off!”
Expansion was important to the cast and character design. Joy has to deal with Anxiety, Envy, Embarrassment and Ennui. “The original Emotions were simple shapes – Anger is a block and Sadness is a teardrop,” Mann observes. “I wanted to expand the cast and vocabulary of shapes. Figuring out what those shapes were was a fun challenge. I wanted a tiny and giant Emotion!” A common factor was discovered when making a list of sequels that Mann admired. “They open doors of the world that were just off-camera that I didn’t know were there. The Stream of Consciousness was something interesting that we didn’t get to see in the first film. Early on, [Animator] Ralph Eggleston did some beautiful paintings of what the Stream of Consciousness could look like; he called it the Northern Lights in water. Then, adding the fun of what Riley is thinking about appears and floats by so you can literally see what she is thinking! We have a whole beat where she is suddenly very hungry and all of this food goes floating by the characters. Successful sequels don’t repeat but grow and change. The concept of puberty went perfectly with this because it’s all about change.”
“The original Emotions were simple shapes – Anger is a block and Sadness is a teardrop. I wanted to expand the cast and vocabulary of shapes. Figuring out what those shapes were was a fun challenge. I wanted a tiny and giant Emotion!”
—Kelsey Mann, Director
Technology is all about change, too. “The Emotions are made of particles, and there were a lot of weird tricks that they had to do to get that look back in the past, which didn’t work anymore,” Mann remarks. “They had to reinvent new ways to get that look back to the original film.” Story is paramount at Pixar. “Everything we create here is a visual effect because there’s nothing there. I don’t think people fully appreciate the amount of work that it takes to make these movies because you don’t get anything for free,” Mann adds. One environment pushed the technology when it came to integrating effects, simulations and lighting. “The Belief System lives underneath Headquarters in the Mind World,” explains VFX Supervisor Sudeep Rangaswamy. “It’s this cavernous space with waterfalls and Memory Spheres floating around in it. There are these Belief Strings which can be plucked, and when you pluck them it says a belief that Riley has. We had to figure out what these strings look like, the plucking behavior and how the light changes when you pluck these strings. It was many different elements coming together that would have been difficult to achieve in the past. We had to develop new procedural ways to do the waveforms that happen when you pluck the strings, and also the interactive lighting effects.”
Hair proved to be the common technological challenge for each of the new Emotions. “Anxiety has got this hair that sprouts out from the top of her head in all different directions, and that hair
can be used also for acting,” Rangaswamy states. “When she gets particularly anxious, it will straighten up and look alarmed. The animators had controls to be able to adjust the placement of the hair, like guide controls, then the simulation had to work on top of that. There are also these special hairs that are made of tiny discs because all of the Emotion characters are made of these particles. That was a big challenge for the character, in particular, being able to get a performance out of her hair because it was such a big part of what she is like. Ennui is often leaning over in droopy poses. You are working against the simulation where you’ve got her hanging over a couch but wanting to preserve this swoopy shape at the bottom of her long hair that isn’t totally physically correct. Envy is small but has a larger portion of hair on her than a lot of the other characters. Managing that and making sure the performances where she is moving her hair around looked good was important. Embarrassment is often covered up, so hair was less of an issue with him. He’s got a big hoodie, so with him the challenge was a simulation issue of having the big hoodie garment wrap in a way that was appealing but realistic to the contours of the character.”
Maintaining consistency with the characters is critical, considering the number of animators working on the project. “We make a bunch of documentation at the beginning of a show and update it throughout as we figure out the character more,” remarks
Directing Animator Amanda Wagner. “Each character has their own model sheet that tells you who the character is. If Kelsey had given us any sort of notes on what Anxiety means as a character, we’d put that information there. Any little rules that are specific to that character, or even if there’s video reference from the voice actors, we’ll link that so people can easily find all of the on-model information. Then we also built animation libraries for all of the characters so that people can stay on model and only have to worry
about their acting. It gets them maybe 50% of the way there.” Even the Emotions exhibit a range of their own emotions. “We’ll be, ‘How would this character look angry but still be their core emotion?’ Ninety percent of the time they are their core emotion. Anger can still be happy, but what would an angry happy be?! The Emotions are interesting in that they still have emotions on top of their emotion. The older characters are a little easier because we can be like, ‘Does this feel like the Joy from the first movie, but elevated a bit more to go with the new story?’ We have to figure out these new characters from scratch. How do they move? How is the animator going to want to move them? All of the different characters have various languages on how they move. Sadness is droopy. Joy is more upright and always has a flowy line of action. What you have to think about with the new characters is, what is their main silhouette and how do they move?” Wagner says.
Separating the Real World from the Mind World is the camera language; the former is imperfect and grounded in physics while the latter is perfect and virtual. “On Inside Out 2, we took some of the language that was developed on the first movie and decided to push it even further,” remarks DP, Camera Adam Habib. “One of the first differences that audiences will probably notice or feel right away is that the movie is in widescreen. When you have this ensemble staging, it’s more fun to check in with a lot of them in the same frame without having to cut. Based on that change, we also decided to push the Human and Mind Worlds’ differences so that the Human World has an anamorphic look. It gives more of that texture of reality, imperfection and physicalness.” Habib adds, “In the Mind World, the characters are so appealing and their shape language is so simple and clean that they look good on a lot of different lenses, but really look nice on wider than you would think. The bread and butter of lenses for the Mind World is
A
developing
Dust was an important atmospheric in establishing the proper mood for Demolition Day at Headquarters.
a 25mm spherical, and this is on a Super 35 film back. The anamorphic is the same film back but with the 2x squeeze.” Handheld was reserved for a specific emotion. “I wanted you to be able to feel what emotion is driving when you look at Riley in the outside world by the way the camera moves. Anxiety makes you feel that everything has a little bit of an edge to it and is more off-kilter,” Habib notes.
“We’re doing some cool stuff with shallow focus in the third act,” Habib reveals. “There are other moments with Anxiety when we tried to make it a deeper focus look. It’s almost like ‘information overload’ was the feeling I was trying to get. We want to do that in a way that doesn’t distract from the main story, but hopefully, on a subliminal level the information level increases when Anxiety is driving.” There were other photographic opportunities. “What is fun about the new Emotions like Envy and Embarrassment is the scale gets pushed beyond the original film. Embarrassment is huge. Kelsey is sometimes saying that he is the Big Bird of this group of Emotions. Then, Envy is this tiny thing. It was tempting sometimes to go, ‘Should we scale her down or up a little bit?’ But we did a good job of sticking to, ‘He’s supposed to be big in the frame,’” Habib says. Characters are front and center in the Vault. “You’re going to laugh your ass off!” chuckles Production Designer Jason Deamer. “All of Riley’s secrets have been locked up. She still watches a 2D character called Bloofy who is on a show for three-year-olds and has a crush on a 3D video game character, Lance Slashblade. The characters in the Vault are the fun part because it was outside of what we normally do.” The clouds have a more prominent role in the Mind World. Deamer explains, “We have volumetric clouds in this one. What do clouds look like in the mind? If you do the same cumulus clouds in Florida, it’s going to look real, so we spent a lot of time designing almost playful clouds. We leaned into what a kid would draw, so they’re fun and simplified, but light-penetrating volumes in the sky.”
“I didn’t want to spend another entire movie inside of the Memory Banks because a lot of the first movie is [characters] lost in them,” Deamer remarks. “I was looking for ways to not do that, to do refreshing novel takes on the same world, like being on top of those Memory Banks. In this case, a lot of construction because Riley is changing as a human being. She’s a teen becoming an adult; she’s separating herself from her parents. Construction is a major theme in the design of the film. In the Sar-chasm sequence, her mind is expanding so they’re building more Memory Banks. There’s a lot of unfinished shelving that can collapse.” The color palette reflects Riley’s state of mind. “There is zero orange in the film until the first shot of Anxiety, which I can tell you meeting after meeting like, ‘No orange!’ But they were like, ‘Why?’ And I said, ‘No orange, damnit!’ From that point forward, whenever Anxiety has an effect on Riley in the Real World as well as in the Mind World, orange shows up,” Deamer notes.
An extremely complex sequence to execute was the Vault because of the integration of different animation styles. “We had a card based in 3D space to represent where the 2D animation would be and blocked the animation to that in the CG world,” Rangaswamay explains. “We completed a CG version of the shot. then had the 2D animators go in and do the hand-drawn work for Bloofy. That went through a process of shading and texturing on Bloofy to make him sit with the rest of the CG environment. Lots of different pieces moving around there.” Like his predecessor, Pete Docter, Mann shares a similar perspective toward emotions. “With these new emotions, we wanted to make sure it was coming not from an evil place because all of our emotions are in us to help us to survive,” Mann says. “They all have a sense that they are here to help us and that their reasoning is for the love of us as individuals. There are times when we got away from that. There are early
versions of this movie where Anxiety was much more of the stereotypical evil mustache-twirling kind of villain. We realized that we needed to double down on their motivations and why they’re doing what they’re doing.” Conflict is essential to the narrative. “You don’t have a story unless some sort of dramatic thing happens at the beginning of the film, and as long as Joy is our main character in these stories, she’s going to have bad things happen to her!” Mann observes.
TOP: Director Kelsey Mann wanted to expand the size of the Emotions, which is why he made Embarrassment so big and Envy so tiny.
BOTTOM: Fear, Disgust, Sadness, Anger and Joy reach the Edge of Consciousness and have to decide whether it is worth the risk to carry on.
PLAYING THE LONG GAME ON A WIDER FIELD
By CHRIS McGOWANTOP: Diablo IV is an action RPG with much evil to conquer. (Images courtesy of Blizzard)
OPPOSITE TOP: The inside and outside of Hogwarts castle are featured in this open-world action RPG. Hogwarts Legacy was the No. 1 video game of 2023, adding together its releases in various formats. (Images courtesy of Warner Bros. Games)
OPPOSITE MIDDLE AND BOTTOM: First-person shooters Call of Duty: Modern Warfare III and Call of Duty: Modern Warfare II were so popular they both made the 2023 top 10. (Images courtesy of Activision)
Following the post-COVID slowdown of 2022-23, the video game industry is cautiously pressing forward, as games that were further along in the production pipeline are center stage while big-name games with longer productions cycles are still playing catch-up, which bodes well for gaming’s diversified future, if not its present.
The video games market generated about $184 billion globally in 2023, a small growth of 0.6% over the previous year, according to data firm Newzoo. Mobile games were forecast to generate $90.4 billion in 2023, a 1.6% decline year-over-year but still about half of the market. The console segment was expected to reach $53.2 billion, a 1.9% increase. PC was the biggest growth vehicle due to projected sales of $40.4 billion, a 3.9% increase. Newzoo projects that the total global games market will pick up the pace and reach $205.4 billion by 2026, with a world player population of 3.79 billion.
The consoles PlayStation 5, Nintendo Switch and Xbox Series X/S sold 22.5 million units, 16.4 million units and 7.6 million units across the globe in 2023, according to Ampere Analysis, as quoted by Financial Times. The PS5 had a banner year, its player sales up 65% while Switch and Xbox were down 18% and 15%, respectively.
Hogwarts Legacy, Call of Duty: Modern Warfare III, Madden NFL 24, Marvel’s Spider-Man 2, Legend of Zelda: Tears of the Kingdom, Diablo IV, Call of Duty: Modern Warfare II, Mortal Kombat 1, Star Wars Jedi: Survivor, EA Sports FC 24, Starfield, Super Mario Bros. Wonder, Resident Evil 4, MLB The Show 24 and Dead Island 2 were the 15 bestselling video games in the U.S. in 2023, according to research company Circana, formerly known as NPD. Another hot title is NBA 2K24, which launched in September 2023.
“Companies over-invested in 2021 as work-from-home drove revenues higher, and the buzz words of the week were Metaverse,
VR and esports. However, the buzz words never materialized in a big way, and people went back to work, so growth stalled. With no meaningful growth, overstaffing gave way to layoffs. We’re probably close to ‘right sized’ now and should see a return to growth,” comments Michael Pachter, Managing Director of Equity Research for Wedbush Securities.
“2023 was a brutal year in game development. While every year has its cycle of layoffs and studio closures, 2023 [was] quite different,” says David Johnson, Founder, CEO and Creative Director of Undertone FX, which specializes in real-time visual effects and has worked on Mortal Kombat 1, Diablo 4, Starfield and Star Wars: Jedi Survivor, among other titles. “There have been massive layoffs at a lot of major studios. My speculation is that this is partially a late-stage COVID backlash. Teams had to ramp up quite a bit during COVID because of the time lost due to going remote. Those games have now shipped, and the studios no longer need that size of teams, so a lot of scaling back has occurred – and continues. Funding for projects has gotten much thinner this last year.”
DISNEY/EPIC
One very positive video game development in 2024 was Disney and Fortnite maker Epic Games announcing that they are teaming up to build what’s being described as an “expansive and open games and entertainment universe,” according to the Walt Disney Company. Disney is investing $1.5 billion in an equity stake in Epic.
“In addition to being a world-class games experience and interoperating with Fortnite, the new persistent universe will offer a multitude of opportunities for consumers to play, watch, shop and engage with content, characters, and stories from Disney, Pixar, Marvel, Star Wars, Avatar and more. This will all be powered
by Unreal Engine,” according to the Disney site.
“This marks Disney’s biggest entry ever into the world of games and offers significant opportunities for growth and expansion,” Disney CEO Bob Iger said in a statement. “We can’t wait for fans to experience the Disney stories and worlds they love in groundbreaking new ways.”
MOBILE
Mobile gaming is still the biggest single piece of the video game pie. Pachter explains, “It’s growing at a low single-digit rate and poised to grow at a high single-digit rate in the next several years. The biggest developments are Apple’s and Google’s legal and regulatory losses, with courts and regulators requiring competition. That will ultimately result in a reduction of store fees from the current 30% to something lower, which is a direct transfer of profit to developers and will lead to much more investment in both game development and user acquisition.”
FTP
Pachter notes, “FTP is still 85% of all gaming revenue and growing. The big opportunity here is to monetize the non-payers by advertising to them, and we are seeing meaningful growth there. That’s also good for developers.” Free-to-play is also the standard model now for mobile games, which is a massive market. Candy Crush Saga alone made $472 million in 2023,” Johnson says.
BATTLE ROYALE
The hottest gaming genre for several years now has been the “Battle Royale” genre, starting with PUBG: Battlegrounds. “Then, several other studios followed the formula. Fortnite, Apex Legends, Call of Duty Warzone and others have been massive hits,” Johnson notes.
GAME PASS
“The biggest innovation is games as a platform – Roblox and Game Pass, with Netflix emulating their success,” Pachter observes. “It’s unclear if the subscription model [Game Pass], pay-as-you-go [Roblox] or ‘free’ with another service [Netflix] will win, or if all three can thrive. It’s early days, but really interesting.”
“Microsoft’s Game Pass has changed the economics of game development, but I think it’s still unclear how this bet is going to pay off for them in the end,” Johnson says. “With Game Pass, users no longer buy games outright; they just have a $15/month subscription and can play any game they like within the Game Pass library. It’s been an incredible deal for users.”
Johnson adds, “Microsoft is paying the developers a commission fee for their game to be part of the Game Pass library. There are then metrics to see how much a game is downloaded and played. [Just as] Netflix and Spotify have disrupted their respective industries, Game Pass is disrupting games. The numbers aren’t entirely public, so it’s hard to know what benefit/detriment this is yielding for developers.”
Microsoft’s library will be further strengthened in the long term by the firm’s acquisition of Activision Blizzard.
REMOTE WORK
Remote work and distributed development have more or less become standard in video game development, “to varying degrees depending on the studio,” Johnson says. “Almost all major studios now have an outsourcing manager to handle vendors. The entire game industry went fully remote over COVID. And while a few studios have been moving back to in-office development, many have embraced it and are now fully remote.”
CLOUD
The impact of the Cloud on gaming “is hard to assess,” Pachter says. “The only company with enough content to thrive is Microsoft, and even their position is difficult because of the meddling of regulators.” Johnson notes, “Cloud gaming has been a promising and exciting area of development in games for quite a long time now, but it seems we’re not there yet. OnLive was the first big attempt to go mass market with it that failed. Google Stadia looked like a promising attempt, but it too was shuttered about a year ago.”
Johnson adds, “I’m quite skeptical about the viability of Cloud gaming for the near future for a number of technical reasons. Reliably getting latency to hold firm under 16ms might open up some possibilities, but we’re nowhere near that right now. How the internet itself is architected might have to change to open up Cloud gaming as a viability. Packets, routing and redundancy make for a fairly solid method of data transmission, but they are at odds with the speeds necessary it to work well.”
AI/ML
Some emerging technologies are having a big effect on the making of video games. Johnson comments, “AI is already moving the needle. One of the first areas of development I think we’ll see is Chat GPT-like dialog generated on the fly for NPC characters in games. Where a writer no longer has to pre-write every single word that everyone in the world can say. Their job might become more about putting bumpers on the domains the NPCs are constrained to. You don’t want a fantasy blacksmith talking about our latest elections.”
Continues Johnson, “Certainly AI and machine learning are making an impact in games already. There’s an amazing indie project in development called Bitmagic that allows the player to type prompts into the video game itself and modify the game in ways the developer never built in. I worry for concept artists, as I suspect that’s one of the first areas of displacement that has already started.
“Style transfer is another AI/machine learning tech that I think may have a huge impact on video games. Imagine if we are no longer rendering detailed scenes and characters. Instead, we are rendering blocky approximations and the characters’ ‘deep fake’ and the environments style transfer to look like filmed footage,” Johnson says.
“The FX in video games are one area that the real-time constraints really hold us back from making film-quality imagery. The kind of stuff that takes a simulation farm,” Johnson states. “I
“Companies over-invested in 2021 as workfrom-home drove revenues higher, and the buzz words of the week were Metaverse, VR and esports. However, the buzz words never materialized in a big way, and people went back to work, so growth stalled. With no meaningful growth, overstaffing gave way to layoffs. We’re probably close to ‘right sized’ now and should see a return to growth.”
Michael Pachter, Managing Director of Equity Research, Wedbush Securities
am hopeful that ML models will allow us to generate coarser simulations, and machine learning will take it the rest of the way at reasonable frame rates.”
DRIVE READS
“A tech [advance] that I don’t think many of us saw coming was what super-fast drive reads like the PS5 and Xbox Series X have and what that can allow,” Johnson explains. “Some of the new Unreal 5 tech is reliant on that, and the amount of detail one can get in a UE5 scene today is pretty amazing. It’s freeing us from having to be entirely loaded into RAM, which is a game-changer. The ‘On the Fly’ level swapping in the latest Ratchet and Clank game was a brilliant usage of that tech, and it well showed something that would have been impossible a generation ago.”
VR/AR/XR GAMES
Johnson is excited about the long-term prospects of XR, which gained momentum from the launch of Apple Vision Pro, a mixed-reality headset that already offers more than 250 games.
“I’m hearing very positive feedback,” he comments. “I can imagine a future in five or 10 generations down the line where XR will be as commonplace on one’s glasses that it becomes a seamless and transparent part of our lives. There are some really cool gaming possibilities with XR, once you move it from your phone – like Pokémon GO. Cool, but not immersive – to your eyes, where you can have [a game scenario of] monsters bursting from the walls in your house. It’s definitely a space to keep your eye on.”
“Apple Vision Pro has already impacted XR overall by validating the market. The bigger question is how much of an impact. We’ll know based on the new providers coming to market within the next year,” notes Tuong Nguyen, Director Analyst at research firm Gartner.
AI and the Cloud should give a huge boost to the quality of VR games and VR, in general. Nguyen explains, “AI enables multiple aspects of VR experiences. For example, AI applications such as computer vision to track eye movements for foveated rendering. Other applications such as text-to-speech, text-to-text [and] sentiment analysis can be applied to interfacing with the VR experience. GenAI can be used to automate the creation of CG content [of procedurally generated objects] that would be impossible or impractical to do manually. The Cloud offloads the rendering, compute and storage of content [thus] enabling richer experiences than would be possible on [just a] device.”
THE HORIZON
Looking at the near future of video games, Johnson says, “I’d keep an eye on Nvidia and Epic. They have been at the cutting edge of game technology. [And] keep an eye out for ML-based middleware. I think this is a very exciting time to be in video games. I think we’re in for a lot of advancement in the next decade or two. Being on the cutting-edge of visuals and still having a long way to go means there’s a lot of opportunity to innovate and personally be a part of advancing the industry.”
STREAMING PLATFORM DEMANDS KEEP RAISING THE BAR FOR VFX
By CHRIS McGOWANIn recent years, the increasing demands of streaming platforms for a higher quality and quantity of visual effects have helped raise the bar for VFX studios and fueled the expectations of viewers.
“The demand for high-quality visual effects in streaming content is indeed on the rise,” comments Charlene Eberle, Raynault VFX Executive Producer and Head of Business Development. “As streaming platforms continue to produce original films, series and other content to attract and retain subscribers, they prioritize delivering visually stunning and immersive experiences. This means investing in advanced VFX to create worlds, creatures and effects that rival or surpass those seen in traditional cinema.”
Much of the increased demand is due to new investments in original content. “The resurgence of mid-budget, genre-specific productions on streaming platforms has been one of the major factors,” comments Mark Hammond, Co-Owner and VFX Supervisor at Herne Hill. “It’s been a positive for the entire industry, as the streamers have shown a strong desire to support the creative vision of their storytellers, resulting in the type of VFX requirements and budgets that were typically reserved for studio feature films in the past.”
TOP: The Witcher (Image courtesy of Netflix)
OPPOSITE TOP: One Piece (Image courtesy of Netflix)
OPPOSITE MIDDLE AND BOTTOM: The Crown (Image courtesy of Netflix)
“Stories are getting more ambitious, taking place in entirely fictional worlds or reimagined versions of our own,” according to Glenn Matchett, Managing Director at Grammatik Agency, an international tech PR and marketing firm. “If you look at shows like The Witcher, there’s an enormous amount of world-building that leans on visual effects to be believable. While shows like The Crown take place in our own reality, it’s set in the past so there’s a
lot of VFX needed to put contemporary settings back to the way they were. High-quality visual effects work is absolutely fundamental to almost every tentpole program across every streaming service.”
GROWING PROMINENCE
“Streaming platforms aren’t just altering how content is consumed, they’re revolutionizing content creation itself,” says Conrad Allan, Co-Founder and CEO at MattePaint. “Giants like Netflix and Amazon are at the forefront of this shift. Their pipelines are built on modern technologies and efficiencies, and their content production is tailored specifically for streaming. This evolution signifies a broader change in the entertainment landscape.”
Matchett adds, “For us in the U.K., most terrestrial TV channels have streaming counterparts, including the BBC. As internet speeds get faster and fiber goes into more homes, there’s been a significant jump in what’s actually possible for people to watch. Shows are getting longer and more complex, and budgets are often increasing to match expectations. In-home entertainment hasn’t just taken off, it’s deeply rooted in the fabric of our society. The big four platforms – Netflix, Amazon Prime, Disney+ and Apple TV – are in a constant state of competition to score the next big hit, which can mean investment but also caution. So, it’s an interesting space, and it’s great to see quality visual effects taking their place in long-form content as well as feature films.”
“There has been a profound shift in the way that entertainment
is consumed, and, as a result, we are witnessing a change in how shows are being made,” says Paul Silcox, VFX Director at Lux Aeterna. “Streaming services seem to be more flexible, more creative and therefore more disruptive, and this leads to great television. At Lux Aeterna, we work closely with our streaming partners from the earliest stages of a project, and we have seen incredible ideas taken into production in recent years.”
GREAT EXPECTATIONS
“At Amazon MGM Studios, we are always raising the bar – and the bar for visual effects in streaming has become quite high,” says Chris Del Conte, Global Head of Visual Effects at Amazon MGM Studios. “Our customers are accustomed to seeing the finest quality visual effects streamed into their homes and onto their devices. This has had a positive impact for our storytellers, allowing them to think big in terms of what they put on screen. To help our creators achieve the goal of delighting our customers, the Amazon MGM Studios VFX teams enter the creative process near the beginning, during the initial script breakdowns. This allows us to initiate early VFX discussions, map out sequences and evaluate methodologies. At this early stage, we also gauge the potential for utilization of innovative tools such as virtual production. This process allows us to engage in in-depth ‘how do we make this cool?’ evaluations. We are then able to lock in the methodologies, budget requirements, vendors and technical expertise needed to raise the bar.”
David Lebensfeld, President of Ingenuity Studios and Ghost VFX, comments, “Streaming allows the storyteller to go into great detail over many episodes rather than the length of a typical feature film. It is helpful that streaming schedules tend to allow for longer timelines, which makes room for higher quality,”“It is no secret that visual effects artistry elevates visual storytelling as far as the imagination can go. That’s one of the best things about this industry.”
Del Conte notes, “The Boys and its spinoff, Gen V, rely on fantastic VFX to demonstrate all varieties of superpowers that push the limits of the grotesque and hilarious. Thanks to the visual effects and stories surrounding them, our customers now come to each season of The Boys expecting it to outdo the previous one. Advancing technology has allowed the showrunners limitless creativity with the expectation that any idea is possible with VFX.”
“As time goes on, viewers’ taste and ability to judge the quality of any content just gets higher and higher,” comments Lebensfeld. “You’re working hand-in-glove with the viewer on refining their taste, and great content elevates everything. People learn from watching great content and expectations grow. When you think back about movies that you thought were scary or incredibly realistic when you were younger and you rewatch them years later, you can see how far this artistry has evolved.”
The success of high-quality drama shows on streaming demonstrates the audience’s desire for premium production values, according to Silcox. “This comes in the form of brilliant writing, great filmmaking and, quite often, great VFX. Great VFX can be subtle, of course. It’s not all about colliding planets or
disintegrating superheroes; it can be the cinematic renditions of the V-E. Day celebrations at Piccadilly Circus or any number of historical recreations in The Crown. Personally, I have been blown away by the quality of [recent] episodic series. I loved The Last of Us. While a lot of the success of the show comes down to the superb writing, the post-apocalyptic universe that Ellie and Joel struggle to survive in would not be as rich without the work of the talented VFX teams.”
NEW STANDARDS
Some specific projects have been praised for setting new standards for visual effects in streaming content. Lebensfeld comments, “One that really comes to mind is Stranger Things, which helped define the spectacle standard. Adding to that, The Last of Us, House of the Dragon, One Piece and these types of projects keep pushing the standard. Audiences can’t get enough. Another example is The Night Agent, which is less of a spectacle-style show yet has incorporated high-quality VFX on a more standard-style show for maximum visual impact.” Allan adds, “While The Mandalorian is a household name and somewhat of a benchmark for virtual productions in the streaming arena, The Last Of Us recently emerged as a groundbreaking project, earning an Emmy for its VFX achievements. These productions highlighted and elevated the bar of what’s expected of streaming content.” “Some projects whose VFX I have really enjoyed are Monarch: Legacy of Monsters on AppleTV+, The Mandalorian on Disney+ and Lost in Space on Netflix,” says Gaurav Gupta, Co-Founder and CEO of FutureWorks. Some of Silcox’s favorite series for VFX include The Crown, Foundation and the Netflix documentary Life on Our Planet. He points out, “The Last of Us [is] showing again how far episodic television has progressed since the arrival of the streaming platforms.”
VIRTUAL PRODUCTION AND REAL-TIME RENDERING
Virtual production techniques and real-time rendering are becoming increasingly essential in meeting the demands of streaming platforms. Virtual production can produce highquality content with smaller crews, lighter sets, more creative flexibility and more iteration in the early parts of the creative process where changes are cheap, according to Steve Sullivan, Chief Product Officer for Arcturus. He notes, “This is important to any producer, but especially to streaming platforms where cost and flow of content are critical to growing their subscriber base.” Silcox notes, “Virtual production continues to offer new ways to create and collaborate, and there is constant innovation and refinement of techniques in both VP and real-time. With hardware manufacturers like Nvidia and software platforms like Epic making seismic improvements to the technological landscape, we will continue to see improvements to the quality and efficiency of rendering three-dimensional environments and FX.”
“Virtual production techniques and real-time rendering are becoming indispensable in the streaming content production ecosystem,” Allan says. “These methods significantly shorten feedback between digital environment creation and directorial shot composition, which in turn improves efficiency and flexibility while helping actors to immerse themselves in the worlds in which they’re performing.”
Of course, at any point in time the streamers are working on many different types of projects, all with different needs. “For shows like The Mandalorian, which relied heavily on virtual production, it’s absolutely vital to their success. Others simply don’t need the level of complexity that this kind of shoot entails,” Matchette explains. There’s a fine balance to strike between practical shooting, traditional visual effects and emerging technologies like virtual production and AI. Genuine real-time rendering is the golden goose of visual effects, but it still needs a huge amount of computing power to make it work. That being said, platforms like Unreal Engine are lowering the barrier to entry for high-end visual effects and CG work. It’s likely we’ll see more projects adopting this approach in the future.”
STREAMLINING
In terms of implementing these technologies to streamline production processes, “previs and virtual sets have the most visible impact today, with interactive changes and real-time viewing at their core,” comments Sullivan. “As speed and quality improve, these approaches will spread further into asset creation, animation, simulation and lighting. The whole production process will feel more like previs, with quicker iteration in a richer context to support better creative decisions.”
“Streamlining asset prep is where most of the streamlining needs to happen,” says Addy Ghani, Vice President of Virtual Production for disguise. “With real-time rendered content, getting assets from a previs quality to production quality is the biggest hurdle.”
SKILL SETS
Eberle notes that the streaming platforms have had a notable impact on the skill sets required in the VFX workforce. “The demand for high-quality VFX in streaming content has pushed studios and VFX companies like Raynault to continually innovate and refine their techniques. As streaming platforms compete for viewership and subscribers, they invest heavily in original content with top-tier production values, including visual effects. This has led to an increasing need for VFX artists.”
TOOLS
The latest technological advancements in VFX tools and software are aiding studios in meeting the demands of streaming platforms. Matchett notes, “Artificial intelligence is a big one – we’re seeing AI-powered tools being integrated into key software like Nuke and the Adobe suite.”
Technological progress that will help create streamer content better and faster includes “advances in capturing the real world and bringing it into the LED volume through more efficient forms of volumetric capture, photogrammetry and Gaussian splatting,” Ghani says. Explains Silcox, “There are a few technologies which are changing the way we work at Lux Aeterna. USD, a universal language for all of our digital content creation packages, is very exciting and heralds the way for real-time rendering in our studio. Artificial intelligence and machine learning offers the prospect of both refining workflows and bringing new creative possibilities to our artists, and while this technology remains contentious, there are a lot of very promising technologies that could change the way artists work. The introduction of these technologies mean that we can work better, smarter and more creatively and therefore offer an even higher standard of work to our clients in the streaming industry.”
FUTURE TRENDS
“FutureWorks has experienced a notable surge in the demand for VFX services from streaming partners. This includes both global and local platforms in India,” Gupta says. “Since the inception of streaming platforms, the demand for VFX services has naturally increased, [and] VFX continues to play a pivotal role in storytelling. Even when there has been a drawback in the number of scripted shows being produced, we don’t see the platforms scaling down their ambitions. This can be seen with latest hits like Percy Jackson on Disney+ and Avatar: The Last Airbender on Netflix.”
“As the industry evolves, we are embracing and working with newer innovative technologies and focusing on how they are introduced into the traditional filmmaker workflow,” Del Conte observes. “Filmmakers and their studio partners are faced with greater and greater visual challenges. Creative opportunities have increased, with more new and diverse stories being told than ever before. In this VFX environment, we need to work together to focus on the exciting creative opportunities and embrace technical advancements. Viewers want compelling stories and characters with cool VFX, so we must continue to raise the bar, innovate and exceed expectations.”
SETTING THE STAGE FOR LED VOLUMES
By TREVOR HOGGTOP: Autodesk’s Paolo
believes that there’s room for software innovation when it comes to making LED volume shoots more accessible. (Image courtesy of Autodesk)
OPPOSITE TOP: StageCraft was utilized by NBCUniversal to promote the 2024 Olympic Games in Paris. (Image courtesy of ILM)
OPPOSITE BOTTOM: Brompton Technology offers features like ShutterSync and Frame Remapping, which expand upon what filmmakers can do with virtual production. (Image courtesy of Brompton Technology)
As with any tool in the film and television industry, the LED volume is being shaped by technological advancements, budgetary requirements and artistic demands, which means the application and capability of the virtual production methodology is constantly changing.
“From the outside looking in, it seems that the standardization of the previs process with LED volume production will be integral to broadening adoption of LED volumes,” states Tim Walker, Sr. Product Manager at AJA Video Systems. “Most large LED volume studios develop and leverage at least some of their own tools to fit their needs because there isn’t one cookie-cutter solution. However, once you hit ‘cookie-cutter status’ much wider adoption is possible as the costs and expenses come down.”
Some on-set roles have become even more critical within LED volume such as the DIT. “The DIT is responsible for working with the DP and/or director to ensure what is being captured will meet the demands of final color by monitoring the desired creative look on set for dailies, post-production, visual effects and ultimately final color grade,” Walker notes. “The volume introduces another variable in creating and maintaining that look, which the DIT will ultimately have some responsibility for and has the ability to assist in making any on-set color grading decisions that will help the overall production succeed early in the production process.”
When it comes to whether the software is keeping pace with the hardware development, Paolo Tamburrino, Senior Manager,
“We have a teaming agreement where we can use the stage and [Dark Slope studio] can use Houdini, We want to push Houdini to be more real-time through this partnership, and we’re going to learn about production that we’re only touching with our technologies.”
—Kim Davidson, President/CEO, SideFX, and VES Board Chair
Strategic Business Development at Autodesk, remarks, “Yes and no. Heavy investments have been made by Epic Games in Unreal Engine. Unreal Engine has been used heavily primarily for in-camera visual effects [ICVFX], but there is room for more innovation across the software community to make LED volume shoots more accessible.” Autodesk works closely with Epic Games. Says Tamburrino, “Our main collaboration is with Epic’s Unreal Engine and Maya Live Link, which allows artists to create assets in Maya and visualize them in real-time in Unreal Engine.” The emergence of Virtual Art Department services has reshaped the visual effects process and the types of stories being told. Explains Tamburrino, “Firstly, it can help screenwriters understand the worlds they’re writing for, how actors will interact with them and enhance the creative ideation of a film. Secondly, visual effects and art departments are involved at much earlier stages of pre-production, basically when the concept and the ideation starts, and that helps filmmakers have visual reference to tell better stories. In cases where content is captured on greenscreen, it’s very difficult even
Advertising is an enormous growth area for virtual production. (Image courtesy of 80six)
An example of accelerated volumetric video by Arcturus. (Image courtesy of Arcturus)
for established screenwriters, directors and DPs, not to mention all the other departments working traditionally in the film set, to imagine what the world might look like. Finally, shooting on an LED volume can give creative control back to the filmmakers, which has been lost to some degree with the traditional process in the 12-plus months of post-production/VFX after principal photography, where a film’s creative process often ends up out of the creative control of the filmmakers.”
“The big thing that has changed is how easy it is to do this at all,” observes Adrian Jeakins, Director of Engineering at Brompton Technology. “Back on The Mandalorian Season 1, it was the industry’s brightest minds figuring out how to get all of this to work, and there were no YouTube tutorials to follow. Now, the barrier to entry is much lower, which is amazing. For Brompton, as the LED part of the process, we’ve seen a rapid evolution in terms of the amount of flexibility and creative control that you had on those early volumes, to now. Initially, there was just a narrow range of settings that would work, and anything beyond those would break. Furthermore, you could only have one camera. We now have features like ShutterSync and Frame Remapping, which enable filmmakers to do so much more with virtual production.”
LED volumes excel in capturing certain shooting scenarios. Jeakins adds, “Virtual production is indisputably the best method for driving shots that are challenging on greenscreen because you have reflections everywhere. Again and again, we’ve seen filmmakers try virtual production for driving shots, and they love the experience. Generally, virtual production makes a lot of sense for episodic television because you’re pushing for feature film quality but don’t have a feature film budget. And you can easily reuse the same environments throughout a season or more. I’d say one of the biggest areas for growth is in traditional broadcast studios. Virtual production greatly improves the experience for talent. Advertising is also an enormous growth area. Advertising is a fast-paced business, and the creatives have different requirements in terms of photorealism, so they can take advantage of the flexibility of this technology.”
Among the technological advances is the ability to populate LED walls with volumetric video characters and crowds. “It’s exciting to see people realize what LED walls are capable of on an ongoing basis,” states Piotr Uzarowicz, Head of Partnerships and Marketing at Arcturus. “LED walls were limited in that you could put people on them because the only way to do so was as two-dimensional characters. If the camera moved, it destroyed the illusion of a person. But now that the characters on that wall can be volumetric and three-dimensional, it allows for the camera to shift and still represent that person on the LED wall in a truthful way. If you put them as secondary or background or distant characters, you won’t be able to tell the difference between a volumetric character and a real person.” To assist with the rendering, not everything is treated with the same level of detail. Uzarowicz says, “The software automatically renders characters that are closer to the camera at a higher resolution than the ones which are further away. Depending on how the camera moves, the characters are reanimated to higher or lower resolutions.” Scenes can be treated more realistically. Uzarowicz adds, “For example, where it might have been complicated to reproduce a city street with a café in the background and our characters are on the corner, that can now be reproduced realistically with a LED wall because we can populate that scene as if it were Manhattan. Lots of people running around back and forth, and we don’t need to use extras, which makes it more affordable for productions coming into that space, and more manageable in that you don’t need as large a crew to deal with hundreds of extras. You can manage all of those things in pre-production and have that realism for your principal actors during production so they can see the environment around them and have something to respond to.”
“AI-generated content will be huge in this area,” believes Ella Kennea, Production Manager at Dimension Studio. “We’re already seeing huge strides in upscaling technology, but we’re also excited about the progress in generative AI in quickly building assets and responding to creative feedback.” All departments need to become familiar with the technology. “It can’t be stressed enough that education and more hands-on experience for production heads of departments is crucial to the success of virtual production,” remarks James Franklin, Virtual Production Supervisor at Dimension Studio. “Also, education specifically around the content creation process. Directors and DPs have an enormous amount of creative opportunity there; they should definitely be getting involved earlier!” There are those educating themselves. “We’re seeing DPs, DITs and directors using bespoke tools on set, which enables them to directly make the changes they want,” Franklin notes. “Developments to Unreal Engine allow us to use high poly-count assets which is great for quick LiDAR scans. And as the industry matures, procedures are becoming more established. There is now much more experience on set and less of the unknown.” The spotlight should be solely on LED volumes. “Virtual production is an integrated production pipeline that starts in pre-production and carries through to post,” Franklin states. “We can previsualize scenes, more effectively plan shoots and provide savings with the data we capture. Everything that we
A test pattern in a Magicbox Superstudio. (Image courtesy of Magicbox)
create digitally is useful throughout the pipeline. So far, we’ve only scratched the surface. There are many more opportunities waiting to be discovered.”
Collaborating to launch an advanced virtual production smart stage in Toronto that will consist of a 75-inch volume and an AI-enabled pipeline is virtual production-focused studio Dark Slope and SideFX. “We have a teaming agreement where we can use the stage and they can use Houdini. We want to push Houdini to be more real-time through this partnership, and we’re going to learn about production that we’re only touching with our technologies,” remarks Kim Davidson, President and CEO of SideFX and current VES Board Chair. “Epic Games is an investor in SideFX, and we have a great relationship with them. But it’s another thing to have a company in Toronto to go to and see how Houdini performs in real-time or in conjunction with Unreal Engine because they’re Unreal Engine experts there. Virtual production is not going to be a bit part of the entertainment industry, but we’re always trending on the cutting-edge stuff, so for us it will be a case study for how virtual production plays out, and then you can usually trickle some of those technologies down to the mainstream, and mainstream still means high-end for us.”
“LED walls are only as good as the people and software that run them,” states Raja Khanna, Co-Founder and Executive Chair at Dark Slope. “2023-2024 is a tipping point as this tech has now been around for a number of years. It has been used, there are experts, new workflows and pipelines, and software like Unreal Engine has released updates that focused specifically on virtual production, which is essentially the art of using these types of walls and stages. All of that has combined to bring us to a place where it is finally at the stage where we can deliver a lower cost but producing the same quality.” The talent pool has dramatically increased. Khanna explains, “If we were talking three years ago, everybody was building a wall, but there was no one to run them. Today, almost every media college program around has a virtual production program, and they are graduating people who understand this technology, Unreal Engine, and the workflows, maybe not as well as they will in a few years; however, the version 1.0 of all those academic training programs has been released.”
Autodesk assisted with the LED stage shoot for Season 1 of Star Trek:
With Dimension Studios, Arcturus makes use of an open system that accommodates a variety of cameras, hardware and servers. (Image courtesy of Arcturus)
What remains as a major strength of the LED volume is the ability to remove the guesswork for the cast and crew. “One of the biggest things is being able to be fully immersed in an environment as they’re telling the story and being able to be inspired by what that space looks like, rather than walking into a greenscreen environment and working with your actors to imagine things,” observes Shivani Jhaveri, Virtual Production Producer at ILM. “A lot of our filmmakers have come back with the feedback that it helps to enhance the performances of the actors and allows our DPs to be fully immersed in the process as well from the get-go in the early stages. On every show, people are finding more use cases for it, and what makes StageCraft special is that we’ve been doing this for a number of years. One of the key pieces when we’re introducing new filmmakers to the process is, just as much as we tell them how to use the technology, we are also able
to show them examples of where it might not have worked as well in the past. It’s never a one-size-fits-all scenario. More often than not, we’re actually seeing filmmakers use virtual production in their pre-production phase now.”
“You want to embrace the things where you’re leaning into the efficiencies of virtual production. So, being able to scout locations and visual effects things, to be able to mock up sets quickly to look at a design plan, then start lensing it with a camera and say, ‘Oh, wait. If we don’t build that side of the building, we’re going to be shooting into the backside of it the entire time’ [is an important benefit],” remarks Chris Bannister, Executive Producer, Virtual Production at ILM. “Sometimes, you’re doing things with virtual productions that are pushing the technology to its absolute limits, and other times it’s figuring out what are the efficient tools and how do you tap into them. For all of the tools, the people are the most important part to any successful virtual production, people who understand the tools and are able to collaborate with the creatives to say, ‘Here is how we can be useful.’ The thing is being able to embrace it less as one [solution] and more about, what are the pieces that you can plug into? Similarly, what we want to embrace on LED volumes is that you can shoot lots of takes, shoot in several directions, get a lot of freedom in the edit and be able to change things later that you might not have if you shot large sequences on greenscreen. It’s not like we’re not using practical instruments, but it allows a lot of flexibility for the DP to audition ideas and work with those tools.”
Democratization is taking place when it comes to access to virtual production technology. “When I say democratization, I mean taking that technology outside of Hollywood and making it accessible anywhere,” states Brian T. Nowac, Founder and CEO of Magicbox. “The price point is in many cases 1/10 or 1/100 of the price to build or rent these large facilities, making it accessible to almost any level of production. Our starting price point is $10,000 a day. It costs that much to rent a restaurant to do a shoot in. We’re looking to change the usability from extremely complex that it began as, into a more user-friendly and recognizable toolset that almost anyone who is working in this industry can come in, sit down and start using our virtual production solution almost instantly. In making this technology mobile and accessible at a price point, we’re opening up new markets to use this technology. We’re not specific to motion picture or video production anymore. We’re now being used as an experiential, event and educational device. In terms of lingo, that is going to vary for us based on the industry that our solution is being used by. If it’s the motion picture industry, obviously we want to know that vernacular and production on-set etiquette with the people who are operating our products. But it is also important to note that we’re not ILM. We’re not providing the LED volume and VAD [Virtual Art Department]. Our solution is more like a camera solution. We’re a piece of equipment with an operator right now, but eventually that will change. There is a certain learning curve that needs to happen. After that learning curve is achieved, you might go to a lot, pick up a Magicbox yourself and drive off with it, or call up or use the app and have one delivered to you, and you’ll know how to use it.”
Rather than having to go to a virtual production studio, Magicbox offers a mobile solution that can go to the clients rather than the other way around.
(Image courtesy of Magicbox)
Magicbox is about 1/10th the price to build or 1/100th the price to rent a virtual production studio.
(Image courtesy of Magicbox)
KUGALI AND DISNEY: BRINGING AFRICAN STORIES TO THE WORLD WITH IWÁJÚ
By CHRIS McGOWANWhen Walt Disney Animation Studios teamed with Kugali Media earlier this year to launch the animated series Iwájú on Disney+, it debuted the first Nigerian animated TV series. In addition, “Iwájú is important for Disney Animation because it is not only our first-ever long-form series, but it’s also the first time we have collaborated with an outside creative company. It’s giving Africans a voice and a means through animation to tell their own stories on the highest platform across the world,” Producer Christina Chen states.
Iwájú is a sci-fi coming-of-age tale set in a futuristic Lagos, Nigeria. The series centers around a young girl named Tola, who is the daughter of a renowned tech inventor; she lives in a mansion on the island in the more affluent part of Lagos. Tola befriends Kole, a young boy, who is a self-taught tech wizard from the poor mainland. Things get complicated when Bode, a crime boss who runs a kidnapping ring, decides to make Tola his next target. Otin, an overly protective, little orange robot lizard, helps Tola along the way.
The series “is meant to be purely Nigerian but with a sci-fi twist as the story is set in the future,” says director and co-screenwriter Olufikayo “Ziki” Adeola (aka Ziki Nelson). “The main themes of the story explore inequality, innocence [and] the relationship bonds that hold us and break us.” Unlike Black Panther, which is a purely fictionalized location, Iwájú takes place in a realistic place that has been futurized,” Ziki notes.
The three principals of Kugali – director “Zika” Adeola, production designer Hamid Ibrahim and cultural consultant Toluwalakin “Tolu” Olowofoyeku – take viewers on a vivid journey into a singular world enlivened by visual elements and technological advancements inspired by the spirit of Lagos. The screenplay was co-written by Ziki and Halima Hudson.
How Kugali Media reached this point is a fantastic story in itself. It involves a tiny pan-African upstart firm challenging Disney in a BBC interview and eliciting an unexpected reaction that would propel them to a deal with Disney Animation. It all started with Tolu and Ziki, who had known each other since primary school in Lagos. Ziki comments, “Growing up, I loved all forms of media from music to comic books. I started off watching cartoons based on Marvel and DC characters. Spider-Man was my favorite. As I approached my teens, I discovered anime and that became my favorite form of media, particularly Dragon Ball Z and Naruto.” Later, Ziki moved to London and attended college. His and Tolu’s partner Hamid grew up in Uganda and Kenya and also went to college in Hertfordshire, England. The latter worked in visual effects and was a rigging technical director at MPC for The Predator, Dolittle, Dumbo and The Lion King.
Kugali’s roots lay in a weekly podcast hosted by Ziki and Tolu called “Tao of Otaku,” which started in 2015 and focused on comics, manga, anime and sci-fi. This led to Ziki and Tolu seeking out African comics and anime. They noticed that something was missing from the media they consumed. “We’re from Nigeria, we’re Nigerian. Everything we watch is American and Japanese. Where’s the Nigerian content?” Tolu recalled in the ABC News documentary Iwájú: A Day Ahead, which was released with Iwájú on February 28.
Ziki attended Lagos Comic Com and “saw a small group of creators making comics that were inspired by African mythology and folklore. It was a lightbulb moment. ‘Okay! This is new, this is original, and here’s an opportunity.’ What was needed was a
“[I] saw a small group of creators making comics that were inspired by African mythology and folklore. It was a lightbulb moment. 'Okay! This is new, this is original, and here’s an opportunity.' What was needed was a company that could be a platform.”
—Olufikayo “Ziki” Adeola, Co-Screenwriter/Director, Kugali Media
company that could be a platform,” he said in A Day Ahead. Tolu added, “We decided we’d transition [to] being a publisher. And put [others’] work together with our own work. The quality of the work is world-class. It’s good quality by anybody’s standards anywhere in the world.”
According to Producer Chen, the next year, Ziki and Tolu decided “rebrand and expand their podcast to YouTube and create a website exposing the world to manga, video games, animation and short films from all over Africa. This is when the name Kugali was born. It was derived from a Swahili word ‘kujali’ meaning ‘to care.’” Ziki explains, “The goal of the podcast was to create a space where we could discuss stories inspired by African culture across multiple mediums. The approach was always from the perspective of a fan, but it was also important to create a space where listeners could learn more about the various projects we covered.”
Ziki continues, “For me, it was a way to overcome my colonial mentality. This is where people from countries that once existed as European colonies perceive their culture to be inferior to that of Western culture. Therefore, I wanted to show that African stories had the potential to be as good as stories from anywhere else in the world.”
One aspect of this was publishing the pan-African comic anthology Kugali Anthology in 2017 with the help of Kickstarter. Ziki comments, “Kugali initially focused on comic books. This was down to my own personal connection to comics as a medium. Furthermore, it is possible to make a comic book with just one person as opposed to a movie which requires dozens of people. Therefore, as a small startup, comics made a lot of sense.”
Meanwhile, Hamid had been researching and speaking to several African visual entertainment companies and he had come across “Tao of Otaku.” He became interested and reached out to Ziki to see if they wanted to work together. Hamid brought his animation and visual effects expertise to the firm and started to work part-time for Kugali, joining it full-time in 2018. That year, Kugali Media was officially founded. Ziki comments, “Kugali is a company based online, employing people who work from different parts of the world; it was first registered in London and then Nigeria.”
According to Ziki, “Our goal is to use art and animation to create world-class African stories and new opportunities for African artists.”
In terms of African storytelling, “There isn’t a singular approach. Africa is a massive continent with dozens of countries, and within those countries, there are a whole host of various cultures. Our main thing is authenticity. Any story that we tell must be authentic to the culture it represents, and there has to be a feeling that it brings something new to the table that hasn’t been seen before. That means we take our research process very seriously and try to work with creatives from all over the continent and the diaspora.”
Kugali’s timing was good – Disney was looking to produce a wider variety of original content for its new streaming service, Disney+. Iwájú Producer Christina Chen recalls, “Jennifer Lee, Chief Creative Officer at Walt Disney Animation Studios, saw a BBC article online about this African comic book startup company called Kugali.” It was 2018. “Hamid Ibrahim, one of the founders, talked about wanting to ‘kick Disney’s ass’ in Africa, and Jenn met the challenge with open
arms and reached out to them. That fateful quote launched it all and was the beginning of everything.”
At the time, Hamid was sharing a flat with Ziki in London. The two were sitting by the living room, and an executive from Disney Animation reached out. In A Day Ahead, Hamid remembered, “Just imagine you’re sitting there. A young creative, right? You just called out the grandaddy of visual entertainment. You just said you’re going to kick their ass. Then somebody reaches out and says ‘hi’ and wants to talk.” It took a while for Ziki and Hamid to process the event and believe Disney actually called.
“I was skeptical. At the time we had met with a handful of people claiming to be connected to big names in Hollywood, but nothing ever came of these discussions, so I took it with a pinch of salt,” Ziki recalls. Disney met with Kugali and was intrigued. Chen comments, “It was more than appeal and more than the freshness of their ideas and stories. The founders of Kugali are visionaries. They had a vision for the boundless potential of African stories and African storytellers and built a platform with their dedication, brilliance and profound passion. I was, and still am, inspired by them, and I consistently learn from them both professionally and personally. When I first heard of the project and first met them, I knew that this project and this partnership would be unlike anything our studio has ever done and unlike anything I would ever experience again.”
According to Chen, once Iwájú was greenlit, the founders of Kugali teamed up with Disney Animation to bring the series to life. The work was shared by Disney Animation in Burbank, Kugali in Lagos and London, as well as Cinesite in Montreal and Vancouver. Storyboards and visual development were supervised and created in Burbank.
Teaming up with Disney was an education. “Through our collaboration with Disney, we’ve had an opportunity to learn directly from the best in the business,” Ziki comments. “Apart from the sheer inspiration one gets from working with some of the most talented names in the industry, it has also been very informative from a technical perspective. We’ve learned a lot about Disney’s systems and processes which will allow us to improve our operation.”
Iwájú means ‘the future’ in Yoruba, one of Nigeria’s main languages. “The animation style takes its inspiration from the vibrancy of Lagos. The environment has a painterly aspect, meant to be a canvas to bring out the energy of Lagos and create a palette of colors and movement,” Chen adds. “Lagos is an important cultural and financial center in Africa and therefore plays a significant part as a visual and creative gateway to the rest of the world. Due to the rapid evolution of technology and the resulting juxtaposition of class all within this city, it provides an opportunity to use science fiction as a foundation to tell a deeply human story.”
The Kugali team never lacked self-confidence. “I didn’t know if it would be Disney specifically, but I did believe one day we would get to a point where we would be collaborating with a studio of this caliber. Of course, I didn’t expect it would come so soon! This has been the biggest opportunity of my career; my family is proud and so is my country. It doesn’t get much better than that,” Ziki says.
An IMAX showing of Iwájú, produced by Walt Disney Animation Studios and Kugali Media, at the premiere in Lagos on February 27.
Series Composer Ré Olunuga speaking at the Iwájú premiere with “Tolu” Olowofoyeku, “Ziki” Adeola and others.
Optimization and Delivery
By CHRIS SWIATEK, ICVREdited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Virtual Production
Editedby Susan
Zwerman,VES and Jeffrey A. Okun, VES
Introduction
The Virtual Art Department (VAD) production period concludes with final optimization, stage testing and delivery of the content to the stage before shooting.
This period begins with delivery of the VAD content to the stage, after which it is thoroughly tested in the target environment. Any remaining issues are identified, resolved, re-tested and approved before shooting begins.
The specifics of this process vary depending on the nature of the production and how closely the VAD and stage operations team are connected. On some productions these responsibilities are handled by the same vendor, while other times they may operate as distinctly separate teams. Regardless of team structure, the delivery and testing period should be a collaborative process between the stage operations and VAD teams and requires efficient two-way exchange of communication and data.
Asset Delivery
Content is first delivered to the stage, then loaded onto the target hardware and tested before shooting begins. Game engine assets are managed using source control software, which stores files on a central server and enables collaboration between large teams without overwriting work.
Because the testing stage should be a collaboration between the VAD team and stage operations team, keeping the project hosted on source control is critical to ensuring a smooth two-way exchange of data. The stage team receives the content, tests it on the wall and makes initial tweaks to fix any obvious issues. The stage team can then push these changes back to the server so they are instantly received by the VAD team. Subsequent changes can then be made by the VAD team without fear of overwriting the initial changes made by the stage team. Without source control, any changes made by either team would have to be manually merged, introducing a higher likelihood of error and slowing down the testing process.
Delivery of content to the stage without the use of source control is only recommended in situations where the VAD team is not involved at all in testing or optimization and performs a strict linear handoff. This
scenario is not recommended as it is always more efficient for the same team who originally created the content to also perform optimization and technical fixes.
Stage Testing
Once the content is delivered to the stage operations team, it is tested on stage hardware in the final shooting environment. The content is benchmarked and stress-tested to ensure it hits the target frame rate. For an LED shoot, the visual quality of the content is analyzed on the wall and through a camera lens. Any performance issues or visual issues are recorded and documented so they can be fixed before the shoot. In the interest of efficiency, the stage operations team may immediately fix simple issues that come up during testing, while more in-depth issues are relayed back to the VAD team.
Optimizing for Target Hardware
By the time content is delivered to the stage, it should already be optimized well for the target hardware (see “Managing Asset Quality and Performance Needs for Virtual Production”). Even with careful planning and testing, unexpected performance issues sometimes appear during stage testing that were not obvious in the VAD test environment.
Many issues can be narrowed down quickly by using Unreal Engine’s optimization View Modes to identify problem areas or using Multi-user Editor on stage to turn scene objects on and off until the culprit object(s) is found. These techniques are helpful for revealing simple problems with straightforward fixes, like overlapping transparent textures, too many emissive light sources or too much foliage. If these techniques do not pinpoint the problem, scene profiling can be used to identify bottlenecks.
Members receive a 20% discount with code: VVP23 Order yours today! https://bit.ly/VES_VPHandbook
VES France Revisited: Thriving VFX Hub in the City of Lights
By NAOMI GOLDMANThe Visual Effects Society’s worldwide presence gets stronger every year – and so much of that is because of our regional VFX communities and their work to advance the Society and bring people together. Since it was established as the 13th Section in 2018, VES France has grown into a vibrant regional community, now boasting more than 200 members.
“The arts are an integral part of the French culture, cinema in particular, and I’m proud that the region is also becoming a hub for visual effects,” said Sophie Leclerc, Co-Chair of the VES France Section and Executive Producer at Rodeo FX Paris. “We are experiencing this growth because of the presence of a very talented pool of VFX artists in France and across Europe and many extremely good VFX and animation schools building new artists, and also due to enticing tax rebates for foreign productions either filmed in France or VFX-only projects.”
“Amid the recent [actors and writers] strikes, a lot of French talent working abroad decided to come home to France,” said Sebastián Eyherabide, Co-Chair of the VES France Section and VFX Executive Producer at Mathematic Studio. “Because we have a strong local industry, whether advertising or fictional cinema, we are not feeling the slowdown as much as some other places.”
The hub of visual effects work in Paris continues to develop, thanks to schools in both the north and south of France. Several cities are establishing branches of VFX and animation companies, including Montpellier and Valence. “The cost of life is cheaper outside of Paris, and because the remote or hybrid work model that was integrated into our lives during COVID is sticking, that revolution is allowing our industry to expand across the region,” said Eyherabide.
In concert with the growth of the local VFX industry, interest in the VES as a professional organization dedicated to practitioners has also increased. The Section’s membership includes 2D and 3D artists, designers and producers, many of whom have built strong ties working for different companies across France. While the Section had a majority of members working in advertising at its inception, there has been a surge of people from film, episodic and animation in the last 3-4 years. Both Co-Chairs stated that it is a priority goal to recruit more women and diversify their membership, with Leclerc noting, “we hope to reach gender parity in the next few years.”
VES France hosts a robust roster of film screenings and Q&As for members and guests, thanks to the TSF Group, which holds the events in their projection room in the La Plaine Saint-Denis business district. The Section hosts regular pub nights and an annual BBQ to bring together existing and prospective members. This past Fall, they combined the BBQ and recruiting drive at a festive event on a glowing, chartered boat on the Seine River – which is still getting rave reviews. The Section held a celebratory gathering for the first World VFX Day in
“It’s nice to have a community where tech and artistry meet. To be around people with the same passions and interest in a professional field that is always changing and expanding –it’s fantastic to be able to grow together.”
—Sophie Leclerc, Co-Chair, VES France
December, when it presented awards to Head of Studio Paris at Rodeo FX/VFX Supervisor Franck Lambertz, CEO and Founder of Happy Flamingos Studio/VFX Supervisor Josselin Mahot and Vice President France VFX/VFX Supervisor-Producer Roxane Fechner. The group also hosted a recent outing to view a special exhibition of filmmaker James Cameron’s work at La Cinémathèque Française.
Every year, VES France is a proud sponsor of the Paris Images Digital Summit (PIDS), an event created by the Centre des arts d’Enghien-les-Bains and dedicated to visual effects, where the creative, technical and economic issues of the constantly changing sector intersect.
“Each year before the PIDS masterclass – which was last taught by award-winning VFX Supervisor Guillaume Rocheron – we have the opportunity to talk about the VES to this large group of VFX professionals and students,” said Eyherabide. “This is always a great opportunity for us to raise awareness about the VES and continue to enrich our community. We want to sponsor more events to heighten our profile with the future goal of ‘when you think about VFX in France, you think about the VES.’”
VES France has a highly engaged membership and group of leaders who contribute to many programs and events. The Section has shot three video interviews for the VES Luminaries Series with VFX Supervisors Christian Guillon, Rodolphe Chabrier and Eve Ramboz. Section Board of Managers member Alain Boutillier worked on a resource project outlining the definitions of VFX jobs (in French and English), as a tool to help increase understanding and respect for the craft and build bridges with directors, cinematographers and other allied professionals. The Section also hosted an in-person VES Awards nominations event, which further cemented camaraderie among the members.
“I’m really attached to the VES Awards,” said Eyherabide. “It is a privileged and intimate moment to see the breakdowns of the work and understand that even big companies and experienced artists have those challenges to find solutions and innovate in every domain.”
Said Leclerc, “It’s nice to have a community where tech and artistry meet. To be around people with the same passions and interest in a professional field that is always changing and expanding – it’s fantastic to be able to grow together.”
“I still consider us a young Section, and I’m happy that we are getting recognized by our peers for all that we are doing to enhance our community” Eyherabide added. “Across COVID, strikes and the AI revolution, the VES fills a space and connects people to talk and share support and ideas so that no one feels isolated. It is a beautiful dynamic that I’m proud to be a part of.”
BOTTOM:
BOTTOM:
VES Takes the Stage at FMX
By NAOMI GOLDMANVES Board Chair Kim Davidson and Executive Director Nancy Ward co-hosted an interactive audience discussion at FMX with Jonas Ussing, VFX Supervisor at Space Office VFX on
“No CGI is Really Just Invisible CGI.”
In April, the VES continued its long partnership with FMX by participating in the 28th edition of FMX [Film & Media Exchange] in Stuttgart, Germany, one of the premier international conferences dedicated to animation, effects, interactive and immersive media.
VES Board Chair Kim Davidson and Executive Director Nancy Ward co-hosted an interactive audience discussion with Jonas Ussing, VFX Supervisor at Space Office VFX on “‘No CGI’ is Really Just Invisible CGI” – named for Ussing’s popular video exposé series that highlights the industry practice of minimizing the pivotal contributions of visual effects artistry. With the intention of giving VFX pros the recognition and credit they deserve, the discussion delved into the root causes of today’s CGI backlash and steps that the VFX industry can take to elevate visibility and respect for the craft and its practitioners.
The second VES panel featured The VES Handbook of Virtual Production, the acclaimed guide to virtual production techniques and best practices, in a discussion on “The State of VFX Virtual Production.” Using the comprehensive reference book as a base, a panel of virtual production experts brought to life several chapters with real-life production examples. Thank you to Christina Caspers-Roemer, General Manager/Managing Director, TRIXTER GmbH; Nils Pauwels, Director of Virtual Production, Ready Set Studios; Markus Trautmann, Managing Director, Sehsucht; and Tobias Stärk, Creative Technologist on VFX/XR/Virtual Production for this dynamic forum.
VES Hosts Global Webcast on the Potential Impact of AI
Now that Generative Artificial Intelligence (GenAI) is quickly becoming more powerful, what will the future hold for visual effects practitioners?
Machine Learning has been used for years to enable new capabilities, make VFX artists’ lives easier and achieve better visuals. Now that Generative Artificial Intelligence (GenAI) is quickly becoming more powerful, what will the future hold for visual effects practitioners?
The Visual Effects Society’s Technology Committee delved into the implications of GenAI in the VFX industry in the dynamic webcast, “How Generative AI Might Affect Visual Effects Now and In the Future.”
A group of forward-thinking professionals convened for a discussion on how to navigate the landscape of Machine Learning and AI solutions, how the industry is likely to change in the coming years and how to adapt and grow in your career. This interactive forum was the first in a planned series of conversations addressing this rapidly evolving field and framing the discourse that tries to project its impact.
Thank you to our esteemed panel of experts for sharing your insights: VES 1st Vice Chair and VFX, Post & Technology Recruiter Susan O’Neal; author and Distinguished Research Scientist in DL/ML & CG at Wētā FX Dr. Andrew Glassner; VES Technology Committee member and CTO at Cinesite Group Michele Sciolette; shareholder & Co-Chair of Buchalter’s Entertainment Industry Group and Adjunct Professor at Southwestern Law School Stephen Strauss; and forum moderator, VES Technology Committee member and Media & Entertainment Executive, CTO & Industry Advisor Barbara Ford Grant.
VFX Evolution Swings Into Action on Planet of the Apes
The current issue of VFX Voice takes a deep dive into the new Kingdom of the Planet of the Apes, directed by Wes Ball, where the VFX progression of the ape characters takes an astonishing leap. The recent film does, however, conjure up memories of the original 1968 Planet of the Apes featuring Charlton Heston and Roddy McDowell, adapted from the 1963 science fiction novel by French author Pierre Boulle. While that movie may look somewhat primitive through 2024 eyes, it was groundbreaking in its day. The apes in the early film were all made-up actors with prosthetic makeup, thanks to the work of another of cinema’s major unsung heroes, John Chambers. The makeup work was so dramatic at the time that Chambers received an Honorary Academy Award from the Academy of Motion Picture Arts and Sciences in 1969. This was before there was a category for makeup. Born and raised in Chicago, Chambers was first a commercial artist earning a living crafting carpets and jewelry. During World War II, he became a dental
technician and found himself making prosthetic limbs for wounded soldiers at a veteran’s hospital. He also helped reconstruct facial scarring where he became a master of the human face. Later, he found himself at NBC-TV as a makeup man. That led to filmmaking and a truly unsung role in makeup and prosthetics. He created the iconic ‘ears’ of Vulcan Spock (Leonard Nimoy) in the original Star Trek TV series. Chambers’ other work included The Munsters, Outer Limits, Cat Ballou and A Man Called Horse. He is also credited with the development of the Cornelius and Dr. Zaius characters in Planet of the Apes. Chambers was awarded the CIA’s Intelligence Medal of Merit for his involvement in the Canadian Caper, in which six American hostages escaped during the 1979 Iran hostage crisis. The incident was the basis of the film Argo, which won the 2012 Academy Award for Best Picture, and in which Chambers was played by John Goodman.