VFX OSCAR PREVIEW • AI & VFX • GLADIATOR II PROFILES: PAUL LAMBERT & LAURA PEDRO
WICKED TALES
WINNER OF THE 2024 FOLIO: OZZIE AWARD
Best Cover Design VFXV Winter 2024 Issue (Association/Nonprofit/Professional/Membership)
HONORABLE MENTIONS FOR THE 2024 FOLIO: EDDIE & OZZIE AWARD
Best Cover Design VFXV Fall 2023 Issue and Best Full Issue for Fall 2023 Issue
The Folio: Awards are one of the most prestigious national awards programs in the publishing industry. Congratulations to the VFXV creative, editorial and publishing team!
Thank you, Folio: judges, for making VFXV a multiple Folio: Award winner.
Welcome to the Winter 2025 issue of VFX Voice!
Thank you for being a part of the global VFX Voice community. We’re proud to be the definitive authority on all-things VFX and winner of the prestigious 2024 FOLIO: Ozzie & Eddie Award in Publishing for Best Cover Design. In this issue, VFX Voice previews likely contenders for the Academy Award for Visual Effects and delves in the VFX wizardry of epic musical fantasy Wicked in our cover story. We step inside the Colosseum for Gladiator II and celebrate the animation work of studios in Latvia, France and Belgium in feline-friendly Flow. We also shine a bright light on VFX-driven live events, the game-changing industry impacts of AI, how streamers continue to expand the global demand for VFX, and the dramatic rise of large-format cinematic experiences. We deliver personal profiles of Goya Award-winning Spanish Visual Effects Supervisor Laura Pedro and multiple Oscar-winning Visual Effects Supervisor Paul Lambert, and highlight our dynamic VES Georgia Section. Dive in and meet the innovators and risk-takers who push the boundaries of what’s possible and advance the field of visual effects.
Cheers!
P.S. You can continue to catch exclusive stories between issues only available at VFXVoice.com. You can also get VFX Voice and VES updates by following us on X at @VFXSociety.
Kim Davidson, Chair, VES Board of Directors
Nancy Ward, VES Executive Director
FEATURES
8 THE VFX OSCAR: PREVIEW
Meet the contenders seeking to make VFX Oscar history.
26 VFX TRENDS: PREMIUM CINEMA
Inside the growing appetite for large-format experiences.
38 FILM: GLADIATOR II
Ridley Scott’s VFX/SFX teams bring new tech to the arena.
48 PROFILE: PAUL LAMBERT
Combining art and technology keys Oscar winner’s success.
54 SPECIAL FOCUS: AI & VFX
Expert insight into AI’s transformation of the VFX industry.
68 TELEVISION/STREAMING: VFX IN DEMAND
Streamers continue to reshape the global VFX pipeline.
76 COVER: WICKED
Reviving The Wizard of Oz with immersive, touchable VFX.
86 PROFILE: LAURA PEDRO
With success, VFX Supervisor evolves from protégé to mentor.
92 VR/AR/MR TRENDS: VFX ON STAGE
Visual tech powers next-generation, VFX-fueled live events.
100 ANIMATION: FLOW
Original all-European animated feature emerges as Oscar entry.
DEPARTMENTS
2 EXECUTIVE NOTE
106 THE VES HANDBOOK 108 VES SECTION SPOTLIGHT – GEORGIA 110 VES NEWS
112 FINAL FRAME – SPARTACUS
ON THE COVER: Glinda (Ariana Grande) and Elphaba (Cynthia Erivo) in Wicked (Image courtesy of Universal Pictures)
VFXVOICE
Visit us online at vfxvoice.com
PUBLISHER
Jim McCullaugh publisher@vfxvoice.com
EDITOR
Ed Ochs editor@vfxvoice.com
CREATIVE
Alpanian Design Group alan@alpanian.com
ADVERTISING
Arlene Hansen Arlene-VFX@outlook.com
SUPERVISOR
Ross Auerbach
CONTRIBUTING WRITERS
Naomi Goldman
Trevor Hogg
Chris McGowan
Barbara Robertson
Oliver Webb
ADVISORY COMMITTEE
David Bloom
Andrew Bly
Rob Bredow
Mike Chambers, VES
Lisa Cooke, VES
Neil Corbould, VES
Irena Cronin
Kim Davidson
Paul Debevec, VES
Debbie Denise
Karen Dufilho
Paul Franklin
Barbara Ford Grant
David Johnson, VES
Jim Morris, VES
Dennis Muren, ASC, VES
Sam Nicholson, ASC
Lori H. Schwartz
Eric Roth
Tom Atkin, Founder
Allen Battino, VES Logo Design
VISUAL EFFECTS SOCIETY
Nancy Ward, Executive Director
VES BOARD OF DIRECTORS
OFFICERS
Kim Davidson, Chair
Susan O’Neal, 1st Vice Chair
Janet Muswell Hamilton, VES, 2nd Vice Chair
Rita Cahill, Secretary
Jeffrey A. Okun, VES, Treasurer
DIRECTORS
Neishaw Ali, Alan Boucek, Kathryn Brillhart
Colin Campbell, Nicolas Casanova
Mike Chambers, VES, Emma Clifton Perry
Rose Duignan, Gavin Graham
Dennis Hoffman, Brooke Lyndon-Stanford
Quentin Martin, Maggie Oh, Robin Prybil
Jim Rygiel, Suhit Saha, Lopsie Schwartz
Lisa Sepp-Wilson, David Tanaka, VES
David Valentin, Bill Villarreal
Rebecca West, Sam Winkler
Philipp Wolf, Susan Zwerman, VES
ALTERNATES
Andrew Bly, Fred Chapman
John Decker, Tim McLaughlin
Ariele Podreider Lenzi, Daniel Rosen
Visual Effects Society
5805 Sepulveda Blvd., Suite 620
Sherman Oaks, CA 91411
Phone: (818) 981-7861 vesglobal.org
VES STAFF
Elvia Gonzalez, Associate Director
Jim Sullivan, Director of Operations
Ben Schneider, Director of Membership Services
Charles Mesa, Media & Content Manager
Eric Bass, MarCom Manager
Ross Auerbach, Program Manager
Colleen Kelly, Office Manager
Mark Mulkerron, Administrative Assistant
Shannon Cassidy, Global Manager
P.J. Schumacher, Controller
Naomi Goldman, Public Relations
OSCAR PREVIEW: NEXT-LEVEL VFX ELEVATES STORYTELLING TO NEW HEIGHTS
By OLIVER WEBB
Godzilla Minus One made history at last year’s 96th Academy Awards when it became the first Japanese film to be nominated and win an Oscar for Best Visual Effects, and the first film in the Godzilla franchise’s 70-year history to be nominated for an Oscar. Will the 97th Academy Awards produce more VFX Oscar history? Certainly, VFX will again take center stage, with a number of pedigree franchises and dazzling sequels hitting movie screens in the past year. From collapsing dunes to vast wastelands, battling primates and America at war with itself, visual effects played a leading role in making 2024 a memorable, mesmerizing year for global audiences.
TOP: Dune: Part Two has significantly more action and effects than Dune: Part One, totaling 2,147 VFX shots. (Image courtesy of Warner Bros. Pictures)
OPPOSITE TOP TO BOTTOM: Editorial and postvis collaborated with the VFX team to create a truly unique George Miller action sequence for Furiosa: A Mad Max Saga. (Image courtesy of Warner Bros. Pictures)
The VFX team at Framestore delivered 420 shots for Deadpool & Wolverine, while Framestore’s pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. (Image courtesy of Marvel Studios)
Wētā FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes Remarkably, there are only 38 non-VFX shots in the film. (Image courtesy of Walt Disney Studios Motion Pictures)
Dune: Part One won six Academy Awards in 2022, including Best Achievement in Visual Effects, marking Visual Effects Supervisor Paul Lambert’s third Oscar. Released in March, Dune: Part Two is an outstanding sequel and has significantly more action and effects than the first installment, totaling a staggering 2,147 visual effects shots. The film is a strong contender at this year’s Awards. “It was all the same people from Part One, so our familiarity with Denis’s [Villeneuve] vision and his direction allowed us to push the boundaries of visual storytelling even further,” Lambert says.
The production spent a lot more time in the desert on Dune Two than on Dune One. Cranes were brought in and production built roads into the deep deserts of Jordan and Abu Dhabi. Concrete slabs were also built under the sand so that the team could hold cranes in place for the big action sequences. “A lot of meticulous planning was done by Cinematographer Greig Fraser to work out where the sun was going to be relative to particular dunes,”
Lambert explains. “We had an interactive view in the desert via an iPad that gave us a virtual view of these enormous machines at any time of day. This allowed us, for example, to figure out the shadows for the characters running underneath the spice crawler legs and the main body of the machine. VFX was then able to extend the CG out realistically, making it all fit in the same environment. Dune: Part One was a collaborative experience, but Dune: Part Two was even more so as we went for a much bigger scale with lots more action.”
The first topic discussed during pre-production among department heads and Villeneuve were the worm-riding scenes. Villeneuve envisaged Paul Atreides mounting the worm from a collapsing dune – an idea that immediately struck the team as visually stunning and unique. The challenge lay in making this concept and the rest of the worm-riding appear believable. Filming for the worm sequences took place in both Budapest and the UAE. A dedicated ‘worm unit’ was established in Budapest for the months-long shoot. The art department built a section of the worm on an SFX gimbal surrounded by a massive 270-degree sand-colored cone. “This setup allowed the sun to bounce sand-colored light onto the actors and stunt riders who were constantly blasted with dust and sand,” Lambert describes. “Shooting only occurred on sunny days to maintain the desert atmosphere. Most of the actual worm-riding shots were captured here, except for the widest shots, which were later augmented with CG. In post-production, the sand-colored cone was replaced with extended, sped-up, low and high-flying helicopter footage of the desert.”
For the collapsing dune scene, an area was scouted in the desert, and then a 10-foot-high proxy dune crest was created on flat desert. Three concrete tubes attached to industrial tractors were buried in this proxy dune and were used to create the collapsing effect while a stunt performer, secured by a safety line, ran across and descended into the collapsing sand as the tubes were pulled out. “We could only attempt this once a day because of the need to match the light to the real dune, and the re-set to rebuild the crest took a few hours. On the fourth day, Denis had the shot he wanted. Post-production work extended the dune’s apparent height to match the real dune landscape. The sequence was completed with extensive CG sand simulations of the worm moving through dunes, all contributing to the believability of this extraordinary scene.”
Mad Max: Fury Road was nominated for Best Visual Effects at the 2016 Academy Awards. Spin-off prequel/origin story Furiosa:
A Mad Max Saga, the fifth installment in the Mad Max franchise, is the first of the films not to focus on the eponymous Max Rockatansky. DNEG completed 867 visual effects shots for the finished film. When DNEG came onboard with the project, main conversations were focused on the scope of the film and the variety of terrains and environments. “Furiosa covers much more of the Wasteland than Fury Road did and details a lot of places that had only been touched on previously,” notes DNEG VFX Supervisor Dan Bethell. “It was really important that each environment have its own look, so as we travel through the Wasteland with these characters, the look is constantly changing and unique; in effect, each environment is its own character.”
TOP: Earlier footage from Gladiator was blended into Gladiator II flashbacks and live-action, especially original Gladiator crowd footage and in the arenas. The Colosseum set for Gladiator II was detailed as closely as possible to the first film.
(Photo: Aidan Monaghan. Courtesy of Paramount Pictures)
BOTTOM: Blowing up the Lincoln Memorial for Civil War was shot in a parking lot in Atlanta. The single-story set was extended with VFX and the explosion grounded in real footage. Soldiers fired at a bluescreen with a giant hole in the middle. (Image courtesy of A24)
The Stowaway sequence was particularly challenging for the visual effects team to complete. “Apart from being 240 shots long and lasting 16 minutes, it had a lot of complex moving parts; vehicles that drive, vehicles that fly, dozens of digi-doubles, plenty of explosions and, of course, the Octoboss Kite!” says Bethell. “Underneath it all, a lot of effort also went into the overall crafting of the sequence, with editorial and postvis collaborating with our VFX team to create a truly unique George Miller action piece. The Bullet Farm Ambush was also a big challenge, although one of my favorites. Choreographing the action to flow from the gates of Bullet Farm down into the quarry as we follow Jack, then culminating with the destruction of, well, everything was very complex. We work often on individual shots, but to have over a hundred of them work together to create a seamless sequence is tough.”
Working on a George Miller project is always a unique experience for Bethell. “Everything is story-driven, so the VFX has to be about serving the characters, their stories and the world they inhabit. It’s also a collaboration; the use of VFX to support and enhance work from the other film departments such as stunts, SFX, action vehicles, etc. I enjoy that approach to our craft. Then, for me, it’s all about the variety and scope of the work. It’s rare to get to work on a film with such a vast amount of fresh and interesting creative and technical challenges. On Furiosa, every day was something new, from insane environments and FX to the crazy vehicles of the Wasteland – this movie had it all!”
Alex Garland’s Civil War required over 1,000 visual effects shots as Garland pushed the importance of realism. “The more grounded and believable we could make Civil War, the scarier it would be,” notes Production VFX Supervisor David Simpson. “We deliberately avoided Hollywood conventions and set a rule that all
TOP AND BOTTOM: Twisters features six tornadoes for which ILM built 10 models (Images courtesy of Universal Pictures)
inspiration should be sourced from the real world. Every element VFX brought to the film had a real-world reference attached to it –drawing from documentaries, news footage, ammunition tests and war photography.”
Due to the strict rules about shooting from the skies above Washington D.C., capturing the aerial shots of the Capitol would have been impossible to do for real. This resulted in full CG aerial angles over D.C. and the visual effects team building their own digital version, which covered 13 square miles and 75 distinct landmarks, thousands of trees, buildings, lampposts and a fully functioning system of traffic lights spread over 800 miles of roads. “Plus, there are roadworks, buildings covered in scaffolding, parked cars, tennis courts and golf courses,” Simpson adds. “One of my favorite touches is that our city has cranes – because all major cities are constantly under construction!”
The visual effects team went even further, building a procedural system to populate the inside of offices. “When the camera sees inside a building, you can make out desks, computers, potted plants, emergency exit signs, water coolers. The buildings even have different ceiling-tile configurations and lightbulbs with slight tint variations. We literally built inside and out! Once the city was complete, it was then turned into a war zone with mocap soldier skirmishes, tanks, police cars, explosions, gunfire, helicopters, debris, shattered windows and barricades.”
Here follows multiple generations of couples and families that
TOP AND BOTTOM: Robert Zemeckis’ Here follows multiple generations of couples and families that have inhabited the same home for over a century. The movie required de-aging Tom Hanks and Robin Wright. Nearly the entire movie was touched by VFX in some form or another. (Images courtesy of TriStar Pictures/Sony)
BOTTOM: Dune: Part Two was even more of a collaborative experience than Dune: Part One, on a bigger scale with more action. (Image courtesy of Warner Bros. Pictures)
have inhabited the same home for over a century. Three sequences in the film were particularly CG-dominant, the first being the neighborhood reveal, which was the last shot in the movie. “It was challenging mainly because it was subject to several interpretations, compositions and lighting scenarios, and the build was vast,” says DNEG VFX Supervisor John Gibson. “The sequence surrounding the house’s destruction was also incredibly complex due to the interdependence of multiple simulations and elements, which made making changes difficult and time-consuming.”
The biggest challenge was the ‘grand montage,’ which required seamless transitions through various time periods and environments. “The Jurassic Era beat was especially challenging in that we needed to flesh out a brand-new world that had real-time elements mixed with accelerated time elements, and they all had to be set up to transition smoothly into the superheated environment and maintain a consistent layout,” Gibson details. “By far the most challenging aspect of the grand montage was the tree and plant growth. As it would have been very difficult to modify existing plant growth systems to match our cornucopia of plant species using the existing software available for foliage animation and rendering, we had to develop a host of new techniques to achieve the realistic results we were after.”
Gibson lauds the collaborative spirit of the team. He cites their willingness to experiment, learn new techniques and support each other as instrumental in overcoming the challenges of the condensed production schedule. “Boundaries between departments dissolved, folks seized work to which they thought they could contribute, there was little hesitation to bring in and learn
“I love Marvel’s collaborative approach to VFX – things are often hectic at the end, but that is because stuff is still being figured out, largely because it’s complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas can end up in the film. For hard-working VFX artists, nothing is better than that.”
new software or techniques, and we brainstormed together, constantly looking for better and better ways to get results. That’s what stood out to me: the cohesion within the team.”
Wētā FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes. Remarkably, there are only 38 non-VFX shots in the film. VFX Supervisor Erik Winquist ran through a gauntlet of challenges, “from a cast of 12 new high-res characters whose facial animation needed to support spoken dialogue, to a minute-long ‘oner’ set in an FX extravaganza with 175 apes and 24 horses to choreograph,” he notes. “The scenes that I’d say were the most challenging were those that featured large water simulations integrating with on-set practical water, digital apes and a human actor. The bar for reality was incredibly high, not only for the water itself but also in having to sell that water’s interaction with hairy apes, often in close-ups. It was an incredibly satisfying creative partnership for me and the whole team, working with [director] Wes Ball. From the start, he had a clear vision of what we were trying to achieve together and the challenge was about executing that vision. It gave us unshifting goal posts that we could plan to, and
TOP: Cassandra inserting her hand through Mr. Paradox’s head was one of the many challenging VFX shots required for Deadpool & Wolverine. (Image courtesy of Marvel Studios)
MIDDLE AND BOTTOM: Framestore VFX Supervisor Robert Allman praises Marvel’s collaborative approach to VFX on Deadpool & Wolverine, which he describes as a “melting pot” for filmmakers and artists. (Images courtesy of Marvel Studios)
BOTTOM: Strict rules about shooting from the skies above Washington D.C. prevented capturing aerial shots of the Capitol for Civil War, which resulted in full CG aerial angles over D.C. and the VFX team building a digital version covering 13 square miles and 75 distinct landmarks. (Image courtesy of A24)
we knew that we were in safe hands working on something special together. That knowledge created a great vibe among the crew.”
Deadpool & Wolverine has grossed more than $1.264 billion at the box office, a staggering feat. The VFX team at Framestore delivered 420 shots, while Framestore’s pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. Robert Allman served as Framestore VFX Supervisor on the film. “I love Deadpool, so it was tremendously exciting to be involved in making one,” he explains. “However, more than this, I love Marvel’s collaborative approach to VFX – things are often hectic at the end, but that is because stuff is still being figured out, largely because it’s complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas really can end up in the film. For hard-working VFX artists, nothing is better than that.”
The atomizing of Cassandra in the final sequence was technically tough to achieve. Making a completely convincing digital human – and the atomizing effects as detailed and dynamic as the shots demanded – was a huge challenge. “Most problematic was creating an effect within the borders of good taste when the brief –‘disintegrate the face and body of a human’ – seems to call for gory and horrifying. Many takes of this now lie on the digital cuttingroom floor. An early wrong turn was to reference sandblasted meat and fruit, for which there are a surprisingly large number of videos on YouTube. However, this real-world physics gave rise to some stomach-churning simulations for which there was little appetite among filmmakers and artists alike. In the end, the added element of searingly hot, glowing embers sufficiently covered the more
TOP: More shooting time was spent in the desert on Dune: Part Two than on Dune: Part One. Cranes were brought in and production built roads deep into the deserts of Jordan and Abu Dhabi, UAE. (Image courtesy of Warner Bros. Pictures)
TOP AND BOTTOM: Traveling through the Wasteland with the characters of Furiosa: A Mad Max Saga, the look is constantly changing and unique. Each environment had to have its own look and, in effect, became its own character. (Images courtesy of Warner Bros. Pictures)
visceral elements of the gore to make the whole thing, while still violent, more palatable to all concerned.”
Ridley Scott’s Gladiator was met with critical acclaim upon its release in 2000. It won five awards at the 73rd Academy Awards, including Best Visual Effects. Nearly 25 years later, Gladiator II hits screens as one of the most anticipated releases of the year. Last year, Scott’s highly anticipated Napoleon was also nominated for Best Visual Effects, and Scott’s films are, more often than not, strong contenders at the Awards.
Work for Gladiator II was split between Industrial Light & Magic, Framestore, Ombrium, Screen Scene, Exceptional Minds and Cheap Shot, with 1,154 visual effects shots required for the film. For Visual Effects Supervisor Mark Bakowski, the baboon fight sequence was particularly daunting. “Conceptually, this was a tough one,” he explains. “Very early on, Ridley saw a picture of a hairless baboon with alopecia. It looked amazing and terrifying but also somewhat unnatural. Most people know what a baboon looks like, but a baboon with alopecia looks a bit like a dog. Framestore did a great job and built a baboon that looked and moved just like the reference, but viewed from certain angles and in action, unfortunately, it didn’t immediately sell ‘baboon.’ It’s one thing to see
one in a nature documentary, but to have one in an action sequence with no introduction or explanation was a visual challenge.”
Bakowski explains that working with Ridley Scott was a crazy and unique experience. “So many cameras and such scale, it’s a real circus – and Ridley’s very entertaining. He talks to everyone on Channel 1 on the radio, so you can follow along with his thought process, which is by turns educational, inspirational and hilarious. A lovely man. I enjoyed working with him. The VFX team was all fantastic and so capable both on our production side and vendor side. I’ve never worked with such an amazing bunch on both sides. Our production team was a well-oiled machine – sometimes in both senses but mainly in terms of efficiency and, vendor side, it’s great just being served up these beautiful images by such talented people. Both made my job so much easier. The locations were stunning, both scouting and shooting – 99% of the film was shot in Malta and Morocco, so you’re there for a long time; you get to immerse yourself in it. That was multiplied by the fact we got impacted by the strikes, so we ended up going back to Malta multiple times. I felt I got to know the island quite well and loved it and the people. That said, I won’t be going back to Malta or Morocco for a holiday soon. I feel like I’ve had my fill for a while!”
Other outstanding releases that could potentially compete for Best Visual Effects include Twisters, which took everyone by storm earlier in 2024 (with ILM as the main vendor), Godzilla x Kong: The New Empire featuring compelling work by Wētā, Scanline VFX, DNEG and Luma Pictures, among others, and A Quiet Place: Day One, a fresh, frightening addition to the Quiet Place series.
TOP AND BOTTOM: One of the biggest challenges facing the VFX team on Kingdom of the Planet of the Apes was the cast of 12 new high-res characters whose facial animation needed to support spoken dialogue.
(Images courtesy of Walt Disney Studios Motion Pictures)
NEXT-GENERATION CINEMA: THE NEW STANDARD IS ‘PREMIUM’ – AND IT’S WORKING
By CHRIS McGOWAN
Cinema audiences are increasingly showing an appetite for higher-resolution, higher-quality movies, often with large-format presentations and/or 4D effects. Soon, they will also be exploring AR movie augmentations and an increasing number of film-related experiences as well.
“When audiences began returning to theaters after the pandemic, they wanted experiences that they couldn’t get in their homes. Now, in a post-pandemic world, moviegoers want something premium and special for their time, and audiences seek out IMAX because it truly is a premium experience,” says Bruce Markoe, Senior Vice President and Head of Post & Image Capture for IMAX.
IMAX is a pioneer and leader in premium cinema. Markoe notes, “As of June 30, 2024, there were 1,780 IMAX systems (1,705 commercial multiplexes, 12 commercial destinations, 63 institutional) operating in 89 countries and territories. The numbers speak for themselves. In 2023, IMAX delivered one of the best years in our history. with nearly $1.1 billion in global box office. And while last year’s Hollywood strikes dealt the entire entertainment business a temporary setback, it was just that: temporary. We’ve had an incredible 2024 to-date at IMAX, marked by several recent box-office successes, including Inside Out 2, Twisters and Deadpool & Wolverine, and as we look ahead, this trend shows no signs of slowing down.”
Markoe explains, “Every IMAX location in the world is built to our precise standards – they are designed, built and carefully maintained with meticulous care, and every location is customized for optimal viewing experiences. Only IMAX uses acousticians, laser alignment and custom theater geometry, combined with our Academy Award-winning projection technology, precision audio and optimized seating layouts, to ensure every element is immersive by design.”
TOP: Gladiator II was given the IMAX “Maximum Image” treatment in November. (Image courtesy of Paramount Pictures)
OPPOSITE TOP TO BOTTOM: Joker: Folie à Deux launched on IMAX in October. (Image courtesy of Warner Bros. Pictures)
Dolby Cinema is a premium cinema experience created by Dolby Laboratories that combines proprietary visual and audio technologies such as Dolby Vision and Dolby Atmos. (Image courtesy of Dolby)
The world’s largest 4DX theater is the Regal Times Square located at 247 West 42nd St. in New York City. (Image courtesy of Full Blue Productions and 4DX)
Markoe continues, “Our theaters are calibrated daily to ensure audiences get perfectly tuned sound and image every time, at every location, regardless of where in the world it is. We also have incredible partnerships with filmmakers and studios. We’re seeing a dramatic shift to IMAX among filmmakers and studios. We are increasingly creating specifically for the IMAX platform. [And,] we have more ‘Filmed for’ IMAX titles in production than any time in our history,” Markoe says. “We are dramatically expanding our ‘Filmed for’ IMAX program to feature many of the world’s most iconic filmmakers and directors alongside rising talents in the industry. To date, we have 15 ‘Filmed for’ IMAX titles set for release this year — more than double any previous year – as filmmakers and studios from Hollywood and international territories increasingly create uniquely optimized versions for the IMAX platform.”
Markoe notes, “To meet growing demand among filmmakers to shoot in IMAX, the company is developing and finalizing the roll-out of four next-generation IMAX 15/65mm film cameras. IMAX tapped such prolific filmmakers and cinematographers as Christopher Nolan, Jordan Peele and Hoyte van Hoytema, among others, to identify new specs and features for the prototype. The new cameras recently entered production.”
IMAX launched the “Filmed for IMAX” program in 2020 to certify digital cameras that were officially approved to create IMAX-format films. Markoe explains, “The program is a partnership between IMAX and the world’s leading filmmakers to meet their demands for the ultimate IMAX experience. Working directly with IMAX, the program allows filmmakers the ability to fully leverage the immersive IMAX theatrical experience, including expanded aspect ratio and higher resolution. Through the program, IMAX certifies best-in-class digital cameras from leading brands, including ARRI, Panavision, RED Digital Cinema and
BOTTOM: The E3LH QuarterView Dolby Vision Cinema Projection System. Dolby Vision is part of the company’s Dolby Cinema package, which includes the Dolby Atmos sound system and special theater treatments to reduce ambient light and enhance comfort. (Image courtesy of Dolby)
Sony, to provide filmmakers with the best guidance to optimize creatively how they shoot to best work in the IMAX format when paired with IMAX’s proprietary post-production process.”
“IMAX continues to innovate on the cutting edge of entertainment technology,” Markoe states. “Our IMAX with Laser system was recently recognized with a Scientific and Technical Academy Award. Combining our industry-leading technology and new tools with the enthusiastic embrace by filmmakers to specifically and creatively design their movies to be the most immersive, highquality presentation, we continue to find new and innovative ways to expand the IMAX canvas moving forward. We see an opportunity for our platform to serve as a conduit for sports leagues to expand their global reach and provide a launchpad for projects from some of the world’s most iconic music acts. The future of cinema is multi-pronged, combining visionary works from Hollywood and local language blockbusters, original documentaries and exclusive events.”
DOLBY
“The appetite for premium cinema is huge, and it’s a clear factor in what’s drawing people to see movies in theaters,” says Jed Harmsen, Dolby Vice President and General Manager of Cinema & Group Entertainment. “2023 marked Dolby Cinema’s strongest year in history at the box office, with U.S. Dolby Cinema ticket sales eclipsing pre-pandemic levels, up 7% from 2019. Furthermore, Dolby boasts the highest average per-screen box office among all premium large-format offerings, which is a testament to the consumers’ recognition and value of enjoying their films in Dolby.”
According to Comscore data, the domestic “large-format” gross box office was up +10.1% in 2023 vs. 2019, illustrating how the popularity of premium cinema is growing and overtaking pre-pandemic levels. Also, per Comscore, market share of the domestic large-format gross box office (in relation to the entire domestic gross box office) grew from 9.7% in 2019 to 13.3% in
TOP: The Wild Robot landed on IMAX in September (Image courtesy of Universal Pictures)
2023. According to Harmsen, “Premium cinema experiences are becoming a larger share of all cinema experiences. It’s one of the reasons we continue to work with our cinema partners to make Dolby Vision and Dolby Atmos available to as many moviegoers around the world as possible. We created Dolby Cinema to be the best way for audiences to see a movie, featuring the awe-inspiring picture quality of Dolby Vision together with the immersive sound of Dolby Atmos, all in a fully Dolby-designed theater environment. There are around 275 Dolby Cinemas globally.”
Harmsen adds, “Our global footprint for Dolby Cinema spans 28 exhibitor partners and 14 countries, with the first Dolby Cinema opening in 2014 in the Netherlands. We’re excited to have true collaborations with multiple trusted partners and advocates like AMC, who have been a huge proponent in bringing the magic of the Dolby Cinema experience to moviegoers.”
Harmsen underscores the value of Dolby sound and vision to the viewing experience. “Dolby Vision allows viewers to see subtle details and ultra-vivid colors with increased contrast ratio and blacker blacks, delivering the best version of the picture that the filmmaker intended. Dolby Atmos offers powerful, immersive audio, allowing audiences to feel a deeper connection to the story with sound that moves all around them. And Dolby’s unique theater design allows audiences to experience both technologies in the best possible way – by limiting ambient light, ensuring an optimal view from every premium seat and more.”
“Dolby Vision and Dolby Atmos have revolutionized premium movie-going and have been embraced widely by creators and exhibitors, allowing us to bring Dolby-powered media and entertainment to more and more audiences,” Harmsen says. “To date, more than 600 theatrical features have been released or are confirmed to be released in Dolby Vision and Dolby Atmos, including recent box-office hits like Inside Out 2, Dune: Part Two, Deadpool & Wolverine and more.” Harmsen concludes, “We see
TOP TO BOTTOM: Inside Out 2 hit IMAX giant screens in 2024. (Image courtesy of Pixar/Disney)
Twisters was unleashed on IMAX in 2024 (Image courtesy of Universal Pictures)
The IMAX 70mm Film Camera. One of the IMAX film cameras used by Christopher Nolan to shoot Oppenheimer. (Image courtesy of IMAX)
exhibitors continuing to outfit their auditoriums to support more premium cinema experiences to meet the demand we’re seeing from moviegoers. At Dolby, we see premium as the new standard in cinema. It’s clear audiences worldwide do as well.”
4DX
4D cinema adds motion seats and multi-sensory effects to blockbuster movies. The experience adds about $8 to each ticket. South Korea’s CJ 4DPLEX is the leader in this area. Globally, there are some 750 4DX screens affiliated with the company, which has teamed up with partners like Regal Cinemas. According to the Regal site, “4D movies utilize physical senses to transport viewers into a whole new viewing experience. Regal’s 4DX theaters are equipped with motion-enabled chairs, which create strong vibrations and sensations, as well as other environmental controls for simulated weather or other conditions such as lightning, rain, flashing (strobe) lights, fog and strong scents.”
SPHERE
On the outskirts of the Las Vegas strip, Sphere is a literal expansion of what cinema is. When not hosting concerts or events, Sphere shows high-resolution films on a wraparound screen that is 240 feet tall and covers 160,000 square feet, with a 16K by 16X resolution. Digital Domain worked on the visual effects of Darren Aronofsky’s movie Postcard from Earth, shown in the gigantic spherical venue. “Working on Postcard from Earth for the Sphere
TOP: On the outskirts of the Las Vegas strip, Sphere is a literal expansion of cinema. (Image courtesy of Sphere Entertainment)
BOTTOM: The 4DX Cinema Sunshine Heiwajima movie theater in BIG FUN Heiwajima, an entertainment complex in Tokyo. (Image courtesy of 4DX)
TOP TO BOTTOM: Deadpool & Wolverine made a historic global IMAX debut last July. (Image courtesy of Walt Disney Studios Motion Pictures)
With the Apple Vision Pro, stunning panorama photos shot on the iPhone expand and wrap around the user, creating the sensation that they are standing where the photo was taken. (Image courtesy of Apple Inc.)
The IMAX Commercial Laser Projector designed specifically for multiplexes. (Image courtesy of IMAX)
was an extraordinary experience, marking our debut in such an impressive venue. Collaborating with Darren Aronofsky, a filmmaker whose work we’ve long admired, added an extra layer of excitement as we brought his vision to life in this dramatic setting,” comments Matt Dougan, Digital Domain VFX Supervisor.
NETFLIX HOUSE
Netflix House is another next-generation cinema addition. The experiential entertainment venue will bring beloved Netflix titles to life, beginning with locations in malls in Dallas, Texas, and King of Prussia, Pennsylvania, in 2025. “Building on previous Netflix live experiences for Bridgerton, Money Heist, Stranger Things, Squid Game and Netflix Bites, Netflix House will go one step further and create an unforgettable venue to explore your favorite Netflix stories and characters beyond the screen year-round,” according to Henry Goldblatt, Netflix Executive Editor, on the Netflix website.
“At Netflix House, you can enjoy regularly updated immersive experiences, indulge in retail therapy and get a taste – literally – of your favorite Netflix series and films through unique food and drink offerings,” says Marian Lee, Netflix’s Chief Marketing Officer, on the Netflix site. “We’ve launched more than 50 experiences in 25 cities, and Netflix House represents the next generation of our distinctive offerings. The venues will bring our beloved stories to life in new, ever-changing and unexpected ways.”
AR
Augmented reality is expected to expand the experience of moviegoing, adding interactive and immersive elements to movie posters and trailers, interaction with characters, personalized narrative and cinematic installations. “Apple Vision Pro undoubtedly brings new immersive opportunities to the table with its advanced mixed-reality capabilities offering unique ways to engage with stories,” comments Rob Bredow, ILM Senior Vice President, Creative Innovation and Chief Creative Officer. He explains that cinema is a highly mature art form “with well-established storytelling traditions and audience expectations. While Apple Vision Pro can be used to help create compelling new experiences, it’s not about replacing these mediums [such as film] but rather complementing them. The device opens doors to hybrid forms of entertainment that blend interactivity and immersion in ways that are uniquely suited to its technology. [Cinema] will continue to grow and thrive, enriched by these new possibilities, but certainly not overshadowed by them.”
Looking forward, one of the changes in next-generation cinema may be one of content. IMAX’s Markoe says, “Audiences are increasingly interested in must-see events – things that cannot be experienced the same way at home on a TV. Recent events, such as our broadcast of the 2024 Paris Olympics Opening Ceremony or broadcasting the 2024 NBA Finals to select theaters in China, brings these larger-than-life experiences to audiences in a way that can’t be replicated elsewhere. Increasingly, concert films like Taylor Swift: The Eras Tour have appealed to audiences who want to feel fully immersed in the action.”
Indeed, the future of cinema looks to lie more and more in premium cinema as well as immersive experiences that expand what movies are today.
BANDING TOGETHER ONCE AGAIN FOR GLADIATOR II
By TREVOR HOGG
Not often does a film crew get to reunite two decades later to make a sequel that makes swords and sandals cool again, but that is exactly the case with Gladiator II where Ridley Scott collaborates once again with Production Designer Arthur Max and Special Effects Supervisor Neil Corbould, VES. Russell Crowe as Maximus is not returning to the Colosseum to wreak havoc on the Roman Empire; instead, the task has been given to his equally determined son Lucius Verus (Paul Mescal).
“It was an amazing experience to see the Colosseum back up again,” states Corbould. “It was like stepping back in time 20-odd years because it was an exact replica of what we did before. I felt that the first one was damn good. To revisit this period again and take it a step further was quite an incredible and daunting task.” The scope has been expanded. “We were using the same old tools, like physical builds and handmade craftsmanship, that we always did,” remarks Max. “Only this time around, the digital technologies have come into that world as well, and that enlarged and increased the scope of what we could do in the time and on budget. It has been a gigantic shift from the first one to the sequel.”
OPPOSITE TOP TO BOTTOM: From left: Stunt Coordinator Nikki Berwick; VFX Supervisor Mark Bakowski; DP John Mathieson; Prosthetics Designer Conor O’Sullivan; Director Ridley Scott and Head Stunt Rigger Christopher Manger (ground) discuss the gladiator battle featuring the animatronic rhino created by Neil Corbould and his special effects team.
The advancements in technology allowed for more of the Rome set to be built physically for Gladiator II than for the original film.
Visual ties still exist between the original and the sequel. “We wanted people to be able to recognize the [different] world from the first to the second,” Max notes. “It was also opportunistic of us to try to use some of the earlier footage to blend in. We did that in flashbacks and in the live-action, where we produced some of the Gladiator’s original crowd footage. We tried to match the sets in actual detail, particularly in the arenas, both provincial and in
Images courtesy of Paramount Pictures.
TOP: Lucius Verus (Paul Mescal) seeks vengeance against Roman General Marcus Acacius (Pedro Pascal).
“Ridley said, ‘I want to have a rhino there.’ I spoke to Mark Bakowski [Visual Effects Supervisor] about it. I said, ‘We can create a remote-drive gimbal rig underneath, which is completely wireless, with a little six-axis motion base, and a muscle suit that we put a textured skin on with the head and body.’ Then Ridley said, ‘I want it to do 40 miles per hour and turn on a dime.’ That was like, ‘Oh, Christ. Another thing!’ But we did it. It was powered by flow-cell batteries and used electric car motors. This thing was lethal.”
—Neil Corbould, Special Effects Supervisor
the capital – like the Colosseum set – as closely as possible to the first one. There were changes, but they were subtle. That was a nod to economical filmmaking. Why waste the time shooting crowds cheering when you have it in the can already? We did a few of those kinds of things.”
Also coming in handy was the Jerusalem set from Kingdom of Heaven, which was repurposed as a Numidian coastal fort attacked by the Roman fleet. “The technology of water software – thank you, James Cameron and Avatar and other productions – had evolved to such a degree of sophistication that it made sense. Also, a credit to Neil Corbould, who found an incredible all-wheel-drive remote-control platform that was used for transporting enormous pieces of industrial technology great distances, like cooling chambers of nuclear power stations. We had a couple of those to put our ships on. This is where we were innovative,” Max states.
Corbould was inadvertently responsible for a cut sequence appearing in the sequel. He recalls, “I was going through some of my old archive stuff of the original Gladiator and found the storyboards of the rhino. After the meeting finished, I said, ‘By the way, Ridley, I found these.’ I put them on the desk and he went, ‘Wow! This is amazing. We’ve got to do this.’ And that’s how the rhino came about. It was like, ‘Oh, Christ, I didn’t think he would do that!’ Then Ridley said, ‘I want to have a rhino there.’ I spoke to Mark Bakowski [Visual Effects Supervisor] about it. I said, ‘We can create a remote-drive gimbal rig underneath, which is completely wireless, with a little six-axis motion base, and a muscle suit that we put a textured skin on with the head and body.’ Ridley said, ‘I want it to do 40 miles per hour and turn on a dime.’ That was like, ‘Oh, Christ. Another thing!’ But we did it. It was powered by flow-cell batteries and used electric car motors. This thing was lethal. It was good and could move around. We didn’t do it like a conventional buggy. We did it like the two-drive wheels were on the side, and we had the front and back wheels in the middle, which were stabilizing wheels. We were driving it like a JCB excavator around the arena; that, in conjunction with the movement of the muscle suit and the six axes underneath, gave some good riding shots of the guy standing on top of it.”
TOP TO BOTTOM: Pedro Pascal, Ridley Scott and Paul Mescal share a light moment in between takes.
The entrance arch of the Colosseum had to be enlarged to allow the ships to pass through.
The Colosseum had to be constructed higher than the original to accommodate the CG water needed for naval battles.
“We tried
to match the
sets
in
actual detail, particularly
in
the
arenas,
both provincial and in the capital – like the Colosseum set – as closely as possible to the first [ Gladiator ]. There were changes, but they were subtle. That was a nod to economical filmmaking. Why waste the time shooting crowds cheering when you have it in the can already?”
—Arthur Max, Production Designer
Not everything went according to plan, in particular the naval battle in the Colosseum. “Life got in the way because of the strikes,” remarks Visual Effects Supervisor Mark Bakowski, who was a newcomer to the project. “The Colosseum was originally to be more wet-for-wet and less dry-for-wet. But it works well in the end. There is a speed of working that suits Ridley Scott; he shoots quickly and likes to move quickly. That worked, shooting it dry-for-wet because Ridley could get his cameras where he wanted, reset quickly and get the shots; whereas, there are more restrictions being in the proper wet kind of thing. When it comes to integration, I was wary of having too much of a 50/50 split where you have to constantly match one to the other. We had a certain style of shot that was wet-for-wet, as in someone falling into the water, or Neil did some amazing impacts of the boats where the camera is skimming along the surface. Those made sense to do wet-for-wet because there are lots of interactions close to the camera.” The water went through a major design evolution. Bakowski adds, “We started off looking at the canals of Venice as our reference for the Colosseum, and then we started to drift. Ridley was showing pictures of his swimming pool in Los Angeles and saying, ‘Can you move it that way?’ It took us a while to find the look of the Colosseum water, but we got there in the end.” Technological advances allowed for the expansion of Rome.
TOP: The gladiator battle with the rhino was revived for the sequel when Neil Corbould showed Ridley Scott the original storyboards.
BOTTOM: There were times when DP John Mathieson had to coordinate as many as 11 cameras for action sequences.
“We built the boats in the U.K., shipped them out and then assembled them there, which was the right thing to do because it was almost impossible to get that material in Morocco or the sheer quantity of of steel and timber we needed. We put them in 30 40-foot trucks going across the Atlas Mountains, and as they were arriving, we were assembling them. It was like clockwork. On the day when we were shooting, we were still painting bits. It was that close.”
—Arthur Max, Production Designer
TOP: The Colosseum naval battle was a combination of dry-for-dry and wet-for-wet photography.
BOTTOM: The attack on the Numidian coastal fort was shot using the landlocked Jerusalem set from Kingdom of Heaven, with boats moved around on self-propelled modular transporters (SPMT) and CG water added in post-production.
“We built much more than we did on the first one in terms of the amount of site we covered,” Max explains. “We went from one end to the other. CNC sculptures and casting techniques were expedited greatly on the sequel because we had the technology. The timeframe was compressed from getting a finished drawing or working drawing to the workshop floor and also being able to farm out digital files, not only to one workshop but to multiple workshops simultaneously, which increased the speed of production. To a large extent, we met the demands of Ridley’s storyboards, but there was still a large amount of [digital] set extensions.” The accomplishment was impressive. “It was an amazing set to wander around,” Bakowski states. “We had like a kit of parts that we could dress in the background. Technically, a certain hill should be in a particular place. We established it in this shot, or a certain building should be at a specific angle. But if it didn’t look good, of course, it moved because Ridley is a visual director; his work is like a moving painting every time, and we responded to that by trying to make everything beautiful, which was the main thing.”
Visual effects took over some tasks previously looked after by special effects. “In Gladiator, we did a lot of launching arrows, but in this one, we didn’t do any of that,” Corbould reveals. “It was all Mark [Bakowski]. That allowed Ridley to shoot at the speed he did, which was good. I concentrated on the fires, dust and
explosions in the city. But we only shot that once, with 11 cameras.” A major contribution was the practical black smoke. Corbould describes, “We were burning vegetable oil, and when you atomize it at high pressure, it vaporizes, ignites and gives you this amazing black smoke. Everyone smells like a chip shop! We had six of these massive burners that were dotted around the set, and then we had to chase the wind. We would have some wind socks up or look at the flags. You had to try to anticipate it because of the speed at which Ridley works. We must have had 16 people just doing black smoke. We built a beach as well. Ridley said, ‘It’s supposed to be on the coast, and I want an 80-meter stretch of beach with waves washing the bodies onto the shore.’ We constructed an 80 x 80-meter set of beach. I made this wave wall, which was basically one section of the wall, but the whole 80 meters of it pushed in. It’s
TOP: Visual ties still exist between the original and the sequel, such as the Colosseum.
BOTTOM: The baboons fighting the gladiators was a complicated sequence that required multiple passes.
a bit like a wave tank. We put a big liner in it and sprayed sand over the liner; that gave it a nice, majestic wash-up against the beach.”
ILM led the charge in creating the 1,154 visual effects shots, followed by Framestore, Ombrium VFX, Screen Scene, Exceptional Minds and Cheap Shot VFX. “The baboons were a fun ride and tough,” Bakowski remarks. “The speed that Ridley likes to shoot is fantastic, but someone interacting and fighting with a troop of baboons does take some planning and thought to go into it. It’s complicated business. Bakowski adds, “He was generous in terms of letting us shoot the passes we wanted to shoot. In general, we kept to the logic that there were a couple of hero guys in the middle and a bunch of supporting baboons on the edge. We do one pass where we put all of the baboon stunt performers in there. Everyone would run around acting like baboons. After that, we pulled out the baboons that weren’t interacting with people because it was a nightmare with the amount of dust being kicked up. We did a pass with only the contact points and then a clean pass afterwards. It was a challenge to have it all come together.”
Nothing was achieved easily on Gladiator II. “This is the most challenging project that I’ve ever done given the scale and scope of it and the conditions under which we worked,” Max states. “We had the sandstorms in Morocco, and the idea of doing a naval battle in the desert had its problems. We had to keep the dust down, and the physical effects team was always out there with water hoses, and they were clever. They had water cannons to replicate physical waves coming over the bow of the ships. It was a lot of technology on an enormous scale.” The boat scenes were the most complicated. “I was probably one of the first people on the show with Arthur, and our prep period was quite short. We built the boats in the U.K., shipped them out and then assembled them out there, which was the right thing to do because it was almost impossible to get that material in Morocco or the sheer quantity of steel and timber we needed. We put them in 30 40-foot trucks going across the Atlas Mountains, and as they were arriving, we were assembling them. It was like clockwork. On the day when we were shooting, we were still painting bits. It was that close.”
The visual effects work was as vast as the imagination of Scott. “We’re doing extensions in Rome, crowds in the Colosseum, creatures and water,” Bakowski notes. “For the final battle. We were adding vast CG armies in the backgrounds of virtually every shot. We did some little pickups as well, so it’s integrating these pickups that came back to the U.K. with the stuff that was shot in Malta. It’s not groundbreaking stuff, but the volume of it is quite high because it’s one of those things that adjusts and adapts as the edit develops. The Colosseum naval battle encapsulated both what I’m looking forward to people seeing and also a big challenge. The baboons were a fun challenge, and the rhino just worked, which was fantastic. By the end, we knew how we were doing in the Colosseum, and our crowds look beautiful. I can’t wait for you to see all of it.”
TOP TWO: A character in its own right is the capital city of Rome.
BOTTOM TWO: One of the major creative challenges was developing the look of the CG water.
PAUL LAMBERT CROSSES THE ARTISTIC AND TECHNICAL DIVIDE
By TREVOR HOGG
Nowadays, Paul Lambert is at the forefront of Hollywood productions as a visual effects supervisor, with memorable visual accomplishments being the dystopian Los Angeles cityscapes and the lead hologram character from Blade Runner 2049, the transition to the Moon’s surface in IMAX in First Man and the realism of the worlds of the Dune franchise. Ironically, the ability to combine art and technology, which has been the key to his success, originally made him an anomaly in the British education system. Forced to choose between the two, he initially decided to earn a degree in Aeronautical Engineering at the University of London. Upon graduating, Lambert realized that engineering was not his calling, so he took a job as a courier in London and studied sculpture as a part-time art school student. Frequently, deliveries for Salon Productions led to visits to Shepperton and Pinewood Studios, and eventually saw him hired by the company that provided editing equipment to the film industry.
“At Salon, I learned how to put together and fix Steinbecks, KEMs and Moviolas,” Lambert recalls. “I even had to go over to Moscow to fix a Steinbeck being used by [Editor] Terry Rawlings for The Saint.” It was during this time that Lambert became aware of the digital transition in the film industry. “Avid and Lightworks non-linear editing systems were starting to disrupt the industry. It was this digital transition that made me more aware of something called visual effects.” The discovery was worth exploring further. “SGI had a big old building in Soho Square and were running week-long courses under the name of Silicon Studios, where you could play with Monet, Flint [the “baby” version of Flame] and Houdini. I left Salon and did this course, which was amazing.”A six-month odyssey of looking for employment came to an end when a runner at Cinesite went on a two-week vacation. “They kept me because I was so enthusiastic and hungry for knowledge. It was at a time when you could jump onto the graphics workstations, whether it be the Flames or Infernos or Cineon machines, at night in your own time. I taught myself. I was so hungry and focused. I had finally found what I wanted to do. It was a good balance of creativity and technical know-how. When I started at Cinesite, they had two Flames, and by the time I left I was the head of that department and we had seven.”
OPPOSITE
(Image
Director/writer/producer Denis Villeneuve, left, and Lambert on the set of Dune: Part Two (Image courtesy of Warner Bros. Pictures. Photo: Niko Tavernise) Lambert celebrates winning an Oscar for Dune: Part One with his wife, Mags Sarnowska.
Fascinated by a proprietary compositing software developed by Digital Domain, Lambert had a job interview with the visual effects company founded by James Cameron, Stan Winston and Scott Ross. “I added substantial pieces of technology to Nuke because by that time I had figured out the ins and outs of compositing,” Lambert reveals. “It was an obsession of mine of how an image comes together. Digital Domain was on the verge of commercializing Nuke but didn’t have a keyer. I spent six months playing around with this idea of keying, came back to them and showed them this algorithm. It was the IBK keyer, and that’s still in Nuke.” Simplicity drove the programming process. “What I can’t stand as a compositor is when there is a node and it’s got 50,000 sliders in there. Nobody knows what those sliders do! It’s trial and error. What I tried to develop is something simple but a process where, if you can combine these things in a particular way, you can work with bluescreens and greenscreens, which are uneven, and it gets you to a good place quickly. The irony is, now I tend to try not to rely on bluescreens or greenscreens!”
Images courtesy of Paul Lambert, except where noted.
TOP: Paul Lambert, Visual Effects Supervisor. A proud accomplishment for Lambert was creating the IBK Keyer, which is still used today in Nuke to deal with bluescreen and greenscreen plates.
TOP TO BOTTOM: A portion of a seawall was constructed for Blade Runner 2049, with the action shot in a water tank in Hungary.
courtesy Warner Bros. Pictures)
“Why would you put up with these crazy deadlines or having to move around the world if you didn’t truly love it? If you truly love something, you’re going to come up with creative ways of doing things and participate in some of these beautiful movies.”
—Paul Lambert, Visual Effects Supervisor
After 12 years at Digital Domain, Lambert joined DNEG’s facility in Vancouver in 2015 where he began his transition as a production visual effects supervisor starting with The Huntsman: Winter’s War. The size of the visual effects budget is only part of the equation for success. “By the time we had finished First Man it was a $7 million visual effects budget, which is relatively tiny, but we came up with some incredibly creative ways to do stuff,” Lambert remarks. “We used a number of great techniques for the visuals. Doing a bigature and miniature for space work is ideal because you can control the light so that shadows are really hard. We used real 1960s footage for the launch, but we repurposed that footage with CG to make it more cinematic. Also, we utilized one of the first LED screens, but we had it up for six weeks with operators for a fraction of the cost of what it costs now.” Ninety minutes of LED screen content had to be created. “This is where my gray hair has come from! We did not take the visor off one single shot. We even got reflections in the eyes!”
Two fundamental elements have to be respected for a visual effects shot to be believable. “I’m going to try not to change the actor’s performance or the light because I know that changing the light with our current tools always looks a bit artificial,” Lambert explains. “Your eye will pick up on something which takes you out, and in our current environment people will say, ‘It’s bad CGI.’ No, it’s the fact that you’ve taken the natural balance of the original image and gone too far by changing the background to a completely different luminance or trying to add a different light on the character. You see it all the time. I’m sure you will be able to do it with generative AI soon enough where you’re relighting or regenerating the image based on some form of transformer and diffusion model, but using current tools I try to avoid it. I would rather the continuity of a background be off rather than have a composite feel wrong. If I shoot something knowing that a background is going to be a certain background in post, then I try to have that screen be of a tone of luminance that I’m going to put the background in. Hence the sand-colored backing screens on Dune: Part One and Two.” Never underestimate the significance of having a clear vision. “With Denis Villeneuve there is such a clarity of vision as to what he wants, so it’s a pleasure to work with him, and you don’t do crazy hours and overtime,” Lambert states. “There isn’t a mad rush. It’s a sensible approach to things. There are hiccups along the way, but
TOP: Over 90 minutes of footage had to be created for the LED screens used for First Man. (Image courtesy of Universal Pictures)
BOTTOM: A major benefit of using the LED screens for the space and aerial scenes in First Man was the ability to capture reflections on the visor and in the eyes of Ryan Gosling, which are extremely difficult to achieve in post-production. (Image courtesy of Universal Pictures)
it’s not like you have to ramp up to 1,000 people towards the end because you’re 5,000 shots short. For Dune, the concepts were the basis of what we built and photographed and what I ultimately created in visual effects.”Blade Runner 2049 was a special project with Lambert working on behalf of DNEG. It was special “to come into this world and see pure professionalism at work with Denis and [Director of Photography] Roger Deakins, and witness them shooting with a single camera all the time.” He is also proud of his collaboration with Cinematographer Greig Fraser on Dune: Part One and Two. “Greig uses a multitude of lenses and some were old Russian lenses. He’s totally into degrading and giving character to the image. Then, of course, I have to try to match these things! We have a good understanding of the way we work. Greig is given untold freedom in how he wants to do things, but when I need something, he listens and will adapt,” he says.
Moviemaking is becoming more accessible to the masses. “You’ll see the cream rise to the top like you always do in whatever industry,” Lambert notes. “You will have directors who have a vision and bring that forward. I keep reading and seeing this whole idea of democratizing our industry, and it will happen. It depends on whether we put guardrails up or not to help with the transition. You’ll have different ways to visualize things. You’ll have the ability to put your VR goggles on and enjoy the movie that you just
TOP TO BOTTOM: Lambert in the Mojave Desert near Edwards Air Force Base for the landing of the X15 in First Man
Blade Runner 2049 marked the first time that Lambert collaborated with Denis Villeneuve as a facility supervisor at DNEG, and it resulted in him receiving his first Oscar. (Image courtesy of Warner Bros. Pictures)
created.” Great films are built upon solid collaborations. “I’ve been lucky with my path so far in that I’ve never had a bad experience with another HOD [head of department]. In the end, I’m only successful if the photography that we have shot works and people have put their heart into it. If I get the best foundation that I can, then I can add to that and bring it to the final where the director will hopefully love it.”
Lambert joined Wylie Co. in 2021 as the Executive Creative Director and is currently working on Project Hail Mary with directors Phil Lord and Chris Miller as well as Cinematographer Greig Fraser. “I’m thinking on my feet on Project Hail Mary more than I’ve ever done before because of trying to keep the camerawork and everything fluid,” Lambert remarks. “That means you’re not clinically breaking up the shot into layers because what tends to happen is you lose some of the organic feel of a shot if you do this and that element. I’m a big believer in having a harder comp which will always give you a better visual.” Even with a trio of Oscars, his enthusiasm remains undiminished. “Why would you put up with these crazy deadlines or having to move around the world if you didn’t truly love it? If you truly love something, you’re going to come up with creative ways of doing things and participate in some of these beautiful movies.”
TOP: From left: Rebecca Ferguson (Lady Jessica), Director/Writer/ Producer Denis Villeneuve, Lambert and Production Designer Patrice Vermette on the set of Dune: Part Two. (Image courtesy of Warner Bros. Pictures. Photo: Niko Tavernise)
BOTTOM: First Man resulted in Lambert winning his second Oscar and his first as a production visual effects supervisor.
AI/VFX ROUNDTABLE: REVOLUTIONIZING IMAGERY –THE FUTURE OF AI AND NEWER TECH IN VFX
By JIM McCULLAUGH
The VFX industry is still in the formative stage of a revolutionary transformation, driven by rapid advancements in artificial intelligence (AI) and its tech cousins VR, Virtual Production, AR, Immersive and others. As we begin 2025, AI promises to redefine both the creative and technical workflows within this dynamic field. To explore the potential impacts and necessary preparations, a roundtable of leading experts from diverse corners of the global VFX industry brings insights from their experiences and visions for the future, addressing the critical questions.
Q. VFX VOICE: How do you foresee AI transforming the creative and technical workflows in the visual effects industry by 2025, and what steps should professionals in the industry take today to prepare for these changes? Are we entering AI and Film 3.0, the phase where filmmakers are figuring out workflows that put together a string of specialized AI tools to serially generate an actual project? Still, lots of fear (era 1.0) and cautious experimentation (era 2.0), but most forward-looking are figuring out actual production processes.
A. Ed Ulbrich, Chief Content Officer & President of Production, Metaphysic By 2025, AI will profoundly reshape the visual effects industry, enabling creators to achieve what was once deemed impossible. AI-powered tools are unlocking new levels of creativity, allowing artists to produce highly complex imagery and effects that were previously out of reach. These innovations are not only pushing
TOP: Here features a de-aged Tom Hanks and Robin Wright. Their transformations were accomplished using a new generative AI-driven tool called Metaphysic Live. (Image courtesy of Metaphysic and TriStar Pictures/Sony)
With the help
(Images
BOTTOM: Blue Beetle marked the first feature film where Digital Domain used its proprietary ML Cloth tool, which captures how Blue Beetle’s rubber-like suit stretches and forms folds and wrinkles in response to Blue Beetle’s movements. (Image courtesy of Digital Domain and Warner Bros. Pictures)
the boundaries of visual storytelling but also drastically cutting costs by automating labor-intensive tasks and streamlining workflows.
Moreover, AI will accelerate production and post-production schedules, transforming the entire filmmaking process. With AI handling time-consuming tasks, teams can focus more on the creative elements, leading to faster, more dynamic productions. To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared to harness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuel for creativity.
A. Lala Gavgavian, Global President & COO, Digital Domain AI tools are already making strides in automating rotoscoping, keying and motion capture cleanup, which are traditionally labor-intensive and time-consuming tasks. In 2025, these tools will be more sophisticated, making post-production processes quicker and more accurate. The time saved here can be redirected to refining the quality of the visual effects and pushing the boundaries of what’s possible in storytelling. AI has the possibility of being added to the artist’s palette, allowing expansion to experiment with different styles in a rapid prototyping way. By harnessing the power of AI, VFX professionals can unlock new levels of creativity and efficiency, leading to more immersive and personalized storytelling experiences.
We are indeed moving into what could be considered the AI and Film 3.0 era. This phase is characterized by transitioning from fear (1.0) and cautious experimentation (2.0) to practical application.
TOP:
of Metaphysic AI, Eminem’s music video “Houdini” created a version of Eminem from 20 years ago. Metaphysic offers tools that allow artists to create and manage digital versions of themselves that can be manipulated.
courtesy of Metaphysic and Interscope Records)
Filmmakers and VFX professionals are now figuring out workflows integrating specialized AI tools to create full-fledged projects. These tools can handle everything from pre-visualization and script breakdowns to real-time rendering and post-production enhancements. However, this transition is not without its challenges. There will be concerns about job displacement and the ethical implications of AI-generated content. To address these issues, the industry must adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.
A. Brandon Fayette, Co-Founder & Chief Product Officer, Fuzzy Door Tech
By 2025, AI is poised to significantly transform both creative and technical workflows in the visual effects industry. AI’s impact is already evident in the entertainment sector, and it is set to become the standard for automating repetitive tasks such as shot creation and rendering. This automation is not limited to VFX; we can see AI’s efficiency in code generation, optimization, testing and de-noising audio, images and video. Technical workflows will become more flow-driven, utilizing AI to dynamically adapt and drive the desired creative results. This means AI will assist in creating templates for workflows and provide contextual cues that help automate and enhance various stages of the creative process. AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content.
TOP: Fuzzy Door Tech’s ViewScreen in action from the Ted TV series. ViewScreen Studio is a visualization tool that enables real-time simulcam of visual effects while ViewScreen Scout is an app for iPhone. ViewScreen Studio visualizes and animates a complete scene, including digital assets, in real-time and for multiple cameras simultaneously. (Image courtesy of Fuzzy Door Tech)
BOTTOM: Harrison Ford transforms into Red Hulk for Captain America: Brave New World. (Image courtesy of Marvel Studios)
Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements rather than compromises the artistic process. Our focus with the ViewScreen family of ProVis™ tools is on using AI to support and enhance human creativity, not replace it. By improving processes across production workflows, AI can make jobs easier while respecting and preserving the craft and expertise of entertainment professionals.
A. Nick Hayes, ZEISS Director of Cinema Sales, U.S. & Canada
This past year, we have already seen “fingerprints” left by AI in both the technical and creative sides of the film industry. Companies like Strada are building AI-enabled production and post-production toolsets to complete tasks widely considered mundane or ‘that nobody wants to do.’ In turn, this new technology will allow VFX artists and post-production supervisors more freedom to focus on the finer details and create “out of this world” visuals never seen before. I see this resulting in a higher grade of content, more imagination and even better storytelling. Recently, Cinema Synthetica held an AI-generated film contest. The competition founders argued that the use of generative AI empowers filmmakers to bring their stories to life at a much lower cost and faster than traditional filmmaking methods. Now, creatives can use software tools from companies like Adobe and OpenAI to create content from their “mind’s eye” by simply describing their vision in just a few sentences. In a way, the use of AI can be inspiring, especially for filmmakers with lower budgets
TOP: With GPU-accelerated NVIDIA-Certified Systems combined with NVIDIA RTX Virtual Workstation (vWS) software, professionals can do their work with advanced graphics capabilities from anywhere, able to tackle workloads ranging from interactive rendering to graphics-rich design and visualization applications or game development. (Image courtesy of NVIDIA)
BOTTOM THREE: Examples of joint deformations before and after AI training shapes. (Image courtesy of SideFX)
Character poses created in Houdini and used for AI training of joints. (Image courtesy of SideFX)
Final result of posed character after AI training of joints, created and rendered in Houdini by artist Bogdan Lazar. (Image courtesy of SideFX)
“To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared to harness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuel for creativity.”
—Ed Ulbrich, Chief Content Officer & President of Production, Metaphysic
“There
will be concerns
about
job displacement and the ethical implications of AI-generated content. To address these issues, the industry must adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.”
—Lala Gavgavian, Global President & COO, Digital Domain
and less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.
A. Neishaw Ali, Founder, President, Executive Producer, Spin VFX
AI is set to transform the VFX industry by automating repetitive tasks, enhancing creativity and enabling real-time rendering. By staying up-to-date with AI tools, collaborating across disciplines, experimenting with new technologies and focusing on creative skills, professionals can effectively prepare for and leverage these advancements to enhance their workflows and deliver more innovative and compelling visual effects.
We have been working with AI for many years in VFX and only now is it made available at a consumer level and poised to significantly transform both creative and technical workflows in the visual effects industry in several key areas such as: Concept Development – Allows for visual ideation among the director, creative team and VFX to solidify a vision in hours rather than weeks. It enables real-time alignment of the creative vision through text-to-image generation, a process not unlike Google image searches but far more targeted and effective.
Automation of Repetitive Tasks – Automation of repetitive and non-creative tasks such as rotoscoping and tracking will significantly reduce the time and effort required for these laborious processes thus allowing our artists to concentrate more on the creative aspects of the scene, which is both energizing and inspiring for them.
Face Replacement – AI is revolutionizing face replacement by enhancing accuracy and realism, increasing speed and efficiency, and improving accessibility and cost-effectiveness, allowing for high-quality face replacement for a wide range of applications. Proper authorization and clearance are necessary to ensure we ‘do no harm’ to any likeness or person.
Real-time rendering – Though not only AI-driven, real-time rendering is most certainly changing the VFX workflow. As the quality of final renders becomes more photorealistic and AI-enabled technologies like denoising and upresing allow for more complex scenes to be scalable in software like Unreal Engine, the design and iteration process will accelerate. Changes can be instantly viewed and assessed by everyone.
Steps for Professionals to Prepare: I believe one of the biggest challenges for some VFX artists and professionals is understanding that embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.
A. Antoine Moulineau, CEO & Creative Director, Light Visual Effects
AI feels like the beginning of CGI 30 years ago when a new software or tool was out every week. There are a lot of different techs available, and it’s very hard to focus on one thing or invest in specific workflows. At LIGHT, we are focusing on better training artists with Nuke’s Copycat and new tools such as comfyUI. Up-res or frame interpolation are already huge time-savers in producing high-res renders or textures. AI like Midjourney or FLUX has already
disrupted massively concept art and art direction; they play now a major part in the workflow. 2025 will be about animated concepts and possibly postvis if the tools mature enough to have the control required. Animating concepts with tools such as Runway 3.
A major blocker for final use remains controlling the AI and the lack of consistency of the tools. As said earlier, there is so much happening now, that it is hard to keep up or rely on the tools to be able to integrate in a pipeline.
I don’t know if it will be for 2025, but I can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios and reduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupt traditional workflows in 2025.
We will start seeing directors preparing an AI version of their films with an edit of animated concepts with music during the pitching/concept phase, especially for advertising. This is such a helpful process to understand and communicate their vision. It’s kind of a Moodboard 3.0, and I can certainly imagine this process becoming the norm very quickly. For very short-form social content, it will probably replace entirely traditional workflows. That being said, I think long-form remains an art form where actors and performance remain central, and I don’t see AI taking over anytime soon. It is hard for me to see the point of that. We need real people to identify with so we can connect to the content. Art is about the vision; it captures society and the world as it is in the time it is made. In other words, AI remains a gigantic database of the past, but we still need the human creation process to create new art. A good example is, AI wouldn’t be able to generate a cartoon version of a character if someone hadn’t invented “cartoon” previously. It will accelerate processes for sure but not replace them.
A. Christian Nielsen, Creative Director, The Mill
Predicting the future is challenging, especially given AI’s rapid advancement. However, I anticipate an increasing integration of AI tools into the VFX pipeline. We’re already seeing this to some degree with AI-powered rotoscoping and paint tools, which address some of the most common repetitive tasks in VFX. Additionally, inpainting and outpainting techniques are emerging as powerful tools for removing elements from shots and creating set extensions. ComfyUI has already become an integral part of many AI pipelines, and I foresee its integration expanding across most VFX studios.
I strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities and implications. The integration of AI into VFX is both inevitable and unstoppable.
There’s still progress to be made before full text-to-video tools like Runwayml Gen-3 or Sora can be used to create complete AI commercials or movies. The main challenge is the lack of precise control with AI. If a director dislikes a specific element in a shot or wants to make changes, there’s currently no way to control that. As a result, AI tools are generally not very director-friendly. At present, these tools work best for ideation and concept
“AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content. Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements rather than compromises the artistic process.”
—Brandon Fayette, Co-Founder
&
Chief Product Officer,
Fuzzy Door Tech
“In a way, the use of AI can be inspiring, especially for filmmakers with lower budgets and less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.”
—Nick Hayes, ZEISS Director of Cinema Sales, U.S. & Canada
“[O]ne of the biggest challenges for some VFX artists and professionals is understanding that embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.”
can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios and reduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupt traditional workflows in 2025.”
—Antoine Moulineau, CEO & Creative Director, Light Visual Effects
development, like how we use Midjourney or Stable Diffusion for still concepts. Initially, AI could be used for creating stock elements, but I’m confident that OpenAI and others are working on giving users more control.
Over the past 12 months, we’ve used AI for several commercials and experiences, learning as we go. This technology is so new in the VFX industry that there’s little experience to draw from, which can lead to some long workdays.
A.
Mark Finch, Chief Technology Officer, Vicon
The industry is going through considerable change as audience preferences and consumer habits have evolved significantly in recent years. More people are staying in than going out, tentpole IPs are reporting decreased excitement and financial returns, and we’ve seen a period of continuous layoffs. As a result, there’s a lot of caution and anticipation as to what’s next.
In a transitional period like this, people are looking at the industry around them with a degree of trepidation, but I think there’s also a significant amount of opportunity waiting to be exploited. Consumer hunger for new worlds and stories powered by VFX and new technologies is there, along with plenty of companies wanting to meet that demand.
For the immediate future, I predict we’re going to see a spike in experimentation as people search for the most effective ways of utilizing these technologies to serve an audience whose appetite knows no bounds. Vicon is fueling that experimentation with our work in ML/AI, for example, which is the foundation of our markerless technology. Our markerless solution is lowering the barriers to entry to motion capture, paving the way for new non-technical experts to leverage motion capture in their industries.
An example we’ve come to recognize is giving animators direct access to motion capture who historically would have only had access to it through mocap professionals on the performance capture stage, which is expensive and in high demand. This unfettered access reduces the creativity iteration loop, which ultimately leads to a faster final product that is representative of their creative dream.
There’s a lot of excitement and noise surrounding the rapid growth of AI and ML-powered tech. It’s impossible to look anywhere without seeing tools that encourage new workflows or provide enhancements to existing ones. A consequence of this is that you can fall into the mindset of, “This is the way everything is going to be done, so I need to know about it all.” When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are still finding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.
The best preparation comes from understanding the problem before the solution, in other words, identifying the obstacle you need to overcome first. You get this by focusing on people –speaking to them about their challenges, researching those that exist across their industry in general, and gaining an understanding of why a certain tool, workflow or enhancement might exist.
A. Paul Salvini, Global CTO, DNEG
AI, especially machine learning, is poised to significantly impact the visual effects industry, transforming both creative and technical workflows. At DNEG, we are investing in the development of new AI-enabled tools and workflows to empower artists and enhance the creative process. For us, storytelling remains paramount – so our use of AI is directed towards activities that provide better feedback for artists and deeper creative control.
In terms of artist-facing tools, some of the areas likely to see early adoption of AI and ML techniques throughout 2025 include: Improving rendering performance (providing faster artist feedback); automating repetitive tasks; procedurally creating content; generating variations; and processing, manipulating and generating 2D images.
AI techniques and tools are being increasingly used to generate ideas, explore creative alternatives and build early stand-ins for various locations, characters and props. As with all new tools, professionals can prepare by learning the basics of AI, and seeing how these tools are already being explored, developed and deployed in existing industry-standard packages.
Some AI and ML tools work invisibly, while others require direct user involvement. An abundance of publicly available and user-friendly websites has emerged, allowing artists – and the general public – to experiment with various ML models to better understand their current capabilities and limitations.
These new tools, while impressive, further emphasize the importance of human creativity, communication and collaboration. Our collective job of developing and bringing great stories to life remains unchanged. However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.
A. Christopher Nichols, Director, Chaos Labs
Machine learning has been transforming the industry for years, so it’s nothing new to VFX artists. Especially when it comes to digital humans, rotoscoping, fluid sims and analyzing data/camera tracking information. AI will continue to take on a bigger piece of the workflow and replace a lot of traditional VFX techniques in time. The industry will just continue to adapt.
Creating high-level content is going to become much more accessible, though. Soon, independent filmmakers will create shots that would have been the sole domain of high-end VFX houses. This will free the latter to experiment with more ambitious work. Currently, Chaos is trying to help artists get to LED screens faster via Project Arena and NVIDIA AI technology; you’ll likely see AI solutions become commonplace in the years ahead. You’ll also probably see fewer artists per project and more projects in general, too, as AI makes things more affordable. So instead of 10 movies a year with 1,000 VFX artists on each movie, it’ll be more like 1,000 films with 100 names per project.
The elephant in the room is generative AI. However, the big movie studios are reluctant to use it due to copyright issues. Right now, the matter of where the data is coming from is being worked out through the court system, and those decisions will influence what happens next. That said, I don’t think an artist will be
“I
strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities and implications. The integration of AI into VFX is both inevitable and unstoppable.”
—Christian Nielsen, Creative Director, The Mill
“A
consequence of [the rapid growth of AI] is that you can fall into the mindset of, “This is the way everything is going to be done, so I need to know about it all.” When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are still finding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.”
—Mark Finch, Chief Technology Officer,
Vicon
“Our collective job of developing and bringing great stories to life remains unchanged. However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.”
—Paul Salvini, Global CTO, DNEG
“[Y]ou’ll likely see AI solutions become commonplace in the years ahead. You’ll also probably see fewer artists per project and more projects in general, too, as AI makes things more affordable. So instead of 10 movies a year with 1,000 VFX artists on each movie, it’ll be more like 1,000 films with 100 names per project.”
—Christopher Nichols, Director, Chaos Labs
replaced by a prompt engineer anytime soon. The best work you see coming out of the generative AI world is being done by artists who add it to their toolsets. You still must know what to feed these tools and artists know that better than anyone.
A. Greg Anderson, COO, Scanline VFX and Eyeline Studios
In 2025, AI tools and technology are poised to significantly transform how visual effects are created, from automating the most mundane of tasks to expanding the possibilities of the most complex visual effects sequences. Several compositing packages already incorporate AI-based features that greatly improve rotoscoping, tracking, cleanup speed and quality. These features will continue to improve in 2025, allowing artists to spend more time on the final quality of shot production. The ongoing and fast-moving development of generative AI tools and features will change the process, efficiency and quality of everything from digital environments to effects and character animation.
From a technical and production workflow standpoint, AI will continue to optimize render processes, allowing for more iterations and leading to more convincing imagery that is faster and cost-effective. New tools will assist VFX teams in organizing, managing and accessing vast libraries of digital assets, making it easier for artists to find and reuse elements across different projects. Data-driven insights will also allow AI tools to predict which assets might be needed based on project requirements.
Overall, AI technology is poised to revolutionize the VFX industry next year and beyond, as we’ve only yet to scratch the surface of what will be possible. In preparation, anyone working in the VFX industry should lean heavily toward curiosity, continuous learning and skill development. Time spent experimenting with AI tools and technologies in current workflows will heighten the understanding of AI’s capabilities and limitations. Additionally, while AI can enhance many technical aspects, creativity remains a human domain. Artists should focus on developing artistic vision, storytelling skills and creative problem-solving abilities.
A. David Lebensfeld, President and VFX Supervisor, Ingenuity Studios and Ghost VFX
In 2025, we will see a continuation of idea genesis happening by leveraging generative AI tools. We will also find that our clients use generative AI tools to communicate their ideas by leveraging easy-to-use tools they have never had before. The sacrifice being controllability, but the benefit is ease of communication.
Most of our studio clients have a real sensitivity to how AI is being used on their projects, and they want it to be additive to the projects versus a threat to the ecosystem. In the short term, generative AI will be used more as a tool for communication than it is for execution. We’ll continue to see AI-based tools in our existing software packages, giving both in-house and vendor tool developers and software developers room to expand their offerings. While AI advancements will continue to improve existing toolsets, they won’t replace team members at scale, especially in the high-end part of the market. Looking ahead, I think the best professionals in our industry are already dialed in to developing toolsets and new technologies. It’s always been the case that you have to be agile and stay aware of continual software and hardware developments. VFX is the
intersection of technology and art; you must know and constantly improve both to stay competitive. Also on a professional level, I don’t think we’ll see meaningful changes in 2025 to how VFX final pixels get made at the studio side, for a multitude of reasons, two being a lack of granular control and sour optics.
How people are talking about AI can often feel like a marketing trick. Everyone is using the same basic technology layer, and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.
A. Mathieu Raynault, Founder, Raynault VFX
When I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I started in computer graphics in 1996, I haven’t seen anything with this much potential for exciting transformation.
At Raynault VFX, AI is set to significantly boost our efficiency by automating routine tasks and letting our team focus more on the creative parts of our projects. We’re a small team of 55 and creativity is at the heart of what we do. We’ve started using AI to increase our productivity without sacrificing our artistic integrity. With seven full-time developers, we’re heavily invested in research and development, including AI, to improve our workflows.
Looking ahead, I see AI enhancing our current tools, helping us keep control over the creative process and refine our work with client feedback. This blend of AI and human creativity is crucial because filmmakers will still rely on creative teams to bring their visions to life. Although there’s some worry about AI’s ability to create entire films or TV shows on its own, I think these tools won’t replace human-driven filmmaking anytime soon.
AI will certainly transform our workflows and could lead to shifts in employment within our industry. VFX artists will become more productive, able to deliver more work in less time, which might lead to a reduction in job numbers compared to pre-strike highs. For VFX professionals, integrating AI into their workflows is essential, yet it’s crucial to preserve and enhance our existing skills. In the field of concept art, for example, AI can assist in drafting initial designs, but the intricate process of refining these concepts to align with a director’s vision will still require human expertise. Artists who can both direct AI and iterate while creating concept art themselves will be invaluable.
In summary, I’m quite optimistic. As we move toward 2025, adopting AI requires us to change our skills and approaches to stay competitive and innovative. As a business owner in the VFX industry, it’s incredibly motivating!
A. Viktor Müller, CEO, Universal Production Partners (UPP)
To some extent, AI has already begun to transform the industry. We see demonstrations of its growing capabilities almost on a weekly basis, and there seems to be a lot of fear around that. Honestly, I’m not worried about it at all. I could sense it coming long before it started turning up in the media, which is why UPP has been quietly building out our VP and AI departments for the last six years.
I know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think it’ll
“AI
technology is poised to revolutionize the VFX industry next year and beyond, as we’ve only yet to scratch the surface of what will be possible. In preparation, anyone working in the VFX industry should lean heavily toward curiosity, continuous learning and skill development.”
—Greg
Anderson, COO, Scanline VFX and Eyeline Studios
“How
people are talking about AI can often feel like a marketing trick. Everyone is using the same basic technology layer,and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.”
—David Lebensfeld, President
and VFX Supervisor, Ingenuity Studios and Ghost VFX
“When I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I started in computer graphics in 1996, I haven’t seen anything with this much potential for exciting transformation.”
—Mathieu Raynault, Founder, Raynault VFX
“I know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think it’ll be just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and we’re already seeing that adaptation now.”
—Viktor M üller, CEO, Universal Production Partners (UPP)
be just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and we’re already seeing that adaptation now.
A. Kim Davidson, President & CEO, SideFX
Over the past year, we have seen several advancements in AI in the visual effects industry and we expect this to continue in 2025. So far, the advancements have been more evolutionary than revolutionary. AI is not replacing creatives or the production pipeline but is greatly speeding up many of the more mundane tasks while not fully eliminating them – yet. Tracking and rotoscoping are key examples of tasks that have been improved and sped up. We predict that 2025 will see more AI-based tools being used throughout the pipeline, with improved AI implementations and some brand-new tools. These AI-enhanced workflows will include design concept, asset (model and texture) creation, motion stabilization, improved character animation and deformation (e.g. clothing, hair, skin), matching real-world lights, style transferring, temporal denoising and compositing.
Of course, there will be improvements (and more releases) of prompt-based generative video applications. But – for a variety of reasons – we don’t see this as the best workflow for creative professionals, certainly not the be-all and end-all for art-directed content creators. We believe in providing artists with AI/ML-enhanced toolsets to bring their creative visions to life more quickly and efficiently, allowing for more iterations that should lead to higher quality. We are at an exciting stage in the confluence of powerful hardware and AI-enhanced software – where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.
A. Dade Orgeron, Vice President of Innovation, Shutterstock
2025 is here, but with generative AI technology moving so quickly, I think we can expect to see AI continue to transform the visual effects industry, particularly through advancements in generative video and 3D tools. As AI models continue to improve, we can expect notable enhancements in temporal consistency and reduced distortion, along with compositing tools to help seamlessly integrate AI-generated content into live-action footage or easily remove/replace unwanted people or objects. In the next wave of generative video models, complex mechanical devices and other intricate details will be represented with unprecedented precision, and advanced dynamics and fluid simulations will start to become achievable with generative video rather than traditional, time-consuming simulation engines. Will it be perfect? Maybe not in the next six months, but perhaps within the next year.
To prepare for these advancements, VFX professionals should invest in upskilling themselves in AI and machine learning technologies. Understanding the capabilities, and particularly the limitations of AI-driven tools, will be essential. They should experiment with generative image and video technologies as well as 3D tools that leverage AI to streamline their workflows and enhance their creative skills. That’s something at Shutterstock that we are actively enabling through partnerships with NVIDIA and Databricks. For instance, we’ve developed our own GenAI models to accelerate authentic creative output, all with ethically sourced data. Early adoption and a shift towards embracing new technologies and methodologies will enable artists and technicians to remain competitive and innovative in these rapidly evolving times.
A. Gary Mundell, CEO, Tippett Studio
The big question is: What will AI mean to us in 2025? As we move through the Gartner Hype Cycle, AI seems to be transitioning from the Trough of Disillusionment into the Slope of Enlightenment, much like the early days of the .com era. AI is poised to bring a suite of tools that handle obvious tasks – roto, match move, res up, FX – but that’s just the tip of the iceberg. Anything described by a massive database can use AI. If you can articulate your prompts, and there’s a database to train the answers, you’re set. Forget influencers – soon, “prompters” will drive production with AI-generated insights.
By 2025, AI will fundamentally change VFX production. Imagine a system capable of generating an entire schedule and budget through prompts. AI could create a VFX schedule for a 1,200-shot project, complete with budgets, storyboards, 3D layouts and animatic blocking, all tailored to a director’s style and the level of complexity. However, where today’s AI falls short is in the temporal dimension – it struggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many tools claim to address this, it will take time before AI excels at high-quality animation.
At Tippett Studios, we leverage AI for previsualization, conceptualization and project management. Using TACTIC Resource, we integrate AI into planning and resource management, handling vast production data to predict outcomes and streamline workflows. As we move into 2025 and beyond, AI’s data management capabilities will be key to future productivity and financial success, even as we await more advanced animation tools. As AI continues through the Peak of Inflated Expectations and towards the Plateau of Productivity, its role in VFX production will become increasingly significant.
“We are at an exciting stage in the confluence of powerful hardware and AI-enhanced software – where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.”
—Kim Davidson, President & CEO, SideFX
“Early adoption and a shift towards embracing new technologies and methodologies will enable artists and technicians to remain competitive and innovative in these rapidly evolving times.”
—Dade Orgeron, Vice President of Innovation, Shutterstock
“[W]here today’s AI falls short is in the temporal dimension – it struggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many tools claim to address this, it will take time before AI excels at high-quality animation.”
—Gary Mundell, CEO, Tippett Studio
STREAMING AND VFX: CULTIVATING THE ABILITY TO ADAPT TO CONSTANT CHANGE
By CHRIS McGOWAN
Despite the lingering effects of 2023’s writers’ and actors’ strikes, the streamers continue to disrupt the industry. “Streaming has increased the demand for VFX work and accelerated the growth of all parts of the production and post-production industries,” says Tom Williams, Managing Director of DNEG Episodic.
Among the leading streamers, Netflix had 277.65 million paid subscribers worldwide as of the second quarter of 2024, according to Statista research, an increase of over eight million subscribers compared with the previous quarter, and Netflix’s expenditures on content were expected to stabilize at roughly 17 billion U.S. dollars by 2024. Also, by 2024, the number of Amazon Prime members in the United States was projected to reach more than 180 million users. In Q2 2024, the number of Disney+ subscribers stood at around 153.6 million, according to Statista, while the combined number of subscribers to Warner Bros. Discovery’s Max (formerly HBO) and Discovery+ services surpassed 103 million. Apple TV+, Hulu, Paramount+ and Peacock are among the others with significant viewers.
TOP: Shō gun (Image courtesy of FX Network)
OPPOSITE TOP TO BOTTOM: The Last of Us (Image courtesy of HBO)
3 Body Problem (Image courtesy of Netflix) House of the Dragon (Image courtesy of HBO)
Such subscriber numbers have bankrolled a lot of visual effects and animation. “Streaming has been a game-changer for the VFX industry. It has significantly increased demand. With platforms constantly producing new content, visual effects studios have more opportunities than ever before,” comments Valérie Clément, VFX Producer at Raynault VFX Visual Effects & Environments. “The rise of streaming has also shifted the focus from traditional films to high-budget series, which has diversified the types of projects we work on at Raynault.” Jennie Zeiher, President of Rising Sun Pictures (RSP), remarks, “The advent of streaming had a huge impact that we’re still feeling today, not only for global consumers,
but studios, production companies, TV channels, post houses, VFX studios; the entire industry was impacted. [It was a major disruption in the industry] that changed how content was consumed.”
BUDGETS & MODELS
Streaming changed the way the industry was divided up and took away market share from broadcast and theatrical, according to Zeiher. She explains, “In 2017, RSP’s work was still wholly theatrical. We predicted that over the course of that year, we would be progressively taking on more streaming projects and that the year following, our work would be distributed 50/50. This indeed played out, and it tells the story of how a disruptive change can affect a business model. Fast forward to today, the industry is more complex than ever, made more so by the fact that streaming opened up distribution to a global, multi-generational audience, which is more diverse than ever.”
“Everyone is more budget-conscious at the moment, which is not a bad thing for VFX as it encourages more planning and the use of previs and postvis, which helps everyone deliver the best possible end product,” Williams says. “We are a technology-driven industry that is always moving forward, combined with incredible artists, so I think we will always see improvements in quality.” Zeiher adds, “I think studios are still trying to settle on their model. There are fewer big hits due to diversity in taste, and there are more risks around greenlighting productions at a higher price point. What made a hit five or 10 years ago isn’t the same as it is today. There is more diverse product in the pipeline to attract more diverse audiences. The streamers are producing high-end series, but they are more concentrated to a handful of studios.”
SHARING WORK
“Productions normally split work between multiple vendors,” Zeiher notes. “This work can be sensitive to timing and schedule changes. Therefore, VFX vendors need to have a plan on how they manage and mitigate any changes in schedule or type of work. Besides capability and the quality of the creative, this is the biggest singular challenge for VFX vendors and is the secret to a successful studio!” Zeiher adds, “Studios have always split work between multiple vendors, and only in limited scenarios kept whole shows with single vendors, and this continues to be the trend. The studios are splitting work among their trusted vendors who have the capability in terms of crew and pipeline to hit schedules and manage risks.”
“The increase in work has meant that more shows than ever before are being shared between different VFX houses, so that will add to the cooperation. Being a relatively young industry, it doesn’t take long to find a mutual connection – or 10 – when you meet someone else from VFX at an event,” Williams says. Comments Wayne Stables, Wētā FX’s VFX Supervisor on House of the Dragon Season 2, “I’m not sure that I’ve seen a big change [in business and production models]. We bring the same level of creativity and quality to everything we do, be it for feature film or streaming, and use the same tools and processes. I approach it the same way as I would working on a film. I think episodic television has always pushed boundaries. I remember when Babylon 5 came out [and] being amazed at what they were doing, and then seeing that ripple through to other work such as Star Trek: Deep Space Nine.”
HIGHER EPISODIC QUALITY
Working with the VFX studios, the streamers have set the visual effects bar high by bringing feature film quality to episodic television. “Game of Thrones comes to mind despite starting before the streaming boom. It revolutionized what viewers could expect from a series in terms of production value and storytelling. Later
TOP TO BOTTOM: Foundation (Image courtesy of AppleTV+)
The Lord of the Rings: The Rings of Power (Image courtesy of Prime Video)
The Boys (Image courtesy of Prime Video. Photo: Jan Thijs)
TOP TO BOTTOM: Fallout (Image courtesy of Prime Video)
In Your Dreams. Coming in 2025. (Image courtesy of Netflix)
The Wheel of Time (Image courtesy of Prime Video)
seasons had blockbuster-level budgets and cinematic visuals that rivaled anything you’d see in theaters,” Clément says. “Netflix has also made significant strides with shows like Stranger Things, which combines appealing aesthetics and compelling storytelling, and The Crown, known for its luxurious production design and attention to detail. Also, series like Westworld and Chernobyl both deliver sophisticated narratives with stunning visuals that feel more like feature films than traditional TV. These are just a few examples, of course. The range of projects that have made a significant impact in the streaming world is vast.”
Zeiher also points out the streaming titles The Rings of Power, Avatar: The Last Airbender, Shōgun, Monarch: Legacy of Monsters, Loki Season 2, Fallout [and] the Star Wars universe, with recent series such as Andor, Ahsoka and The Acolyte as having brought feature-film quality to episodic. Stable comments, “As the techniques used on big visual effects films have become more common, we have seen more high-end work appear everywhere. Looking at work in Game of Thrones and then, more recently, Foundation and through to shows like Shōgun. And, of course, I am proud of our recent work on House of the Dragon Season 2, Ripley and The Last of Us.”
EXPECTATIONS
“The expectation of quality never changes – showrunners, writers and directors can spend years getting their visions greenlit – no one is looking to cut corners. We all want to do our best work, regardless of the end platform,” Williams says. Regarding the delivery dates for series episodes, Stables comments, “I haven’t ever found the timeframes to be short. The shows tend to be very structured with the fact that you have to deliver for each episode, but that just brings about a practicality as to what is important. As with everything, the key is good planning and working with the studio to work out the best solution to problems.” Clément says,
“While the compressed timelines can be challenging, the push for high-quality content from streaming platforms means that we are constantly striving to deliver top-notch visuals, even within tighter schedules. This is always exciting for our team.”
CHANGES IN THE STREAMER/VFX RELATIONSHIP
“I think that showrunners and studios are seeing that it is now possible to create shows that perhaps in the past were not financially feasible. So, we are developing the same relationships [with the streamers] that we have had with the film studios, seeing what we can offer them to help tell their stories,” Stables states. “Relationships can be reciprocal, or they can be transactional,” Zeiher observes. “In VFX, we very much operate in a reciprocal relationship with the studios and their production teams; it’s a partnership at every level. Our success is based on their success and theirs on ours.”
GLOBAL COOPERATION
Streaming is enhancing global cooperation among VFX studios by creating a greater need for diverse talent and resources. Clément says, “As streaming platforms produce more content, studios around the world are teaming up to manage the growing amount and complexity of VFX work. Advances in remote work technology and cloud tools make it easier for teams from different regions to collaborate smoothly and effectively.” Zeiher explains, “RSP’s work on Knuckles is a great example of global, inter-company collaboration. Instead of using a single vendor, the work was split between
TOP: Sakamoto Days. Coming in 2025. (Image courtesy of Netflix)
BOTTOM: A Knight of the Seven Kingdoms: The Hedge Knight. Coming in 2025. (Image courtesy of HBO)
several, mostly mid-size, vendors. The assets were built to a specification and shared using Universal Scene Description, allowing asset updates to be rolled out simultaneously across vendors and providing a consistent look across the characters. Paramount’s approach to Knuckles was very smart and could be indicative for future workflows.”
“VFX is a tumultuous industry and, off the back of the WGA and SAG-AFTRA strikes, we’ve entered a time of consolidation,” says Zeiher. “Studios, often backed by private equity, are acquiring small to mid-size studios. This is helping them to distribute work globally across many jurisdictions. Dream Machine is an example of this new collaborative model with its recent acquisition of Important Looking Pirates and Cumulus VFX, joining Zero, Mavericks and Fin Design. Likewise, RSP has its sister studios FuseFX, FOLKS and El Ranchito under its parent company Pitch Black; it’s a new form of global collaboration, mid-size studios, with different offerings across brands and locations who can collaborate under one banner.”
“I think that the streaming distribution model was the first disruption, and that distribution continues to evolve,” Zeiher comments. “The production model may now be disrupted through the use of GAI. Combining the distribution evolution, audience consumer changes and using GAI in production, we’re in for lots more changes in the year(s) to come.” Clément states, “As streaming platforms experiment with new content formats and distribution methods, VFX studios will adapt to different types of media and storytelling approaches.”
TOP: Knuckles (Image courtesy of Paramount+ and Nickelodeon Network)
BOTTOM: The Witcher: Sirens of the Deep. Coming in 2025. (Image courtesy of Netflix)
WICKED IS A COLLAGE OF DIFFERENT LENSES AND TALENTS
By TREVOR HOGG
In recent years, exploring the backstories of iconic villains has become more in vogue with the release of Maleficent, Joker and now Wicked, a Universal Pictures production that brings the Broadway musical adaptation of Wicked: The Life and Times of the Wicked Witch of the West by Gregory Maguire to the big screen. No stranger to musicals is filmmaker Jon M. Chu, who has been making them ever since he was a USC film school student, but this time around, the scale is a throwback to Hollywood classics such as The Wizard of Oz, with the added benefit of the visual effects industry, which didn’t exist back then.
“There is this grandiose nature to Wicked, but from the beginning, we always wanted it to feel touchable and immersive,” director Jon M. Chu explains. “We wanted to break the matte painting of Oz that we have in our mind. What happens if you could live in it? What happens if you can touch the dirt and textures? Visual effects are extremely powerful to be able to do that. Of course, we worked hand in hand with building as well by planting nine million tulips and having a real train and Wizard’s head, but we’re all in it together.”
Massive sets were built. “I firmly believe you’ve got to build as much set as you possibly physically can, or do as much for real as you possibly physically can because the real photography on that
Images courtesy of Universal Studios.
TOP: Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) get a private tour of the Royal Palace of Oz.
set informs visual effects on how everything should look,” states Production Designer Nathan Crowley. “That is fundamental. You can’t just put a bluescreen up because you’re going to get enough of that anyway. You’ve got to try to balance it.” The act of physical construction is extremely informative. Crowley says, “The thing is, if you do a concept and don’t build it, then you miss out on the art direction of making it.” Doing concept art in 3D was imperative. “We will build, paint and finish a 3D model and will deliver it rendered to Pablo Helman [Visual Effects Supervisor]. Pablo has to rebuild it because visual effects have to go into a lot more specific areas, but at least he knows what it should look like. We also go scout places, and even if we don’t film that place, we’ll say to Pablo and Framestore, which does a lot of the environments, ‘That’s what we need it to look like. We need to go to the south coast down to Poole and Bournemouth and get that set of cliffs, and that becomes Shiz.’ Emerald City is a hard one because you’re going much higher [digitally]. I would try to build enough below 50 feet so he would have textures.”
Unreal Engine influenced the shot design. “Emerald City was the last set that was built and was behind schedule,” states Cinematographer Alice Brooks. “We had this idea for when Elphaba and Glinda get off of the train, and we start to push down
TOP: Jonathan Bailey as Prince Fiyero performs in front of the rotating university library set, which was constructed by the special effects team led by Paul Corbould.
BOTTOM: Cinematographer Alice Brooks stands behind director Jon M. Chu as he discusses a shot she captured with custom-made lenses by Panavision.
the stairs, and it all becomes this one long Steadicam shot that ends on a crane that lifts up. We had been working on this for months but couldn’t get into the set to start rehearsals because all of the construction cranes and painters were in there. What we did do was take the game controller, go into Unreal Engine and start designing the shot. When walking the set in Unreal Engine, we realized that this big stepping-onto-crane move didn’t show off the city in any spectacular way; that being low was the way you
TOP: Cynthia Erivo decided to go with practical makeup for the green skin of Elphaba, which was then finessed digitally in post-production.
BOTTOM: Special Effects Supervisor Paul Corbould built the The Emerald Express, which was designed by Production Designer Nathan Crowley to be like a personal motorized carriage for the Wizard of Oz.
saw the city in this amazing way. Then, we threw out our amazing Steadicam idea, which our ‘A’ camera operator was bummed out about, and we created something new in Unreal Engine that was perfect.”
Numerous production meetings were held to discuss how to deal with the green skin of the future Wicked Witch of the West, Elphaba, portrayed by Cynthia Erivo. “We wanted to have all of the options on the table then work with Cynthia herself to know
BOTTOM: This aerial shot of Munchkinland showcases the nine million tulips that were planted.
TOP: Glinda, portrayed by Ariana Grande, makes use of her bubble wand.
what she needed as an actor,” Chu explains. “We did a lot of tests with a double to show Cynthia real makeup, semi-makeup where you only do the main areas, and completely non-green makeup because we knew that makeup every day for that long of a shoot could be grueling and would also take away time from actually shooting. Cynthia was like, ‘I need the makeup.’ Of course, there is some cleanup that we needed to do because sometimes her hands were thinner on certain days than others.” The green skin had to look believable and work in any lighting condition. “David Stoneman, who is a chemist who makes products for our industry, took my green design, which was from products called Creamy Air and Illustrator, and the discontinued product that I had found, and put three drops of yellow neon into the base,” explains Hair Designer/Makeup Designer/Prosthetics Designer Frances Hannon. “It reflected off the dark skin tone and made it look like it was her skin, not like it was green-painted on the surface, and more than that, it worked in every light.”
Prosthetic makeup was required to show the characters of Boq (Ethan Slater) and Fiyero (Jonathan Bailey) being transformed into Tin Man and Scarecrow. “One of my most important things was working with Mark Coulier [Prosthetic Makeup Designer] again,” Hannon remarks. “For Tin Man, we wanted to achieve something sympathetic because it should have never happened to Boq. In our story, Elphaba’s spell goes wrong in Nessarose [Marissa Bode]’s office, and everything metal in that room attaches to Boq; his breast plate would be the tray on the table, and his hands become the thimbles, salt and peppers. Then, the visual effects took over because all the joints were blue. With Scarecrow, Jon and Mark particularly wanted to keep Jonathan Bailey’s face shape. We also kept his nice teeth and natural eye color for Scarecrow. I used contact lenses on Jonathan for Fiyero, so we had a nice change
TOP: A lens flare, rainbow and the Yellow Brick Road are incorporated into an establishing shot of the Emerald City.
BOTTOM: The head of the Wizard of Oz was a massive animatronic puppet hung from the ceiling of the studio.
there. Then, for his head element, I put masses of gold blonde through his look as Fiyero, which carried onto Scarecrow in a straw-colored wig; that kept Fiyero attractive because Elphaba and he fall in love.”
Most of the 2,500 visual effects shots were divided between ILM and Framestore, with other contributors being OPSIS, Lola VFX, Outpost VFX and BOT VFX. “The CG creatures were difficult because they also talk, but they are mainly animals,” Helman remarks. “They don’t walk on two legs. If it’s a goat that talks and is a teacher, it’s basically a goat if you look at it, then he talks. It was a fine line stylizing the talking so that it doesn’t feel like a completely stylized character, but also finding the expression, the eyebrows, eyes and mouth, the phonemes, and how articulate those creatures are. We had an animal unit of about 10 people or so that would play animals on set, and we would shoot a take or a few takes with them. We had a transformation scene where the monkey transforms and gets wings, so we had the whole animal unit performing and being directed by Jon. Sometimes, the second unit would stay there to shoot plates. Besides the music, dancers, choreography and huge sets, then there were the animals.”
Magic was always treated in a grounded manner. “It’s not a cutesy, glowing, sparkling thing,” Helman notes. “There is nothing wrong with those kinds of things; it’s just that this version of Oz is not magical. You have to remember, when you go back to the original story, the Wizard of Oz is not really a wizard.” Creative solutions had to be applied to achieve the desired effect. “How do you make a book glow without making it look completely fantastical and cartoony?” Helman explains, “Maybe what you do is provide a language inside of the book with words that may become golden that wasn’t golden in the beginning. So, you see a transition between a word that is on a black ink parchment to something
TOP TO BOTTOM: The mandate was to capture as much in-camera, which gave Nathan Crowley the freedom to construct massive sets.
Water was seen as the key method of transportation to Shiz.
Elphaba (Cynthia Erivo) begins to master the art of flying a broom.
golden that produces a glow and is completely grounded.” Broomsticks are a form of aerial transportation. “We worked with the stunt department to get the center of gravity correct and to be able to move the actors around. Cynthia Erivo wanted to do her own stunts, so she did. All of that wirework was closely planned. There are two things: There’s the center of gravity and what the body is doing in the air, and the lighting. If we get those two things right then we’re fine,” Helman says.
A major accomplishment was the practical realization of the Emerald City Express, which is the personal train of the Wizard of Oz. “It was Nathan Crowley’s vision,” states Special Effects Supervisor Paul Corbould. “We built the running gear and the track and motorized it. It’s hydraulically driven. Construction clad it with all of the fiberglass panels.” The motion was repeatable. “The train could be programmed to drive to a particular spot, run down and stop at a position, and when told to start again, make its next move and run to the end of the track. You can take it back to the beginning and keep on doing that,” remarks Special Effects Design Supervisor Jason Leinster. Equally impressive was the construction of the Wizard’s head. “Jason reverse-engineered the scale model and changed the electric servos to hydraulic rams and a whole control system,” Corbould explains. “It progressed from that.” The head was suspended from the ceiling of the stage. “It was a civil engineering project to have something like that floating in the middle of space,” Leinster notes. “It was 22 axes and puppeteered by one person. Most of it was done live and was often changing.”
Other complicated rigs included the rotating library. “Because it was first up in the schedule, we built one wheel and ladder, and the dancers with Chris Scott [Choreographer] rehearsed on that one,” Corbould states. “As we built another one, they were rehearsing
TOP: An establishing shot of Shiz University.
MIDDLE AND BOTTOM: Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) decide to take a more relaxed approach to flying by taking off in a hot air balloon.
on that with more dancers, and we built a third one. It took three months.” An amazing prop was the fountains. “The petals opened up, and they wanted water to come out,” Corbould remarks.
“We’ve got these hydraulic motion bases, and in the middle is a slit ring that allows you to turn a rig round and round without winding the cable up. We had to take a slit ring off, which you normally run a hydraulic oil through, and put that on the fountain. It ruined it because we were running water through it; that was quite a challenge.” A bricklaying machine gets pulled by a bison. “There was no bison to pull it, so the machine was self-driven,” Leinster reveals.
“You could sit back and steer it. We had a roadway of foam bricks rolled up inside, and as the machine drove forward, it unrolled the Yellow Brick Road. Eventually, it drove off into the sunset, being pulled by the bison. You probably won’t realize that is an effect.”
To convey the impression of a floating castle, the concept of anti-gravity architecture was developed. “Kiamo Ko isn’t just a castle,” Crowley observes. “It’s a defiant emblem of a bygone era, a testament to the forgotten magic that once pulsed through Oz. Its architecture, though ancient, utilizes lost principles of levitation, defying gravity yet remaining grounded in a sense of order and purpose. The key to Kiamo Ko’s defiance lies not in defying gravity entirely but in manipulating it subtly. Imagine a series of inverted arches, their points reaching skyward. These arches wouldn’t be perfect mirrors of one another; instead, they possess a slight asymmetry, a calculated tilt that interacts with the forgotten magic of the land, generating a gentle, constant lift. This subtle slant would also provide a visual cue, hinting at the castle’s orientation even from a distance. By incorporating these design elements, Kiamo Ko transcends the trope of a generic floating castle. It becomes a character itself, a silent testament to a forgotten age and a beacon of hope for Elphaba and Fiyero’s new beginning.”
TOP: Anti-gravity architecture serves as the basis for Kiamo Ko, which is a castle located on the peak of Knobblehead Pike.
BOTTOM: Madame Morrible (Michelle Yeoh) has the ability to control the weather, so there is a cloud motif to her hairstyle.
Lenses were developed specifically for the production (which evolved into the new series of Ultra Panatar II) that were paired with the ARRI ALEXA 65 cameras. “Jon told me that he wanted Wicked to be unlike anything anyone had ever seen before, and the photography needed to represent that,” Brooks states. “I was on the movie so early I was able to design them with Dan Sasaki at Panavision in Woodland Hills. We called them the ‘Unlimiteds’ after Elphaba singing ‘Unlimited’ in Wicked because at the time they didn’t have a name. Those lenses capture all of the pictures that Nathan, Jon and I put together for so many months, and they wrap the light beautifully on our actors. Usually, you’re matching close-ups on the same lens, but on Elphaba, we shot her on a 65mm lens and Glinda on a 75mm lens, and we matched the size, but those two lenses did different things to their faces. Oz is a different place, and something is a little bit off everywhere. Our A and B 65mm lenses were not the same. It was a collage of lenses. Each one had such a different characteristic, and that made our movie feel different. Elphaba even has one line in the movie that goes, ‘Some of us are just different.’ That’s what we want our Oz to be.”
Apple Vision Pro is an essential part of the editorial process. “I am overseeing the edit in the Vision Pro,” Chu explains. “Instead of being trapped in a monitor on a desk, which isn’t the most creative, I can be like I am in the room with Myron Kertstein [Editor] where I’m walking around or sitting on the couch. We can do visual effects approvals there too. I can bring it on and draw with my finger where certain areas need to be improved or whatnot.” Hannon looks forward to seeing everything being brought together. “For me, it’s seeing those finishing touches. The sets were 60 feet high. then we would have bluescreen. I do believe Paul Tazewell [Costume Designer] and myself, to the best of our abilities, gave Jon the spectacular, extraordinary and timeless look that he was after.”
TOP TO BOTTOM: Skies played a major role in setting the proper tone for scenes.
Throughout the whole movie, there is this idea that the sun is always rising for Glinda (Ariana Grande) and setting for Elphaba (Cynthia Erivo).
Jonathan Bailey plays the role of Fiyero, who goes on to be transformed into the iconic Scarecrow.
Wicked is spanning two movies, with the first one centered around the song “‘Defying Gravity” and the second song “For Good.” “It’s in two parts, but we shot the whole movie in one lifetime!” Helman laughs. “I look at every project as a traumatic project where you develop these scars and learn from those scars, but you wear them proudly.” Teamwork reigns supreme for Chu. “Each department can make everything, but the reality is that we need to work together to make the thing that none of us can make alone. I feel lucky to be working with a team at the highest level, with the bar at the highest place for us to cross. It has been an amazing journey.”
TOP: Musical numbers were as complicated to plan and execute as action sequences.
BOTTOM: Various animals are part of the faculty at Shiz University, with Peter Dinklage doing facial capture and the voice of Dr. Dillamond.
LAURA PEDRO EXCELS AT THE MAGIC OF MAKING EFFECTS INVISIBLE
By TREVOR HOGG
Even though Society of the Snow thrust Laura Pedro into the international spotlight and garnered a nomination at the VES Awards as well as trophies from the European Film Awards and Goya Awards, she has been amassing an impressive portfolio that includes A Monster Calls, Way Down (aka The Vault) and Superlópez. Born and raised in Montgat, Spain, Pedro currently resides in Barcelona where she is a visual effects supervisor for El Ranchito.
“I did not come from an artistic family, but I always liked photography. When I was little, my father gave me his camera, and I began trying to tell stories with it; that is when I figured out this is maybe something for me. When I was 16, our English teacher proposed to us to make a project for the government that would give us money to learn English outside of Spain. My classmates and I decided to make a movie about the robbery of The Scream by Edvard Munch. We won the money to travel to Toronto and stayed there for a month to learn English and finish the movie.” The intervening years have strengthened her self-confidence. “The difference from then to now is I finally found my own voice.”
Photography teaches the fundamentals of making an image. “I know a lot of things about visual effects, but in the end, it’s all about light,” Pedro notes. “If you don’t know anything about light, it’s impossible to tell things with images. It’s incredible how photography connects with everything.” Originally, the plan was to study photography, not visual effects, at ESCAC (Escola Superior de Cinema i Audiovisuals de Catalunya), which is the alma mater of filmmaker J.A. Bayona. “I had an accident during the first year of school; I lost all of the exams and couldn’t specialize in photography. Because of that, I decided to go for my second selection, which was visual effects.” The schooling was beneficial as it provided an overall understanding of the filmmaking process. “When I was studying, every week we did a short film, and I was either producing or doing camera or directing. That caused me to work with different teams and learn various things for five years. When I finished school, it was easy for me to start as a visual effects supervisor and compositor, and know what the director wants and help them with visual effects.”
Images courtesy of Laura Pedro.
OPPOSITE TOP TO BOTTOM: Pedro on a set for The Vault (aka Way Down), which is part of the supporting chamber located under the Bank of Spain. (Photo: Jorge Fuembuena)
Pedro in the Andes in Chile while shooting the scenes where Nando and Canessa attempt to find help in Society of the Snow. (Photo: Quim Vives)
Robert Zemeckis has left a lasting cinematic impression. “There are a lot of movies that I can’t get out of my mind, like Death Becomes Her and Who Framed Roger Rabbit. In my career, I normally do invisible effects, but when I have the opportunity to do comedies with visual effects, it’s fun for me to be part of that. Of course, you know that it’s fake, but they tried to do it in the most realistic way.” Innovation can come from unlikely sources. “In Death Becomes Her, they developed how to do skin in CGI, then the same technology was used for Jurassic Park. For me, it’s interesting and cool that you are developing a technology not doing Titanic, but a comedy. It’s awesome to have the chance to create new technology. This has never happened in Spain because the industry is small, but we have the opportunity in the films we make to take the technology from the big productions and try to use it in our smaller ones, and teach the producers in Spain that they can trust these new technologies and ways of working with visual effects.”
Over the past decade, a better understanding of visual effects has taken root in the Spanish film industry where digital artists are
TOP: Laura Pedro, Visual Effects Supervisor
“It’s important to have role models. Now that we have the schools, maybe in 10 years I will work with more supervisors who are women. It’s not about gender. It’s more about the age you start doing visual effects or become a visual effects supervisor because I’m 34 and other supervisors I know are 40 or 50. We are not of the same age and have different ways of thinking.”
—Laura Pedro, Visual Effects Supervisor
brought into pre-production to properly plan for complex shots rather than being seen simply as post-production fixers. “Visual effects are in all of the productions, so it’s easy to work continually,” Pedro states. “There are more visual effects companies that do television commercials trying to upgrade to narrative fiction.”
Television and streaming have become a driving force in Spain.
“Here, the film productions are small, so television and streaming productions allow us to continue working through the year. Maybe you do a film and at the same time two TV shows.”
Filmic visual effects have made their way to the small screen.
“The difference when you do a project like Game of Thrones or Ripley is that there’s a lot of work in pre-production trying to find the perfect design with the cinematographer and production
designer in the most cinematic way,” Pedro remarks. “Other projects work faster.” One has to be willing to adapt. “In the end, every project, director and producer is different. It’s like a new adventure. When I begin working with a new client, I need to have a lot of patience, try to understand and be faster because I only have three months. Normally, I work with visual reference found on the Internet or pictures or videos taken with my camera. I have this capacity to find what is exactly in the mind of the filmmaker with the reference that I have imagined and later start working with our team doing the concept.”
In 2013, Pedro made her debut as a visual effects supervisor for Barcelona nit d’estiu (Barcelona Summer Night) by director Dani de la Orden, and she would reunite with him a decade later for Casa en flames (A House on Fire), which is currently the most viewed movie in Catalonia. A major career turning point occurred when the emerging talent was recruited by a prominent Spanish visual effects company. “I was doing a short film for a Spanish singer named David Bisbal, and El Ranchito called me to begin working with them on A Monster Calls,” Pedro recalls. “Félix Bergés [Founder and President, El Ranchito] is my mentor, and I learned from him it’s better to start with the real world and plates, and after that begin working with CGI because that mixture works better for the human eye. Also, he gave me the power to say, ‘No’ when I’m on set.”
TOP: For Society of the Snow, the art department built two fuselages and the previs department at El Ranchito helped director J.A. Bayona design each shot of the crash. (Photo: Quim Vives)
BOTTOM: Pedro participated as Visual Effects Supervisor in the Spanish interpretation of the superhero genre, which resulted in Superlópez (2018)
and
TOP RIGHT: Pedro after winning her first Goya Award in 2019 for Superlópez. which was shared with Special Effects Supervisor Lluís Rivera.
MIDDLE: Visual Effects Supervisor Félix Bergés, Special Effects Supervisor Pau Costa and Pedro celebrate Society of the Snow winning the Goya Award for Best Special Effects in 2024. (Photo: Ana Belén Fernández)
BOTTOM: Visual Effects Supervisor Félix Bergés and Pedro on the set of A Monster Calls (2016), which was their first project together. (Photo: Quim Vives)
Personal highlights include a children’s fantasy tale, a comic book adaptation and the recreation of a historical event. “It’s not common in Spain to do a movie about a superhero like Superlópez,” Pedro observes. “We had to build a robot that was 10 to 12 meters tall. Before Superlópez I worked on A Monster Calls where we needed to build a monster that was also 12 or 13 meters tall, so I knew how to film a movie about this difference of dimensions and create something really big. Society of the Snow is a movie that has entirely invisible visual effects. We achieved that by traveling to the Valley of Tears, doing all of the environments with real photography, managing all of this material and developing the tools to work with five vendors at the same time while maintaining consistency. It was a lot of work.”
Nowadays, the protégé has become a mentor. “It’s important to have role models,” Pedro states. “Now that we have the schools, maybe in 10 years I will work with more supervisors who are women. It’s not about gender. It’s more about the age you start doing visual effects or become a visual effects supervisor because I’m 34 and other supervisors I know are 40 or 50. We are not of the same age and have different ways of thinking.” Patience is a key trait. “The most important thing is to be yourself and talk about things. I continue to learn by reading books and watching films. I try to remain connected with the technology and new tools, but it’s completely impossible to know everything.” Real-time and machine learning have introduced new ways of working. “There is a good and bad way of using technology. We need to be calmer because we rely on each other in the end to do the things that we love, which in turn creates an emotional response from the audience.”
TOP LEFT: Pedro
Special Effects Supervisor Pau Costa with their Goya Award for Best Special Effects for Society of the Snow in 2024. (Photo: Papo Waisman)
VFX IN LIVE PERFORMANCE: A VAST, VISUAL FEAST
By CHRIS McGOWAN
OPPOSITE
TO
Disguise worked on Stranger Things: The First Shadow, an immersive play that is a prequel to the hit Netflix series Stranger Things (Image courtesy of Disguise. Photo: Manuel Harlan)
Now more than ever, VFX is on stage. From the Glastonbury music festival to the mega-format venue Sphere to the play Stranger Things: The First Shadow, a new generation of visual effects is enlivening live performances, often involving LED walls, sophisticated projectors, real-time engines and/or XR (extended reality).
“Shows are ever increasing in size and becoming more and more of a visual feast. Video effects play a significant role in that,” says Emily Malone, Head of Live Events for Disguise. “It’s now pretty unusual to see a large-scale concert that doesn’t feature video effects and content. People are always looking for new ways of visual storytelling. From the LED floor and flying cubes at Eurovision to the LED and projection blend of Stranger Things: The First Shadow, video is now seen as a far more integrated and intrinsic part of the overall design and experience.”
The 2023 Glastonbury festival is an example. Malone comments, “The pyramid stage present in Coldplay’s performance had the video delivered by Flint Studios and driven by Disguise. Also in Glastonbury, Block9’s famous IICON stage has, for several years, had Disguise power the video content on the famous block. Also [there] in Glastonbury, major music performances from artists like Camilla Cabello and London Grammar were driven by Disguise.”
Disguise can power the video for theater shows as well, according to Malone. “Stranger Things: The First Shadow [which debuted December 2023 in London’s Phoenix Theatre] uses video content to tell the narrative – integrating it as an essential part of the set design. 59 Productions used Disguise’s 10-bit workflow,
TOP: Dead & Company takes us above Earth in its 30-day 2024 residency at Sphere. (Image courtesy of Alive Coverage, ILM and Sphere)
TOP
BOTTOM: Disguise and Evoke Studios powered visual content for the band London Grammar at the 2023 Glastonbury Festival. (Image courtesy of Evoke Studios and Disguise)
The Coachella Music Festival has offered immersive AR effects, including geo-specific experiences created together with Niantic. (Image courtesy of Unreal Engine)
featuring four GX 3 media servers and the OmniCal camera-based projector calibration system. These tools allowed them to deploy almost a dozen projectors and a large LED video surface cohesively while also keeping the large complement of projectors aligned.”
“It’s entirely possible to watch a musical theater show where the entire back wall is made of LED and spend the majority of the performance unable to easily distinguish between video content, lighting effects, stage effects and physical scenery elements, such is the cohesion and artistry of the integration,” continues Malone.
“Our range of hardware products means that we have something to fit every project, from the small and nimble Disguise EX 2 server to the powerhouse VX4+, our hardware is specified by engineers and System Integrators according to the unique needs of a project. In terms of real-time engines, we’ve long been working with Notch real-time rendering natively within the Designer software, and in recent years have also had the ability to offload that render workload to a separate RX machine via our proprietary RenderStream protocol. We also see a lot of Unreal Engine usage, and integrations with Touch Designer and Unity,” Malone notes.
“Digital Domain has been at the forefront of digital human and immersive content creation since the debut of Tupac Shakur’s performance at Coachella in 2012,” says Charles Bolwell, Digital Domain Senior VFX Producer. At that event, a CGI Shakur appeared on stage with the real Snoop Dogg, thus bringing together a digital person and a real person in a live event.
“Whether it’s the groundbreaking Tupac Coachella performance
or the next-level Vince Lombardi commercial and hologram, we are always excited to push our technology to new heights,” says Lala Gavgavian, Digital Domain Global President and COO. “Our relentless focus is on advancing our Autonomous Virtual Human (AVH) technology and crafting digital humans that are visually stunning, highly interactive and emotionally resonant. By harnessing the power of AI, machine learning and real-time rendering, we’re creating virtual humans that blur the lines between reality and imagination.” Gavgavian adds, “Looking ahead, we envision our technology evolving to support live hologram performances that are truly interactive and unique in every way.”
To create VFX for immersive concerts, theater and experiences, Digital Domain employs a variety of software tools, both proprietary and off-the-shelf, to meet their clients’ needs. “These include Nuke, Maya, Houdini, V-Ray, Unreal Engine and our own Charlatan and Masquerade tools, to name a few,” explains Matt Dougan, Digital Domain VFX Supervisor. “Most live performances are driven through Unreal for real-time content, and our teams utilize whatever tool in the toolbox delivers the best results. Often, the requirements for real-time delivery are closely specified with the client, usually including any specific hardware and redundancy measures to ensure smooth, error-free playback.”
For the 2024 NHL Draft, Digital Domain partnered with the Sphere team to create a never-before-seen live event. Bolwell explains, “We helped create the majority of the team logo-based backdrops which would appear on the 16k screen before each team would make their selections, make a trade or introduce a Hall of Fame iconic player. We also created several snow/ice themed particle transitions that took the broadcast to and from commercial
TOP: ABBA Voyage is a concert experience in a purpose-built London arena that features the band’s de-aged avatars, created by ILM. (Image courtesy of ILM. Photo: Stufish Entertainment Architects)
MIDDLE AND BOTTOM: ABBA performed in motion capture suits as part of ILM’s creation of their 1970s virtual selves. (Images courtesy of ILM. Photo: Baillie Walsh)
breaks. For the event, Digital Domain also created over 155 individualized elements that ESPN and NHL Network could have at their disposal to make this the most exciting and immersive draft in major sports history.”
DNEG is working on immersive/interactive projects that integrate VFX into live events, according to Joshua Mandel, Managing Director of DNEG IXP. He comments, “DNEG is currently active in a number of projects in the virtual concert and location-based experience spaces – live events/experiences that rely heavily on VFX artists to create an experience where a participant feels a performer, environment or other emotionally resonant thing is present. We call these live experiences because they involve audiences in venues and concert halls and they involve recreations of emotionally-resonating people and places, using the same artistry and technological know-how that has driven our success in film VFX.” Mandel adds, “Since many of these experiences are bespoke, never-done-before projects, we use all of the tools at our fingertips to bring the experience to life, from VFX tools like Maya and Nuke to real-time engines, to applications of AI and machine learning – all as tools to help our artists to bring the visions of our partners to life.”
Opening on May 27, 2022, the ABBA Voyage residency has combined virtual and human performances in a purpose-built concert hall. ILM helped the Swedish group ABBA return to the stage digitally, after a long hiatus. ABBA’s 20-song virtual live show is a hybrid creation: pre-recorded avatars appear on stage with a physically present 10-piece band. The avatars meld the band’s current-day movements with their appearances in the 1970s. ILM utilized more than 1,000 total visual effects artists on the project. ILM Creative Director and Senior Visual Effects Supervisor Ben Morris oversaw the VFX of the show.
“I guess an important thing to say up front is that ABBA Voyage landed here at ILM as a total surprise,” Morris says. “That said, as soon as we heard what was being proposed we knew we had to be involved. There were so many creative and technological unknowns: would we be able to create totally believable digital human performances for a 90-minute concert? Would the audience’s perception of the ‘live performance’ be strong enough to allow them to let go and enjoy the experience in the same way they would a normal concert? Could we break through the inevitable screen barrier, merging the real and digital worlds into one coherent and highly innovative stage?”
Alongside refining many in-house performance-capture tools that ILM had developed for visual effects over the years, the firm had to undertake extensive research, analysis and replication of every real-world lighting instrument used in the main arena. Morris comments, “This allowed our director, Baillie Walsh, to design lighting and staging that worked seamlessly in both the virtual and real-world space. Our lighting designer never considered the concert as two separate spaces; all programming was done assuming the screen didn’t even exist. In addition to recreating the traditional lighting instruments, we also took great care to build digital versions of all the wonderful kinetic lighting installations and laser effects too.”
“[ABBA Voyage] proved that it is possible to create a seamless, dynamic human performance based entirely on digital avatars and staging. This project has opened so many creative floodgates for ILM, with inquiries coming from all areas of the music, theater, cinema, AR/VR/MR and even museum and architectural projects. It is so exciting to have taken on such a highly creative and innovative project and then see the creative potential of a new entertainment concept based on the success of this trailblazing project.”
—Ben Morris, Creative Director and Senior Visual Effects Supervisor, ILM
TOP AND BOTTOM: Disguise drove visuals in 2023 at Block9, an area at Glastonbury Festival known for its immersive environments, underground music and alternative performances. (Images courtesy of Disguise. Photo: Tom Marshak)
“There is no question that audiences are loving the ABBA Voyage show, with a full house for every performance, nearly two and a half years after opening night,” Morris says. “It proved that it is possible to create a seamless, dynamic human performance based entirely on digital avatars and staging. This project has opened so many creative floodgates for ILM, with inquiries coming from all areas of the music, theater, cinema, AR/VR/MR and even museum and architectural projects. It is so exciting to have taken on such a highly creative and innovative project and then see the creative potential of a new entertainment concept based on the success of this trailblazing project.”
Sphere, a home for VFX on a massive scale, opened on September 23, 2023 on the outskirts of the Las Vegas Strip. The giant orb has a wraparound 240-foot-tall interior display that covers 160,000 square feet, comprised of 64,000 LED tiles and 16K x 16K resolution. It has a capacity of 17,600+ people. The facility opened with the five-month-plus residency of U2: UV Achtung Baby Live at Sphere, which featured elaborate immersive visuals for each live song by the band.
The visual piece that accompanied “Atomic City” was ILM’s first production for live entertainment at the Sphere, according to ILM VFX Supervisor Khatsho Orali. “We had to learn things along the way as we were executing on the work. Certainly, we leaned on
TOP: Sphere’s 240-foot-tall screen, which has 160,000 square feet, towers over the 17,000+ audience in a scene with San Francisco’s Haight Ashbury neighborhood. (Image courtesy of Alive Coverage, ILM and Sphere)
BOTTOM: Unreal Engine powered the AR environment at the Coachella Music Festival. (Image courtesy of Unreal Engine)
our vast experience on past large-format projects, but the Sphere had its own challenges. The physical scale of the screen and the 16K square resolution we worked towards required us to account for multiple levels of visual detail to bring realism to the level that would allow the audience to believe that what they were looking at was the live Las Vegas vista in front of them.”
ILM also worked on VFX content for Dead & Company’s Sphere concert. “For this project, we generated close to 20 minutes worth of content that was split into two parts: the Haight and Ashbury San Francisco Intro that plays at the beginning of the concert and then takes us into the upper atmosphere – and then out into the Milky Way – as well as the return to 710 Ashbury during the 1965 outro segment that plays at the end of the concert,” Orali explains.
Immersive live-entertainment venues have unique spaces, displays and playback requirements, “often being non-flat custom display walls that aim to immerse the audience within the experience,” Orfali says. “This requires composing scenes and visuals that would play best to the point-of-view of multitudes of seating positions. Perspective, distortion and horizon levels, and framing of subjects, are all things that need to be studied and appropriately executed on to guarantee the best experience possible. The large physical size puts its own requirements for how to animate cameras, whether in full computer graphics shots or plate.”
“We’ve been very inspired by the innovative thinking around leveraging the kind of magic we create at Industrial Light & Magic and how it can fit into and expand the kind of experiences bands and live shows can bring to their fans,” notes Rob Bredow, ILM Senior Vice President, Creative Innovation and Chief Creative Officer. “Based on the incredibly warm response to ABBA Voyage, along with both U2 and Dead & Company’s [presentations] at Sphere in Vegas, we’ve been in some exciting conversations about what happens next, and love partnering to push the state of the art of visual experiences with our creative teams around the world.
“New immersive venues are a fantastic opportunity to experience visual storytelling outside of the traditional projection we see in the movie theater,” Bredow continues. “Because the canvases are unique, it actually challenges most of what we know about traditional filmmaking from the standpoint of editorial, framing, transitions and all the visual storytelling tricks we know, while benefiting from things the artists at ILM do particularly well: highly complicated, detailed, photorealistic visuals that look amazing in these new immersive environments. There’s nothing
TOP: Disguise worked with 59 Productions on the immersive play Stranger Things: The First Shadow (Image courtesy of Disguise. Photo: Manuel Harlan)
like seeing these experiences with an audience of 3,000 or 18,000 people who get to collectively experience something that would have been impossible just a few years ago.”
“Hologram technology is a prime example of an evolving visual effect popping up in live events. “It’s improving rapidly and finding uses everywhere, from live performances to immersive abstract art,” says Digital Domain’s Dougan. “It also relies heavily on the underlying technology of how to create that hologram and what the limitations are from that. For example, most holograms are purely an additive effect that happens in a live space, so your bright areas are what are visible, and your dark areas become transparent. This seems obvious, but you’d be surprised at how much you need to modify your content for readability, especially in regards to 360° holograms where the effect can overlap the backside of another effect; all of these things to be taken into consideration to make sure that it’s readable and effective.”
“When it comes to augmented reality, it feels like a natural evolution of what we’ve always done,” Dougan explains. “The core of our work remains the same – creating stunning visuals – but now the challenge is figuring out which cutting-edge device or platform the client wants to use. AR is a bit of a chameleon in that sense, whether the content is pre-rendered or happening in real-time. The magic lies in how the viewer experiences it. We relish the challenge of adapting our work to fit the quirks and capabilities of each new piece of tech, making sure our content not only meets but exceeds expectations, no matter what reality it’s enhancing.” AR is everywhere at festivals, as part of the show, as decoration or as practicality. Snap signed a multi-year pact with Live Nation Entertainment for audiences to access AR experiences via Snapchat at Live Nation music festivals such as Lollapalooza. Snapchat users can locate their friends in the audience, find landmarks on the festival grounds and otherwise enhance their music experiences. And, in 2022, the Coachella Valley Music and Arts Festival’s Coachellaverse app offered various immersive AR effects, including geo-specific experiences created in partnership with Niantic.
Double Eye Studios stages immersive theatrical VR experiences such as Finding Pandora X. The software currently involved in most productions includes Midjourney, Adobe Premiere, Blender and Unity. There is always a game engine involved powering the production,” says Kiira Benzing, Creative Director and Founder of Double Eye Studios. “Over the next decade we’ll continue to see an expansion of the arts and live performance in new spaces, ranging from virtual spaces like VR to physical spaces like domes. We can expect that VFX will play a large role in all of these spaces creating transformative experiences that will alter the audience’s sense of reality and the human experience.”
“The vision brought by VFX artists and technologists will increasingly be used across more and different types of screens to create both realistic and fantastical experiences for audiences across the world,” comments DNEG`s Mandel. Disguise’s Malone concludes, “The boundaries are always being pushed and the next project is always more ambitious than the last. The technologies involved are constantly evolving, which is both exciting and challenging. Whether it’s new aesthetics, new integrations, bigger canvases or higher-fidelity outputs, nothing stays still for long!”
TOP: Darren Aronofsky’s 50-minute, immersive Postcard from Earth was the first feature film to play at Sphere and utilized a bespoke 18K x 18K camera system. (Image courtesy of Darren Aronofsky and Sphere)
MIDDLE AND BOTTOM: Interact with the Greek gods in the VR theatrical experience Finding Pandora X, another example of the expansion of visual effects. (Images courtesy of Double Eye Studios)
CHARTING A NEW COURSE FOR FLOW
By TREVOR HOGG
TOP: A tool was developed by Simulation Artist M ārtiņš Upītis that could add water and submerge objects, and the lighting and physics would react according to the applied rules.
OPPOSITE TOP: The cat learns to trust and work with other animals. Director Gints Zilbalodis trusted the work of animation studios in Latvia, France and Belgium to make Flow and was rewarded with an award-winning film.
OPPOSITE BOTTOM: Some of the most hypnotic scenes occur when the cat goes underwater and swims among colorful schools of fish.
Art gets to imitate life when a solitary cat has to rely on a group of various animals to be able to survive a flood in Flow. Gints Zilbalodis, who, up until now, has been a one-man production crew, was given the budget to bring additional talent onboard for his sophomore feature.
While attending its North American Premiere at the 49th Toronto International Film Festival, Zilbalodis readily admitted that the resemblance between himself and the feline protagonist was not coincidental. “I’m telling a personal story through this cat. The premise is a cat overcoming its fear of water. We don’t have dialogue, so we have to find things that are visually clear and immediate. Also, I knew that I wanted to have a story about this character who was independent and learns how to trust, be collaborative and accepting of others.”
Making the shift from a solo to a team effort did not require an entirely different filmmaking process. “I never use storyboards in the short films that I’ve done and in Away,” Zilbalodis states. “Even though we had a budget and could get some storyboard artists, I was used to creating the environment myself in 3D, placing the characters in there and discovering shots. On this film, there are many long takes with the camera following the characters, turning 360 and moving not just left to right and up and down but also in depth. When it’s moving in depth, it’s hard to draw because the perspective is always changing.” New skills were developed. Zilbalodis says, “I am credited as the director on the previous films, but Flow was the actual first film I directed because before if I had an idea, I could make it myself. This time, I had to articulate my thoughts. It’s almost like a different skill that I had to learn. When you articulate things, it is good because then everything has a
Images courtesy of Janus Films.
purpose. However, you can go too far in that direction where everything is too direct, clear and predictable. I like to have some ambiguity as well. There were a few scenes where I had to ask people to trust me that it will work in the end when we add the music and everything is edited.”
Zilbalodis established Dream Well Studio in Latvia and partnered with Sacrebleu Productions in France and Take Five in Belgium. Zilbalodis remarks, “It’s quite common in Europe to have these co-productions between different countries because then you can get extra funding from European funding agencies. In Latvia, there is an animation industry, but it’s quite small. We don’t have that many 3D character animators. I thought that we should approach a studio in France because the animation industry is the third largest in the world. France and Belgium did all of the character animation and the sound design. Everything else was done in Latvia. It was quite easy to manage all of that, but I had to constantly travel back and forth to make sure that we all were on the same path.” Unlike Away, which was animated in Maya, Flow was created with Blender. Zilbalodis comments, “Blender has this great real-time render engine called Eevee that helped us to do the animatic where it was possible to have lights and effects, like fog, that affects how I frame shots because the way the shadows are cast can influence the layout and compositing. The budget was quite tight, so it helped that we didn’t have to pay for all of the licenses for the software. Also, Blender is great because it’s so customizable.”
While the animals are stylized, the environmental setting harkens more towards realism. “The approach for the style in a lot of 2D animated films is that the backgrounds are painted
Giant feline sculptures populate the landscape.
Concept art developed for the continuous shot was 6,606 frames long and required eight animators working together on different scenes that were regrouped into a single file.
with a lot of detail using oil paint or watercolor, but the characters are flat shapes that are more stylized,” Zilbalodis observes. “If you make the characters too realistic, they can look cold and synthetic. You need to make the characters a little bit more graphic because they are more appealing that way.” All of the animals were chosen based on the theme of wanting to find a connection with others. Zilbalodis notes, “The lemur thinks he needs to have all of these shiny things so others will accept and love him. The cat learns to trust others and work together, but I wanted to have this counterpoint where the dog is almost too trusting and learns to be more independent.” The Secretary bird replaced the seagull from the original short film. Zilbalodis says, “We needed to have a more intimidating bird that can carry the cat. It’s big and strong. The Secretary bird also wants to be accepted by other birds. It is important that we can relate and understand each animal with all of their flaws. Each of them has their own character arc and learn something. The only exception is the capybara. It already knows what’s right from the beginning and is the mentor for the other animals.”
Research consisted of finding video references and photography to get the proper posing and expressions of the animals. “I would put it on a website to share with the whole team,” states Animation Director Léo Silly-Pélissier. “Everybody had to watch those videos to understand the behavior of each animal and what they do in different situations. We didn’t copy the movement exactly because all of the situations in the movie were quite specific and a bit fantastical. It was more naturalistic than realistic.” Even though there was no dialogue, a vocal performance still needed to be incorporated into the animation. Silly-Pélissier adds, “We didn’t work with animal sounds before the animation, so depending on the situation and what the animal is doing, we would open the mouth at the time we think is more accurate and then the sound designer would put the animal sound there.” Rules were devised for each animal to ensure consistency and believability. “Gints gave
TOP TO BOTTOM: Rather than rely on dialogue, actual animal sounds were recorded for each of the characters.
us a lot of space to imagine movement, and because there are long shots, we couldn’t check the movement frame by frame as it would take too long. Some shots are 6,000 frames, so I wrote some rules for animators, like how the cat moves its ears to check the sound, and if the sound is interesting, it can move its head to look at that. Also, sometimes the tail moves apart from the movement,” SillyPélissier says.
Water is a character in its own right. “Water was a huge challenge,” Zilbalodis remarks. “It was one of the first things that we started and one of the last things we finished. Whether water is still or an actively moving flood, we needed to develop a completely different tool. But we also used the water as a storytelling tool as it represents the cat’s fear of others. As the cat opens up to others and starts to trust them, the water becomes more beautiful and peaceful. It’s a visual metaphor for that.” Creating the proper water and animal interaction was a two-step process. “Gints made the initial water effects as he wanted, then I went over the more important ones that we had to do before the animation,” explains Simulation Artist Mārtiņš Upītis. “After the animation was completed, I could add the final details, like these smaller splashes and ripples. It was a convenient ping-pong between the different parts of the workflow.”
Equally difficult to achieve were the long continuous shots. “It was important for me to create this sense that you are next to the cat, not observing it from a distance,” Zilbalodis states. “Usually, we are subjective, but at some points, we are also from a more distant objective point of view. Sometimes, that happens within the same shot where the camera starts out in a wide shot but then gradually glides into a close-up. With the long takes, you can get a sense of real-time. It’s a tool. The camera is sometimes expressing fear or curiosity. There are moments where it’s a little bit delayed as if the camera operator will need some time to react to things. Everything is on purpose. We’re trying to create this feeling of organic spontaneity, which I haven’t seen that much in animation, which tends to
in an attempt to gain
A dramatic environment is the half-submerged city.
TOP TO BOTTOM: The water reflects the emotional state of the cat throughout the movie.
The lemur hoards shiny objects
social acceptance.
be clean and perfect. I like to have some grunginess and a little bit of a handmade feeling.”
A cinematic reference was The Revenant. “The main character gets into turbulent water, and the camera can go below or above the water or stay between the water and the air,” Upītis says. “That posed an interesting challenge. I had this idea that I should make a tool that allows you to place the camera at any angle, either above or below water, and you don’t have to switch between underwater and above-water modes or in-between modes. It does it automatically. I spent quite a lot of time investigating different methods in Blender and how to do that. At the same time, Blender announced that they will have geometry nodes. Based on that, I created a tool where you can add water and submerge objects; lighting and physics react accordingly to the rules that you give.” Extra water shots were added to the original total of 160. Upītis adds, “I was going one by one through each of the shots and individually adapting them. The difficult part was probably the fact that every shot was a different file, and to add consistency between shots, the water shouldn’t change much from the previous shot and the next shot. It was quite a lot of looking at the edits that Gints made to see if the water was consistent, like the color and movement. Also, throughout the whole movie, the water changes color from green in the beginning, then through the heights of the story it’s dull and muddy, and at the end, it gets more bluish and oceanic.”
Accompanying the water were splashes. “We cut part of the
TOP: The animals were stylized rather than photorealistic to avoid them looking cold and synthetic.
BOTTOM: The animation was entirely created with Blender.
water out and added the simulated splashes on top of it,” Upītis states. “The thing is, we had to work with different scales of simulations. The close-ups had a dense simulation for the foreground, and the geometry in the background was sparser because you don’t need as much detail. Gints said that the simulations made in the previsualization looked better. In the end, the movie is quite stylized, so the splashes could be of a lesser quality.” The biggest challenge for the animation team was a long, continuous shot featuring several animators. “It was 6,606 frames, and eight animators worked together on different scenes, then we had to regroup all of the files into one,” Silly-Pélissier notes. “It was huge for the computer, and checking everything worked well. We started this shot at the beginning of the production and finished the shot near the end. This is the one when the cat looks at the fish, falls into the water, is drowning, the whale rescues him, and the bird takes him away and drops him.”
Flow won Best Original Music Award, Jury Award, Audience Award and Gan Foundation Award for Distribution at the 2024 Annecy International Animated Film Festival, and it was chosen to be Latvia’s official Oscar entry for Best International Feature. “In Cannes, it was my first time seeing it with an audience, and that was a big relief,” Zilbalodis remarks. “I’ve been traveling to festivals all over the world, and I’ve been asked how people are reacting differently, and to tell you the truth, they are reacting the same because it’s quite a universal story.”
TOP: Replacing the seagull in the original short is the Secretary bird, which was more intimidating and could carry the cat in the air.
BOTTOM: Serving as the mentor for the rest of the animals is the laid-back capybara.
What Is Storyvis
By MARIANA ACUÑA ACOSTA
Edited for this publication by Jeffrey A. Okun, VES
Abstracted from The VES Handbook of Virtual Production
Edited
by Susan Zwerman, VES, and Jeffrey A. Okun, VES
Storyvis vs. Pitchvis
Pitchvis has been used for over 20 years. It is intertwined with previsualization, and it is used by directors or producers to quickly have a proxy version of a scene from a script; making it is easier to visualize the concept and sell it to investors, studios, or stakeholders. Storyvis is how production designers, independent filmmakers or writers use 3D visualization to be able to craft a story. They may not use it for funding, the script may not even be finalized, but by being able to use newer technologies to construct part of the world and/or characters of the story they are building, it makes the story easier to convey. It is also used as a teaser for stories with complex narratives, making it easier to understand where the theme of a story could go – for example, a “dream sequence.”
3D Storyboards
Traditionally, storyboards and animatics have been created in a 2D format. Using game engine technologies, one can potentially skip this process and go straight into building a 3D storyboard. Younger generations of content creators that are used to working this way opt for simply putting together a master sequence, kitbashing characters and environments, motion capture or animation clips, adding cameras, and even physics, and they can use the camera views and the game engine timeline to take screenshots and export out image sequences or videos. This reduces not only iteration time but also creates 3D storyboards/animatics that are already lit, with effects, simulations, audio tracks or sound effects. There are tools available that extend these capabilities to 2D storyboard artists, making it easier to transition, such as the plug-in Epos1 for Unreal Engine. A storyboard artist does not need to know how to use Unreal to take advantage of the 2D and 3D offerings of the technology.
Interactive Workflows
The creative process is not as linear as it used to be. Storyboarding, look development, environments and concept art can now happen concurrently, using a much more dynamic process. The beauty of incorporating real-time workflows into the pre-production, design, creation and production processes is the ability to reinvent, reimagine and reassess the process of linear content creation. Reducing iteration times, co-creating, collaborating, and being able to quickly use a game engine as a “playground” for interactive design have been key drivers for adoption.
Linear Content
Stories transfer information, an experience or a point of view. Stories have a beginning, a middle and an end. Stories have a teller and an audience. The way stories are told, produced, consumed and distributed has significantly changed over the past decade. There is immersive storytelling in virtual reality, Roblox YouTube channels, VTubers, Tiktokers, video game streamers, branching narratives, the Metaverse, etc. The technologies that enable us to view content in different platforms, the format and how viewers can interact with the content are in constant flux. Gone are the days when newspapers, TV, cinemas, theaters and radio were the only means of connecting with an audience. It is only natural for the workflows on how humans create stories to evolve alongside filmmaking, which have not evolved much in the past century; however, the technology used on set has. LED walls and ICVFX (In-Camera Visual Effects) are part of that evolution, as are Augmented Reality, Virtual Reality, Volumetric Capture, etc.
Order yours today! https://bit.ly/VES_VPHandbook
Figure 5.1 MetaHumans in “Thrown.” (Image courtesy of Maya Singer)
VES Georgia Revisited: Vibrant VFX Community in the Heart of the South
By NAOMI GOLDMAN
The VES Georgia Section represents a diverse membership of more than 135 artists and educators who span Atlanta and Savannah, the dual hubs of the regional VES collective. Georgia is continuing to experience the economic and work impacts of the dual labor strikes, which have changed the VFX trajectory in the region, making the tightly-knit community of VES Georgia all the more important as a convenor and network for support and camaraderie.
Since the Section was established six years ago, Zach Bell, VES Georgia Co-Chair and Senior CG Artist at Warner Bros. Discovery, and Sean Thigpen, VES Georgia Treasurer and VFX Supervisor/Producer, have held these same leadership positions – affording them unique insights into the regional VFX industry, the Section’s growth and forward-looking vision.
“It has been a great source of pride being a part of the Board of Managers from the very beginning,” said Bell. “As Sean and I are preparing to pass the baton, we have been fostering the next wave of Section leaders through a shadow mentoring program to enable a Sseamless transition. I believe we have created a stable foundation with the connections and strategic planning to help them thrive and carry us into the future.”
“During COVID and the strikes, our membership numbers actually grew, largely due to our amazing members, who proved to be a highly unified group and a valuable resource to one another,” said Thigpen. “The landscape for work has been tough since the strikes, and everyone is curious and hopeful for things to pick up. Amidst all that uncertainly, we have created a strong, positive culture in the VFX industry, and our Section is an important player as an ambassador to people across our state and beyond.”
The Section has a strong presence in Atlanta and Savannah, and works to provide meaningful and exciting experience, opportunities and benefits for members in both markets. Their goal of having parity across the Section’s sub-groups is supported by having Section Co-Chairs from both cities who orchestrate logistics on local events. While Bell is focused on Atlanta, VES Georgia Co-Chair David “DJ” Johnson has been at the center of the Savannah activities.
In terms of member recruitment, the Section is welcoming and inclusive, attracting prospective members who may be seasoned practitioners to those just out of school and not yet eligible – offering them a glimpse at their potential future in the VFX industry. They host pub nights at local Atlanta brewery Monday Night Garage, a steady stream of screenings and panels, educational and career development events for members and guests.
This past year, they held a number of events around Virtual Production, including The Lux Stage at Trilith Studios, where members and others experienced LED volume walls and the space firsthand. The Section hosted an Autodesk Showcase,
TOP: VES Georgia members take part in the VES Awards nominations event, hosted at FotoKem Atlanta.
MIDDLE AND BOTTOM: VES Georgia members and guests enjoy a special Virtual Production event at The Lux Stage.
in conjunction with Xencelabs, offering a review of the latest Autodesk solutions, workflows and tools. And this past Fall, they co-sponsored an event at the SCAD Savannah Film Festival, the largest university-run film festival in the world, in collaboration with Savannah Women in Film & Television (SWIFT).
Leadership is very appreciative of long-time event sponsor Aurora Cineplex in Roswell, who has hosted almost all of its film screenings for the past six years. “More than that, during COVID, as protocols allowed, they held a number of private screenings for our members, in addition to our ongoing studio-sponsored screenings. They have been phenomenal partners,” said Thigpen.
“We also want to give special thanks to Fotokem Atlanta for hosting our first in-person VES Awards nomination event in 2024, and coming up again soon as the venue for our 2025 event,” said Bell. “The nominations event, the opportunity review, and giving expert input into the outstanding VFX work of our peers from all around the world – that is the secret gem for our VES members to be a part of.”
Moving forward, the Section is working to develop more educational workshops and career development opportunities around Virtual Production and technology demos, and tapping further into its cadre of SCAD professors in Savannah. The team is excited at the early prospects of bringing a Virtual Production Summit to Atlanta.
“As one of the original 50 members that formed our Section, I was inspired to push the new group forward and see how I could help,” said Thigpen. “We grew an early spark into a good blaze, and now I’m excited as we hit that 150-member milestone. We already span two cities, and with the next slate of leaders and more members, I know that we can achieve more, and we can do more in providing value to our members.”
“A big motivation and mindset for me is that Georgia is a newer established market for the creative filmmaking industry. It’s just come into its own in the last 15 years, so we can help shape the voice of what it is to work in VFX in Georgia,” said Bell. “Through VES Georgia, we are showing that you can make a career here, that you can meet and collaborate with people who represent the best of our industry. We try to showcase that to our members and the broader community, that ‘the better we are together, the further we all will go.’”
In their closing comments, both Section leaders extolled the unique culture of their state and the spirit that permeates the Section. “What you find here is an amazing group of talented artists and resources for all types of production – gaming, film, TV, broadcast or any way people make great content. It’s a phenomenal place to work, live and be creative,” said Bell.
Thigpen concurred. “We pride ourselves on being accessible and open to meeting new people. This is a great place to produce and create. If you are passing through, give us a visit! Everyone is welcome.”
Members and guests gather for their festive holiday
Members and prospects enjoy pub nights at Atlanta’s Monday Night Garage brewery.
The VES Georgia Section comes together in camaraderie and community. VES Georgia members and prospects have a fun night at Savannah’s Fall membership night.
TOP LEFT TO RIGHT: VES Georgia proudly hosts its annual gathering with Savannah Women in Film & Television at the SCAD Savannah Film Festival.
party.
Visual Effects Society Recognizes 2024 Honorees
By NAOMI GOLDMAN
VES celebrated a distinguished group of VFX practitioners and industry leaders at its festive annual VES Honors event October 25th, held under the night sky at the Skirball Cultural Center in Los Angeles.
Producer and VFX industry leader Brooke Breton, VES was named recipient of the 2024 VES Founders Award. “Receiving this recognition means so much. I love this organization for its full representation of artists and practitioners and for how we support our expanding global community. Bringing talented people together to technically and artistically problem-solve and give them the opportunity to shine has been one of my greatest joys in this business.”
The Society designated VFX Producer/Supervisor Reid Paul and VFX Producer Ronald B. Moore with Lifetime VES memberships. Our newest Honorary VES members are award-winning Director and Animator Don Bluth; awardwinning Visual Effects Supervisor and Filmmaker John Bruno; award-winning Director, Production Designer & Visual Effects Specialist Robert Stromberg; and acclaimed Animation and Optical Effects Supervisor Harry Walton.
This year’s venerated VES Fellows are: Head of Visual Effects for Wētā FX Matt Aitken, VES; Director of Creative Innovation at Netflix Girish Balakrishnan, VES; award-winning Titles Designer and Visual Effects Supervisor Randall Balsmeyer, VES; VES award-winning Artist and Supervisor Michael Conte, VES; Executive Producer and former VES Chair Lisa Cooke, VES; Senior Visual Effects Supervisor at Scanline Bryan Grill, VES; Executive Producer Thomas Knop, VES; Senior Vice President, Production Visual Effects at Sony Pictures Entertainment Arnon Manor, VES; and V isual Effects Supervisor Susan Rowe, VES.
The 2024 class of VES Hall of Fame inductees include Oscar-winning VFX Supervisor and VES Founders Award recipient Tim McGovern, VES; Visual Effects Supervisor Thad Beier; experimental filmmaker Maya Deren; and film Director-Producer Dorothy Davenport.
And for their long service as leaders of their Section Board of Managers, the Society recognized: Zach Bell, VES Georgia; Tony Botella, VES France; Kay Delventhal, VES Germany; Kay Hoddy, VES New Zealand; Andy Romine, VES Washington; Marc A. Rousseau, VES Montreal; and Sean Thigpen, VES Georgia.
“Our honorees represent a group of exemplary artists, innovators and thought leaders who have had a profound impact on the field of visual effects and our global Society,” said VES Board Chair Kim Davidson. “We are proud to recognize those who helped shape our shared legacy and continue to inspire future generations of VFX practitioners.”
New VES Fellows celebrate at VES Honors. From left to right: Arnon Manor, VES; Randall Balsmeyer, VES; Sue Rowe, VES; Matt Aitken, VES; Thomas Knop, VES; Lisa Cooke, VES; and Michael Conte, VES.
New VES Honorary Members gather to celebrate the VES community. From left to right: Harry Walton, John Bruno and Robert Stromberg
New VES Lifetime Member Redi Paul beams at the VES Honors event.
THIS PAGE, TOP LEFT: Jessica McGovern accepts the VES Hall of Fame induction honor on behalf of her father Tim McGovern, VES, with VES Chair Emeritus Mike Chambers, VES.
TOP RIGHT: VES Executive Director Nancy Ward and VES Chair Kim Davidson thank VES Washington Section leader Andy Romine for his volunteer service.
Visual Effects Society Celebrates World VFX Day
The VES was proud to be a Supporting Event Partner for the 2nd Annual World VFX Day in December. Held on the birthday of French illusionist, film director and visual effects pioneer Georges Méliès, the global livestreamed event put visual effects center stage worldwide, and highlighted artistry and innovation in the field. The 2024 two-day event (December 6th and 8th) featured more than 100 speakers and 60+ talks across 20 hours on visual effects, animation, gaming and immersive media. Congratulations to World VFX Day Founder Hayley Miller and everyone involved in making this celebration of our VFX craft and community a rousing success.
The Society hosted two dynamic webcasts during the event shared across VES’ 16 global Sections: A panel of venerated VES Award winners, and a conversation with Women Who Lead VFX, featuring Chrysta Marie Burton, Senior Vice President Physical Production & Visual Effects at Paramount Pictures; Kathy Chasen Hay, Senior Vice President, Visual Effects at Skydance Media; and Janet Lewin, Senior Vice President of Lucasfilm Visual Effects and General Manager of Industrial Light & Magic; moderated by Lisa Cooke, VES, Executive Producer at Tippett Studio and former VES Chair of the Board of Directors.
Thank you for tuning in! Watch the replays at worldvfxday.com.
OPPOSITE PAGE, TOP TO BOTTOM: Brooke Breton, VES, recipient of the 2024 Founders Award with VES Board Chair Kim Davidson.
VES members enjoy the festive 2024 VES Honors event.
The Practical Brilliance of Spartacus
Younger moviegoers are probably familiar with Gladiator, the 2000 historical epic directed by Ridley Scott and starring Russell Crowe. The film won five Academy Awards including Best Picture, Best Actor and Best Visual Effects, and was nominated in 12 categories. It also grossed $460 million worldwide. A sequel, Gladiator II, recently arrived (see article, page 38). But perhaps less known is the granddaddy of all Gladiator movies – 1960’s Spartacus starring Kirk Douglas and directed by the legendary Stanley Kubrick. It was an epic before CGI, relying on voluminous practical methods. Many believe the movie revolutionized cinema for years after.
Spartacus was known for its massive sets of ancient Rome and very elaborately crafted costumes and props such as Roman legionnaire outfits and ancient weapons. In some scenes, 10,000 extras were used. For the last battle scene, 8,000 soldiers from the Spanish army played Roman soldiers and rebel slaves. Major portions of the movie were filmed in Spain.
Many matte paintings were used to extend physical sets and create expansive landscapes and cityscapes. Matte paintings were also combined with live-action shots. In addition, rear projection simulated distant backgrounds while many miniature models were used for large-scale scenes.
Significantly, instead of relying on visual effects, many dramatic sequences were created with practical stunt work during battle scenes and gladiator contests. One scene showing a gladiator fight between Kirk Douglas and Woody Strode was carefully choreographed with real weapons. Cinematographer Russell Metty used several in-camera effects such as creative lighting and forced-perspective techniques to create larger backdrops. The large-scale battle scenes included a profusion of pyrotechnics for explosions and flaming weapons. There was also an abundance of fake blood and prosthetics. Even by modern standards, the practical visual and special effects in Spartacus remain spectacular.